Net54baseball.com Forums

Net54baseball.com Forums (http://www.net54baseball.com/index.php)
-   WaterCooler Talk- Off Topics (http://www.net54baseball.com/forumdisplay.php?f=29)
-   -   Who's Responsible if a Tesla on Autopilot Kills Someone? (http://www.net54baseball.com/showthread.php?t=316638)

GasHouseGang 03-14-2022 02:03 PM

Who's Responsible if a Tesla on Autopilot Kills Someone?
 
With the price of gas heading skyward, you might be considering a Tesla electric. Just be aware some of the systems on these new cars might get you in trouble if you don't use them the way they were intended. I thought this was an interesting article, and I'm not even a lawyer. :D

Who's Responsible if a Tesla on Autopilot Kills Someone?

Courts may decide, says a professor of civil litigation.

Vehicular manslaughter charges filed in Los Angeles earlier this year mark the first felony prosecution in the US of a fatal car crash involving a driver-assist system.
In late 2019, Kevin George Aziz Riad’s car sped off a California freeway, ran a red light, and crashed into another car, killing the two people inside. Riad’s car, a Tesla Model S, was on autopilot.
The Los Angeles County prosecutors filed two charges against Riad, now 27. The case is also the first criminal prosecution of a crash involving Tesla’s autopilot function, which is found on over 750,000 cars in the US. Meanwhile, the crash victims’ family is pursuing civil suits against both Riad and Tesla.
Tesla is careful to distinguish between its autopilot function and a driverless car, comparing its driver-assist system to the technology airplane pilots use when conditions are clear.
“Tesla autopilot relieves drivers of the most tedious and potentially dangerous aspects of road travel,” states Tesla online. “We’re building autopilot to give you more confidence behind the wheel, increase your safety on the road, and make highway driving more enjoyable.… The driver is still responsible for, and ultimately in control of, the car.”
The electric vehicle manufacturer clearly places the onus of safety on the driver, but research suggests that humans are susceptible to automation bias, an over-reliance on automated aids and decision support systems.
Now it’s up to the courts to decide who is culpable when the use of those systems results in fatal errors. Currently, Riad is out on bail and pleading not guilty to manslaughter charges.
Here, Mark Geistfeld, professor of civil litigation at New York University, and the author of the new paper in the California Law Review, talks about the significance of the criminal charges and what they might mean for the future of consumer trust in new tech:

Q: Can you shed some light on the legal precedent the criminal prosecution of Kevin George Aziz Riad sets? What message does it send to consumers and manufacturers of similar technology?
A: First, the criminal charges are surprising, based on what we know—the criminal charging documents, as usual, provide no details. Typically, if you weren’t paying attention, ran a red light and hit somebody—as tragic as it is—you wouldn’t get a criminal charge out of that behavior in the vast majority of cases. You really don’t see many criminal prosecutions for motor vehicle crashes outside of drunk-driving cases.
If the driver was found guilty of manslaughter, this case could really be the most disruptive, the most novel, the most groundbreaking precedent. It’s a strong departure from the past, if in fact the criminal prosecution is simply based on his relying on autopilot when he should have taken over. If that’s what is going on, you might see a lot more criminal prosecutions moving forward than we do today.
Tort liability, or civil charges, by contrast, is very commonplace. That’s when the defendant would pay damages for injuries caused. The majority of tort suits in state courts across the country are from motor vehicle crashes in which one driver is alleged to have negligently caused the crash, which clearly occurred in this case because the driver went through a red light.
If this case somehow signals that criminal liability is more possible simply by relying on the technology, then that could become a profound shift in the nature of legal liabilities moving forward.

Q:What obligation does an advanced tech company such as Tesla have in informing drivers, whether directly or through advertising and marketing messages, that they are liable for all damages, regardless of whether the car is on autopilot?
A: They clearly have an obligation to warn the person sitting in the driver’s seat to take over the vehicle—that it’s not capable of doing everything on its own. You see that warning in Tesla vehicles, and almost all vehicles have that type of warning. For example, when you use a map function while driving, many cars will offer a warning: “This will distract you, pay attention to the road.”
Manufacturers also have an obligation to keep in mind the sense of complacency that comes with driving technology while designing the car. Tesla or any other manufacturers can’t just say, “Hey, pay attention, that’s your responsibility.”
They actually have to try to put something into the design to make sure that drivers are staying attentive. So different manufacturers are taking different approaches to this problem—some cars will pull over if your hands are not on the steering wheel, and other cars have cameras that will start beeping if you’re not paying attention.
Under current law, if the driver gets in a crash and there was an adequate warning, and the design itself is adequate enough to keep the driver attentive, the car manufacturer is not going to be liable. But there’s one possible exception here: there is a formulation of the liability rule that is pretty widely adopted across the country, including in California, where this case will take place. Under this rule, the inquiry is based on what consumers expect the manufacturer to do. And consumer expectations can be strongly influenced by marketing and advertising and so on.
For example, if Tesla were to advertise that autopilot never gets in a crash, and then a consumer does get in a crash, Tesla would be liable for having frustrated those expectations.

Q: In this case, the driver was charged based on the idea that he was over-reliant on his car’s autopilot. What does this say about our basic assumptions about whether humans or tech are more trustworthy?
A: There’s an important distinction between overreliance and complacency. I think complacency is just a natural human reaction to the lack of stimulus—in this case, the lack of responsibility for executing all of the driving tasks. You can get bored and lulled into a sense of complacency, but I don’t think that behavior is being overly reliant on technology.
The idea of overreliance comes into play with the potential nature of the wrongdoing here. Maybe the driver in this case will defend himself by saying he reasonably thought the car had everything under control, was fully capable of solving this problem, and so he didn’t have to worry about reacting if things turned out otherwise.
Now at that point, he would be placing his faith in the technology instead of in his own ability to stop the vehicle and get out of the problem in a safe way. If there is blind faith in the technology rather than in taking over when you could have done so, and if you are liable as a consequence, that becomes a very profound, interesting kind of message that the law is sending.

Q: Do you think this shift in liability will hurt business for companies like Tesla?
A: The big issue that autonomous vehicle manufacturers like Tesla face right now is gaining consumer trust when they’re introducing a new technology to the market. The need for trust in the early stages of these products is massively important. And all the manufacturers are worried about that problem because they know that if there are some horrific crashes, consumers are going to lose trust in the product.
Ultimately the technology will end up taking over; it’s just a question of whether it’s sooner rather than later. And time is money in this context—so if you just get slower adoption because consumers are very concerned about the safety performance of the technology, that’s going to hurt the industry. They obviously want to avoid that outcome. This technology is still going to take over—it’s just a question of how long it takes for that to happen. There are just so many advantages to using autonomous vehicles, including in the safety dimension.

Q: Of its autopilot and full self-driving capability, Tesla says: “While these features are designed to become more capable over time, the currently enabled features do not make the vehicle autonomous.” What liability issues do you foresee if/when these vehicles do become autonomous?
A: It’s a complicated question, and that is the issue that everybody is interested in. Once these vehicles become fully autonomous, then there’s just the car. The human in the car isn’t even an element in the situation.
So the big question is: once those vehicles crash, who pays? You’d think the manufacturer would be liable—and that’s going to increase the cost of these vehicles and make them a lot harder to distribute. There are a lot of people who think that in the event of a crash, the manufacturer should be liable all of the time. I am strongly skeptical about that conclusion, because I think it’s a much closer call than most people make it out to be.
Ultimately, these issues depend on how federal regulators like the National Highway Traffic Safety Administration regulate the vehicle. They will have to set a safety performance standard which the manufacturer has to satisfy before it can commercially distribute the product as fully autonomous.
The question is where the regulators set that standard at, and I don’t think it’s easy to get right. At that point there will be a good debate to be had: Did they get it right or not? We’re still a few years out. I think we’ll all be having these conversations in 2025.

Smarti5051 03-14-2022 02:13 PM

I think 99.99% of Tesla owners know enough about the current limitations of the self-driving technology. Anybody that has logged more than 100 miles with Tesla on auto-pilot knows it is not a "set it and forget it" experience. I believe it does greatly reduce the stress of driving (particularly long distances), but you would have to be a special kind of stupid to think the current system is foolproof. The drivers pointing fingers at Tesla for causing accidents are just trying to deflect from their own negligence.

Hopefully, someday the technology will be there to literally take a nap in the car while it drives and charges itself cross-country. At that point, I suspect the company will be liable for any damages, and the company would have an insurance policy covering all of the cars it issues. So, the cost of the insurance will be built into the price of the car. But, this should be more than offset by the savings on the car buyer, since he would no longer be responsible for carrying auto insurance personally (unless he intends to continually manually driving automobiles).

vintagetoppsguy 03-14-2022 02:28 PM

If a pilot is responsible for a plane that crashes while on autopilot, then I would assume a driver that crashes a vehicle while on autopilot is as well.

irv 03-14-2022 02:36 PM

U.S. clears way for truly driverless vehicles without steering wheels. Federal vehicle safety regulators have cleared the way for the production and deployment of truly driverless vehicles that do not include manual controls such steering wheels or pedals.3 days ago
https://www.cnbc.com/2022/03/11/us-c...20or%20pedals.

U.S. eliminates steering wheel requirement for fully automated vehicles
The country's safety regulator is revising rules that made human controls in cars mandatory, in light of the rise of new tech

https://driving.ca/auto-news/technol...mated-vehicles

GasHouseGang 03-14-2022 03:36 PM

Quote:

Originally Posted by Smarti5051 (Post 2205659)
I think 99.99% of Tesla owners know enough about the current limitations of the self-driving technology. Anybody that has logged more than 100 miles with Tesla on auto-pilot knows it is not a "set it and forget it" experience.

While I would agree with you, some people certainly haven't gotten the message. Here's a link to a video about multiple wrecks caused by this behavior, and one driver and his passenger totally asleep behind the wheel.

https://www.youtube.com/watch?v=NHUZxeSUFUk

Smarti5051 03-14-2022 04:08 PM

People fall asleep and get in accidents in all vehicles. The only difference with Tesla is that the idiot driver can try and dupe non-Tesla owners into believing they were misled into believing the car drives itself so well it is perfectly fine to be literally asleep at the wheel or having sex in the back seat. Even they know the claim is BS, but people eat it up. In the case of Tesla, they have been so successful the past decade that it has become a sport to find things wrong with the cars, whether it be the build, the technology, the environmental impact, or the cause of accidents.

In America, no matter how stupid one is, someone else is always the reason for any problem.

GasHouseGang 03-14-2022 04:35 PM

Unfortunately it's not just Tesla owners.

https://www.youtube.com/watch?v=bovqT3PDiCU

This women set the cruise control in her RV and went to make a cup of tea!

UKCardGuy 03-14-2022 07:26 PM

Quote:

Originally Posted by GasHouseGang (Post 2205707)
Unfortunately it's not just Tesla owners.

https://www.youtube.com/watch?v=bovqT3PDiCU

This women set the cruise control in her RV and went to make a cup of tea!

There's a runner up candidate for the Darwin Awards

Casey2296 03-14-2022 09:52 PM

Unfortunately the government, corporations, and lawyers, have sent the message for the last 40 years that we're not responsible for our actions. It's shameful and a cancer on our society.

icurnmedic 03-15-2022 08:33 AM

Quote:

Originally Posted by Casey2296 (Post 2205808)
Unfortunately the government, corporations, and lawyers, have sent the message for the last 40 years that we're not responsible for our actions. It's shameful and a cancer on our society.

Damn, that was well said!!

mrreality68 06-30-2022 04:53 PM

Quote:

Originally Posted by icurnmedic (Post 2205889)
Damn, that was well said!!

+1 agreed and that video is a little scary
if i was driving near that car I would get away from it

Michael B 06-30-2022 04:59 PM

One of my newer mantras: "You can't unteach stupid!"

mrreality68 07-01-2022 05:39 AM

Quote:

Originally Posted by Michael B (Post 2238779)
One of my newer mantras: "You can't unteach stupid!"

That’s good but amazing it is easy to teach stupid

tschock 07-02-2022 06:07 AM

Quote:

Originally Posted by Casey2296 (Post 2205808)
Unfortunately the government, corporations, and lawyers, have sent the message for the last 40 years that we're not responsible for our actions. It's shameful and a cancer on our society.

Supported by "The electric vehicle manufacturer clearly places the onus of safety on the driver, but research suggests that humans are susceptible to automation bias, an over-reliance on automated aids and decision support systems."

And why we have disclaimers such as 'do not take this medication if you are allergic to it' as well. Seriously? People have to be told that?

steve B 07-03-2022 10:37 AM

Quote:

Originally Posted by irv (Post 2205664)
U.S. clears way for truly driverless vehicles without steering wheels. Federal vehicle safety regulators have cleared the way for the production and deployment of truly driverless vehicles that do not include manual controls such steering wheels or pedals.3 days ago
https://www.cnbc.com/2022/03/11/us-c...20or%20pedals.

U.S. eliminates steering wheel requirement for fully automated vehicles
The country's safety regulator is revising rules that made human controls in cars mandatory, in light of the rise of new tech

https://driving.ca/auto-news/technol...mated-vehicles

Is there some sort of service override control? Like if it breaks down while the wheels are in a turn, moving it around the lot at a service place will be challenging.

The tech is cool, but I'll keep doing my own driving.

BobC 07-03-2022 12:27 PM

Quote:

Originally Posted by steve B (Post 2239378)
Is there some sort of service override control? Like if it breaks down while the wheels are in a turn, moving it around the lot at a service place will be challenging.

The tech is cool, but I'll keep doing my own driving.

Unfortunately, that doesn't help if the car in the lane next to you is a driverless car that suddenly has a failure, and takes you out.

If they begin building cars that have no manual controls for the passengers, then clearly any accidents caused by them are not the fault of the passengers. Unless the passenger/owner was supposed to keep up some maintenance/repair regime that a manufacturer can claim they failed to do and thus push the fault (and liability) over to the owner/passenger. And we all know that will be one of the first things any manufacturer, and their insurance company, will try to do.

And in the end, that "cost" will be considered as nothing more than a cost of doing business, and will ultimately be passed on to consumers buying driverless vehicles. Someone will come up with statistics that show that fewer people end up dying or being hurt from driverless vehicles than from drunk or distracted drivers, and therefore the additional costs are all worth it. Meanwhile, big business saves money by no longer having to employ people to drive trucks and delivery vehicles, operate taxis or Ubers, nor deliver food to people's door, and such, and now even more people are out of jobs.

Technology is great, until it isn't. While technology and computers can make our lives easier and better, and tremendously increase production and efficiency, they also put people out of work, make us more dependent on them, and make all of us subject to much greater overall harm and damage, not if, but when, such technology and their related systems ultimately do fail or break down.

From what I've seen, we finally need to amend the list of certainties in everyone's lives from not just death and taxes, but now include that any technology being used will eventually get hacked and/or fail!

Mark17 07-03-2022 01:47 PM

Quote:

Originally Posted by BobC (Post 2239411)

And in the end, that "cost" will be considered as nothing more than a cost of doing business, and will ultimately be passed on to consumers buying driverless vehicles.

I wonder what the impact on car insurance would be. And would anyone need a drivers' license?

BobC 07-03-2022 07:26 PM

Quote:

Originally Posted by Mark17 (Post 2239439)
I wonder what the impact on car insurance would be. And would anyone need a drivers' license?

Great questions. I would assume anyone purchasing their own driverless car would still be responsible for purchasing insurance on it in case it does any damage to others or their property. As to the potential impact on the cost of that insurance, lord knows. Would likely depend a lot on how the courts view responsibility and liability in cases involving driverless vehicles, and I'm guessing we haven't seen enough to fully determine where that will all end up in the near future.

As for driver's licenses, if you have a vehicle you truly aren't actively driving, I'd have to believe a driver's license wouldn't be needed. However, a big difference may be in if you actually purchase and own a driverless vehicle, as opposed to just hiring and using them like an Uber or a taxi. As the owner of a driverless vehicle, I would assume the state would require someone, either the owner or manufacturer, to have the driving system periodically checked and tested to make sure it is properly updated for current rules and regulations, and that it operating in accordance with required rules and laws. There will likely still be some type of driving license for the vehicle to still be required, but who will be responsible for getting and paying for it, that may be another question.

As for people that now have and operate a vehicle with assisted/driverless systems, what I would like to know is why states still seem to only require a standard driver's license to own and operate one? If we are really going forward with such a dramatic change in how vehicles will be operated on our streets in the future, wouldn't you think that the states should have immediately jumped on getting a whole new set of tests and requirements for licensing people to own and operate such driverless and driving assisted vehicles? Like the story earlier told about the woman who put her vehicle on cruise control, and then went to make a cup of tea, there are a lot of things people don't obviously know or always understand about operating such vehicles. You would think (hope) the states would have realized that they should have immediately instituted new training, tests and licensing procedures for this entirely new type of vehicle, and its operation and it's operators, that is going to become the norm whether we all like it or not.

JustinD 07-05-2022 06:06 PM

perhaps?

https://static.independent.co.uk/s3f...=982:726,smart

mrreality68 07-11-2022 01:52 PM

How would you feel if you were the other person in the accident and the responsibility is yet to be determined. What would they do and/or would they have to attempt to sue everyone
THese type of issues should be resolved before it is out there in full use


All times are GMT -6. The time now is 05:22 PM.