Thursday, November 02, 2023

WAIT, WHAT?!
Jury acquits Tesla Autopilot in fatal crash

Tesla's driver-assist technology has been at the center of several lawsuits stemming from fatal crashes in which the carmaker's software was being used. 
The popular Model 3 is shown on display in Beijing on Sunday, January 5, 2020. 
Photo by Stephen Shaver/UPI | License Photo

Nov. 1 (UPI) -- A California jury ruled Tuesday that Tesla and its driver-assistance software were not responsible for a 2019 crash that killed a Model 3 owner and seriously injured two passengers.

The jury's decision could presage how other juries decide on similar cases, in which Tesla's technology has been blamed for fatal crashes.

In this case, decided in California a lawsuit was filed by Lindsay Molander and her son, Parker Austin. In June 2019, they were passengers in a Tesla Model 3 car driven by Micah Lee, who died after the car suddenly veered off a highway, smashed into a palm tree and burst into flames. Lindsay Molander and her son were severely injured.

Lee was using Tesla's Autopilot system, a name which can create the perception that the car is driving itself, much like an airplane that is on autopilot. But Tesla's system is merely driver- assistance technology, which allows the car to operate with a degree of autonomy but has been criticized as unreliable.

This is the first verdict involving a fatal crash in which lawyers for the victims have blamed Tesla's Autopilot system. Lawyers for Molander blamed a software malfunction for the crash. Attorney Jonathan Michaels argued Tesla was aware of a defect in the technology that could cause the Tesla to veer suddenly, which is what happened in the accident that killed Lee.

Molander's lawyer blamed a malfunction by the car's driver-assistance software for the crash. During closing arguments in the trial, Jonathan Michaels, who is representing Ms. Molander and her son, cited Tesla internal documents, which he said showed the company was aware of a defect in the software that could cause the car to veer suddenly.

Lawyers for Tesla argued human error caused the deadly crash and said the software was not even capable of causing the Model 3 to veer suddenly as it had in this case.

Michael Carey, a lawyer representing Tesla, blamed Lee who he said had consumed several alcoholic beverages at a restaurant near Disneyland just prior to the accident.

Carey told jurors Lee had alcohol in his blood several hours after the accident, though it was below the threshold for legal intoxication in California.

"When someone gets in a crash after they have had a few drinks, we don't accept their excuses," Mr. Carey told jurors.

"This product is doing a really good job," Mr. Carey added of Tesla's technology. "Autopilot is helping people and making the world safer for all of us."

While it was the first to focus on the Autopilot technology, that was Tesla's second victory in a series of cases arguing that the company should be held liable when its vehicles crash while using advanced driver-assist systems. Earlier this year, a jury ruled against plaintiff Justine Hsu, who sued Tesla after his vehicle hit a median while using Autopilot.

While juries have ruled in Tesla's favor so far in cases involving driver-assist technology, the deadly crashes could color consumer perceptions and confidence about the quality of the vehicles and the safety of the technology.

Half of electric car sales in the United States are Teslas, often touted for their technology, even though driver-assist safety and reliability remain an open question given the spate of lethal accidents involving the company's software.

Tesla chief executive Elon Musk has told Wall Street that self-driving technology will be a lucrative revenue stream for the company. It charges $199 per month for the top of the line driver assistance package called Full Self-Driving.

Despite their names, Tesla recommends drivers remain engaged with the steering wheel while operating the vehicle using Full Self-Driving and with the less expensive Autopilot option, suggesting that, despite the names, drivers must remain fully prepared to regain full control of the vehicle at a moment's notice.

Tesla remains under criminal investigation by the US Department of Justice for its self-driving features. In 2021, the U.S. National Highway Traffic Safety Administration launched an investigation into the Autopilot system following collisions with parked emergency vehicles.

California's Department of Motor Vehicles has accused Tesla of making false claims about the capabilities of its Autopilot and Full Self-Driving systems.

Musk has been flippant about accepting liability for crashes involving the company's driver-assist technology. "There's a lot of people who assume we have legal liability judging by the lawsuits," Musk quipped.

No comments:

Post a Comment