An increased threat of losses for Tesla in Autopilot litigation could embolden others to challenge the technology in court. David Paul Morris/Bloomberg

Tesla’s biggest vehicle recall ever threatens to hurt the company’s defense in several high-profile lawsuits it faces over crashes linked to its automated driver-assistance program Autopilot.

The automaker’s recall of 2 million cars comes after a top U.S. auto-safety regulator found that Autopilot failed to ensure drivers stay attentive. Lawyers representing crash victims and family members who have sued Tesla, some over fatal accidents, say the determination bolsters their claims that the Autopilot system is defective and contributed to collisions.

An increased threat of losses for Tesla in Autopilot litigation across the U.S. could embolden others to challenge the technology in court and force the company to potentially pay victims millions of dollars in compensatory damages.

Half a dozen lawsuits headed to trial in the next year in Florida, California, and Texas allege that Tesla allowed Autopilot to be used on roads for which it wasn’t designed and that the technology failed to send sufficient warnings when drivers became disengaged. Lawyers leading the cases say these very issues are mirrored in the recall.

“We believe that Tesla knew there was a problem and that this recall validates that,” said Adam Boumel, an attorney preparing for a trial against the company next year over a 2019 accident in Key Largo, Florida. A man was badly injured and a woman was killed while standing next to a parked car struck by a Tesla Model S.

Tesla didn’t immediately respond to a request for comment.

Advertisement

Autopilot is a crucial part of the Elon Musk-led company’s efforts to stand out from industry rivals and a significant factor in its almost $800 billion valuation. In a recall report, Tesla said it doesn’t agree with the analysis by the National Highway Traffic Safety Administration but undertook the recall voluntarily in the interest of resolving the agency’s investigation.

The company said it will deploy an over-the-air software remedy, which includes additional driver controls and alerts. The company’s acknowledgment in the notice that safeguards around its Autosteer feature “may not be sufficient to prevent driver misuse” amounts to an admission of a safety defect, said Michael Brooks, executive director of the Center for Auto Safety.

CRACKS OPEN UP

In another case, the family of a Model 3 owner killed in 2019 when the car plowed into the underbelly of a tractor-trailer claims Autopilot failed to detect the truck as it crossed a divided Florida highway. If the driver were still alive, his car would now be covered by the recall, said Trey Lytal, the family’s lawyer.

There’s also a suit in California involving a driver killed in 2018 when his Model X ran into a highway barrier as he was playing a video game on his phone. Another in Texas was filed by five police officers hit by a Model X while searching a car for illegal drugs on the side of a freeway. The officers allege that the Tesla driver was intoxicated and that the vehicle failed to engage Autopilot safety features to avoid the accident.

Bryant Walker Smith, a University of South Carolina law professor who has monitored years of controversy over Autopilot, said the recall is “not conclusive” as legal proof of wrongdoing but is “helpful” for the lawsuits against Tesla.

Advertisement

“Regulatory actions and lawsuits can be very symbiotic in that one can help the other,” he said. “What we are seeing is several cracks starting to open up against Tesla, and not all of them will manifest.”

Plaintiffs can use facts of the recall and Tesla’s move to fix the driver-engagement issue as evidence in certain states, including California, said John Uustal, a trial attorney who specializes in product liability law and isn’t involved in ongoing Autopilot litigation.

“It can provide proof to help win the lawsuit, but it’s not the basis for the lawsuit,” Uustal said.

TESLA’S DEFENSE

The NHTSA has been investigating whether defects in Autopilot have contributed to at least 17 deaths. Tesla also faces regulatory probes over claims made about the automated driving capabilities of its cars.

In its defense in court to claims that Autopilot is marketed to lull drivers into a false sense of security, Tesla has argued its vehicle owner’s manuals and the company’s website clearly say that the features require active driver supervision.

Advertisement

That helped Tesla prevail this year in trials over two separate California crashes – one fatal and the other injurious – in which plaintiffs blamed Autopilot for steering the vehicles off the road.

Don Slavik, a lawyer involved in multiple crash suits against Tesla, credited the company with making improvements to driver monitoring over the years, including by using cameras to detect whether drivers are paying attention to the road.

“They’ve made incremental improvements, but have they gone full-out?” Slavik said. “I think the gold standard right now is the Ford and GM systems.”

 

Bloomberg writer Madlin Mekelburg contributed to this article.

Copy the Story Link

Related Headlines


Only subscribers are eligible to post comments. Please subscribe or login first for digital access. Here’s why.

Use the form below to reset your password. When you've submitted your account email, we will send an email with a reset code.

filed under: