Tesla's Autopilot was not engaged in a crash with a train; driver unharmed
News

Tesla argues human error caused fatal 2019 crash, not Autopilot: report

Credit: Jeremy from Sydney, Australia, CC BY 2.0 , via Wikimedia Commons

Tesla now faces the jury’s verdict in a trial alleging that Autopilot caused a fatality, and the trial is expected to set a precedent for future cases surrounding advanced driver assistance systems (ADAS). During closing arguments on Tuesday, an attorney for the plaintiffs pointed to an analysis Tesla conducted two years before the accident, claiming that the automaker knowingly sold the Model 3 with a safety issue related to its steering.

The trial began in California late last month after a 2019 incident in which 37-year-old Micah Lee veered off a highway outside Los Angeles at 65 miles per hour, suddenly striking a palm tree before the vehicle burst into flames. According to court documents, the crash killed Lee and injured both of his passengers, one of whom was an 8-year-old boy.

Lee’s passengers and estate initiated a civil lawsuit against Tesla, alleging that the company knew that Autopilot and its other safety systems were defective when it sold the Model 3.

Tesla has denied any liability in the accident, claiming that Lee had consumed alcohol before getting behind the wheel and saying it could not detect if Autopilot was engaged at the time of the crash.

This and other trials come as regulatory requirements for ADAS suites are just emerging, and the cases are expected to help navigate future court cases related to accidents with the systems.

According to Reuters, the attorney for the plaintiffs, Jonathan Michaels, showed the jury an internal safety analysis from Tesla in 2017 during closing arguments, in which employees identified “incorrect steering command” as a potential safety issue. Michaels said the issue involved an “excessive” steering wheel angle, arguing that Tesla was aware of related safety problems before selling the Model 3.

“They predicted this was going to happen. They knew about it. They named it,” Michaels said.

Michaels also said that Tesla created a specific protocol to deal with affected customers and that the company instructed workers to avoid accepting liability for the issue. Michaels also echoed prior arguments, saying that Tesla knew it was releasing Autopilot in an experimental state, though it needed to do so to boost market share.

“They had no regard for the loss of life,” Michaels added.

Michael Carey, Tesla’s attorney, said that the 2017 analysis wasn’t meant to identify the defect but instead was meant to help avoid any potential safety issues that could theoretically occur. Carey also said that Tesla developed a system to prevent Autopilot from making the same turn that had caused the crash.

Carey said that the subsequent development of the safety system “is a brick wall standing in the way of plaintiffs’ claim,” adding that there haven’t been any other cases where a Tesla has maneuvered the way that Lee’s did.

Instead, Carey argued to the jury that the crash’s simplest explanation was human error, asking jurors to avoid awarding damages on behalf of the severe injuries encountered by the victims.

“Empathy is a real thing, we’re not saying its not,” Carey argued. “But it does not make cars defective.”

Earlier this month, a federal judge in California ruled in Tesla’s favor in a similar case looking at whether the automaker misled consumers about its Autopilot system’s capabilities. In that case, which had the chance to become a class-action lawsuit, the judge ruled that most of the involved plaintiffs had signed an arbitration clause when purchasing the vehicle, requiring the claims to be settled outside of court.

The cases are expected to set precedents in court for future trials involving Tesla’s Autopilot and Full Self-Driving (FSD) beta systems and the degree of the automaker’s responsibility in accidents related to their engagement. Tesla is also facing additional information requests from the U.S. Department of Justice related to its Autopilot and FSD beta.

Tesla has received more requests regarding Autopilot and FSD from DOJ

What are your thoughts? Let me know at zach@teslarati.com, find me on X at @zacharyvisconti, or send your tips to us at tips@teslarati.com.

Tesla argues human error caused fatal 2019 crash, not Autopilot: report
To Top