A federal judge in California has ruled in favor of Tesla in a case looking at whether or not the company misled consumers about the capabilities of its Autopilot advanced driving assistance system (ADAS). The ruling will prevent Tesla from facing class-action claims from larger groups blaming the Autopilot system for accidents, with the judge instead requiring the plaintiffs to pursue individual arbitration outside of court.
On Saturday, U.S. District Judge Haywood Gilliam said that four Tesla owners who were pursuing a class-action lawsuit against the company agreed to arbitrate legal claims when they accepted the terms and conditions of their online vehicle purchases (via Reuters). Although a fifth plaintiff did not sign an arbitration agreement, Gilliam ruled that the Tesla owner had waited too long to file a lawsuit against the company.
Plaintiffs initially alleged that they were misled to think the company was on the cusp of full autonomy upon purchasing the ADAS but were instead sold unreliable technology that has resulted in multiple accidents, injuries and deaths.
Tesla has denied doing anything wrong throughout the process.
During the case, Tesla cited the arbitration agreement signed by four of the plaintiffs, and Gilliam rejected the plaintiffs’ claims that the arbitration agreements weren’t enforceable.
Tesla did not immediately respond to Reuters’ requests for comment on Monday, nor did lawyers for the plaintiffs.
The victory could set a precedent for ongoing and future cases surrounding Tesla’s Autopilot system, and it comes amidst the first U.S. trial in which plaintiffs claimed that the feature led to fatalities due to what they called untested, experimental technology.
The plaintiffs in that case say that Autopilot caused their Model 3 to drive off of a highway in Southern California at 65 miles per hour, then striking a tree and causing the car to start on fire. The accident, which took place in 2019, killed driver and vehicle owner Micah Lee and caused severe injuries to two passengers, including an 8-year-old boy.
Tesla noted that Lee had consumed alcohol prior to getting behind the wheel, though plaintiffs may argue that Lee’s blood alcohol content (BAC) was under the legal limit. Additionally, Tesla argued that human error caused the accident, adding that it could not be determined whether Autopilot was actually engaged at the time of the accident.
Crucially, Tesla doesn’t say that Autopilot or its Full Self-Driving (FSD) beta systems are fully autonomous, and the company alerts drivers that they must be ready to take complete control of the steering wheel at all times.