News
Tesla posts stern response to Washington Post’s article on alleged Autopilot dangers
Tesla has posted a stern response to a recent article from The Washington Post that suggested that the electric vehicle maker is putting people at risk because it allows systems like Autopilot to be deployed in areas that it was not designed for. The publication noted that it was able to identify about 40 fatal or serious crashes since 2016, and at least eight of them happened in roads where Autopilot was not designed to be used in the first place.
Overall, the Washington Post article argued that while Tesla does inform drivers that they are responsible for their vehicles while Autopilot is engaged, the company is nonetheless also at fault since it allows its driver-assist system to be deployed irresponsibly. “Even though the company has the technical ability to limit Autopilot’s availability by geography, it has taken few definitive steps to restrict use of the software,” the article read.
In its response, which was posted through its official account on X, Tesla highlighted that it is very serious about keeping both its customers and pedestrians safe. The company noted that the data is clear about the fact that systems like Autopilot, when used safety, drastically reduce the number of accidents on the road. The company also reiterated the fact that features like Traffic Aware Cruise Control are Level 2 systems, which require constant supervision from the driver.
Following is the pertinent section of Tesla’s response.
While there are many articles that do not accurately convey the nature of our safety systems, the recent Washington Post article is particularly egregious in its misstatements and lack of relevant context.
We at Tesla believe that we have a moral obligation to continue improving our already best-in-class safety systems. At the same time, we also believe it is morally indefensible not to make these systems available to a wider set of consumers, given the incontrovertible data that shows it is saving lives and preventing injury.
Regulators around the globe have a duty to protect consumers, and the Tesla team looks forward to continuing our work with them towards our common goal of eliminating as many deaths and injuries as possible on our roadways.
Below are some important facts, context and background.
Background
1. Safety metrics are emphatically stronger when Autopilot is engaged than when not engaged.
a. In the 4th quarter of 2022, we recorded one crash for every 4.85 million miles driven in which drivers were using Autopilot technology. For drivers who were not using Autopilot technology, we recorded one crash for every 1.40 million miles driven. By comparison, the most recent data available from NHTSA and FHWA (from 2021) shows that in the United States there was an automobile crash approximately every 652,000 miles.
b. The data is clear: The more automation technology offered to support the driver, the safer the driver and other road users. Anecdotes from the WaPo article come from plaintiff attorneys—cases involving significant driver misuse—and are not a substitute for rigorous analysis and billions of miles of data.
c. Recent Data continues this trend and is even more compelling. Autopilot is ~10X safer than US average and ~5X safer than a Tesla with no AP tech enabled. More detailed information will be publicly available in the near future.
2. Autopilot features, including Traffic-Aware Cruise Control and Autosteer, are SAE Level 2 driver-assist systems, meaning –
a. Whether the driver chooses to engage Autosteer or not, the driver is in control of the vehicle at all times. The driver is notified of this responsibility, consents, agrees to monitor the driving assistance, and can disengage anytime.
b. Despite the driver being responsible for control for the vehicle, Tesla has a number of additional safety measures designed to monitor that drivers engage in active driver supervision, including torque-based and camera-based monitoring. We have continued to make progress in improving these monitoring systems to reduce misuse.
c. Based on the above, among other factors, the data strongly indicates our customers are far safer by having the choice to decide when it is appropriate to engage Autopilot features. When used properly, it provides safety benefits on all road classes.
Tesla also provided some context about some of the crashes that were highlighted by The Washington Post. As per the electric vehicle maker, the incidents that the publication cited involved drivers who were not using Autopilot correctly. The publication, therefore, omitted several important facts when it was framing its narrative around Autopilot’s alleged risks, Tesla argued.
Following is the pertinent section of Tesla’s response.
The Washington Post leverages instances of driver misuse of the Autopilot driver assist feature to suggest the system is the problem. The article got it wrong, misreporting what’s actually alleged in the pending lawsuit and omitting several important facts:
1. Contrary to the Post article, the Complaint doesn’t reference complacency or Operational Design Domain.
2. Instead, the Complaint acknowledges the harms of driver inattention, misuse, and negligence.
3. Mr. Angulo and the parents of Ms. Benavides who tragically died in the crash, first sued the Tesla driver—and settled with him—before ever pursuing a claim against Tesla.
4. The Benavides lawsuit alleges the Tesla driver “carelessly and/or recklessly” “drove through the intersection…ignoring the controlling stop sign and traffic signal.”
5. The Tesla driver didn’t blame Tesla, didn’t sue Tesla, didn’t try to get Tesla to pay on his behalf. He took responsibility.
6. The Post had the driver’s statements to police and reports that he said he was “driving on cruise.” They omit that he also admitted to police “I expect to be the driver and be responsible for this.”
7. The driver later testified in the litigation he knew Autopilot didn’t make the car self-driving and he was the driver, contrary to the Post and Angulo claims that he was mislead, over-reliant or complacent. He readily and repeatedly admitted:
a. “I was highly aware that was still my responsibility to operate the vehicle safely.”
b. He agreed it was his “responsibility as the driver of the vehicle, even with Autopilot activated, to drive safely and be in control of the vehicle at all times.”
c. “I would say specifically I was aware that the car was my responsibility. I didn’t read all these statements and passages, but I’m aware the car was my responsibility.”
8. The Post also failed to disclose that Autopilot restricted the vehicle’s speed to 45 mph (the speed limit) based on the road type, but the driver was pressing the accelerator to maintain 60 mph when he ran the stop sign and caused the crash. The car displayed an alert to the driver that, because he was overriding Autopilot with the accelerator, “Cruise control will not brake.”
While there are many articles that do not accurately convey the nature of our safety systems, the recent Washington Post article is particularly egregious in its misstatements and lack of relevant context.
We at Tesla believe that we have a moral obligation to continue…
— Tesla (@Tesla) December 12, 2023
Don’t hesitate to contact us with news tips. Just send a message to simon@teslarati.com to give us a heads up.
Elon Musk
Delaware Supreme Court reinstates Elon Musk’s 2018 Tesla CEO pay package
The unanimous decision criticized the prior total rescission as “improper and inequitable,” arguing that it left Musk uncompensated for six years of transformative leadership at Tesla.
The Delaware Supreme Court has overturned a lower court ruling, reinstating Elon Musk’s 2018 compensation package originally valued at $56 billion but now worth approximately $139 billion due to Tesla’s soaring stock price.
The unanimous decision criticized the prior total rescission as “improper and inequitable,” arguing that it left Musk uncompensated for six years of transformative leadership at Tesla. Musk quickly celebrated the outcome on X, stating that he felt “vindicated.” He also shared his gratitude to TSLA shareholders.
Delaware Supreme Court makes a decision
In a 49-page ruling Friday, the Delaware Supreme Court reversed Chancellor Kathaleen McCormick’s 2024 decision that voided the 2018 package over alleged board conflicts and inadequate shareholder disclosures. The high court acknowledged varying views on liability but agreed rescission was excessive, stating it “leaves Musk uncompensated for his time and efforts over a period of six years.”
The 2018 plan granted Musk options on about 304 million shares upon hitting aggressive milestones, all of which were achieved ahead of time. Shareholders overwhelmingly approved it initially in 2018 and ratified it once again in 2024 after the Delaware lower court struck it down. The case against Musk’s 2018 pay package was filed by plaintiff Richard Tornetta, who held just nine shares when the compensation plan was approved.
A hard-fought victory
As noted in a Reuters report, Tesla’s win avoids a potential $26 billion earnings hit from replacing the award at current prices. Tesla, now Texas-incorporated, had hedged with interim plans, including a November 2025 shareholder-approved package potentially worth $878 billion tied to Robotaxi and Optimus goals and other extremely aggressive operational milestones.
The saga surrounding Elon Musk’s 2018 pay package ultimately damaged Delaware’s corporate appeal, prompting a number of high-profile firms, such as Dropbox, Roblox, Trade Desk, and Coinbase, to follow Tesla’s exodus out of the state. What added more fuel to the issue was the fact that Tornetta’s legal team, following the lower court’s 2024 decision, demanded a fee request of more than $5.1 billion worth of TSLA stock, which was equal to an hourly rate of over $200,000.
Delaware Supreme Court Elon Musk 2018 Pay Package by Simon Alvarez
News
Tesla Cybercab tests are going on overdrive with production-ready units
Tesla is ramping its real-world tests of the Cybercab, with multiple sightings of the vehicle being reported across social media this week.
Tesla is ramping its real-world tests of the Cybercab, with multiple sightings of the autonomous two-seater being reported across social media this week. Based on videos of the vehicle that have been shared online, it appears that Cybercab tests are underway across multiple states.
Recent Cybercab sightings
Reports of Cybercab tests have ramped this week, with a vehicle that looked like a production-ready prototype being spotted at Apple’s Visitor Center in California. The vehicle in this sighting was interesting as it was equipped with a steering wheel. The vehicle also featured some changes to the design of its brake lights.
The Cybercab was also filmed testing at the Fremont factory’s test track, which also seemed to involve a vehicle that looked production-ready. This also seemed to be the case for a Cybercab that was spotted in Austin, Texas, which happened to be undergoing real-world tests. Overall, these sightings suggest that Cybercab testing is fully underway, and the vehicle is really moving towards production.
Production design all but finalized?
Recently, a near-production-ready Cybercab was showcased at Tesla’s Santana Row showroom in San Jose. The vehicle was equipped with frameless windows, dual windshield wipers, powered butterfly door struts, an extended front splitter, an updated lightbar, new wheel covers, and a license plate bracket. Interior updates include redesigned dash/door panels, refined seats with center cupholders, updated carpet, and what appeared to be improved legroom.
There seems to be a pretty good chance that the Cybercab’s design has been all but finalized, at least considering Elon Musk’s comments at the 2025 Annual Shareholder Meeting. During the event, Musk confirmed that the vehicle will enter production around April 2026, and its production targets will be quite ambitious.
News
Tesla gets a win in Sweden as union withdraws potentially “illegal” blockade
As per recent reports, the Vision union’s planned anti-Tesla action might have been illegal.
Swedish union Vision has withdrawn its sympathy blockade against Tesla’s planned service center and showroom in Kalmar. As per recent reports, the Vision union’s planned anti-Tesla action might have been illegal.
Vision’s decision to pull the blockade
Vision announced the blockade in early December, stating that it was targeting the administrative handling of Tesla’s facility permits in Kalmar municipality. The sympathy measure was expected to start Monday, but was formally withdrawn via documents sent to the Mediation Institute and Kalmar Municipality last week.
As noted in a Daggers Arbete report, plans for the strike were ultimately pulled after employer group SKR highlighted potential illegality under the Public Employment Act. Vision stressed its continued backing for the Swedish labor model, though Deputy negotiation manager Oskar Pettersson explained that the Vision union and IF Metall made the decision to cancel the planned strike together.
“We will not continue to challenge the regulations,” Petterson said. “The objection was of a technical nature. We made the assessment together with IF Metall that we were not in a position to challenge the legal assessment of whether we could take this particular action against Tesla. Therefore, we chose to revoke the notice itself.”
The SKR’s warning
Petterson also stated that SKR’s technical objection to the Vision union’s planned anti-Tesla strike framed the protest as an unauthorized act. “It was a legal assessment of the situation. Both for us and for IF Metall, it is important to be clear that we stand for the Swedish model. But we should not continue to challenge the regulations and risk getting judgments that lead nowhere in the application of the regulations,” he said.
Vision ultimately canceled its planned blockade against Tesla on December 9. With Vision’s withdrawal, few obstacles remain for Tesla’s long-planned Kalmar site. A foreign electrical firm completed work this fall, and Tesla’s Careers page currently lists a full-time service manager position based there, signaling an imminent opening.