Connect with us
tesla autopilot tesla autopilot

News

Tesla posts stern response to Washington Post’s article on alleged Autopilot dangers

(Credit: Tesla)

Published

on

Tesla has posted a stern response to a recent article from The Washington Post that suggested that the electric vehicle maker is putting people at risk because it allows systems like Autopilot to be deployed in areas that it was not designed for. The publication noted that it was able to identify about 40 fatal or serious crashes since 2016, and at least eight of them happened in roads where Autopilot was not designed to be used in the first place. 

Overall, the Washington Post article argued that while Tesla does inform drivers that they are responsible for their vehicles while Autopilot is engaged, the company is nonetheless also at fault since it allows its driver-assist system to be deployed irresponsibly. “Even though the company has the technical ability to limit Autopilot’s availability by geography, it has taken few definitive steps to restrict use of the software,” the article read. 

In its response, which was posted through its official account on X, Tesla highlighted that it is very serious about keeping both its customers and pedestrians safe. The company noted that the data is clear about the fact that systems like Autopilot, when used safety, drastically reduce the number of accidents on the road. The company also reiterated the fact that features like Traffic Aware Cruise Control are Level 2 systems, which require constant supervision from the driver. 

Following is the pertinent section of Tesla’s response.

While there are many articles that do not accurately convey the nature of our safety systems, the recent Washington Post article is particularly egregious in its misstatements and lack of relevant context. 

Advertisement
-->

We at Tesla believe that we have a moral obligation to continue improving our already best-in-class safety systems. At the same time, we also believe it is morally indefensible not to make these systems available to a wider set of consumers, given the incontrovertible data that shows it is saving lives and preventing injury. 

Regulators around the globe have a duty to protect consumers, and the Tesla team looks forward to continuing our work with them towards our common goal of eliminating as many deaths and injuries as possible on our roadways. 

Below are some important facts, context and background.

Background

1. Safety metrics are emphatically stronger when Autopilot is engaged than when not engaged.

Advertisement
-->

a. In the 4th quarter of 2022, we recorded one crash for every 4.85 million miles driven in which drivers were using Autopilot technology. For drivers who were not using Autopilot technology, we recorded one crash for every 1.40 million miles driven. By comparison, the most recent data available from NHTSA and FHWA (from 2021) shows that in the United States there was an automobile crash approximately every 652,000 miles.

b. The data is clear: The more automation technology offered to support the driver, the safer the driver and other road users. Anecdotes from the WaPo article come from plaintiff attorneys—cases involving significant driver misuse—and are not a substitute for rigorous analysis and billions of miles of data.

c. Recent Data continues this trend and is even more compelling. Autopilot is ~10X safer than US average and ~5X safer than a Tesla with no AP tech enabled. More detailed information will be publicly available in the near future.

2. Autopilot features, including Traffic-Aware Cruise Control and Autosteer, are SAE Level 2 driver-assist systems, meaning –

a. Whether the driver chooses to engage Autosteer or not, the driver is in control of the vehicle at all times. The driver is notified of this responsibility, consents, agrees to monitor the driving assistance, and can disengage anytime.

Advertisement
-->

b. Despite the driver being responsible for control for the vehicle, Tesla has a number of additional safety measures designed to monitor that drivers engage in active driver supervision, including torque-based and camera-based monitoring. We have continued to make progress in improving these monitoring systems to reduce misuse.

c. Based on the above, among other factors, the data strongly indicates our customers are far safer by having the choice to decide when it is appropriate to engage Autopilot features. When used properly, it provides safety benefits on all road classes.

Tesla also provided some context about some of the crashes that were highlighted by The Washington Post. As per the electric vehicle maker, the incidents that the publication cited involved drivers who were not using Autopilot correctly. The publication, therefore, omitted several important facts when it was framing its narrative around Autopilot’s alleged risks, Tesla argued. 

Following is the pertinent section of Tesla’s response.

The Washington Post leverages instances of driver misuse of the Autopilot driver assist feature to suggest the system is the problem. The article got it wrong, misreporting what’s actually alleged in the pending lawsuit and omitting several important facts:

Advertisement
-->

1. Contrary to the Post article, the Complaint doesn’t reference complacency or Operational Design Domain.

2. Instead, the Complaint acknowledges the harms of driver inattention, misuse, and negligence.

3. Mr. Angulo and the parents of Ms. Benavides who tragically died in the crash, first sued the Tesla driver—and settled with him—before ever pursuing a claim against Tesla.

4. The Benavides lawsuit alleges the Tesla driver “carelessly and/or recklessly” “drove through the intersection…ignoring the controlling stop sign and traffic signal.”

5. The Tesla driver didn’t blame Tesla, didn’t sue Tesla, didn’t try to get Tesla to pay on his behalf. He took responsibility.

Advertisement
-->

6. The Post had the driver’s statements to police and reports that he said he was “driving on cruise.” They omit that he also admitted to police “I expect to be the driver and be responsible for this.”

7. The driver later testified in the litigation he knew Autopilot didn’t make the car self-driving and he was the driver, contrary to the Post and Angulo claims that he was mislead, over-reliant or complacent. He readily and repeatedly admitted:

a. “I was highly aware that was still my responsibility to operate the vehicle safely.”

b. He agreed it was his “responsibility as the driver of the vehicle, even with Autopilot activated, to drive safely and be in control of the vehicle at all times.”

c. “I would say specifically I was aware that the car was my responsibility. I didn’t read all these statements and passages, but I’m aware the car was my responsibility.”

Advertisement
-->

8. The Post also failed to disclose that Autopilot restricted the vehicle’s speed to 45 mph (the speed limit) based on the road type, but the driver was pressing the accelerator to maintain 60 mph when he ran the stop sign and caused the crash. The car displayed an alert to the driver that, because he was overriding Autopilot with the accelerator, “Cruise control will not brake.”


Don’t hesitate to contact us with news tips. Just send a message to simon@teslarati.com to give us a heads up.

Advertisement
-->

Simon is an experienced automotive reporter with a passion for electric cars and clean energy. Fascinated by the world envisioned by Elon Musk, he hopes to make it to Mars (at least as a tourist) someday. For stories or tips--or even to just say a simple hello--send a message to his email, simon@teslarati.com or his handle on X, @ResidentSponge.

Advertisement
Comments

Elon Musk

Tesla CEO Elon Musk sends rivals dire warning about Full Self-Driving

Published

on

Credit: Tesla

Tesla CEO Elon Musk revealed today on the social media platform X that legacy automakers, such as Ford, General Motors, and Stellantis, do not want to license the company’s Full Self-Driving suite, at least not without a long list of their own terms.

“I’ve tried to warn them and even offered to license Tesla FSD, but they don’t want it! Crazy,” Musk said on X. “When legacy auto does occasionally reach out, they tepidly discuss implementing FSD for a tiny program in 5 years with unworkable requirements for Tesla, so pointless.”

Musk made the remark in response to a note we wrote about earlier today from Melius Research, in which analyst Rob Wertheimer said, “Our point is not that Tesla is at risk, it’s that everybody else is,” in terms of autonomy and self-driving development.

Wertheimer believes there are hundreds of billions of dollars in value headed toward Tesla’s way because of its prowess with FSD.

A few years ago, Musk first remarked that Tesla was in early talks with one legacy automaker regarding licensing Full Self-Driving for its vehicles. Tesla never confirmed which company it was, but given Musk’s ongoing talks with Ford CEO Jim Farley at the time, it seemed the Detroit-based automaker was the likely suspect.

Tesla’s Elon Musk reiterates FSD licensing offer for other automakers

Advertisement
-->

Ford has been perhaps the most aggressive legacy automaker in terms of its EV efforts, but it recently scaled back its electric offensive due to profitability issues and weak demand. It simply was not making enough vehicles, nor selling the volume needed to turn a profit.

Musk truly believes that many of the companies that turn their backs on FSD now will suffer in the future, especially considering the increased chance it could be a parallel to what has happened with EV efforts for many of these companies.

Unfortunately, they got started too late and are now playing catch-up with Tesla, XPeng, BYD, and the other dominating forces in EVs across the globe.

Continue Reading

News

Tesla backtracks on strange Nav feature after numerous complaints

Published

on

Credit: Tesla

Tesla is backtracking on a strange adjustment it made to its in-car Navigation feature after numerous complaints from owners convinced the company to make a change.

Tesla’s in-car Navigation is catered to its vehicles, as it routes Supercharging stops and preps your vehicle for charging with preconditioning. It is also very intuitive, and features other things like weather radar and a detailed map outlining points of interest.

However, a recent change to the Navigation by Tesla did not go unnoticed, and owners were really upset about it.

Tesla’s Navigation gets huge improvement with simple update

For trips that required multiple Supercharger stops, Tesla decided to implement a naming change, which did not show the city or state of each charging stop. Instead, it just showed the business where the Supercharger was located, giving many owners an unwelcome surprise.

Advertisement
-->

However, Tesla’s Director of Supercharging, Max de Zegher, admitted the update was a “big mistake on our end,” and made a change that rolled out within 24 hours:

The lack of a name for the city where a Supercharging stop would be made caused some confusion for owners in the short term. Some drivers argued that it was more difficult to make stops at some familiar locations that were special to them. Others were not too keen on not knowing where they were going to be along their trip.

Tesla was quick to scramble to resolve this issue, and it did a great job of rolling it out in an expedited manner, as de Zegher said that most in-car touch screens would notice the fix within one day of the change being rolled out.

Advertisement
-->

Additionally, there will be even more improvements in December, as Tesla plans to show the common name/amenity below the site name as well, which will give people a better idea of what to expect when they arrive at a Supercharger.

Continue Reading

News

Dutch regulator RDW confirms Tesla FSD February 2026 target

The regulator emphasized that safety, not public pressure, will decide whether FSD receives authorization for use in Europe.

Published

on

The Dutch vehicle authority RDW responded to Tesla’s recent updates about its efforts to bring Full Self-Driving (Supervised) in Europe, confirming that February 2026 remains the target month for Tesla to demonstrate regulatory compliance. 

While acknowledging the tentative schedule with Tesla, the regulator emphasized that safety, not public pressure, will decide whether FSD receives authorization for use in Europe.

RDW confirms 2026 target, warns Feb 2026 timeline is not guaranteed

In its response, which was posted on its official website, the RDW clarified that it does not disclose details about ongoing manufacturer applications due to competitive sensitivity. However, the agency confirmed that both parties have agreed on a February 2026 window during which Tesla is expected to show that FSD (Supervised) can meet required safety and compliance standards. Whether Tesla can satisfy those conditions within the timeline “remains to be seen,” RDW added.

RDW also directly addressed Tesla’s social media request encouraging drivers to contact the regulator to express support. While thanking those who already reached out, RDW asked the public to stop contacting them, noting these messages burden customer-service resources and have no influence on the approval process. 

“In the message on X, Tesla calls on Tesla drivers to thank the RDW and to express their enthusiasm about this planning to us by contacting us. We thank everyone who has already done so, and would like to ask everyone not to contact us about this. It takes up unnecessary time for our customer service. Moreover, this will have no influence on whether or not the planning is met,” the RDW wrote. 

Advertisement
-->

The RDW shares insights on EU approval requirements

The RDW further outlined how new technology enters the European market when no existing legislation directly covers it. Under EU Regulation 2018/858, a manufacturer may seek an exemption for unregulated features such as advanced driver assistance systems. The process requires a Member State, in this case the Netherlands, to submit a formal request to the European Commission on the manufacturer’s behalf.

Approval then moves to a committee vote. A majority in favor would grant EU-wide authorization, allowing the technology across all Member States. If the vote fails, the exemption is valid only within the Netherlands, and individual countries must decide whether to accept it independently.

Before any exemption request can be filed, Tesla must complete a comprehensive type-approval process with the RDW, including controlled on-road testing. Provided that FSD Supervised passes these regulatory evaluations, the exemption could be submitted for broader EU consideration.

Continue Reading