News
Tesla posts stern response to Washington Post’s article on alleged Autopilot dangers
Tesla has posted a stern response to a recent article from The Washington Post that suggested that the electric vehicle maker is putting people at risk because it allows systems like Autopilot to be deployed in areas that it was not designed for. The publication noted that it was able to identify about 40 fatal or serious crashes since 2016, and at least eight of them happened in roads where Autopilot was not designed to be used in the first place.
Overall, the Washington Post article argued that while Tesla does inform drivers that they are responsible for their vehicles while Autopilot is engaged, the company is nonetheless also at fault since it allows its driver-assist system to be deployed irresponsibly. “Even though the company has the technical ability to limit Autopilot’s availability by geography, it has taken few definitive steps to restrict use of the software,” the article read.
In its response, which was posted through its official account on X, Tesla highlighted that it is very serious about keeping both its customers and pedestrians safe. The company noted that the data is clear about the fact that systems like Autopilot, when used safety, drastically reduce the number of accidents on the road. The company also reiterated the fact that features like Traffic Aware Cruise Control are Level 2 systems, which require constant supervision from the driver.
Following is the pertinent section of Tesla’s response.
While there are many articles that do not accurately convey the nature of our safety systems, the recent Washington Post article is particularly egregious in its misstatements and lack of relevant context.
We at Tesla believe that we have a moral obligation to continue improving our already best-in-class safety systems. At the same time, we also believe it is morally indefensible not to make these systems available to a wider set of consumers, given the incontrovertible data that shows it is saving lives and preventing injury.
Regulators around the globe have a duty to protect consumers, and the Tesla team looks forward to continuing our work with them towards our common goal of eliminating as many deaths and injuries as possible on our roadways.
Below are some important facts, context and background.
Background
1. Safety metrics are emphatically stronger when Autopilot is engaged than when not engaged.
a. In the 4th quarter of 2022, we recorded one crash for every 4.85 million miles driven in which drivers were using Autopilot technology. For drivers who were not using Autopilot technology, we recorded one crash for every 1.40 million miles driven. By comparison, the most recent data available from NHTSA and FHWA (from 2021) shows that in the United States there was an automobile crash approximately every 652,000 miles.
b. The data is clear: The more automation technology offered to support the driver, the safer the driver and other road users. Anecdotes from the WaPo article come from plaintiff attorneys—cases involving significant driver misuse—and are not a substitute for rigorous analysis and billions of miles of data.
c. Recent Data continues this trend and is even more compelling. Autopilot is ~10X safer than US average and ~5X safer than a Tesla with no AP tech enabled. More detailed information will be publicly available in the near future.
2. Autopilot features, including Traffic-Aware Cruise Control and Autosteer, are SAE Level 2 driver-assist systems, meaning –
a. Whether the driver chooses to engage Autosteer or not, the driver is in control of the vehicle at all times. The driver is notified of this responsibility, consents, agrees to monitor the driving assistance, and can disengage anytime.
b. Despite the driver being responsible for control for the vehicle, Tesla has a number of additional safety measures designed to monitor that drivers engage in active driver supervision, including torque-based and camera-based monitoring. We have continued to make progress in improving these monitoring systems to reduce misuse.
c. Based on the above, among other factors, the data strongly indicates our customers are far safer by having the choice to decide when it is appropriate to engage Autopilot features. When used properly, it provides safety benefits on all road classes.
Tesla also provided some context about some of the crashes that were highlighted by The Washington Post. As per the electric vehicle maker, the incidents that the publication cited involved drivers who were not using Autopilot correctly. The publication, therefore, omitted several important facts when it was framing its narrative around Autopilot’s alleged risks, Tesla argued.
Following is the pertinent section of Tesla’s response.
The Washington Post leverages instances of driver misuse of the Autopilot driver assist feature to suggest the system is the problem. The article got it wrong, misreporting what’s actually alleged in the pending lawsuit and omitting several important facts:
1. Contrary to the Post article, the Complaint doesn’t reference complacency or Operational Design Domain.
2. Instead, the Complaint acknowledges the harms of driver inattention, misuse, and negligence.
3. Mr. Angulo and the parents of Ms. Benavides who tragically died in the crash, first sued the Tesla driver—and settled with him—before ever pursuing a claim against Tesla.
4. The Benavides lawsuit alleges the Tesla driver “carelessly and/or recklessly” “drove through the intersection…ignoring the controlling stop sign and traffic signal.”
5. The Tesla driver didn’t blame Tesla, didn’t sue Tesla, didn’t try to get Tesla to pay on his behalf. He took responsibility.
6. The Post had the driver’s statements to police and reports that he said he was “driving on cruise.” They omit that he also admitted to police “I expect to be the driver and be responsible for this.”
7. The driver later testified in the litigation he knew Autopilot didn’t make the car self-driving and he was the driver, contrary to the Post and Angulo claims that he was mislead, over-reliant or complacent. He readily and repeatedly admitted:
a. “I was highly aware that was still my responsibility to operate the vehicle safely.”
b. He agreed it was his “responsibility as the driver of the vehicle, even with Autopilot activated, to drive safely and be in control of the vehicle at all times.”
c. “I would say specifically I was aware that the car was my responsibility. I didn’t read all these statements and passages, but I’m aware the car was my responsibility.”
8. The Post also failed to disclose that Autopilot restricted the vehicle’s speed to 45 mph (the speed limit) based on the road type, but the driver was pressing the accelerator to maintain 60 mph when he ran the stop sign and caused the crash. The car displayed an alert to the driver that, because he was overriding Autopilot with the accelerator, “Cruise control will not brake.”
While there are many articles that do not accurately convey the nature of our safety systems, the recent Washington Post article is particularly egregious in its misstatements and lack of relevant context.
We at Tesla believe that we have a moral obligation to continue…
— Tesla (@Tesla) December 12, 2023
Don’t hesitate to contact us with news tips. Just send a message to simon@teslarati.com to give us a heads up.
News
Tesla Full Self-Driving shows stunning maneuver in Europe to silence skeptics
In a striking demonstration of autonomous driving prowess, Tesla’s Full Self-Driving (FSD) system recently showcased its capabilities on the narrow rural roads of the Netherlands. Captured in two in-car videos, the system encountered scenarios that would challenge even the most experienced human drivers.
Tesla Full Self-Driving, fresh on the heels of its approval for operation on European roads for the first time, showed off a stunning maneuver that will certainly silence any skeptics on the continent.
Fresh off its approval in the Netherlands, Full Self-Driving is working toward a significant expansion into more parts of Europe.
In a striking demonstration of autonomous driving prowess, Tesla’s Full Self-Driving (FSD) system recently showcased its capabilities on the narrow rural roads of the Netherlands. Captured in two in-car videos, the system encountered scenarios that would challenge even the most experienced human drivers.
In the first clip, a wide tractor occupied more than half the lane on a tight two-way road. Rather than braking abruptly or forcing a collision risk, FSD smoothly edged the vehicle onto the adjacent bike path—using the extra space with precision—before seamlessly returning to the lane once clear.
The second clip was equally demanding: while overtaking a group of cyclists, an oncoming car approached at speed.
FSD maintained a safe, minimal buffer to the cyclists while timing the pass perfectly, avoiding any swerve or hesitation that could unsettle passengers or other road users.
People wonder if FSD is safe on narrow European roads. Well have a look what it did when a tractor took up more than half of the road or when overtaking bicycles with fast oncoming traffic. pic.twitter.com/z37Csa09sP
— Chanan Bos (@ChananBos) April 14, 2026
This maneuver highlights FSD’s advanced spatial reasoning and predictive planning. On roads often under three meters wide, with no room for error, the system calculated available clearance in real time, incorporated shoulder and path geometry, and executed a controlled deviation without compromising safety.
It treated the bike path as a legitimate extension of navigable space, something many drivers might hesitate to do, while respecting Dutch road norms and cyclist priority.
Such feats align closely with a growing library of impressive FSD maneuvers documented on camera worldwide.
In urban Amsterdam, for instance, FSD has navigated the world’s densest cyclist environments, weaving through hundreds of unpredictable bike movements on canal-side streets with tram tracks and pedestrians.
One uncut drive showed it yielding smoothly at crossings, overtaking where needed, and even handling a near-perfect auto-park in a tight residential spot, demonstrating the same low-speed precision seen in the rural clips.
Teslas using FSD have tackled turbo roundabouts in the Netherlands, complex multi-lane circles notorious for geometry challenges, merging confidently while yielding to traffic. Similar clips depict smooth handling of construction zones, emergency vehicle pull-overs, and gated parking barriers, where the car stops precisely, waits for clearance, and proceeds without driver input.
Collectively, these examples illustrate FSD’s evolution toward handling the unpredictable.
The rural Netherlands maneuvers aren’t isolated. Instead, they reflect a pattern of spatial awareness, cyclist deference, and traffic anticipation seen from city streets to highways.
As FSD continues refining through real-world data, videos like this one are certainly building a compelling case for its readiness on Europe’s varied roads.
News
Tesla utilizes its ‘Rave Cave’ for new awesome safety feature
Part of the massive interior overhaul of both the Model 3 “Highland” and Model Y “Juniper” was the addition of interior accent lighting to help bring out the mood of the vehicle, increase the customization of the interior, and to create a unique listening experience.
Tesla is utilizing its ‘Rave Cave’ for an awesome new safety feature that will arrive with the upcoming Spring Update for 2026.
Part of the massive interior overhaul of both the Model 3 “Highland” and Model Y “Juniper” was the addition of interior accent lighting to help bring out the mood of the vehicle, increase the customization of the interior, and to create a unique listening experience.
Tesla added a Sync Lights feature that will strobe the accent strips with the beat of the music.
It is one of the most unique and one of the coolest non-functional features of a Tesla, as it does not improve the driving of the vehicle, but makes it a cool and personal addition to the interior.
However, Tesla is going to take it one step further, as the Rave Cave lights will now be used for blind spot recognition. This feature will be added as the Spring 2026 Update starts to roll out.
A lot of CRAZY new features coming with Tesla’s 2026 Spring Update, including a new FSD app!
– Self-Driving App (AI4 hardware): New app in App Launcher > Self-Driving for one-tap FSD subscriptions, activation guides, and ongoing stats.
– “Hey Grok”: Voice-activated Grok with… https://t.co/ljeYPlq9Qt— TESLARATI (@Teslarati) April 13, 2026
Tesla writes:
“Accent lights now turn red when an object is in your blind spot and your turn signal is engaged, or when an approaching object is detected while parked.”
This neat new safety feature will now increase the likelihood of a driver, who is operating their Tesla manually, of seeing the blind spot warnings that are currently available on the A pillar and on the center touchscreen.
These new alerts will now warn drivers of cross traffic as they back out of a parking space with little to no visibility of what is coming. It is a great new addition that will only increase the safety of the vehicles, while also utilizing something that is already installed in these specific Model 3 and Model Y units.
The Model 3 and Model Y were the central focus of the Spring 2026 Update, especially considering the fact that the Model S and Model X are basically gone, with only a few hundred units left. Additionally, Tesla included new Immersive Sound and Car Visualization for the Model 3 and Model Y specifically in this new update.
News
Tesla parked 50+ Cybercabs outside its Texas Factory with some crash tested
Dozens of Tesla Cybercabs have been spotted at Giga Texas crash testing facility ahead of launch.
Drone footage captured by longtime Giga Texas observer Joe Tegtmeyer shows over 50 units of Tesla Cybercab at the Austin factory campus, including several units clustered by Tesla’s on-site crash testing facility.
The outbound lot at Gigafactory Texas sits just outside the factory exit and serves as the primary staging area where finished vehicles are held before being loaded onto transport carriers or dispatched for validation testing. On any given day, the lot holds a mix of Model Y and Cybertruck units alongside the growing Tesla Cybercab fleet, as can be seen in the drone footage captured by Joe Tegtmeyer.
Roughly 50 Cybercab units are visible across the campus, parked in tight organized rows. Most of the units visible still carry steering wheels and pedals, temporary additions Tesla included to satisfy current safety regulations while the vehicles accumulate real-world data ahead of full regulatory approval for a steering wheel-free design. Tesla operates dedicated Crash Labs at both its Giga Texas and Fremont facilities that are purpose-built for controlled structural crash tests. Historically, automakers begin intensive crash testing roughly one to two months before volume production kicks off. The Cybertruck followed almost exactly that pattern. The Cybercab appears to be on the same track facility that we first saw back in October 2025. The first production Cybercab rolled off the Giga Texas line on February 17, 2026. Volume production is now targeted for April. Musk previously wrote on X that “the early production rate will be agonizingly slow, but eventually end up being insanely fast,” and separately stated Tesla is targeting at least 2 million Cybercab units per year. Commercial robotaxi service in Austin is targeted for late 2026.


