News
Tesla posts stern response to Washington Post’s article on alleged Autopilot dangers
Tesla has posted a stern response to a recent article from The Washington Post that suggested that the electric vehicle maker is putting people at risk because it allows systems like Autopilot to be deployed in areas that it was not designed for. The publication noted that it was able to identify about 40 fatal or serious crashes since 2016, and at least eight of them happened in roads where Autopilot was not designed to be used in the first place.
Overall, the Washington Post article argued that while Tesla does inform drivers that they are responsible for their vehicles while Autopilot is engaged, the company is nonetheless also at fault since it allows its driver-assist system to be deployed irresponsibly. “Even though the company has the technical ability to limit Autopilot’s availability by geography, it has taken few definitive steps to restrict use of the software,” the article read.
In its response, which was posted through its official account on X, Tesla highlighted that it is very serious about keeping both its customers and pedestrians safe. The company noted that the data is clear about the fact that systems like Autopilot, when used safety, drastically reduce the number of accidents on the road. The company also reiterated the fact that features like Traffic Aware Cruise Control are Level 2 systems, which require constant supervision from the driver.
Following is the pertinent section of Tesla’s response.
While there are many articles that do not accurately convey the nature of our safety systems, the recent Washington Post article is particularly egregious in its misstatements and lack of relevant context.
We at Tesla believe that we have a moral obligation to continue improving our already best-in-class safety systems. At the same time, we also believe it is morally indefensible not to make these systems available to a wider set of consumers, given the incontrovertible data that shows it is saving lives and preventing injury.
Regulators around the globe have a duty to protect consumers, and the Tesla team looks forward to continuing our work with them towards our common goal of eliminating as many deaths and injuries as possible on our roadways.
Below are some important facts, context and background.
Background
1. Safety metrics are emphatically stronger when Autopilot is engaged than when not engaged.
a. In the 4th quarter of 2022, we recorded one crash for every 4.85 million miles driven in which drivers were using Autopilot technology. For drivers who were not using Autopilot technology, we recorded one crash for every 1.40 million miles driven. By comparison, the most recent data available from NHTSA and FHWA (from 2021) shows that in the United States there was an automobile crash approximately every 652,000 miles.
b. The data is clear: The more automation technology offered to support the driver, the safer the driver and other road users. Anecdotes from the WaPo article come from plaintiff attorneys—cases involving significant driver misuse—and are not a substitute for rigorous analysis and billions of miles of data.
c. Recent Data continues this trend and is even more compelling. Autopilot is ~10X safer than US average and ~5X safer than a Tesla with no AP tech enabled. More detailed information will be publicly available in the near future.
2. Autopilot features, including Traffic-Aware Cruise Control and Autosteer, are SAE Level 2 driver-assist systems, meaning –
a. Whether the driver chooses to engage Autosteer or not, the driver is in control of the vehicle at all times. The driver is notified of this responsibility, consents, agrees to monitor the driving assistance, and can disengage anytime.
b. Despite the driver being responsible for control for the vehicle, Tesla has a number of additional safety measures designed to monitor that drivers engage in active driver supervision, including torque-based and camera-based monitoring. We have continued to make progress in improving these monitoring systems to reduce misuse.
c. Based on the above, among other factors, the data strongly indicates our customers are far safer by having the choice to decide when it is appropriate to engage Autopilot features. When used properly, it provides safety benefits on all road classes.
Tesla also provided some context about some of the crashes that were highlighted by The Washington Post. As per the electric vehicle maker, the incidents that the publication cited involved drivers who were not using Autopilot correctly. The publication, therefore, omitted several important facts when it was framing its narrative around Autopilot’s alleged risks, Tesla argued.
Following is the pertinent section of Tesla’s response.
The Washington Post leverages instances of driver misuse of the Autopilot driver assist feature to suggest the system is the problem. The article got it wrong, misreporting what’s actually alleged in the pending lawsuit and omitting several important facts:
1. Contrary to the Post article, the Complaint doesn’t reference complacency or Operational Design Domain.
2. Instead, the Complaint acknowledges the harms of driver inattention, misuse, and negligence.
3. Mr. Angulo and the parents of Ms. Benavides who tragically died in the crash, first sued the Tesla driver—and settled with him—before ever pursuing a claim against Tesla.
4. The Benavides lawsuit alleges the Tesla driver “carelessly and/or recklessly” “drove through the intersection…ignoring the controlling stop sign and traffic signal.”
5. The Tesla driver didn’t blame Tesla, didn’t sue Tesla, didn’t try to get Tesla to pay on his behalf. He took responsibility.
6. The Post had the driver’s statements to police and reports that he said he was “driving on cruise.” They omit that he also admitted to police “I expect to be the driver and be responsible for this.”
7. The driver later testified in the litigation he knew Autopilot didn’t make the car self-driving and he was the driver, contrary to the Post and Angulo claims that he was mislead, over-reliant or complacent. He readily and repeatedly admitted:
a. “I was highly aware that was still my responsibility to operate the vehicle safely.”
b. He agreed it was his “responsibility as the driver of the vehicle, even with Autopilot activated, to drive safely and be in control of the vehicle at all times.”
c. “I would say specifically I was aware that the car was my responsibility. I didn’t read all these statements and passages, but I’m aware the car was my responsibility.”
8. The Post also failed to disclose that Autopilot restricted the vehicle’s speed to 45 mph (the speed limit) based on the road type, but the driver was pressing the accelerator to maintain 60 mph when he ran the stop sign and caused the crash. The car displayed an alert to the driver that, because he was overriding Autopilot with the accelerator, “Cruise control will not brake.”
While there are many articles that do not accurately convey the nature of our safety systems, the recent Washington Post article is particularly egregious in its misstatements and lack of relevant context.
We at Tesla believe that we have a moral obligation to continue…
— Tesla (@Tesla) December 12, 2023
Don’t hesitate to contact us with news tips. Just send a message to simon@teslarati.com to give us a heads up.
News
Tesla (TSLA) receives “Buy” rating and $551 PT from Canaccord Genuity
He also maintained a “Buy” rating for TSLA stock over the company’s improving long-term outlook, which is driven by autonomy and robotics.
Canaccord Genuity analyst George Gianarikas raised his Tesla (NASDAQ:TSLA) price target from $482 to $551. He also maintained a “Buy” rating for TSLA stock over the company’s improving long-term outlook, which is driven by autonomy and robotics.
The analyst’s updated note
Gianarikas lowered his 4Q25 delivery estimates but pointed to several positive factors in the Tesla story. He noted that EV adoption in emerging markets is gaining pace, and progress in FSD and the Robotaxi rollout in 2026 represent major upside drivers. Further progress in the Optimus program next year could also add more momentum for the electric vehicle maker.
“Overall, yes, 4Q25 delivery expectations are being revised lower. However, the reset in the US EV market is laying the groundwork for a more durable and attractive long-term demand environment.
“At the same time, EV penetration in emerging markets is accelerating, reinforcing Tesla’s potential multi‑year growth runway beyond the US. Global progress in FSD and the anticipated rollout of a larger robotaxi fleet in 2026 are increasingly important components of the Tesla equity story and could provide sentiment tailwinds,” the analyst wrote.
Tesla’s busy 2026
The upcoming year would be a busy one for Tesla, considering the company’s plans and targets. The autonomous two-seat Cybercab has been confirmed to start production sometime in Q2 2026, as per Elon Musk during the 2025 Annual Shareholder Meeting.
Apart from this, Tesla is also expected to unveil the next-generation Roadster on April 1, 2026. Tesla is also expected to start high-volume production of the Tesla Semi in Nevada next year.
Apart from vehicle launches, Tesla has expressed its intentions to significantly ramp the rollout of FSD to several regions worldwide, such as Europe. Plans are also underway to launch more Robotaxi networks in several more key areas across the United States.
News
Waymo sues Santa Monica over order to halt overnight charging sessions
In its complaint, Waymo argued that its self-driving cars’ operations do not constitute a public nuisance, and compliance with the city’s order would cause the company irreparable harm.
Waymo has filed a lawsuit against the City of Santa Monica in Los Angeles County Superior Court, seeking to block an order that requires the company to cease overnight charging at two facilities.
In its complaint, Waymo argued that its self-driving cars’ operations do not constitute a public nuisance, and compliance with the city’s order would cause the company irreparable harm.
Nuisance claims
As noted in a report from the Los Angeles Times, Waymo’s two charging sites at Euclid Street and Broadway have operated for about a year, supporting the company’s growing fleet with round-the-clock activity. Unfortunately, this has also resulted in residents in the area reportedly being unable to sleep due to incessant beeping from self-driving taxis that are moving in and out of the charging stations around the clock.
Frustrated residents have protested against the Waymos by blocking the vehicles’ paths, placing cones, and “stacking” cars to create backups. This has also resulted in multiple calls to the police.
Last month, the city issued an order to Waymo and its charging partner, Voltera, to cease overnight operations at the charging locations, stating that the self-driving vehicles’ activities at night were a public nuisance. A December 15 meeting yielded no agreement on mitigations like software rerouting. Waymo proposed changes, but the city reportedly insisted that nothing would satisfy the irate residents.
“We are disappointed that the City has chosen an adversarial path over a collaborative one. The City’s position has been to insist that no actions taken or proposed by Waymo would satisfy the complaining neighbors and therefore must be deemed insufficient,” a Waymo spokesperson stated.
Waymo pushes back
In its legal complaint, Waymo stated that its “activities at the Broadway Facilities do not constitute a public nuisance.” The company also noted that it “faces imminent and irreparable harm to its operations, employees, and customers” from the city’s order. The suit also stated that the city was fully aware that the Voltera charging sites would be operating around the clock to support Waymo’s self-driving taxis.
The company highlighted over one million trips in Santa Monica since launch, with more than 50,000 rides starting or ending there in November alone. Waymo also criticized the city for adopting a contentious strategy against businesses.
“The City of Santa Monica’s recent actions are inconsistent with its stated goal of attracting investment. At a time when the City faces a serious fiscal crisis, officials are choosing to obstruct properly permitted investment rather than fostering a ‘ready for business’ environment,” Waymo stated.
News
Tesla FSD v14.2.2 is getting rave reviews from drivers
So far, early testers have reported buttery-smooth drives with confident performance, even at night or on twisty roads.
Tesla Full Self-Driving (Supervised) v14.2.2 is receiving positive reviews from owners, with several drivers praising the build’s lack of hesitation during lane changes and its smoother decision-making, among others.
The update, which started rolling out on Monday, also adds features like dynamic arrival pin adjustment. So far, early testers have reported buttery-smooth drives with confident performance, even at night or on twisty roads.
Owners highlight major improvements
Longtime Tesla owner and FSD user @BLKMDL3 shared a detailed 10-hour impression of FSD v14.2.2, noting that the system exhibited “zero lane change hesitation” and “extremely refined” lane choices. He praised Mad Max mode’s performance, stellar parking in locations including ticket dispensers, and impressive canyon runs even in dark conditions.
Fellow FSD user Dan Burkland reported an hour of FSD v14.2.2’s nighttime driving with “zero hesitations” and “buttery smooth” confidence reminiscent of Robotaxi rides in areas such as Austin, Texas. Veteran FSD user Whole Mars Catalog also demonstrated voice navigation via Grok, while Tesla owner Devin Olsen completed a nearly two-hour drive with FSD v14.2.2 in heavy traffic and rain with strong performance.
Closer to unsupervised
FSD has been receiving rave reviews, even from Tesla’s competitors. Xpeng CEO He Xiaopeng, for one, offered fresh praise for FSD v14.2 after visiting Silicon Valley. Following extended test drives of Tesla vehicles running the latest FSD software, He stated that the system has made major strides, reinforcing his view that Tesla’s approach to autonomy is indeed the proper path towards autonomy.
According to He, Tesla’s FSD has evolved from a smooth Level 2 advanced driver assistance system into what he described as a “near-Level 4” experience in terms of capabilities. While acknowledging that areas of improvement are still present, the Xpeng CEO stated that FSD’s current iteration significantly surpasses last year’s capabilities. He also reiterated his belief that Tesla’s strategy of using the same autonomous software and hardware architecture across private vehicles and robotaxis is the right long-term approach, as it would allow users to bypass intermediate autonomy stages and move closer to Level 4 functionality.