This is a preview from our weekly newsletter. Each week I go ‘Beyond the News’ and handcraft a special edition that includes my thoughts on the biggest stories, why it matters, and how it could impact the future.
Earlier this week, NTSB Chief Jennifer Homendy made some disparaging comments regarding Tesla’s use of “Full Self-Driving” to explain its semi-autonomous driving suite. The remarks from Homendy show that Tesla may not have a fair chance when it ultimately comes to proving the effectiveness of its FSD program, especially considering agency officials, who should remain impartial, are already making misdirected comments regarding the name of the suite.
In an interview with the Wall Street Journal, Homendy commented on the company’s use of the phrase “Full Self-Driving.” While Tesla’s FSD suite is admittedly not capable of Level 5 autonomy, the idea for the program is to eventually roll out a fully autonomous driving program for those who choose to invest in the company’s software. However, instead of focusing on the program’s effectiveness and commending Tesla, arguably the leader in self-driving developments, Homendy concentrates on the terminology.
Homendy said Tesla’s use of the term “Full Self-Driving” was “misleading and irresponsible,” despite the company confirming with each driver who buys the capability that the program is not yet fully autonomous. Drivers are explicitly told to remain vigilant and keep their hands on the wheel at all times. It is a requirement to use Autopilot or FSD, and failure to do so can result in being locked in “Autopilot jail” for the duration of your trip. Nobody wants that.
However, despite the way some media outlets and others describe Tesla’s FSD program, the company’s semi-autonomous driving functionalities are extraordinarily safe and among the most complex on the market. Tesla is one of the few companies attempting to solve the riddle that is self-driving, and the only to my knowledge that has chosen not to use LiDAR in its efforts. Additionally, Tesla ditched radar just a few months ago in the Model Y and Model 3, meaning cameras are the only infrastructure the company plans to use to keep its cars moving. Several drivers have reported improvements due to the lack of radar.
These comments regarding FSD and Autopilot are simple: The terminology is not the focus; the facts are. The truth is, Tesla Autopilot recorded one of its safest quarters, according to the most recently released statistics that outlined an accident occurring on Autopilot just once every 4.19 million miles. The national average is 484,000 miles, the NHTSA says.
It isn’t to say that things don’t happen. Accidents on Autopilot and FSD do occur, and the NHTSA is currently probing twelve incidents that have shown Autopilot to be active during an accident. While the conditions and situations vary in each accident, several have already been proven to be the result of driver negligence, including a few that had drivers operating a vehicle without a license or under the influence of alcohol. Now, remind me: When a BMW driver is drunk and crashes into someone, do we blame BMW? I’ll let that rhetorical question sink in.
Of course, Homendy has a Constitutional right to say whatever is on her mind. It is perfectly reasonable to be skeptical of self-driving systems. I’ll admit, the first time I experienced one, I was not a fan, but it wasn’t because I didn’t trust it. It was because I was familiar with controlling a vehicle and not having it manage things for me. However, just like anything else, I adjusted and got used to the idea, eventually becoming accustomed to the new feelings and sensations of having my car assist me in navigating to my destination.
To me, it is simply unfortunate for an NTSB official to claim that Tesla “has clearly misled numerous people to misuse and abuse technology.” One, because it isn’t possible, two, because it would be a massive liability for the company, and three, because Tesla has never maintained that its cars can drive themselves. Tesla has never claimed that its cars can drive themselves, nor has Tesla ever advised a driver to attempt a fully autonomous trek to a destination.
The numerous safety features and additions to the FSD suite have only solidified Tesla’s position as one of the safest car companies out there. With in-cabin cameras to test driver attentiveness and numerous other safety thresholds that drivers must respond to with the correct behaviors, Tesla’s FSD suite and its Autopilot program are among the safest around. It isn’t favorable for NTSB head Homendy to comment in this way, especially as it seems to be detrimental to not only Tesla’s attempts to achieve Level 5 autonomy but the entire self-driving effort as a whole.
A big thanks to our long-time supporters and new subscribers! Thank you.
I use this newsletter to share my thoughts on what is going on in the Tesla world. If you want to talk to me directly, you can email me or reach me on Twitter. I don’t bite, be sure to reach out!
-Joey
News
Tesla is not sparing any expense in ensuring the Cybercab is safe
Images shared by the longtime watcher showed 16 Cybercab prototypes parked near Giga Texas’ dedicated crash test facility.
The Tesla Cybercab could very well be the safest taxi on the road when it is released and deployed for public use. This was, at least, hinted at by the intensive safety tests that Tesla seems to be putting the autonomous two-seater through at its Giga Texas crash test facility.
Intensive crash tests
As per recent images from longtime Giga Texas watcher and drone operator Joe Tegtmeyer, Tesla seems to be very busy crash testing Cybercab units. Images shared by the longtime watcher showed 16 Cybercab prototypes parked near Giga Texas’ dedicated crash test facility just before the holidays.
Tegtmeyer’s aerial photos showed the prototypes clustered outside the factory’s testing building. Some uncovered Cybercabs showed notable damage and one even had its airbags engaged. With Cybercab production expected to start in about 130 days, it appears that Tesla is very busy ensuring that its autonomous two-seater ends up becoming the safest taxi on public roads.
Prioritizing safety
With no human driver controls, the Cybercab demands exceptional active and passive safety systems to protect occupants in any scenario. Considering Tesla’s reputation, it is then understandable that the company seems to be sparing no expense in ensuring that the Cybercab is as safe as possible.
Tesla’s focus on safety was recently highlighted when the Cybertruck achieved a Top Safety Pick+ rating from the Insurance Institute for Highway Safety (IIHS). This was a notable victory for the Cybertruck as critics have long claimed that the vehicle will be one of, if not the, most unsafe truck on the road due to its appearance. The vehicle’s Top Safety Pick+ rating, if any, simply proved that Tesla never neglects to make its cars as safe as possible, and that definitely includes the Cybercab.
Elon Musk
Tesla’s Elon Musk gives timeframe for FSD’s release in UAE
Provided that Musk’s timeframe proves accurate, FSD would be able to start saturating the Middle East, starting with the UAE, next year.
Tesla CEO Elon Musk stated on Monday that Full Self-Driving (Supervised) could launch in the United Arab Emirates (UAE) as soon as January 2026.
Provided that Musk’s timeframe proves accurate, FSD would be able to start saturating the Middle East, starting with the UAE, next year.
Musk’s estimate
In a post on X, UAE-based political analyst Ahmed Sharif Al Amiri asked Musk when FSD would arrive in the country, quoting an earlier post where the CEO encouraged users to try out FSD for themselves. Musk responded directly to the analyst’s inquiry.
“Hopefully, next month,” Musk wrote. The exchange attracted a lot of attention, with numerous X users sharing their excitement at the idea of FSD being brought to a new country. FSD (Supervised), after all, would likely allow hands-off highway driving, urban navigation, and parking under driver oversight in traffic-heavy cities such as Dubai and Abu Dhabi.
Musk’s comments about FSD’s arrival in the UAE were posted following his visit to the Middle Eastern country. Over the weekend, images were shared online of Musk meeting with UAE Defense Minister, Deputy Prime Minister, and Dubai Crown Prince HH Sheikh Hamdan bin Mohammed. Musk also posted a supportive message about the country, posting “UAE rocks!” on X.
FSD recognition
FSD has been getting quite a lot of support from foreign media outlets. FSD (Supervised) earned high marks from Germany’s largest car magazine, Auto Bild, during a test in Berlin’s challenging urban environment. The demonstration highlighted the system’s ability to handle dense traffic, construction sites, pedestrian crossings, and narrow streets with smooth, confident decision-making.
Journalist Robin Hornig was particularly struck by FSD’s superior perception and tireless attention, stating: “Tesla FSD Supervised sees more than I do. It doesn’t get distracted and never gets tired. I like to think I’m a good driver, but I can’t match this system’s all-around vision. It’s at its best when both work together: my experience and the Tesla’s constant attention.” Only one intervention was needed when the system misread a route, showcasing its maturity while relying on vision-only sensors and over-the-air learning.
News
Tesla quietly flexes FSD’s reliability amid Waymo blackout in San Francisco
“Tesla Robotaxis were unaffected by the SF power outage,” Musk wrote in his post.
Tesla highlighted its Full Self-Driving (Supervised) system’s robustness this week by sharing dashcam footage of a vehicle in FSD navigating pitch-black San Francisco streets during the city’s widespread power outage.
While Waymo’s robotaxis stalled and caused traffic jams, Tesla’s vision-only approach kept operating seamlessly without remote intervention. Elon Musk amplified the clip, highlighting the contrast between the two systems.
Tesla FSD handles total darkness
The @Tesla_AI account posted a video from a Model Y operating on FSD during San Francisco’s blackout. As could be seen in the video, streetlights, traffic signals, and surrounding illumination were completely out, but the vehicle drove confidently and cautiously, just like a proficient human driver.
Musk reposted the clip, adding context to reports of Waymo vehicles struggling in the same conditions. “Tesla Robotaxis were unaffected by the SF power outage,” Musk wrote in his post.
Musk and the Tesla AI team’s posts highlight the idea that FSD operates a lot like any experienced human driver. Since the system does not rely on a variety of sensors and a complicated symphony of factors, vehicles could technically navigate challenging circumstances as they emerge. This definitely seemed to be the case in San Francisco.
Waymo’s blackout struggles
Waymo faced scrutiny after multiple self-driving Jaguar I-PACE taxis stopped functioning during the blackout, blocking lanes, causing traffic jams, and requiring manual retrieval. Videos shared during the power outage showed fleets of Waymo vehicles just stopping in the middle of the road, seemingly confused about what to do when the lights go out.
In a comment, Waymo stated that its vehicles treat nonfunctional signals as four-way stops, but “the sheer scale of the outage led to instances where vehicles remained stationary longer than usual to confirm the state of the affected intersections. This contributed to traffic friction during the height of the congestion.”
A company spokesperson also shared some thoughts about the incidents. “Yesterday’s power outage was a widespread event that caused gridlock across San Francisco, with non-functioning traffic signals and transit disruptions. While the failure of the utility infrastructure was significant, we are committed to ensuring our technology adjusts to traffic flow during such events,” the Waymo spokesperson stated, adding that it is “focused on rapidly integrating the lessons learned from this event, and are committed to earning and maintaining the trust of the communities we serve every day.”