This is a preview from our weekly newsletter. Each week I go ‘Beyond the News’ and handcraft a special edition that includes my thoughts on the biggest stories, why it matters, and how it could impact the future.
Earlier this week, NTSB Chief Jennifer Homendy made some disparaging comments regarding Tesla’s use of “Full Self-Driving” to explain its semi-autonomous driving suite. The remarks from Homendy show that Tesla may not have a fair chance when it ultimately comes to proving the effectiveness of its FSD program, especially considering agency officials, who should remain impartial, are already making misdirected comments regarding the name of the suite.
In an interview with the Wall Street Journal, Homendy commented on the company’s use of the phrase “Full Self-Driving.” While Tesla’s FSD suite is admittedly not capable of Level 5 autonomy, the idea for the program is to eventually roll out a fully autonomous driving program for those who choose to invest in the company’s software. However, instead of focusing on the program’s effectiveness and commending Tesla, arguably the leader in self-driving developments, Homendy concentrates on the terminology.
Homendy said Tesla’s use of the term “Full Self-Driving” was “misleading and irresponsible,” despite the company confirming with each driver who buys the capability that the program is not yet fully autonomous. Drivers are explicitly told to remain vigilant and keep their hands on the wheel at all times. It is a requirement to use Autopilot or FSD, and failure to do so can result in being locked in “Autopilot jail” for the duration of your trip. Nobody wants that.
However, despite the way some media outlets and others describe Tesla’s FSD program, the company’s semi-autonomous driving functionalities are extraordinarily safe and among the most complex on the market. Tesla is one of the few companies attempting to solve the riddle that is self-driving, and the only to my knowledge that has chosen not to use LiDAR in its efforts. Additionally, Tesla ditched radar just a few months ago in the Model Y and Model 3, meaning cameras are the only infrastructure the company plans to use to keep its cars moving. Several drivers have reported improvements due to the lack of radar.
These comments regarding FSD and Autopilot are simple: The terminology is not the focus; the facts are. The truth is, Tesla Autopilot recorded one of its safest quarters, according to the most recently released statistics that outlined an accident occurring on Autopilot just once every 4.19 million miles. The national average is 484,000 miles, the NHTSA says.
It isn’t to say that things don’t happen. Accidents on Autopilot and FSD do occur, and the NHTSA is currently probing twelve incidents that have shown Autopilot to be active during an accident. While the conditions and situations vary in each accident, several have already been proven to be the result of driver negligence, including a few that had drivers operating a vehicle without a license or under the influence of alcohol. Now, remind me: When a BMW driver is drunk and crashes into someone, do we blame BMW? I’ll let that rhetorical question sink in.
Of course, Homendy has a Constitutional right to say whatever is on her mind. It is perfectly reasonable to be skeptical of self-driving systems. I’ll admit, the first time I experienced one, I was not a fan, but it wasn’t because I didn’t trust it. It was because I was familiar with controlling a vehicle and not having it manage things for me. However, just like anything else, I adjusted and got used to the idea, eventually becoming accustomed to the new feelings and sensations of having my car assist me in navigating to my destination.
To me, it is simply unfortunate for an NTSB official to claim that Tesla “has clearly misled numerous people to misuse and abuse technology.” One, because it isn’t possible, two, because it would be a massive liability for the company, and three, because Tesla has never maintained that its cars can drive themselves. Tesla has never claimed that its cars can drive themselves, nor has Tesla ever advised a driver to attempt a fully autonomous trek to a destination.
The numerous safety features and additions to the FSD suite have only solidified Tesla’s position as one of the safest car companies out there. With in-cabin cameras to test driver attentiveness and numerous other safety thresholds that drivers must respond to with the correct behaviors, Tesla’s FSD suite and its Autopilot program are among the safest around. It isn’t favorable for NTSB head Homendy to comment in this way, especially as it seems to be detrimental to not only Tesla’s attempts to achieve Level 5 autonomy but the entire self-driving effort as a whole.
A big thanks to our long-time supporters and new subscribers! Thank you.
I use this newsletter to share my thoughts on what is going on in the Tesla world. If you want to talk to me directly, you can email me or reach me on Twitter. I don’t bite, be sure to reach out!
-Joey
Cybertruck
Tesla Cybertruck gets long-awaited safety feature
Tesla has announced the rollout of its innovative anti-dooring protection feature to the Cybertruck via the 2026.8 software update.
Tesla is rolling out a new and long-awaited feature to the Cybertruck all-electric pickup, and it is a safety addition geared toward pedestrian and cyclist safety, as well as accidents with other vehicles.
Tesla has announced the rollout of its innovative anti-dooring protection feature to the Cybertruck via the 2026.8 software update.
This safety enhancement uses the vehicle’s existing cameras to detect approaching cyclists, pedestrians, or vehicles in the blind spot while parked. Upon attempting to open a door, if a hazard is detected, the system activates: the blind spot indicator light flashes, an audible chime sounds, and the door will not open on the initial button press.
Drivers must wait briefly and press the button again to override, providing crucial seconds to avoid an accident.
Anti-dooring protection now rolling out to @Cybertruck
This feature comes standard on every new Model 3, Model Y & Cybertruck – using cameras to delay door opening if a cyclist, pedestrian or other vehicle is detected approaching in your blind spot
— Tesla North America (@tesla_na) March 17, 2026
The feature, also known as Blind Spot Warning While Parked, comes standard on every new Model 3 and Model Y, and is now extending to the Cybertruck. Leveraging Tesla’s vision-based system without requiring new hardware, it represents a cost-effective software solution that builds on community suggestions dating back to 2018.
This technology addresses the persistent danger of “dooring,” where a driver opens a car door into the path of a passing cyclist or pedestrian.
Tesla implemented this little-known feature to make its cars even safer
Dooring incidents are alarmingly common in urban environments.
According to Chicago data, in 2011 alone, there were 344 reported dooring crashes, accounting for approximately 20 percent of all bicycle crashes in the city, nearly one incident per day.
While numbers have fluctuated (dropping to 11 percent in 2014 before rising again), dooring consistently represents 10-20 percent of bike-related crashes in major cities.
A national analysis of emergency department data estimates over 17,000 dooring-related injuries treated in the U.S. over a decade, with many involving fractures, contusions, and head trauma, particularly affecting upper extremities.
By automatically intervening, Tesla’s system not only protects vulnerable road users but also safeguards its owners from potential liability and enhances overall road safety.
As cities promote cycling for sustainable transport, features like this demonstrate how advanced driver assistance and camera systems can evolve beyond highway driving to everyday urban scenarios.
Enthusiastic responses on social media highlight appreciation for the proactive safety measure, with some calling for broader rollout to older models where hardware permits. Tesla continues to push the boundaries of vehicle safety through over-the-air updates, making its fleet smarter and safer over time.
Elon Musk
Tesla Roadster is ‘sorcery and magic’ and might be worth the wait, Uber founder says
Perhaps the wait will be worth it, especially according to Uber founder Travis Kalanick, who recently teased the Roadster’s potential capabilities based on what he has heard from internal Tesla sources.
Tesla is planning to unveil the Roadster in late April after years of waiting. But the wait might be worth it, according to Travis Kalanick, the founder of Uber, who recently shed some light on his expectations for the all-electric supercar.
We all know the Roadster is supposed to have some serious capability. CEO Elon Musk has said on numerous occasions that the Roadster will be unlike anything else ever produced. It might go from 0-60 MPH in about a second, it might hover, it might have SpaceX cold gas thrusters.
However, the constant delays in the Roadster program and its unveiling event continue to send Tesla fans into confusion because they’re just not sure when, or if, they’ll ever see the finished product.
Perhaps the wait will be worth it, especially according to Uber founder Travis Kalanick, who recently teased the Roadster’s potential capabilities based on what he has heard from internal Tesla sources.
Kalanick said on X:
When I’ve run into people who are in the know, I inquire, they tell me nothing, but their eyebrows raise and their eyes widen in a way that can only mean something of sorcery and magic is coming…
— travis kalanick (@travisk) March 17, 2026
Musk has said this vehicle is not going to be geared for safety, and that, “If safety is your number one goal, do not buy the Roadster.”
There has been so much hype regarding the Roadster that it is hard to believe the company could not come through on some kind of crazy features for the vehicle.
However, the latest delay that Tesla put on the unveiling event is definitely eye-opening, especially considering it is the latest in a series of pushbacks the company has put on the vehicle for the past several years.
Tesla has made several jumps in the Roadster project over the past few months, as it has ramped up hiring for the vehicle and also applied for a patent for a new seat design.
The car has been a back-burner project for Tesla, as it has been focusing primarily on autonomy and the rollout of Robotaxi and Cybercab. Additionally, its other vehicle projects, like the Model 3 and Model Y refreshes, took precedence.
Tesla still plans to unveil the Roadster next month, so we can hope the company can stick to this timeframe.
Cybertruck
Elon Musk clarifies viral Tesla Cybertruck accident with driver logs
Musk has come out to say that the driver logs have already shown that the driver “disengaged Autopilot four seconds before crashing,” in a post on X.
Tesla CEO Elon Musk has clarified some details regarding the viral Tesla Cybertruck accident with company driver logs, which show various metrics at the time of an incident.
The logs have been used in the past to pull responsibility off of Tesla when the automaker’s Full Self-Driving (Supervised) or Autopilot platforms are blamed for a collision or accident. It appears this will be no different.
On Tuesday, a video of a Cybertruck crashing into an overpass barrier in August 2025 was shared by Fox Business in a story that reported a woman was suing the automaker for $1 million in a liability and negligence case.
In the suit, Justine Saint Amour said that, “Something terrifying happened, without warning, the vehicle attempted to drive straight off an overpass.” Her attorney, Bob Hilliard, said Amour “tried to take control, but crashed into the barrier and was seriously injured (mostly her shoulder, neck, and back).”
The Tesla Model Y is leading China’s electric SUV segment by a wide margin
Tesla vehicle crashes are widely popular to report by mainstream media outlets because of the sensationalism of the event. Oftentimes, these outlets will include Tesla in the headline, especially because it will pique the interest of the masses, as most who read the story are waiting to see the claim that Autopilot or Full Self-Driving was the culprit of the accident.
However, Tesla has access to the logs of every vehicle in its fleet, which will show the various metrics, like whether either FSD or Autopilot was active, if the accelerator was pressed, the speed, and other important factors.
Musk has come out to say that the driver logs have already shown that the driver “disengaged Autopilot four seconds before crashing,” in a post on X.
Logs show driver disengaged Autopilot four seconds before crashing
— Elon Musk (@elonmusk) March 18, 2026
If the logs do show this, which Tesla will likely have to prove in court, the real question would be why did the Amour disengage the suite?
Tesla’s Full Self-Driving suite is still not fully autonomous, meaning the driver cannot pull attention away from the road and must be ready to take over the vehicle at all times.
It will be interesting to see how this particular case pans out, especially considering the clip that was released by the law firm starts at about four seconds before the collision. Tesla logs have dispelled media reports in the past that have accused the company’s suite of being responsible for an accident, so there will be some major attention on what is proven in this particular case.