This is a preview from our weekly newsletter. Each week I go ‘Beyond the News’ and handcraft a special edition that includes my thoughts on the biggest stories, why it matters, and how it could impact the future.
Earlier this week, NTSB Chief Jennifer Homendy made some disparaging comments regarding Tesla’s use of “Full Self-Driving” to explain its semi-autonomous driving suite. The remarks from Homendy show that Tesla may not have a fair chance when it ultimately comes to proving the effectiveness of its FSD program, especially considering agency officials, who should remain impartial, are already making misdirected comments regarding the name of the suite.
In an interview with the Wall Street Journal, Homendy commented on the company’s use of the phrase “Full Self-Driving.” While Tesla’s FSD suite is admittedly not capable of Level 5 autonomy, the idea for the program is to eventually roll out a fully autonomous driving program for those who choose to invest in the company’s software. However, instead of focusing on the program’s effectiveness and commending Tesla, arguably the leader in self-driving developments, Homendy concentrates on the terminology.
Homendy said Tesla’s use of the term “Full Self-Driving” was “misleading and irresponsible,” despite the company confirming with each driver who buys the capability that the program is not yet fully autonomous. Drivers are explicitly told to remain vigilant and keep their hands on the wheel at all times. It is a requirement to use Autopilot or FSD, and failure to do so can result in being locked in “Autopilot jail” for the duration of your trip. Nobody wants that.
However, despite the way some media outlets and others describe Tesla’s FSD program, the company’s semi-autonomous driving functionalities are extraordinarily safe and among the most complex on the market. Tesla is one of the few companies attempting to solve the riddle that is self-driving, and the only to my knowledge that has chosen not to use LiDAR in its efforts. Additionally, Tesla ditched radar just a few months ago in the Model Y and Model 3, meaning cameras are the only infrastructure the company plans to use to keep its cars moving. Several drivers have reported improvements due to the lack of radar.
These comments regarding FSD and Autopilot are simple: The terminology is not the focus; the facts are. The truth is, Tesla Autopilot recorded one of its safest quarters, according to the most recently released statistics that outlined an accident occurring on Autopilot just once every 4.19 million miles. The national average is 484,000 miles, the NHTSA says.
It isn’t to say that things don’t happen. Accidents on Autopilot and FSD do occur, and the NHTSA is currently probing twelve incidents that have shown Autopilot to be active during an accident. While the conditions and situations vary in each accident, several have already been proven to be the result of driver negligence, including a few that had drivers operating a vehicle without a license or under the influence of alcohol. Now, remind me: When a BMW driver is drunk and crashes into someone, do we blame BMW? I’ll let that rhetorical question sink in.
Of course, Homendy has a Constitutional right to say whatever is on her mind. It is perfectly reasonable to be skeptical of self-driving systems. I’ll admit, the first time I experienced one, I was not a fan, but it wasn’t because I didn’t trust it. It was because I was familiar with controlling a vehicle and not having it manage things for me. However, just like anything else, I adjusted and got used to the idea, eventually becoming accustomed to the new feelings and sensations of having my car assist me in navigating to my destination.
To me, it is simply unfortunate for an NTSB official to claim that Tesla “has clearly misled numerous people to misuse and abuse technology.” One, because it isn’t possible, two, because it would be a massive liability for the company, and three, because Tesla has never maintained that its cars can drive themselves. Tesla has never claimed that its cars can drive themselves, nor has Tesla ever advised a driver to attempt a fully autonomous trek to a destination.
The numerous safety features and additions to the FSD suite have only solidified Tesla’s position as one of the safest car companies out there. With in-cabin cameras to test driver attentiveness and numerous other safety thresholds that drivers must respond to with the correct behaviors, Tesla’s FSD suite and its Autopilot program are among the safest around. It isn’t favorable for NTSB head Homendy to comment in this way, especially as it seems to be detrimental to not only Tesla’s attempts to achieve Level 5 autonomy but the entire self-driving effort as a whole.
A big thanks to our long-time supporters and new subscribers! Thank you.
I use this newsletter to share my thoughts on what is going on in the Tesla world. If you want to talk to me directly, you can email me or reach me on Twitter. I don’t bite, be sure to reach out!
-Joey
Elon Musk
Tesla’s Elon Musk: 10 billion miles needed for safe Unsupervised FSD
As per the CEO, roughly 10 billion miles of training data are required due to reality’s “super long tail of complexity.”
Tesla CEO Elon Musk has provided an updated estimate for the training data needed to achieve truly safe unsupervised Full Self-Driving (FSD).
As per the CEO, roughly 10 billion miles of training data are required due to reality’s “super long tail of complexity.”
10 billion miles of training data
Musk comment came as a reply to Apple and Rivian alum Paul Beisel, who posted an analysis on X about the gap between tech demonstrations and real-world products. In his post, Beisel highlighted Tesla’s data-driven lead in autonomy, and he also argued that it would not be easy for rivals to become a legitimate competitor to FSD quickly.
“The notion that someone can ‘catch up’ to this problem primarily through simulation and limited on-road exposure strikes me as deeply naive. This is not a demo problem. It is a scale, data, and iteration problem— and Tesla is already far, far down that road while others are just getting started,” Beisel wrote.
Musk responded to Beisel’s post, stating that “Roughly 10 billion miles of training data is needed to achieve safe unsupervised self-driving. Reality has a super long tail of complexity.” This is quite interesting considering that in his Master Plan Part Deux, Elon Musk estimated that worldwide regulatory approval for autonomous driving would require around 6 billion miles.
FSD’s total training miles
As 2025 came to a close, Tesla community members observed that FSD was already nearing 7 billion miles driven, with over 2.5 billion miles being from inner city roads. The 7-billion-mile mark was passed just a few days later. This suggests that Tesla is likely the company today with the most training data for its autonomous driving program.
The difficulties of achieving autonomy were referenced by Elon Musk recently, when he commented on Nvidia’s Alpamayo program. As per Musk, “they will find that it’s easy to get to 99% and then super hard to solve the long tail of the distribution.” These sentiments were echoed by Tesla VP for AI software Ashok Elluswamy, who also noted on X that “the long tail is sooo long, that most people can’t grasp it.”
News
Tesla earns top honors at MotorTrend’s SDV Innovator Awards
MotorTrend’s SDV Awards were presented during CES 2026 in Las Vegas.
Tesla emerged as one of the most recognized automakers at MotorTrend’s 2026 Software-Defined Vehicle (SDV) Innovator Awards.
As could be seen in a press release from the publication, two key Tesla employees were honored for their work on AI, autonomy, and vehicle software. MotorTrend’s SDV Awards were presented during CES 2026 in Las Vegas.
Tesla leaders and engineers recognized
The fourth annual SDV Innovator Awards celebrate pioneers and experts who are pushing the automotive industry deeper into software-driven development. Among the most notable honorees for this year was Ashok Elluswamy, Tesla’s Vice President of AI Software, who received a Pioneer Award for his role in advancing artificial intelligence and autonomy across the company’s vehicle lineup.
Tesla also secured recognition in the Expert category, with Lawson Fulton, a staff Autopilot machine learning engineer, honored for his contributions to Tesla’s driver-assistance and autonomous systems.
Tesla’s software-first strategy
While automakers like General Motors, Ford, and Rivian also received recognition, Tesla’s multiple awards stood out given the company’s outsized role in popularizing software-defined vehicles over the past decade. From frequent OTA updates to its data-driven approach to autonomy, Tesla has consistently treated vehicles as evolving software platforms rather than static products.
This has made Tesla’s vehicles very unique in their respective sectors, as they are arguably the only cars that objectively get better over time. This is especially true for vehicles that are loaded with the company’s Full Self-Driving system, which are getting progressively more intelligent and autonomous over time. The majority of Tesla’s updates to its vehicles are free as well, which is very much appreciated by customers worldwide.
Elon Musk
Judge clears path for Elon Musk’s OpenAI lawsuit to go before a jury
The decision maintains Musk’s claims that OpenAI’s shift toward a for-profit structure violated early assurances made to him as a co-founder.
A U.S. judge has ruled that Elon Musk’s lawsuit accusing OpenAI of abandoning its founding nonprofit mission can proceed to a jury trial.
The decision maintains Musk’s claims that OpenAI’s shift toward a for-profit structure violated early assurances made to him as a co-founder. These claims are directly opposed by OpenAI.
Judge says disputed facts warrant a trial
At a hearing in Oakland, U.S. District Judge Yvonne Gonzalez Rogers stated that there was “plenty of evidence” suggesting that OpenAI leaders had promised that the organization’s original nonprofit structure would be maintained. She ruled that those disputed facts should be evaluated by a jury at a trial in March rather than decided by the court at this stage, as noted in a Reuters report.
Musk helped co-found OpenAI in 2015 but left the organization in 2018. In his lawsuit, he argued that he contributed roughly $38 million, or about 60% of OpenAI’s early funding, based on assurances that the company would remain a nonprofit dedicated to the public benefit. He is seeking unspecified monetary damages tied to what he describes as “ill-gotten gains.”
OpenAI, however, has repeatedly rejected Musk’s allegations. The company has stated that Musk’s claims were baseless and part of a pattern of harassment.
Rivalries and Microsoft ties
The case unfolds against the backdrop of intensifying competition in generative artificial intelligence. Musk now runs xAI, whose Grok chatbot competes directly with OpenAI’s flagship ChatGPT. OpenAI has argued that Musk is a frustrated commercial rival who is simply attempting to slow down a market leader.
The lawsuit also names Microsoft as a defendant, citing its multibillion-dollar partnerships with OpenAI. Microsoft has urged the court to dismiss the claims against it, arguing there is no evidence it aided or abetted any alleged misconduct. Lawyers for OpenAI have also pushed for the case to be thrown out, claiming that Musk failed to show sufficient factual basis for claims such as fraud and breach of contract.
Judge Gonzalez Rogers, however, declined to end the case at this stage, noting that a jury would also need to consider whether Musk filed the lawsuit within the applicable statute of limitations. Still, the dispute between Elon Musk and OpenAI is now headed for a high-profile jury trial in the coming months.