News
Tesla fires back at Fortune with cheeky “Misfortune” blog post
The drama continues between Tesla and Fortune after the media outlet published a story questioning Tesla’s ethics claiming the company sold $2 billion worth of stock but failed to disclose that it was under investigation by the National Highway Transport Association (NHTSA) after Joshua Brown was killed when his Model S in Autopilot mode crashed into a tractor trailer.
Since the story was published, Tesla CEO Elon Musk defended the company’s position that news surrounding the Autopilot related death was not material to its stock price. Fortune disagreed citing that the stock price dropped $6 per share after news broke that the NHTSA was in fact investigating evidence surrounding Brown’s death. That’s when Musk fired back via email picking choice words with Fortune’s writer and stating, “Indeed, if anyone bothered to do the math (obviously, you did not) they would realize that of the over 1M auto deaths per year worldwide, approximately half a million people would have been saved if the Tesla autopilot was universally available. Please, take 5 mins and do the bloody math before you write an article that misleads the public.”
The Tesla vs Fortune debacle spilled over into the public Twittersphere between Fortune’s Editor Alan Murray and Elon Musk. The tweets continued throughout Wednesday with Alan Murray defending the media outlet’s position that Tesla did not disclose news of the Autopilot death. Fortune went as far as quoting statements made in an SEC filing by Tesla which warned investors that a fatal crash related to its Autopilot feature would be a material event to the company’s brand, business, and operating results. Tesla would later bring to light that Fortune mischaracterized the quote within the SEC filing.
Tesla has since released a blog post on this matter titled “Misfortune”.
Misfortune
Fortune’s article is fundamentally incorrect.
First, Fortune mischaracterizes Tesla’s SEC filing. Here is what Tesla’s SEC filing actually says: “We may become subject to product liability claims, which could harm our financial condition and liquidity if we are not able to successfully defend or insure against such claims.” [full text included below] This is just stating the obvious. One of the risks facing Tesla (or any company) is that someone could bring product liability claims against it. However, neither at the time of this SEC filing, nor in the several weeks to date, has anyone brought a product liability claim against Tesla relating to the crash in Florida.
Next, Fortune entirely ignores what Tesla knew and when, nor have they even asked the questions. Instead, they simply assume that Tesla had complete information from the moment this accident occurred. This was a physical impossibility given that the damage sustained by the Model S in the crash limited Tesla’s ability to recover data from it remotely.
When Tesla told NHTSA about the accident on May 16th, we had barely started our investigation. Tesla informed NHTSA because it wanted to let NHTSA know about a death that had taken place in one of its vehicles. It was not until May 18th that a Tesla investigator was able to go to Florida to inspect the car and the crash site and pull the complete vehicle logs from the car, and it was not until the last week of May that Tesla was able to finish its review of those logs and complete its investigation. When Fortune contacted Tesla for comment on this story during the July 4th holiday, Fortune never asked any of these questions and instead just made assumptions. Tesla asked Fortune to give it a day to confirm these facts before it rushed its story to print. They declined and instead ran a misleading article.
Here’s what we did know at the time of the accident and subsequent filing:
- That Tesla Autopilot had been safely used in over 100 million miles of driving by tens of thousands of customers worldwide, with zero confirmed fatalities and a wealth of internal data demonstrating safer, more predictable vehicle control performance when the system is properly used.
- That contrasted against worldwide accident data, customers using Autopilot are statistically safer than those not using it at all.
- That given its nature as a driver assistance system, a collision on Autopilot was a statistical inevitability, though by this point, not one that would alter the conclusion already borne out over millions of miles that the system provided a net safety benefit to society.
Given the fact that the “better-than-human” threshold had been crossed and robustly validated internally, news of a statistical inevitability did not materially change any statements previously made about the Autopilot system, its capabilities, or net impact on roadway safety.
Finally, the Fortune article makes two other false assumptions. First, they assume that this accident was caused by an Autopilot failure. To be clear, this accident was the result of a semi-tractor trailer crossing both lanes of a divided highway in front of an oncoming car. Whether driven under manual or assisted mode, this presented a challenging and unexpected emergency braking scenario for the driver to respond to. In the moments leading up to the collision, there is no evidence to suggest that Autopilot was not operating as designed and as described to users: specifically, as a driver assistance system that maintains a vehicle’s position in lane and adjusts the vehicle’s speed to match surrounding traffic.
Fortune never even addresses that point. Second, Fortune assumes that, putting all of these other problems aside, a single accident involving Autopilot, regardless of how many accidents Autopilot has stopped and how many lives it has saved, is material to Tesla’s investors. On the day the news broke about NHTSA’s decision to initiate a preliminary evaluation into the incident, Tesla’s stock traded up, not down, confirming that not only did our investors know better, but that our own internal assessment of the performance and risk profile of Autopilot were in line with market expectations.
The bottom line is that Fortune jumped the gun on a story before they had the facts. They then sought wrongly to defend that position by plucking boilerplate language from SEC filings that have no bearing on what happened, while failing to correct or acknowledge their original omissions and errors.
Full text referenced above:
We may become subject to product liability claims, which could harm our financial condition and liquidity if we are not able to successfully defend or insure against such claims.
“Product liability claims could harm our business, prospects, operating results and financial condition. The automobile industry experiences significant product liability claims and we face inherent risk of exposure to claims in the event our vehicles do not perform as expected resulting in personal injury or death. We also may face similar claims related to any misuse or failures of new technologies that we are pioneering, including autopilot in our vehicles and our Tesla Energy products. A successful product liability claim against us with respect to any aspect of our products could require us to pay a substantial monetary award. Our risks in this area are particularly pronounced given the limited number of vehicles and energy storage products delivered to date and limited field experience of our products. Moreover, a product liability claim could generate substantial negative publicity about our products and business and would have material adverse effect on our brand, business, prospects and operating results. We self-insure against the risk of product liability claims, meaning that any product liability claims will have to be paid from company funds, not by insurance.”
News
Tesla is not sparing any expense in ensuring the Cybercab is safe
Images shared by the longtime watcher showed 16 Cybercab prototypes parked near Giga Texas’ dedicated crash test facility.
The Tesla Cybercab could very well be the safest taxi on the road when it is released and deployed for public use. This was, at least, hinted at by the intensive safety tests that Tesla seems to be putting the autonomous two-seater through at its Giga Texas crash test facility.
Intensive crash tests
As per recent images from longtime Giga Texas watcher and drone operator Joe Tegtmeyer, Tesla seems to be very busy crash testing Cybercab units. Images shared by the longtime watcher showed 16 Cybercab prototypes parked near Giga Texas’ dedicated crash test facility just before the holidays.
Tegtmeyer’s aerial photos showed the prototypes clustered outside the factory’s testing building. Some uncovered Cybercabs showed notable damage and one even had its airbags engaged. With Cybercab production expected to start in about 130 days, it appears that Tesla is very busy ensuring that its autonomous two-seater ends up becoming the safest taxi on public roads.
Prioritizing safety
With no human driver controls, the Cybercab demands exceptional active and passive safety systems to protect occupants in any scenario. Considering Tesla’s reputation, it is then understandable that the company seems to be sparing no expense in ensuring that the Cybercab is as safe as possible.
Tesla’s focus on safety was recently highlighted when the Cybertruck achieved a Top Safety Pick+ rating from the Insurance Institute for Highway Safety (IIHS). This was a notable victory for the Cybertruck as critics have long claimed that the vehicle will be one of, if not the, most unsafe truck on the road due to its appearance. The vehicle’s Top Safety Pick+ rating, if any, simply proved that Tesla never neglects to make its cars as safe as possible, and that definitely includes the Cybercab.
Elon Musk
Tesla’s Elon Musk gives timeframe for FSD’s release in UAE
Provided that Musk’s timeframe proves accurate, FSD would be able to start saturating the Middle East, starting with the UAE, next year.
Tesla CEO Elon Musk stated on Monday that Full Self-Driving (Supervised) could launch in the United Arab Emirates (UAE) as soon as January 2026.
Provided that Musk’s timeframe proves accurate, FSD would be able to start saturating the Middle East, starting with the UAE, next year.
Musk’s estimate
In a post on X, UAE-based political analyst Ahmed Sharif Al Amiri asked Musk when FSD would arrive in the country, quoting an earlier post where the CEO encouraged users to try out FSD for themselves. Musk responded directly to the analyst’s inquiry.
“Hopefully, next month,” Musk wrote. The exchange attracted a lot of attention, with numerous X users sharing their excitement at the idea of FSD being brought to a new country. FSD (Supervised), after all, would likely allow hands-off highway driving, urban navigation, and parking under driver oversight in traffic-heavy cities such as Dubai and Abu Dhabi.
Musk’s comments about FSD’s arrival in the UAE were posted following his visit to the Middle Eastern country. Over the weekend, images were shared online of Musk meeting with UAE Defense Minister, Deputy Prime Minister, and Dubai Crown Prince HH Sheikh Hamdan bin Mohammed. Musk also posted a supportive message about the country, posting “UAE rocks!” on X.
FSD recognition
FSD has been getting quite a lot of support from foreign media outlets. FSD (Supervised) earned high marks from Germany’s largest car magazine, Auto Bild, during a test in Berlin’s challenging urban environment. The demonstration highlighted the system’s ability to handle dense traffic, construction sites, pedestrian crossings, and narrow streets with smooth, confident decision-making.
Journalist Robin Hornig was particularly struck by FSD’s superior perception and tireless attention, stating: “Tesla FSD Supervised sees more than I do. It doesn’t get distracted and never gets tired. I like to think I’m a good driver, but I can’t match this system’s all-around vision. It’s at its best when both work together: my experience and the Tesla’s constant attention.” Only one intervention was needed when the system misread a route, showcasing its maturity while relying on vision-only sensors and over-the-air learning.
News
Tesla quietly flexes FSD’s reliability amid Waymo blackout in San Francisco
“Tesla Robotaxis were unaffected by the SF power outage,” Musk wrote in his post.
Tesla highlighted its Full Self-Driving (Supervised) system’s robustness this week by sharing dashcam footage of a vehicle in FSD navigating pitch-black San Francisco streets during the city’s widespread power outage.
While Waymo’s robotaxis stalled and caused traffic jams, Tesla’s vision-only approach kept operating seamlessly without remote intervention. Elon Musk amplified the clip, highlighting the contrast between the two systems.
Tesla FSD handles total darkness
The @Tesla_AI account posted a video from a Model Y operating on FSD during San Francisco’s blackout. As could be seen in the video, streetlights, traffic signals, and surrounding illumination were completely out, but the vehicle drove confidently and cautiously, just like a proficient human driver.
Musk reposted the clip, adding context to reports of Waymo vehicles struggling in the same conditions. “Tesla Robotaxis were unaffected by the SF power outage,” Musk wrote in his post.
Musk and the Tesla AI team’s posts highlight the idea that FSD operates a lot like any experienced human driver. Since the system does not rely on a variety of sensors and a complicated symphony of factors, vehicles could technically navigate challenging circumstances as they emerge. This definitely seemed to be the case in San Francisco.
Waymo’s blackout struggles
Waymo faced scrutiny after multiple self-driving Jaguar I-PACE taxis stopped functioning during the blackout, blocking lanes, causing traffic jams, and requiring manual retrieval. Videos shared during the power outage showed fleets of Waymo vehicles just stopping in the middle of the road, seemingly confused about what to do when the lights go out.
In a comment, Waymo stated that its vehicles treat nonfunctional signals as four-way stops, but “the sheer scale of the outage led to instances where vehicles remained stationary longer than usual to confirm the state of the affected intersections. This contributed to traffic friction during the height of the congestion.”
A company spokesperson also shared some thoughts about the incidents. “Yesterday’s power outage was a widespread event that caused gridlock across San Francisco, with non-functioning traffic signals and transit disruptions. While the failure of the utility infrastructure was significant, we are committed to ensuring our technology adjusts to traffic flow during such events,” the Waymo spokesperson stated, adding that it is “focused on rapidly integrating the lessons learned from this event, and are committed to earning and maintaining the trust of the communities we serve every day.”