News
Tesla fires back at Fortune with cheeky “Misfortune” blog post
The drama continues between Tesla and Fortune after the media outlet published a story questioning Tesla’s ethics claiming the company sold $2 billion worth of stock but failed to disclose that it was under investigation by the National Highway Transport Association (NHTSA) after Joshua Brown was killed when his Model S in Autopilot mode crashed into a tractor trailer.
Since the story was published, Tesla CEO Elon Musk defended the company’s position that news surrounding the Autopilot related death was not material to its stock price. Fortune disagreed citing that the stock price dropped $6 per share after news broke that the NHTSA was in fact investigating evidence surrounding Brown’s death. That’s when Musk fired back via email picking choice words with Fortune’s writer and stating, “Indeed, if anyone bothered to do the math (obviously, you did not) they would realize that of the over 1M auto deaths per year worldwide, approximately half a million people would have been saved if the Tesla autopilot was universally available. Please, take 5 mins and do the bloody math before you write an article that misleads the public.”
The Tesla vs Fortune debacle spilled over into the public Twittersphere between Fortune’s Editor Alan Murray and Elon Musk. The tweets continued throughout Wednesday with Alan Murray defending the media outlet’s position that Tesla did not disclose news of the Autopilot death. Fortune went as far as quoting statements made in an SEC filing by Tesla which warned investors that a fatal crash related to its Autopilot feature would be a material event to the company’s brand, business, and operating results. Tesla would later bring to light that Fortune mischaracterized the quote within the SEC filing.
Tesla has since released a blog post on this matter titled “Misfortune”.
Misfortune
Fortune’s article is fundamentally incorrect.
First, Fortune mischaracterizes Tesla’s SEC filing. Here is what Tesla’s SEC filing actually says: “We may become subject to product liability claims, which could harm our financial condition and liquidity if we are not able to successfully defend or insure against such claims.” [full text included below] This is just stating the obvious. One of the risks facing Tesla (or any company) is that someone could bring product liability claims against it. However, neither at the time of this SEC filing, nor in the several weeks to date, has anyone brought a product liability claim against Tesla relating to the crash in Florida.
Next, Fortune entirely ignores what Tesla knew and when, nor have they even asked the questions. Instead, they simply assume that Tesla had complete information from the moment this accident occurred. This was a physical impossibility given that the damage sustained by the Model S in the crash limited Tesla’s ability to recover data from it remotely.
When Tesla told NHTSA about the accident on May 16th, we had barely started our investigation. Tesla informed NHTSA because it wanted to let NHTSA know about a death that had taken place in one of its vehicles. It was not until May 18th that a Tesla investigator was able to go to Florida to inspect the car and the crash site and pull the complete vehicle logs from the car, and it was not until the last week of May that Tesla was able to finish its review of those logs and complete its investigation. When Fortune contacted Tesla for comment on this story during the July 4th holiday, Fortune never asked any of these questions and instead just made assumptions. Tesla asked Fortune to give it a day to confirm these facts before it rushed its story to print. They declined and instead ran a misleading article.
Here’s what we did know at the time of the accident and subsequent filing:
- That Tesla Autopilot had been safely used in over 100 million miles of driving by tens of thousands of customers worldwide, with zero confirmed fatalities and a wealth of internal data demonstrating safer, more predictable vehicle control performance when the system is properly used.
- That contrasted against worldwide accident data, customers using Autopilot are statistically safer than those not using it at all.
- That given its nature as a driver assistance system, a collision on Autopilot was a statistical inevitability, though by this point, not one that would alter the conclusion already borne out over millions of miles that the system provided a net safety benefit to society.
Given the fact that the “better-than-human” threshold had been crossed and robustly validated internally, news of a statistical inevitability did not materially change any statements previously made about the Autopilot system, its capabilities, or net impact on roadway safety.
Finally, the Fortune article makes two other false assumptions. First, they assume that this accident was caused by an Autopilot failure. To be clear, this accident was the result of a semi-tractor trailer crossing both lanes of a divided highway in front of an oncoming car. Whether driven under manual or assisted mode, this presented a challenging and unexpected emergency braking scenario for the driver to respond to. In the moments leading up to the collision, there is no evidence to suggest that Autopilot was not operating as designed and as described to users: specifically, as a driver assistance system that maintains a vehicle’s position in lane and adjusts the vehicle’s speed to match surrounding traffic.
Fortune never even addresses that point. Second, Fortune assumes that, putting all of these other problems aside, a single accident involving Autopilot, regardless of how many accidents Autopilot has stopped and how many lives it has saved, is material to Tesla’s investors. On the day the news broke about NHTSA’s decision to initiate a preliminary evaluation into the incident, Tesla’s stock traded up, not down, confirming that not only did our investors know better, but that our own internal assessment of the performance and risk profile of Autopilot were in line with market expectations.
The bottom line is that Fortune jumped the gun on a story before they had the facts. They then sought wrongly to defend that position by plucking boilerplate language from SEC filings that have no bearing on what happened, while failing to correct or acknowledge their original omissions and errors.
Full text referenced above:
We may become subject to product liability claims, which could harm our financial condition and liquidity if we are not able to successfully defend or insure against such claims.
“Product liability claims could harm our business, prospects, operating results and financial condition. The automobile industry experiences significant product liability claims and we face inherent risk of exposure to claims in the event our vehicles do not perform as expected resulting in personal injury or death. We also may face similar claims related to any misuse or failures of new technologies that we are pioneering, including autopilot in our vehicles and our Tesla Energy products. A successful product liability claim against us with respect to any aspect of our products could require us to pay a substantial monetary award. Our risks in this area are particularly pronounced given the limited number of vehicles and energy storage products delivered to date and limited field experience of our products. Moreover, a product liability claim could generate substantial negative publicity about our products and business and would have material adverse effect on our brand, business, prospects and operating results. We self-insure against the risk of product liability claims, meaning that any product liability claims will have to be paid from company funds, not by insurance.”
News
Tesla Full Self-Driving v14.2.2.5 might be the most confusing release ever
With each Full Self-Driving release, I am realistic. I know some things are going to get better, and I know some things will regress slightly. However, these instances of improvements are relatively mild, as are the regressions. Yet, this version has shown me that it contains extremes of both.
Tesla Full Self-Driving v14.2.2.5 hit my car back on Valentine’s Day, February 14, and since I’ve had it, it has become, in my opinion, the most confusing release I’ve ever had.
With each Full Self-Driving release, I am realistic. I know some things are going to get better, and I know some things will regress slightly. However, these instances of improvements are relatively mild, as are the regressions. Yet, this version has shown me that it contains extremes of both.
It has been about three weeks of driving on v14.2.2.5; I’ve used it for nearly every mile traveled since it hit my car. I’ve taken short trips of 10 minutes or less, I’ve taken medium trips of an hour or less, and I’ve taken longer trips that are over 100 miles per leg and are over two hours of driving time one way.
These are my thoughts on it thus far:
Speed Profiles Are a Mixed Bag
Speed Profiles are something Tesla seems to tinker with quite frequently, and each version tends to show a drastic difference in how each one behaves compared to the previous version.
I do a vast majority of my FSD travel using Standard and Hurry modes, although in bad weather, I will scale it back to Chill, and when it’s a congested city on a weekend or during rush hour, I’ll throw it into Mad Max so it takes what it needs.
Early on, Speed Profiles really felt great. This is one of those really subjective parts of the FSD where someone might think one mode travels too quickly, whereas another person might see the identical performance as too slow or just right.
To me, I would like to see more consistency from release to release on them, but overall, things are pretty good. There are no real complaints on my end, as I had with previous releases.
In a past release, Mad Max traveled under the speed limit quite frequently, and I only had that experience because Hurry was acting the same way. I’ve had no instances of that with v14.2.2.5.
Strange Turn Signal Behavior
This is the first Full Self-Driving version where I’ve had so many weird things happen with the turn signals.
Two things come to mind: Using a turn signal on a sharp turn, and ignoring the navigation while putting the wrong turn signal on. I’ve encountered both things on v14.2.2.5.
On my way to the Supercharger, I take a road that has one semi-sharp right-hand turn with a driveway entrance right at the beginning of the turn.
Only recently, with the introduction of v14.2.2.5, have I had FSD put on the right turn signal when going around this turn. It’s obviously a minor issue, but it still happens, and it’s not standard practice:
How can we get Full Self-Driving to stop these turn signals?
There’s no need to use one here; the straight path is a driveway, not a public road. The right turn signal here is unnecessary pic.twitter.com/7uLDHnqCfv
— TESLARATI (@Teslarati) February 28, 2026
When sharing this on X, I had Tesla fans (the ones who refuse to acknowledge that the company can make mistakes) tell me that it’s a “valid” behavior that would be taught to anyone who has been “professionally trained” to drive.
Apparently, if you complain about this turn signal, you are also claiming you know more than Tesla engineers…okay.
Nobody in their right mind has ever gone around a sharp turn when driving their car and put on a signal when continuing on the same road. You would put a left turn signal on to indicate you were turning into that driveway if that’s what your intention was.
Like I said, it’s a totally minor issue. However, it’s not really needed, and nor is it normal. If I were in the car with someone who was taking a simple turn on a road they were traveling, and they signaled because the turn was sharp, I’d be scratching my head.
I’ve also had three separate instances of the car completely ignoring the navigation and putting on a signal that is opposite to what the routing says. Really quite strange.
Parking Performance is Still Underwhelming
Parking has been a complaint of mine with FSD for a long time, so much so that it is pretty rare that I allow the vehicle to park itself. More often than not, it is because I want to pick a spot that is relatively isolated.
However, in the times I allow it to pull into a spot, it still does some pretty head-scratching things.
Recently, it tried to back into a spot that was ~60% covered in plowed snow. The snow was piled about six feet high in a Target parking lot.
A few days later, it tried backing into a spot where someone failed the universal litmus test of returning their shopping cart. Both choices were baffling and required me to manually move the car to a different portion of the lot.
I used Autopark on both occasions, and it did a great job of getting into the spot. I notice that the parking performance when I manually choose the spot is much better than when the car does the entire parking process, meaning choosing the spot and parking in it.
It’s Doing Things (For Me) It’s Never Done Before
Two things that FSD has never done before, at least for me, are slow down in School Zones and avoid deer. The first is something I usually take over manually, and the second I surprisingly have not had to deal with yet.
I had my Tesla slow down at a school zone yesterday for the first time, traveling at 20 MPH and not 15 MPH as the sign suggested, but at the speed of other cars in the School Zone. This was impressive and the first time I experienced it.
I would like to see this more consistently, and I think School Zones should be one of those areas where, no matter what, FSD will only travel the speed limit.
Last night, FSD v14.2.2.5 recognized a deer in a roadside field and slowed down for it:
🚨 Cruising home on a rainy, foggy evening and my Tesla on Full Self-Driving begins to slow down suddenly
FSD just wanted Mr. Deer to make it home to his deer family ❤️ pic.twitter.com/cAeqVDgXo5
— TESLARATI (@Teslarati) March 4, 2026
Navigation Still SUCKS
Navigation will be a complaint until Tesla proves it can fix it. For now, it’s just terrible.
It still has not figured out how to leave my neighborhood. I give it the opportunity to prove me wrong each time I leave my house, and it just can’t do it.
It always tries to go out of the primary entrance/exit of the neighborhood when the route needs to take me left, even though that exit is a right turn only. I always leave a voice prompt for Tesla about it.
It still picks incredibly baffling routes for simple navigation. It’s the one thing I still really want Tesla to fix.
Investor's Corner
Tesla gets tip of the hat from major Wall Street firm on self-driving prowess
“Tesla is at the forefront of autonomous driving, supported by a camera-only approach that is technically harder but much cheaper than the multi-sensor systems widely used in the industry. This strategy should allow Tesla to scale more profitably compared to Robotaxi competitors, helped by a growing data engine from its existing fleet,” BoA wrote.
Tesla received a tip of the hat from major Wall Street firm Bank of America on Wednesday, as it reinitiated coverage on Tesla shares with a bullish stance that comes with a ‘Buy’ rating and a $460 price target.
In a new note that marks a sharp reversal from its neutral position earlier in 2025, the bank declared Tesla’s Full Self-Driving (FSD) technology the “leading consumer autonomy solution.”
Analysts highlighted Tesla’s camera-only architecture, known as Tesla Vision, as a strategic masterstroke. While technically more challenging than the multi-sensor setups favored by rivals, the vision-based approach is dramatically cheaper to produce and maintain.
This cost edge, combined with Tesla’s rapidly expanding real-world data engine, positions the company to scale robotaxis far more profitably than competitors, BofA argues in the new note:
“Tesla is at the forefront of autonomous driving, supported by a camera-only approach that is technically harder but much cheaper than the multi-sensor systems widely used in the industry. This strategy should allow Tesla to scale more profitably compared to Robotaxi competitors, helped by a growing data engine from its existing fleet.”
The bank now attributes roughly 52% of Tesla’s total valuation to its Robotaxi ambitions. It also flagged meaningful upside from the Optimus humanoid robot program and the fast-growing energy storage business, suggesting the auto segment’s recent headwinds, including expired incentives, are being eclipsed by these higher-margin opportunities.
Tesla’s own data underscores exactly why Wall Street is waking up to FSD’s potential. According to Tesla’s official safety reporting page, the FSD Supervised fleet has now surpassed 8.4 billion cumulative miles driven.
Tesla FSD (Supervised) fleet passes 8.4 billion cumulative miles
That total ballooned from just 6 million miles in 2021 to 80 million in 2022, 670 million in 2023, 2.25 billion in 2024, and a staggering 4.25 billion in 2025 alone. In the first 50 days of 2026, owners added another 1 billion miles — averaging more than 20 million miles per day.
This avalanche of real-world, camera-captured footage, much of it on complex city streets, gives Tesla an unmatched training dataset. Every mile feeds its neural networks, accelerating improvement cycles that lidar-dependent rivals simply cannot match at scale.
Tesla owners themselves will tell you the suite gets better with every release, bringing new features and improvements to its self-driving project.
The $460 target implies roughly 15 percent upside from recent trading levels around $400. While regulatory and safety hurdles remain, BofA’s endorsement signals growing institutional conviction that Tesla’s data advantage is not hype; it’s a tangible moat already delivering billions of miles of proof.
News
Tesla to discuss expansion of Samsung AI6 production plans: report
Tesla has reportedly requested an additional 24,000 wafers per month, which would bring total production capacity to around 40,000 wafers if finalized.
Tesla is reportedly discussing an expansion of its next-generation AI chip supply deal with Samsung Electronics.
As per a report from Korean industry outlet The Elec, Tesla purchasing executives are reportedly scheduled to meet Samsung officials this week to negotiate additional production volume for the company’s upcoming AI6 chip.
Industry sources cited in the report stated that Tesla is pushing to increase the production volume of its AI6 chip, which will be manufactured using Samsung’s 2-nanometer process.
Tesla previously signed a long-term foundry agreement with Samsung covering AI6 production through December 31, 2033. The deal was reportedly valued at about 22.8 trillion won (roughly $16–17 billion).
Under the existing agreement, Tesla secured approximately 16,000 wafers per month from the facility. The company has reportedly requested an additional 24,000 wafers per month, which would bring total production capacity to around 40,000 wafers if finalized.
Tesla purchasing executives are expected to discuss detailed supply terms during their visit to Samsung this week.
The AI6 chip is expected to support several Tesla technologies. Industry sources stated that the chip could be used for the company’s Full Self-Driving system, the Optimus humanoid robot, and Tesla’s internal AI data centers.
The report also indicated that AI6 clusters could replace the role previously planned for Tesla’s Dojo AI supercomputer. Instead of a single system, multiple AI6 chips would be combined into server-level clusters.
Tesla’s semiconductor collaboration with Samsung dates back several years. Samsung participated in the design of Tesla’s HW3 (AI3) chip and manufactured it using a 14-nanometer process. The HW4 chip currently used in Tesla vehicles was also produced by Samsung using a 5-nanometer node.
Tesla previously planned to split production of its AI5 chip between Samsung and TSMC. However, the company reportedly chose Samsung as the primary partner for the newer AI6 chip.