Connect with us

News

The Tesla Autopilot Excuse: How EV ignorance created the perfect storm for a misinformation nightmare

Credit: Tesla

Published

on

It was only a few hours after the accident and a bold statement was already making its rounds in the mainstream media. Another Tesla has crashed, and this time, it took the lives of two individuals from Texas. Facing inquiries from journalists eager for some clarity as to what happened in the tragic incident, Harris County Pct. 4 Constable Mark Herman shared a surprisingly confident and bold statement: there was no one in the ill-fated Model S’ driver seat when it crashed. 

“They are 100% certain that no one was in the driver seat driving that vehicle at the time of impact. They are positive. And again, the height from the back seat to the front seat, that would be almost impossible, but again our investigators are trained. They handle collisions. Several of our folks are reconstructionists, but they feel very confident just with the positioning of the bodies after the impact that there was no one driving that vehicle,” Herman said, also noting that the electric car’s fire was out of control for four hours. 

This statement, as well as the headlines that followed it, have since been proven false. And today, they stand as a remarkable case study on how misinformation spreads, and how the truth — even if it eventually emerges from legitimate sources — becomes largely ignored. This is the story of a Model S crash, rushed statements, and how general ignorance of electric vehicles could result in a massive misinformation nightmare. 

But to get a complete view of this story, one has to go back to that fateful night on April 17, 2021, when two men, a 59-year-old Tesla owner and his 69-year-old passenger, crashed after traveling just about 550 feet, before departing the road on a curve, driving over a curb, hitting a drainage culvert and a raised manhole, and smashing into a tree. The vehicle was ablaze following its crash.

Advertisement
The location where the accident happened. (Credit: NTSB)

The Accident

As it is with other Tesla crashes, the Model S crash in Texas immediately caught the attention of national media. It did not take long before even foreign outlets were running with the story. It was during this initial wave of media attention that Constable Mark Herman noted that investigators were 100% sure that there was no one driving the car when it crashed. This statement was gold to numerous media outlets, with some like the New York Post posting a tweet noting that the ill-fated Tesla was on Autopilot. It’s pertinent to note that the Constable never mentioned Autopilot, though his statement declaring that there was no one in the driver’s seat seemed like a strong enough link to the driver-assist suite. 

Soon, even organizations such as Consumer Reports joined the fray, graciously demonstrating that Autopilot could indeed be “fooled” into operating without a human in the driver’s seat. Consumer Reports‘ walkthrough was thorough, showing audiences exactly what needs to be done to defeat Autopilot’s safety measures. This stunt caught the eye of both national and international media as well, and by this time, the narrative was set: Teslas can drive themselves without a driver, and Autopilot could kill. It’s a chilling thought, but it is one that seemed to be casually supported by Ford CEO Jim Farley, who shared Consumer Reports‘ Autopilot defeat device walkthrough on his personal Twitter page. 

This does not mean to say the narrative surrounding the fatal Model S crash in Texas was ironclad, however. Just days after the initial crash, Palmer Buck, fire chief for The Woodlands Township Fire Department, told the Houston Chronicle that contrary to some reports in the media, the ill-fated Model S was not ablaze for four hours. The fire chief also stated that firefighters did not call Tesla for help, and he was unaware of any hotlines for tips on how to control a battery fire. 

Opinion: Consumer Reports’ Tesla Autopilot stunt crossed a line in an already-heated EV climate

The First Cracks — And A Persistent Misunderstanding

Interestingly enough, even Constable Herman himself seemed less sure about his information later on, noting in a statement to Reuters that his investigators were “almost 99.9% sure” that there was no one in the driver’s seat of the ill-fated car. This was despite Herman noting that they had executed a search warrant on Tesla to secure data about the tragic incident. Meanwhile, Elon Musk went on Twitter to state that data logs so far showed that the ill-fated vehicle was not on Autopilot when it crashed. 

Advertisement

Tesla’s online community took it upon themselves to make sense of the situation, which seemed to have red flags all over the place. The Constable’s statements seemed premature at best, and reports about the vehicle’s fire had been proven false by the fire chief. Couple this with Elon Musk noting that Autopilot was not involved, and it was no surprise that the crash became a topic for analysis and conversations among Tesla supporters. These efforts, however, were largely dismissed if not mocked, with media outlets such as VICE stating that the behavior of the Tesla sleuths was akin to those who believe in conspiracy theories.

“Rather than waiting for the two different federal authorities investigating the crash to publish their findings, some Tesla owners are engaging in the classic behavior of conspiracy theorists and amateur internet sleuths in an apparent attempt to cast doubt on even the most basic facts surrounding the crash,” the publication noted. 

More cracks about the initial “Autopilot crash” narrative emerged during the company’s Q1 2021 earnings call. Lars Moravy, Tesla’s vice president of vehicle engineering, stated that the company had conducted tests with investigators, and they have determined that Autosteer could not be engaged in the area. He also stated that judging by the distance of the vehicle from the owner’s home to the crash site, the Model S would have only accelerated to 30 mph before covering the entire 550-foot distance using Adaptive Cruise Control. This is undoubtedly a clarification about the incident, but like many things in this story, this was also misunderstood. 

Not long after Tesla’s Q1 2021 earnings call, CBS published a piece titled “At Least One Tesla Autopilot Feature Was Active During Texas Crash That Killed 2.” It’s definitely a catchy headline and one that was sure to draw a decent amount of eyes. There was only one problem: the whole premise of the article was false. To add salt to the wound, Texas Rep. Kevin Brady shared the CBS piece on Twitter, noting that “Despite early claims by (Tesla and Elon Musk), Autopilot WAS engaged in (the) tragic crash in The Woodlands. We need answers.” 

Advertisement

A Grassroots Movement

In a world where misinformation is prevalent from media outlets that may or may not be incentivized to publish reports that are completely accurate, citizen journalism has the potential to become the voice of reason. And in the case of the Tesla Texas crash, this was certainly the case. After conversations with sources, some of whom have opted to remain anonymous, Teslarati could surmise that it was the efforts of regular people, from electric vehicle advocates and space enthusiasts who were inspired by Elon Musk’s SpaceX, that may have ultimately helped get the right information about the incident to the right place. 

Advertisement

Days after the incident, and a few weeks before the release of the National Transportation Safety Board (NTSB) preliminary report, @GoGundam1, a Texas-based SpaceX advocate, felt alarm bells in his head after Constable Herman declared confidently that he was 100% sure there was no one in the driver’s seat of the ill-fated Model S. Having been familiar with Elon Musk’s companies, the SpaceX enthusiast was also knowledgeable about Tesla and its products, which made the Constable’s statements seem disingenuous at best. Annoyed by the noticeably false narrative that was being formed, the space advocate sent out some feelers to test out the waters. 

The story that emerged was quite remarkable. Information gathered by citizen informants suggested that by April 22, Constable Herman’s office was already in possession of video evidence that was in direct contradiction to the narrative that was initially presented to the media. It was a disturbing thought, but informants also suggested that the office of the Constable had intentions to sit on the information for as long as possible. Granted, these events may seem like they came from the plot of a semi-decent movie, but considering the relative silence from the Constable following his statements of a search warrant being submitted to Tesla, it does seem like the motivations for a follow-up report clarifying the incident were not really there. 

Pertinent information about the Tesla Texas crash, no matter how valuable, would be next to useless if it did not catch the attention of the right entities. And thus, with the information gathered, the SpaceX enthusiast decided to reach out to members of the Tesla community for help. It was a challenging task, but eventually, @LordPente, a longtime Tesla advocate, decided to lend a hand. After numerous messages to other members of the Tesla community, the longtime EV advocate appeared to hit a breakthrough by (seemingly) reaching someone at Tesla. The SpaceX enthusiast, for his part, failed to get in touch with Tesla but was able to send a report to the NTSB, tipping off the agency about the additional video evidence in the Constable’s office. 

During Teslarati’s conversation with the informant and the Tesla advocate, both noted that they were not really sure if their information reached the right entities. However, something happened not long after which suggested that it did. 

Advertisement
The remains of the ill-fated Tesla Model S (Credit: NTSB)

The Lie Unravels

On May 10, 2021, the National Transportation Safety Board (NTSB) published its preliminary report about the Tesla Model S’ fatal Texas crash. As per the NTSB’s report, “footage from the owner’s home security camera shows the owner entering the car’s driver’s seat and the passenger entering the front passenger seat.” Apart from this, the NTSB also noted that tests of a similar vehicle at the crash location showed that Autopilot could not be engaged in the area, just as Tesla and the electric vehicle community suggested amidst the initial wave of “Autopilot crash” reports. The investigation is ongoing, of course, but based on what the NTSB has published so far, it appears that Autopilot has been absolved in the incident. 

The findings presented in the NTSB’s report all but confirmed what Elon Musk and Tesla supporters were arguing online. It may be disappointing to media outlets like VICE, but as it turned out, the conspiracy theorist-like behavior exhibited by some Tesla sleuths online turned out to be justified. There really was misinformation being floated around, and if it wasn’t for the efforts of a few individuals, pertinent information about the incident might not have been submitted to Tesla or the NTSB on time. 

Interestingly enough, Harris County Pct. 4 Constable Mark Herman has remained silent for now. Teslarati has attempted to reach out to his office through email but was unsuccessful. The Constable, at least for now, seems yet to issue a correction or retraction of his initial and now-debunked statements about the incident. Individuals such as Texas Rep. Kevin Brady have not admitted to making a mistake either. 

How Misinformation Becomes Truth

Tesla, being a rather unorthodox company led by an equally unorthodox man, tends to fall victim to misinformation — lots and lots of it. The story of the Texas crash is a great example, but it is one drop in a whole bucket full of inaccurate reports about the company. Tesla CEO Elon Musk has seemingly thrown the towel with mainstream media coverage, reportedly abolishing Tesla’s PR department last year. This, of course, has pretty much opened the doors to even more misinformation — and to a point, even disinformation — which, in turn, becomes the general public’s truth. 

For professional insights on how misinformation becomes accepted, Teslarati reached out to Stephen Benning, a Professor of Psychology at the University of Las Vegas. Professor Benning explained that humans tend to have an anchoring bias, in which the first information used to make a judgment influences it. While anchoring bias is typically considered in numerical judgments (like estimates on how much something is worth), it could also play out when people hear the first reports of what happened. This is most notable if the event were memorable, like a fatal Tesla crash. The initial information would likely stick on people’s minds and create an initial framework that sets their beliefs about an event. 

Advertisement

“Because initial reports set people’s prior beliefs, additional information has to weigh against established beliefs. People might have additional biases at play, like the confirmation bias that filters out information that isn’t consistent with a previous set of beliefs. It’s as if people put up filters to help themselves maintain the consistency of their beliefs at the expense of their potential correspondence with reality. The initial crash reports were also likely more vivid than the drier details of the subsequent investigation, so the availability heuristic might make those initial reports more vivid and accessible in people’s memories when they think about the crash – even if they’ve followed the subsequent reports,” he wrote. 

Tesla owner apologizes for staging “brake failure” incident in China

Emma Frances Bloomfield (Ph.D.), currently an Assistant Professor of Communication Studies at the University of Nevada, Las Vegas with an expertise in strategies for combatting misinformation, explained to Teslarati that ultimately, misinformation and disinformation travel very quickly because they tend to be compelling and engaging, all while confirming an audience’s biases. This made the Texas crash a perfect storm of sorts, as it had a compelling event that catered to biases against Tesla and its Autopilot system. Unfortunately, Assistant Professor Bloomfield also highlighted that once misinformation sets in, it takes a ton of effort to overturn. 

“To address misinformation, people can create more complete stories that replace the incorrect one, provide trustworthy authority figures to deliver the message, and not repeat the false information when making the correction. You can also emphasize the importance of accurate information to make the best decisions moving forward and highlight how those changes might benefit the audience/consumer. We also say, ‘correct early and correct often’ to try and get ahead of the temporal advantage misinformation has and to counter the repetition of the false information,” she wrote. 

Advertisement

A Battle That Tesla Doesn’t Need To Lose

If there is something highlighted by Professor Benning and Assistant Professor Bloomfield, it is that misinformation is hard to battle once it’s settled in. And for a lie to settle in, it has to be repeated. The Texas crash demonstrated this. It didn’t start with a lie, but it started with a premature, careless statement that could be easily twisted into one.

The Constable’s certainty that there was no one in the driver’s seat was premature at best, and reports about the incident being an Autopilot crash were also premature then, or a lie at worst. Reports about an uncontrollable blaze burning for four hours were false as well. Yet the narrative was so hammered down and unchallenged that even when the NTSB preliminary report came out, the needle barely moved. 

Elon Musk’s reservations about maintaining a relationship with the media are understandable. Years of inaccurate reports tend to do that to a person. However, Tesla could also adopt a much more assertive anti-misinformation strategy. Tesla China has been doing this as of late, to great results. Anyone following the Tesla China story would know that the company was embroiled in a PR storm that involved alleged reports of “brake failure” incidents surrounding the company’s vehicles. But after an assertive legal campaign from Tesla China, media outlets have issued apologies for misreporting on the company and social media personalities have admitted to making up alleged incidents that painted the company’s vehicles in a negative light. Granted, such strategies may not be as effective in the United States, but something has to be done. What this something is remains up for question. 

Do you have anything to share with the Teslarati Team? We’d love to hear from you, email us at tips@teslarati.com.

Advertisement

Simon is an experienced automotive reporter with a passion for electric cars and clean energy. Fascinated by the world envisioned by Elon Musk, he hopes to make it to Mars (at least as a tourist) someday. For stories or tips--or even to just say a simple hello--send a message to his email, simon@teslarati.com or his handle on X, @ResidentSponge.

Advertisement
Comments

News

Tesla Full Self-Driving v14.2.2.5 might be the most confusing release ever

With each Full Self-Driving release, I am realistic. I know some things are going to get better, and I know some things will regress slightly. However, these instances of improvements are relatively mild, as are the regressions. Yet, this version has shown me that it contains extremes of both.

Published

on

Credit: Tesla

Tesla Full Self-Driving v14.2.2.5 hit my car back on Valentine’s Day, February 14, and since I’ve had it, it has become, in my opinion, the most confusing release I’ve ever had.

With each Full Self-Driving release, I am realistic. I know some things are going to get better, and I know some things will regress slightly. However, these instances of improvements are relatively mild, as are the regressions. Yet, this version has shown me that it contains extremes of both.

It has been about three weeks of driving on v14.2.2.5; I’ve used it for nearly every mile traveled since it hit my car. I’ve taken short trips of 10 minutes or less, I’ve taken medium trips of an hour or less, and I’ve taken longer trips that are over 100 miles per leg and are over two hours of driving time one way.

These are my thoughts on it thus far:

Advertisement

Speed Profiles Are a Mixed Bag

Speed Profiles are something Tesla seems to tinker with quite frequently, and each version tends to show a drastic difference in how each one behaves compared to the previous version.

I do a vast majority of my FSD travel using Standard and Hurry modes, although in bad weather, I will scale it back to Chill, and when it’s a congested city on a weekend or during rush hour, I’ll throw it into Mad Max so it takes what it needs.

Early on, Speed Profiles really felt great. This is one of those really subjective parts of the FSD where someone might think one mode travels too quickly, whereas another person might see the identical performance as too slow or just right.

To me, I would like to see more consistency from release to release on them, but overall, things are pretty good. There are no real complaints on my end, as I had with previous releases.

Advertisement

In a past release, Mad Max traveled under the speed limit quite frequently, and I only had that experience because Hurry was acting the same way. I’ve had no instances of that with v14.2.2.5.

Strange Turn Signal Behavior

This is the first Full Self-Driving version where I’ve had so many weird things happen with the turn signals.

Two things come to mind: Using a turn signal on a sharp turn, and ignoring the navigation while putting the wrong turn signal on. I’ve encountered both things on v14.2.2.5.

On my way to the Supercharger, I take a road that has one semi-sharp right-hand turn with a driveway entrance right at the beginning of the turn.

Advertisement

Only recently, with the introduction of v14.2.2.5, have I had FSD put on the right turn signal when going around this turn. It’s obviously a minor issue, but it still happens, and it’s not standard practice:

Advertisement

When sharing this on X, I had Tesla fans (the ones who refuse to acknowledge that the company can make mistakes) tell me that it’s a “valid” behavior that would be taught to anyone who has been “professionally trained” to drive.

Apparently, if you complain about this turn signal, you are also claiming you know more than Tesla engineers…okay.

Nobody in their right mind has ever gone around a sharp turn when driving their car and put on a signal when continuing on the same road. You would put a left turn signal on to indicate you were turning into that driveway if that’s what your intention was.

Like I said, it’s a totally minor issue. However, it’s not really needed, and nor is it normal. If I were in the car with someone who was taking a simple turn on a road they were traveling, and they signaled because the turn was sharp, I’d be scratching my head.

Advertisement

I’ve also had three separate instances of the car completely ignoring the navigation and putting on a signal that is opposite to what the routing says. Really quite strange.

Parking Performance is Still Underwhelming

Parking has been a complaint of mine with FSD for a long time, so much so that it is pretty rare that I allow the vehicle to park itself. More often than not, it is because I want to pick a spot that is relatively isolated.

However, in the times I allow it to pull into a spot, it still does some pretty head-scratching things.

Recently, it tried to back into a spot that was ~60% covered in plowed snow. The snow was piled about six feet high in a Target parking lot.

Advertisement

Tesla ends Full Self-Driving purchase option in the U.S.

A few days later, it tried backing into a spot where someone failed the universal litmus test of returning their shopping cart. Both choices were baffling and required me to manually move the car to a different portion of the lot.

I used Autopark on both occasions, and it did a great job of getting into the spot. I notice that the parking performance when I manually choose the spot is much better than when the car does the entire parking process, meaning choosing the spot and parking in it.

It’s Doing Things (For Me) It’s Never Done Before

Two things that FSD has never done before, at least for me, are slow down in School Zones and avoid deer. The first is something I usually take over manually, and the second I surprisingly have not had to deal with yet.

Advertisement

I had my Tesla slow down at a school zone yesterday for the first time, traveling at 20 MPH and not 15 MPH as the sign suggested, but at the speed of other cars in the School Zone. This was impressive and the first time I experienced it.

I would like to see this more consistently, and I think School Zones should be one of those areas where, no matter what, FSD will only travel the speed limit.

Last night, FSD v14.2.2.5 recognized a deer in a roadside field and slowed down for it:

Navigation Still SUCKS

Navigation will be a complaint until Tesla proves it can fix it. For now, it’s just terrible.

It still has not figured out how to leave my neighborhood. I give it the opportunity to prove me wrong each time I leave my house, and it just can’t do it.

Advertisement

It always tries to go out of the primary entrance/exit of the neighborhood when the route needs to take me left, even though that exit is a right turn only. I always leave a voice prompt for Tesla about it.

It still picks incredibly baffling routes for simple navigation. It’s the one thing I still really want Tesla to fix.

Continue Reading

Investor's Corner

Tesla gets tip of the hat from major Wall Street firm on self-driving prowess

“Tesla is at the forefront of autonomous driving, supported by a camera-only approach that is technically harder but much cheaper than the multi-sensor systems widely used in the industry. This strategy should allow Tesla to scale more profitably compared to Robotaxi competitors, helped by a growing data engine from its existing fleet,” BoA wrote.

Published

on

Credit: Tesla

Tesla received a tip of the hat from major Wall Street firm Bank of America on Wednesday, as it reinitiated coverage on Tesla shares with a bullish stance that comes with a ‘Buy’ rating and a $460 price target.

In a new note that marks a sharp reversal from its neutral position earlier in 2025, the bank declared Tesla’s Full Self-Driving (FSD) technology the “leading consumer autonomy solution.”

Analysts highlighted Tesla’s camera-only architecture, known as Tesla Vision, as a strategic masterstroke. While technically more challenging than the multi-sensor setups favored by rivals, the vision-based approach is dramatically cheaper to produce and maintain.

This cost edge, combined with Tesla’s rapidly expanding real-world data engine, positions the company to scale robotaxis far more profitably than competitors, BofA argues in the new note:

Advertisement

“Tesla is at the forefront of autonomous driving, supported by a camera-only approach that is technically harder but much cheaper than the multi-sensor systems widely used in the industry. This strategy should allow Tesla to scale more profitably compared to Robotaxi competitors, helped by a growing data engine from its existing fleet.”

The bank now attributes roughly 52% of Tesla’s total valuation to its Robotaxi ambitions. It also flagged meaningful upside from the Optimus humanoid robot program and the fast-growing energy storage business, suggesting the auto segment’s recent headwinds, including expired incentives, are being eclipsed by these higher-margin opportunities.

Tesla’s own data underscores exactly why Wall Street is waking up to FSD’s potential. According to Tesla’s official safety reporting page, the FSD Supervised fleet has now surpassed 8.4 billion cumulative miles driven.

Tesla FSD (Supervised) fleet passes 8.4 billion cumulative miles

Advertisement

That total ballooned from just 6 million miles in 2021 to 80 million in 2022, 670 million in 2023, 2.25 billion in 2024, and a staggering 4.25 billion in 2025 alone. In the first 50 days of 2026, owners added another 1 billion miles — averaging more than 20 million miles per day.

This avalanche of real-world, camera-captured footage, much of it on complex city streets, gives Tesla an unmatched training dataset. Every mile feeds its neural networks, accelerating improvement cycles that lidar-dependent rivals simply cannot match at scale.

Tesla owners themselves will tell you the suite gets better with every release, bringing new features and improvements to its self-driving project.

The $460 target implies roughly 15 percent upside from recent trading levels around $400. While regulatory and safety hurdles remain, BofA’s endorsement signals growing institutional conviction that Tesla’s data advantage is not hype; it’s a tangible moat already delivering billions of miles of proof.

Advertisement
Continue Reading

News

Tesla to discuss expansion of Samsung AI6 production plans: report

Tesla has reportedly requested an additional 24,000 wafers per month, which would bring total production capacity to around 40,000 wafers if finalized.

Published

on

Tesla-Chips-HW3-1
Credit: Tom Cross

Tesla is reportedly discussing an expansion of its next-generation AI chip supply deal with Samsung Electronics. 

As per a report from Korean industry outlet The Elec, Tesla purchasing executives are reportedly scheduled to meet Samsung officials this week to negotiate additional production volume for the company’s upcoming AI6 chip.

Industry sources cited in the report stated that Tesla is pushing to increase the production volume of its AI6 chip, which will be manufactured using Samsung’s 2-nanometer process.

Tesla previously signed a long-term foundry agreement with Samsung covering AI6 production through December 31, 2033. The deal was reportedly valued at about 22.8 trillion won (roughly $16–17 billion).

Advertisement

Under the existing agreement, Tesla secured approximately 16,000 wafers per month from the facility. The company has reportedly requested an additional 24,000 wafers per month, which would bring total production capacity to around 40,000 wafers if finalized.

Tesla purchasing executives are expected to discuss detailed supply terms during their visit to Samsung this week.

The AI6 chip is expected to support several Tesla technologies. Industry sources stated that the chip could be used for the company’s Full Self-Driving system, the Optimus humanoid robot, and Tesla’s internal AI data centers.

The report also indicated that AI6 clusters could replace the role previously planned for Tesla’s Dojo AI supercomputer. Instead of a single system, multiple AI6 chips would be combined into server-level clusters.

Advertisement

Tesla’s semiconductor collaboration with Samsung dates back several years. Samsung participated in the design of Tesla’s HW3 (AI3) chip and manufactured it using a 14-nanometer process. The HW4 chip currently used in Tesla vehicles was also produced by Samsung using a 5-nanometer node.

Tesla previously planned to split production of its AI5 chip between Samsung and TSMC. However, the company reportedly chose Samsung as the primary partner for the newer AI6 chip.

Continue Reading