Connect with us

News

The Tesla Autopilot Excuse: How EV ignorance created the perfect storm for a misinformation nightmare

Credit: Tesla

Published

on

It was only a few hours after the accident and a bold statement was already making its rounds in the mainstream media. Another Tesla has crashed, and this time, it took the lives of two individuals from Texas. Facing inquiries from journalists eager for some clarity as to what happened in the tragic incident, Harris County Pct. 4 Constable Mark Herman shared a surprisingly confident and bold statement: there was no one in the ill-fated Model S’ driver seat when it crashed. 

“They are 100% certain that no one was in the driver seat driving that vehicle at the time of impact. They are positive. And again, the height from the back seat to the front seat, that would be almost impossible, but again our investigators are trained. They handle collisions. Several of our folks are reconstructionists, but they feel very confident just with the positioning of the bodies after the impact that there was no one driving that vehicle,” Herman said, also noting that the electric car’s fire was out of control for four hours. 

This statement, as well as the headlines that followed it, have since been proven false. And today, they stand as a remarkable case study on how misinformation spreads, and how the truth — even if it eventually emerges from legitimate sources — becomes largely ignored. This is the story of a Model S crash, rushed statements, and how general ignorance of electric vehicles could result in a massive misinformation nightmare. 

But to get a complete view of this story, one has to go back to that fateful night on April 17, 2021, when two men, a 59-year-old Tesla owner and his 69-year-old passenger, crashed after traveling just about 550 feet, before departing the road on a curve, driving over a curb, hitting a drainage culvert and a raised manhole, and smashing into a tree. The vehicle was ablaze following its crash.

The location where the accident happened. (Credit: NTSB)

The Accident

As it is with other Tesla crashes, the Model S crash in Texas immediately caught the attention of national media. It did not take long before even foreign outlets were running with the story. It was during this initial wave of media attention that Constable Mark Herman noted that investigators were 100% sure that there was no one driving the car when it crashed. This statement was gold to numerous media outlets, with some like the New York Post posting a tweet noting that the ill-fated Tesla was on Autopilot. It’s pertinent to note that the Constable never mentioned Autopilot, though his statement declaring that there was no one in the driver’s seat seemed like a strong enough link to the driver-assist suite. 

Soon, even organizations such as Consumer Reports joined the fray, graciously demonstrating that Autopilot could indeed be “fooled” into operating without a human in the driver’s seat. Consumer Reports‘ walkthrough was thorough, showing audiences exactly what needs to be done to defeat Autopilot’s safety measures. This stunt caught the eye of both national and international media as well, and by this time, the narrative was set: Teslas can drive themselves without a driver, and Autopilot could kill. It’s a chilling thought, but it is one that seemed to be casually supported by Ford CEO Jim Farley, who shared Consumer Reports‘ Autopilot defeat device walkthrough on his personal Twitter page. 

Advertisement
-->

This does not mean to say the narrative surrounding the fatal Model S crash in Texas was ironclad, however. Just days after the initial crash, Palmer Buck, fire chief for The Woodlands Township Fire Department, told the Houston Chronicle that contrary to some reports in the media, the ill-fated Model S was not ablaze for four hours. The fire chief also stated that firefighters did not call Tesla for help, and he was unaware of any hotlines for tips on how to control a battery fire. 

Opinion: Consumer Reports’ Tesla Autopilot stunt crossed a line in an already-heated EV climate

The First Cracks — And A Persistent Misunderstanding

Interestingly enough, even Constable Herman himself seemed less sure about his information later on, noting in a statement to Reuters that his investigators were “almost 99.9% sure” that there was no one in the driver’s seat of the ill-fated car. This was despite Herman noting that they had executed a search warrant on Tesla to secure data about the tragic incident. Meanwhile, Elon Musk went on Twitter to state that data logs so far showed that the ill-fated vehicle was not on Autopilot when it crashed. 

Tesla’s online community took it upon themselves to make sense of the situation, which seemed to have red flags all over the place. The Constable’s statements seemed premature at best, and reports about the vehicle’s fire had been proven false by the fire chief. Couple this with Elon Musk noting that Autopilot was not involved, and it was no surprise that the crash became a topic for analysis and conversations among Tesla supporters. These efforts, however, were largely dismissed if not mocked, with media outlets such as VICE stating that the behavior of the Tesla sleuths was akin to those who believe in conspiracy theories.

“Rather than waiting for the two different federal authorities investigating the crash to publish their findings, some Tesla owners are engaging in the classic behavior of conspiracy theorists and amateur internet sleuths in an apparent attempt to cast doubt on even the most basic facts surrounding the crash,” the publication noted. 

Advertisement
-->

More cracks about the initial “Autopilot crash” narrative emerged during the company’s Q1 2021 earnings call. Lars Moravy, Tesla’s vice president of vehicle engineering, stated that the company had conducted tests with investigators, and they have determined that Autosteer could not be engaged in the area. He also stated that judging by the distance of the vehicle from the owner’s home to the crash site, the Model S would have only accelerated to 30 mph before covering the entire 550-foot distance using Adaptive Cruise Control. This is undoubtedly a clarification about the incident, but like many things in this story, this was also misunderstood. 

Not long after Tesla’s Q1 2021 earnings call, CBS published a piece titled “At Least One Tesla Autopilot Feature Was Active During Texas Crash That Killed 2.” It’s definitely a catchy headline and one that was sure to draw a decent amount of eyes. There was only one problem: the whole premise of the article was false. To add salt to the wound, Texas Rep. Kevin Brady shared the CBS piece on Twitter, noting that “Despite early claims by (Tesla and Elon Musk), Autopilot WAS engaged in (the) tragic crash in The Woodlands. We need answers.” 

Advertisement
-->

A Grassroots Movement

In a world where misinformation is prevalent from media outlets that may or may not be incentivized to publish reports that are completely accurate, citizen journalism has the potential to become the voice of reason. And in the case of the Tesla Texas crash, this was certainly the case. After conversations with sources, some of whom have opted to remain anonymous, Teslarati could surmise that it was the efforts of regular people, from electric vehicle advocates and space enthusiasts who were inspired by Elon Musk’s SpaceX, that may have ultimately helped get the right information about the incident to the right place. 

Days after the incident, and a few weeks before the release of the National Transportation Safety Board (NTSB) preliminary report, @GoGundam1, a Texas-based SpaceX advocate, felt alarm bells in his head after Constable Herman declared confidently that he was 100% sure there was no one in the driver’s seat of the ill-fated Model S. Having been familiar with Elon Musk’s companies, the SpaceX enthusiast was also knowledgeable about Tesla and its products, which made the Constable’s statements seem disingenuous at best. Annoyed by the noticeably false narrative that was being formed, the space advocate sent out some feelers to test out the waters. 

The story that emerged was quite remarkable. Information gathered by citizen informants suggested that by April 22, Constable Herman’s office was already in possession of video evidence that was in direct contradiction to the narrative that was initially presented to the media. It was a disturbing thought, but informants also suggested that the office of the Constable had intentions to sit on the information for as long as possible. Granted, these events may seem like they came from the plot of a semi-decent movie, but considering the relative silence from the Constable following his statements of a search warrant being submitted to Tesla, it does seem like the motivations for a follow-up report clarifying the incident were not really there. 

Pertinent information about the Tesla Texas crash, no matter how valuable, would be next to useless if it did not catch the attention of the right entities. And thus, with the information gathered, the SpaceX enthusiast decided to reach out to members of the Tesla community for help. It was a challenging task, but eventually, @LordPente, a longtime Tesla advocate, decided to lend a hand. After numerous messages to other members of the Tesla community, the longtime EV advocate appeared to hit a breakthrough by (seemingly) reaching someone at Tesla. The SpaceX enthusiast, for his part, failed to get in touch with Tesla but was able to send a report to the NTSB, tipping off the agency about the additional video evidence in the Constable’s office. 

During Teslarati’s conversation with the informant and the Tesla advocate, both noted that they were not really sure if their information reached the right entities. However, something happened not long after which suggested that it did. 

Advertisement
-->
The remains of the ill-fated Tesla Model S (Credit: NTSB)

The Lie Unravels

On May 10, 2021, the National Transportation Safety Board (NTSB) published its preliminary report about the Tesla Model S’ fatal Texas crash. As per the NTSB’s report, “footage from the owner’s home security camera shows the owner entering the car’s driver’s seat and the passenger entering the front passenger seat.” Apart from this, the NTSB also noted that tests of a similar vehicle at the crash location showed that Autopilot could not be engaged in the area, just as Tesla and the electric vehicle community suggested amidst the initial wave of “Autopilot crash” reports. The investigation is ongoing, of course, but based on what the NTSB has published so far, it appears that Autopilot has been absolved in the incident. 

The findings presented in the NTSB’s report all but confirmed what Elon Musk and Tesla supporters were arguing online. It may be disappointing to media outlets like VICE, but as it turned out, the conspiracy theorist-like behavior exhibited by some Tesla sleuths online turned out to be justified. There really was misinformation being floated around, and if it wasn’t for the efforts of a few individuals, pertinent information about the incident might not have been submitted to Tesla or the NTSB on time. 

Interestingly enough, Harris County Pct. 4 Constable Mark Herman has remained silent for now. Teslarati has attempted to reach out to his office through email but was unsuccessful. The Constable, at least for now, seems yet to issue a correction or retraction of his initial and now-debunked statements about the incident. Individuals such as Texas Rep. Kevin Brady have not admitted to making a mistake either. 

How Misinformation Becomes Truth

Tesla, being a rather unorthodox company led by an equally unorthodox man, tends to fall victim to misinformation — lots and lots of it. The story of the Texas crash is a great example, but it is one drop in a whole bucket full of inaccurate reports about the company. Tesla CEO Elon Musk has seemingly thrown the towel with mainstream media coverage, reportedly abolishing Tesla’s PR department last year. This, of course, has pretty much opened the doors to even more misinformation — and to a point, even disinformation — which, in turn, becomes the general public’s truth. 

For professional insights on how misinformation becomes accepted, Teslarati reached out to Stephen Benning, a Professor of Psychology at the University of Las Vegas. Professor Benning explained that humans tend to have an anchoring bias, in which the first information used to make a judgment influences it. While anchoring bias is typically considered in numerical judgments (like estimates on how much something is worth), it could also play out when people hear the first reports of what happened. This is most notable if the event were memorable, like a fatal Tesla crash. The initial information would likely stick on people’s minds and create an initial framework that sets their beliefs about an event. 

“Because initial reports set people’s prior beliefs, additional information has to weigh against established beliefs. People might have additional biases at play, like the confirmation bias that filters out information that isn’t consistent with a previous set of beliefs. It’s as if people put up filters to help themselves maintain the consistency of their beliefs at the expense of their potential correspondence with reality. The initial crash reports were also likely more vivid than the drier details of the subsequent investigation, so the availability heuristic might make those initial reports more vivid and accessible in people’s memories when they think about the crash – even if they’ve followed the subsequent reports,” he wrote. 

Advertisement
-->

Tesla owner apologizes for staging “brake failure” incident in China

Emma Frances Bloomfield (Ph.D.), currently an Assistant Professor of Communication Studies at the University of Nevada, Las Vegas with an expertise in strategies for combatting misinformation, explained to Teslarati that ultimately, misinformation and disinformation travel very quickly because they tend to be compelling and engaging, all while confirming an audience’s biases. This made the Texas crash a perfect storm of sorts, as it had a compelling event that catered to biases against Tesla and its Autopilot system. Unfortunately, Assistant Professor Bloomfield also highlighted that once misinformation sets in, it takes a ton of effort to overturn. 

“To address misinformation, people can create more complete stories that replace the incorrect one, provide trustworthy authority figures to deliver the message, and not repeat the false information when making the correction. You can also emphasize the importance of accurate information to make the best decisions moving forward and highlight how those changes might benefit the audience/consumer. We also say, ‘correct early and correct often’ to try and get ahead of the temporal advantage misinformation has and to counter the repetition of the false information,” she wrote. 

A Battle That Tesla Doesn’t Need To Lose

If there is something highlighted by Professor Benning and Assistant Professor Bloomfield, it is that misinformation is hard to battle once it’s settled in. And for a lie to settle in, it has to be repeated. The Texas crash demonstrated this. It didn’t start with a lie, but it started with a premature, careless statement that could be easily twisted into one.

The Constable’s certainty that there was no one in the driver’s seat was premature at best, and reports about the incident being an Autopilot crash were also premature then, or a lie at worst. Reports about an uncontrollable blaze burning for four hours were false as well. Yet the narrative was so hammered down and unchallenged that even when the NTSB preliminary report came out, the needle barely moved. 

Advertisement
-->

Elon Musk’s reservations about maintaining a relationship with the media are understandable. Years of inaccurate reports tend to do that to a person. However, Tesla could also adopt a much more assertive anti-misinformation strategy. Tesla China has been doing this as of late, to great results. Anyone following the Tesla China story would know that the company was embroiled in a PR storm that involved alleged reports of “brake failure” incidents surrounding the company’s vehicles. But after an assertive legal campaign from Tesla China, media outlets have issued apologies for misreporting on the company and social media personalities have admitted to making up alleged incidents that painted the company’s vehicles in a negative light. Granted, such strategies may not be as effective in the United States, but something has to be done. What this something is remains up for question. 

Do you have anything to share with the Teslarati Team? We’d love to hear from you, email us at tips@teslarati.com.

Simon is an experienced automotive reporter with a passion for electric cars and clean energy. Fascinated by the world envisioned by Elon Musk, he hopes to make it to Mars (at least as a tourist) someday. For stories or tips--or even to just say a simple hello--send a message to his email, simon@teslarati.com or his handle on X, @ResidentSponge.

Advertisement
Comments

Elon Musk

Tesla’s Elon Musk: 10 billion miles needed for safe Unsupervised FSD

As per the CEO, roughly 10 billion miles of training data are required due to reality’s “super long tail of complexity.” 

Published

on

Credit: @BLKMDL3/X

Tesla CEO Elon Musk has provided an updated estimate for the training data needed to achieve truly safe unsupervised Full Self-Driving (FSD). 

As per the CEO, roughly 10 billion miles of training data are required due to reality’s “super long tail of complexity.” 

10 billion miles of training data

Musk comment came as a reply to Apple and Rivian alum Paul Beisel, who posted an analysis on X about the gap between tech demonstrations and real-world products. In his post, Beisel highlighted Tesla’s data-driven lead in autonomy, and he also argued that it would not be easy for rivals to become a legitimate competitor to FSD quickly. 

“The notion that someone can ‘catch up’ to this problem primarily through simulation and limited on-road exposure strikes me as deeply naive. This is not a demo problem. It is a scale, data, and iteration problem— and Tesla is already far, far down that road while others are just getting started,” Beisel wrote. 

Musk responded to Beisel’s post, stating that “Roughly 10 billion miles of training data is needed to achieve safe unsupervised self-driving. Reality has a super long tail of complexity.” This is quite interesting considering that in his Master Plan Part Deux, Elon Musk estimated that worldwide regulatory approval for autonomous driving would require around 6 billion miles. 

Advertisement
-->

FSD’s total training miles

As 2025 came to a close, Tesla community members observed that FSD was already nearing 7 billion miles driven, with over 2.5 billion miles being from inner city roads. The 7-billion-mile mark was passed just a few days later. This suggests that Tesla is likely the company today with the most training data for its autonomous driving program. 

The difficulties of achieving autonomy were referenced by Elon Musk recently, when he commented on Nvidia’s Alpamayo program. As per Musk, “they will find that it’s easy to get to 99% and then super hard to solve the long tail of the distribution.” These sentiments were echoed by Tesla VP for AI software Ashok Elluswamy, who also noted on X that “the long tail is sooo long, that most people can’t grasp it.”

Continue Reading

News

Tesla earns top honors at MotorTrend’s SDV Innovator Awards

MotorTrend’s SDV Awards were presented during CES 2026 in Las Vegas.

Published

on

Credit: Tesla China

Tesla emerged as one of the most recognized automakers at MotorTrend’s 2026 Software-Defined Vehicle (SDV) Innovator Awards.

As could be seen in a press release from the publication, two key Tesla employees were honored for their work on AI, autonomy, and vehicle software. MotorTrend’s SDV Awards were presented during CES 2026 in Las Vegas.

Tesla leaders and engineers recognized

The fourth annual SDV Innovator Awards celebrate pioneers and experts who are pushing the automotive industry deeper into software-driven development. Among the most notable honorees for this year was Ashok Elluswamy, Tesla’s Vice President of AI Software, who received a Pioneer Award for his role in advancing artificial intelligence and autonomy across the company’s vehicle lineup.

Tesla also secured recognition in the Expert category, with Lawson Fulton, a staff Autopilot machine learning engineer, honored for his contributions to Tesla’s driver-assistance and autonomous systems.

Tesla’s software-first strategy

While automakers like General Motors, Ford, and Rivian also received recognition, Tesla’s multiple awards stood out given the company’s outsized role in popularizing software-defined vehicles over the past decade. From frequent OTA updates to its data-driven approach to autonomy, Tesla has consistently treated vehicles as evolving software platforms rather than static products.

Advertisement
-->

This has made Tesla’s vehicles very unique in their respective sectors, as they are arguably the only cars that objectively get better over time. This is especially true for vehicles that are loaded with the company’s Full Self-Driving system, which are getting progressively more intelligent and autonomous over time. The majority of Tesla’s updates to its vehicles are free as well, which is very much appreciated by customers worldwide.

Continue Reading

Elon Musk

Judge clears path for Elon Musk’s OpenAI lawsuit to go before a jury

The decision maintains Musk’s claims that OpenAI’s shift toward a for-profit structure violated early assurances made to him as a co-founder.

Published

on

Gage Skidmore, CC BY-SA 4.0 , via Wikimedia Commons

A U.S. judge has ruled that Elon Musk’s lawsuit accusing OpenAI of abandoning its founding nonprofit mission can proceed to a jury trial. 

The decision maintains Musk’s claims that OpenAI’s shift toward a for-profit structure violated early assurances made to him as a co-founder. These claims are directly opposed by OpenAI.

Judge says disputed facts warrant a trial

At a hearing in Oakland, U.S. District Judge Yvonne Gonzalez Rogers stated that there was “plenty of evidence” suggesting that OpenAI leaders had promised that the organization’s original nonprofit structure would be maintained. She ruled that those disputed facts should be evaluated by a jury at a trial in March rather than decided by the court at this stage, as noted in a Reuters report.

Musk helped co-found OpenAI in 2015 but left the organization in 2018. In his lawsuit, he argued that he contributed roughly $38 million, or about 60% of OpenAI’s early funding, based on assurances that the company would remain a nonprofit dedicated to the public benefit. He is seeking unspecified monetary damages tied to what he describes as “ill-gotten gains.”

OpenAI, however, has repeatedly rejected Musk’s allegations. The company has stated that Musk’s claims were baseless and part of a pattern of harassment.

Advertisement
-->

Rivalries and Microsoft ties

The case unfolds against the backdrop of intensifying competition in generative artificial intelligence. Musk now runs xAI, whose Grok chatbot competes directly with OpenAI’s flagship ChatGPT. OpenAI has argued that Musk is a frustrated commercial rival who is simply attempting to slow down a market leader.

The lawsuit also names Microsoft as a defendant, citing its multibillion-dollar partnerships with OpenAI. Microsoft has urged the court to dismiss the claims against it, arguing there is no evidence it aided or abetted any alleged misconduct. Lawyers for OpenAI have also pushed for the case to be thrown out, claiming that Musk failed to show sufficient factual basis for claims such as fraud and breach of contract.

Judge Gonzalez Rogers, however, declined to end the case at this stage, noting that a jury would also need to consider whether Musk filed the lawsuit within the applicable statute of limitations. Still, the dispute between Elon Musk and OpenAI is now headed for a high-profile jury trial in the coming months.

Continue Reading