Elon Musk roasts WSJ coverage of fatal Tesla crash, defends Autopilot

Credit: Matt Dougherty/Twitter

Tesla CEO Elon Musk roasted the Wall Street Journal for their coverage of the crash that killed two men involving a 2019 Model S. Mainstream media reports claimed that the vehicle was “driverless” in an attempt to cast bad press on Tesla’s Autopilot and Full Self-Driving systems. Musk shared new details regarding the vehicle involved, revealing that the car was not subscribed to the Full Self-Driving program, a voluntary purchase made by Tesla owners, nor could it have been operating on Autopilot due to a lack of road lines.

Musk, in a reply to a skeptical Twitter user who didn’t believe a media outlet’s coverage of the accident, said:

“Data logs recovered so far show Autopilot was not enabled & this car did not purchase FSD. Moreover, standard Autopilot would require lane lines to turn on, which this street did not have.”

Tesla alleged “driverless” crash in Texas: What is known so far

The Wall Street Journal reported the story with the headline “Fatal Tesla Crash in Texas Believed to Be Driverless.” This uses the automatic association that Tesla electric vehicles have with self-driving programs. However, Tesla does not, nor has it ever claimed to have a self-driving vehicle or software that would make a car drive without the driver needing to pay attention. Tesla has a suite called Full Self-Driving (FSD) but has maintained that it is still the driver’s responsibility to pay attention to the road and abide by all road rules. The FSD suite is available for $10,000 and can be purchased at any time.

However, the vehicle involved in the accident was not subscribed to the FSD program. Tesla keeps records of its cars and can track whether any car has FSD or not. Musk claims that data logs that have been recovered thus far show that the involved vehicle did not have FSD.

Additionally, Tesla’s basic Autopilot suite, which now comes standard on every vehicle, would not have been able to function with the road conditions presented in the area of the accident. Standard Autopilot requires road lines to be functional. This road was unmarked, so Tesla’s basic Autopilot feature would not have been active. It’s impossible.

Unfortunately, Tesla’s Autopilot and Full Self-Driving suites are usually the first points of blame when an accident occurs. When Tesla’s are involved in violent or fatal accidents, they are very publicly covered by media outlets. It only adds fuel to the skeptic’s fire against the FSD and self-driving programs that Tesla is currently working on completing. In the past, Elon Musk has stated that Tesla may accomplish Level 5 autonomy by the end of 2021, but the CEO and the company have never claimed that a Tesla vehicle can drive itself. The company has also enforced several barriers that would prevent a driver from letting the vehicle operate independently. If a driver does not keep their hands on the steering wheel while the car is in motion, the vehicle will automatically pull over, and Autopilot will be deactivated for the remainder of the drive.

The company also has revoked FSD access to some drivers after abusing the capabilities of the software and not handling it responsibly.

Reports indicate that the NHTSA has launched an investigation into the Texas crash to determine the cause of the fatal collision.

Tesla recently released its Q1 2021 Safety Report, which showed that cars operating on Autopilot are nearly ten times safer than cars that are being operated by a humans.

Joey Klender: Transportation Writer | Penn State Alum | Future World Series of Poker Bracelet Holder 🚀 🛰 ☀️ 🚘 🧠 🕳
Disqus Comments Loading...