News
Tesla Model S on Autopilot crashes into stalled van on highway
A Tesla Model S crashes into the back of a stalled vehicle on a highway. Who is responsible, Autopilot, TACC, or the driver? Ultimately, we know the answer but not everyone wants to admit it.

A Tesla Model S on Autopilot crashed into the back of a stalled van in the high speed lane of a highway this week. The owner Chris Thomann who caught the accident through his dash cam believes it shows the Traffic Aware Cruise Control/Autopilot feature of his car malfunctioned. According to the description on Thomann’s YouTube video, he claims Autopilot and TACC have worked flawlessly many times before, but this time “The forward collision warning turned on way too late, it was set to normal warning distance”.
Updated: The original YouTube video has been marked as private so we added this animated gif via CNET showing the events of what happened.
There have been several instances lately in which Tesla drivers claim their cars malfunctioned, leading to collisions. Is there something wrong with these systems that people should be aware of?
The answer appears to be “No.” On Reddit, Tesla owner Ricodic took the time to post this language from page 69 of the Model S owner’s manual:
Warning: Traffic-Aware Cruise Control can not detect all objects and may not brake/decelerate for stationary vehicles, especially in situations when you are driving over 50 mph (80 km/h) and a vehicle you are following moves out of your driving path and a stationary vehicle or object, bicycle, or pedestrian is in front of you instead. Always pay attention to the road ahead and stay prepared to take immediate corrective action. Depending on Traffic-Aware Cruise Control to avoid a collision can result in serious injury or death. In addition, Traffic-Aware Cruise Control may react to vehicles or objects that either do not exist or are not in the lane of travel, causing Model S to slow down unnecessarily or inappropriately.
The problem is not with the software, it is with human drivers. It’s not that we don’t trust the technology; it’s that we trust it too much. We assume it means we can read the paper on the way to work or fall asleep at the wheel. We get lulled into a sense of false security by how well Autopilot and TACC work most of the time. The failure is in the human brain, which needs a moment or two to recognize that an emergency is in the making and that it is time to re-assert control over the vehicle.
Tesla owner Jarrod Overson spoke about this candidly in a post on Medium after his car suffered a collision in April. “Once I recognized the car was stopped in front of me, I explicitly remember panicking with the following thoughts going through my head: “Does my car see this? Is it going to do anything? NO. NO IT ISN’T. EMERGENCY.” In retrospect, the actions I needed to take were obvious . I should have regained control immediately. That half of a second or more probably would have made a lot of difference. The problem is that my brain wasn’t primed to have that conversation with itself. Now it is.”
Overson knew some would take him to task for his error in judgment. “I’m not looking forward to the comments calling me stupid for not doing this automatically, but I felt like it’s an important topic to be open about. I’d wager we all had a time in our lives where we didn’t know the extent of some technology, trusted it too far, and had to recalibrate after we understood the limits. Now we might just have to be a little bit luckier to get to that recalibration stage.”
It’s what autonomous driving experts refer to as “the handoff,” that brief period of time between when everything is going along serenely and when it is not. It’s when the computer suddenly finds itself in one of what Elon Musk calls a “corner case.” Those are instances that requires human input. Often, drivers have less than a second to react.
As good as Autopilot is — and it is getting better all the time — Tesla drivers still must be aware that the company and the software expect them to step in when necessary. Many put too much faith in the technology and are willing to abdicate ultimate responsibility for the operation of the car to machines.
The glowing praises we often hear from Elon make it easy to do. Perhaps Musk and Tesla could back their statements about the wonders they have created down a notch. Not everyone reads every page of the owner’s manual and even fewer commit everything found in the instructions to memory.
News
Tesla rolls out new life-saving feature for kids in Europe
On average, 37 children die every year from being left in vehicles unattended.

Tesla is rolling out a new life-saving feature in the European market, one that has been available in the United States for some time and can be considered potentially invaluable.
One of the most preventable causes of death for children is being left in cars unattended. On average, 37 children die every year after being left in hot vehicles. The cause of death is usually heatstroke, and it is incredibly avoidable.
Tesla rolls out new crucial safety feature aimed at saving children
However, there are instances where kids are left in vehicles and lose their lives, something that many companies have tried to fight with alerts and features of their own.
Tesla is one of them, as it has rolled out features like ultrasonic sensors to detect heartbeats, interior cameras to detect movement, and alerts to notify parents if they leave someone in the car.
A few months ago, Tesla rolled out a new feature called “Child Left Alone Detection” in the United States. It was described as:
“If an unattended child is detected, the vehicle will flash the exterior indicator lights, play an alert tone, and send a notification to your Tesla app. This will repeat at regular intervals until you return to your vehicle. Cabin data is processed locally and is not transmitted to Tesla.
This feature is enabled by default. To disable, go to Controls > Safety > Child Left Alone Detection.”
This feature was only rolled out in the U.S. at the time. It is now making its way to the European market, according to Not a Tesla App, which detected the rollout in the 2025.32.6 software update.
The rollout of this feature could specifically change many unfortunate situations. For many of us, it seems hard to think about leaving something as precious as another human life in a hot car. Many of us won’t leave our vehicles without our cell phones, so it seems unlikely that someone would do it without a child.
News
Tesla gets another NHTSA probe, this time related to door handles
“Although Tesla vehicles have manual door releases inside of the cabin, in these situations, a child may not be able to access or operate the releases even if the vehicle’s driver is aware of them.”

Tesla is facing another investigation into its vehicles by the National Highway Traffic Safety Administration (NHTSA), this time related to an issue with its door handles.
In a new Open Investigation named “Electronic door handles become inoperative,” the NHTSA says that it has received nine complaints from owners of the 2021 Tesla Model Y stemming from “an inability to open doors.”
These issues were reported after “parents exited their vehicle after a drive cycle in order to remove a child from the pack seat or placing a child in the back seat before starting a drive cycle.” Parents said they were “unable to reopen a door to regain access to the vehicle.”
Tesla door handles become unlikely hero as they stump road rager
Four of the nine complaints ended with having to break a window to regain access to the cabin.
🚨 Model Year 2021 Tesla Model Y vehicles are under a preliminary investigation by the NHTSA due to a potential issue with door handles, with nine owners reporting an inability to open doors from the outside
“The most commonly reported scenarios involved parents exiting the… pic.twitter.com/u0qBBiu9LT
— TESLARATI (@Teslarati) September 16, 2025
The NHTSA goes on to explain that, while Teslas do have a manual door release inside the cabin, a child may not be able to access it:
“Although Tesla vehicles have manual door releases inside of the cabin, in these situations, a child may not be able to access or operate the releases even if the vehicle’s driver is aware of them. As a result, in these instances, an occupant who remains inside a vehicle in this condition may be unable to be rapidly retrieved by persons outside of the vehicle.”
It appears that the agency is attributing the issue to a low voltage in the vehicle’s 12V DC battery. This would mean there needs to be some sort of notification to the driver that the battery is running low on power and should be replaced to avoid this issue.
The NHTSA estimates that 174,290 vehicles are potentially impacted by this issue. It plans to assess the scope and severity of the condition, the agency says. The NHTSA also wants to see what approach Tesla uses to supply power to door locks and the reliability of the applicable power supplies.
News
Tesla won’t implement strange Grok character as Musk dispels rumor
It is nice to see that Tesla is not forcing this kind of character upon owners of their vehicles, especially considering that many people had a real problem with it.

Tesla is not going to implement a strange character as a Grok assistant in its vehicles, as CEO Elon Musk dispelled the rumor, which seemed to truly invoke some quite polarizing reactions.
Yesterday, there was some controversy within the Tesla community as rumors of a Grok assistant, named Mūn (pronounced like Moon), being implemented into the vehicles started to circulate.
It had some legitimacy. It was initially posted by an employee, and it appeared to be a relatively confirmed development.
However, it really did rub some people the wrong way. Mūn was an Anime-style female dressed in promiscuous clothing, so it was not everyone’s style, and I’m sure not everyone’s significant other’s cup of tea. It seemed a very strange decision to add it, especially considering that, at the time, there was no confirmation to dispel the arrival of the Grok assistant.
That was until Tesla CEO Elon Musk stepped in to put the speculation to bed once and for all.
🚨 Elon has confirmed the Grok assistant rumor with the character Mūn is untrue https://t.co/EC7absBZSj pic.twitter.com/1Skhvy9USQ
— TESLARATI (@Teslarati) September 16, 2025
It was somewhat strange that this type of issue arose in the first place, but given that it was initially released by an employee, the entire situation is self-explanatory.
It is nice to see that Tesla is not forcing this kind of character upon owners of their vehicles, especially considering that many people had a real problem with it. Many owners did not shy away from the fact that they would like the option to opt out:
I want something family friendly…like an Optimus avatar or something.
— FSD (Unsupervised) Test Pilot (@j32pmxr) September 16, 2025
For now, Grok remains a part of Tesla vehicles, and personally, it is very nice to have in my Model Y to answer some quick questions I might have or even to entertain some people in the car.
Nevertheless, I am relieved I won’t have this character forced upon me in my vehicle.
-
Elon Musk2 weeks ago
Tesla’s next-gen Optimus prototype with Grok revealed
-
News1 week ago
Tesla launches new Supercharger program that business owners will love
-
Elon Musk1 week ago
Tesla Board takes firm stance on Elon Musk’s political involvement in pay package proxy
-
News2 weeks ago
Tesla appears to be mulling a Cyber SUV design
-
News2 weeks ago
Tesla deploys Unsupervised FSD in Europe for the first time—with a twist
-
News2 weeks ago
Tesla explains why Robotaxis now have safety monitors in the driver’s seat
-
News2 weeks ago
Tesla is already giving Robotaxi privileges hours after opening public app
-
Elon Musk2 weeks ago
Elon Musk says Tesla will take Safety Drivers out of Robotaxi: here’s when