News
Tesla with sleeping driver proves there’s still misunderstanding and irresponsibility surrounding autonomy
Update: 11:06 AM EST: Paragraph 7 added to show the probability of the driver having a medical emergency. California Highway Patrol saw the vehicle and noted the driver was awake after catching up to the car.
A Tesla Model Y with a sleeping driver was recently spotted on the I-15 Freeway near Temecula, California, which proves that people and media still have a vast misunderstanding and irresponsible tone regarding the capabilities of semi-autonomous vehicles.
According to a report from KTLA 5, a woman in a Tesla Model Y was followed by another driver for more than fifteen minutes on the California interstate in an attempt to wake the woman who was taking advantage of the automaker’s semi-autonomous driving functions up.
The report and the incident prove there are still huge misunderstandings in the capabilities of semi-autonomous driving suites, including Tesla’s Full Self-Driving and Autopilot, which require users to remain vigilant and be prepared to take over the vehicle at any point.
Drivers utilize semi-autonomous vehicle functionalities irresponsibly often, and social media has proven time and time again that people take advantage of the capabilities, even though they are not fully autonomous.
It is no secret that people and companies have utilized whatever they can to alleviate themselves of the responsibility of paying attention while the car operates some tasks on its own. With the introduction of advanced driver assistance systems (ADAS) over the past several years, drivers have taken advantage of the functions to instead play on their phones, read books, eat food, or even catch up on sleep.
Tesla’s camera-based driver monitoring system goes through the cellphone test
However, the risks that come with this behavior are potentially catastrophic. For one, those who use these functions irresponsibly put themselves and every other driver on the road at risk because if the vehicle needs assistance or encounters a situation where it would not react safely, the driver is responsible for taking over. Additionally, if an accident occurs, it can be framed as Tesla’s, or any other manufacturer’s fault, depending on the vehicle used, and instances like this can set the future of semi-autonomous and autonomous driving back years due to skepticism.
There is the possibility that the driver had some type of medical emergency or accidentally fell asleep, in which the Tesla’s functionalities kept the operator and others safe. Police stated the driver was caught up to two minutes after receiving calls about the driver, and the driver was attentive at this time.
However, the media’s portrayal of the situation also proves that many are widely uninformed regarding the capabilities of Teslas. While Tesla’s Full Self-Driving suite has caused controversy over its name, the automaker continues to remind those who use it to remain vigilant, as the cars cannot truly drive themselves.
In Tesla’s FAQ section of the Autopilot and Full Self-Driving page, the company answers the question, “Do I still need to pay attention while using Autopilot?:”
“Yes. Autopilot is a hands-on driver assistance system that is intended to be used only with a fully attentive driver. It does not turn a Tesla into a self-driving car nor does it make a car autonomous.
Before enabling Autopilot, you must agree to “keep your hands on the steering wheel at all times” and to always “maintain control and responsibility for your car.” Once engaged, Autopilot will also deliver an escalating series of visual and audio warnings, reminding you to place your hands on the wheel if insufficient torque is applied. If you repeatedly ignore these warnings, you will be locked out from using Autopilot during that trip.
You can override any of Autopilot’s features at any time by steering, applying the brakes, or using the cruise control stalk to deactivate.”
Media labeling the vehicle as “a self-driving Tesla” is a disservice to people and the company. Teslas do not drive themselves, as the vehicles are defined as Level 2, according to the Society of Automotive Engineers Levels of Driving Automation. Level 2 systems reiterate that the driver is still responsible for driving the car when these systems are activated. “You must constantly supervise these support features,” the SAE says. Level 3 to Level 5 systems maintain that the operator is not driving the car, but Level 5 systems are the only ones that are explicitly labeled as “self-driving.”
“This feature can drive the vehicle under all conditions,” the SAE table states.

Credit: Society of Automotive Engineers
Recent ratings by Consumer Reports showed that Tesla’s biggest flaw was driver monitoring. Many systems use cabin-facing cameras to monitor eye behavior to ensure the operator is keeping their eyes on the road. Teslas use a series of audible and visual cues to alert drivers of their inattentiveness, and steering wheel sensors make sure the driver keeps their hands on the wheel.
However, various cheat devices have been marketed across the internet, and in this instance, the driver appears to have their hands on the wheel while they are dozed.
Tesla activated camera-based driver monitoring in May 2021. “The cabin camera above your rearview mirror can now detect and alert driver inattentiveness while Autopilot is engaged,” Tesla said in the notes. Tests of Tesla’s driver monitoring tests showed the system was effective in some instances, especially when looking at cell phones, with alerts coming in 15 seconds.
The potential irresponsibility of users puts major risks to those on the road and the companies that develop these driver assistance programs. While there are workarounds through the previously-mentioned cheat devices, people have to know their irresponsibility could cost them, or others, their lives.
I’d love to hear from you! If you have any comments, concerns, or questions, please email me at joey@teslarati.com. You can also reach me on Twitter @KlenderJoey, or if you have news tips, you can email us at tips@teslarati.com.
Elon Musk
Elon Musk reveals date of Tesla Full Self-Driving’s next massive release
Initially planned for a January or February release, v14.3 aims to add some reasoning and logic to the decisions that Full Self-Driving makes, which could improve a lot of things, including Navigation, which is a major complaint of many owners currently.
Tesla CEO Elon Musk revealed the date of Full Self-Driving’s next massive release: v14.3.
For months, Tesla owners with Hardware 4 have been utilizing Full Self-Driving v14.2 and subsequent releases. Currently, the most up-to-date FSD version is v14.2.2.5, which has definitely brought out mixed reviews. With releases, some things get better, and other things might regress slightly.
For the most part, things are better in terms of overall behavior.
However, many owners have been looking forward to the next release, which is v14.3, about which Musk has said many great things. Back in November, Musk said that v14.3 “is where the last big piece of the puzzle lands.”
He added:
“We’re gonna add a lot of reasoning and RL (reinforcement learning). To get to serious scale, Tesla will probably need to build a giant chip fab. To have a few hundred gigawatts of AI chips per year, I don’t see that capability coming online fast enough, so we will probably have to build a fab.”
Initially planned for a January or February release, v14.3 aims to add some reasoning and logic to the decisions that Full Self-Driving makes, which could improve a lot of things, including Navigation, which is a major complaint of many owners currently.
Tesla Full Self-Driving v14.2 is a considerable improvement from early versions of the suite, but we have written about the somewhat confusing updates that have come with recent versions.
Tesla Full Self-Driving v14.2.2.5 might be the most confusing release ever
They’ve been incredibly difficult to gauge in terms of progress because some things have gotten better, but there seems to be some real regression on a handful of things, especially with confidence and assertiveness.
Musk confirmed today on X that Tesla is already testing v14.3 internally right now. It will hit a wide release “in a few weeks,” so we should probably expect it by late April.
It’s in testing right now. Wide release in a few weeks.
— Elon Musk (@elonmusk) March 19, 2026
Overall, there are high hopes that v14.3 could be a true game changer for Tesla Full Self-Driving, as many believe it could be the version that Robotaxis in Austin, Texas, some of which are driverless and unsupervised, are running.
It could also include some major additions, including “Banish,” also referred to as “Reverse Summon,” which would go find a parking spot after dropping occupants off at their destination.
What Tesla will roll out, and when exactly it arrives, all remain to be seen, but fans have been ready for a new version as v14.2.2.5 has definitely run its course. We have had a lot of readers tell us their biggest request is to fix Navigation errors, which seem to be one of the most universal complaints among daily FSD users.
Cybertruck
Chattanooga Charge: Tesla and EV fans ready for the Southeast’s wildest Tesla party
From Cybertruck Convoys to Kid-Friendly Fun Zones: The Chattanooga Charge Has Something for Everyone
Hundreds of like-minded Tesla and EV enthusiasts are descending on Chattanooga Charge this weekend for the largest Tesla meet in the Southeast. Taking place on March 20–22, 2026 at the stunning Tennessee Riverpark.
If you were there last year, you’ll know that it’s the ultimate experience to see the wildest Teslas in action, see the best in EV tech, and arguably the most fun – finally put a name to the face and connect with those social media buddies IRL! Oh, and that epic night time Tesla light show is a once-in-a-lifetime experience that will transform the Riverpark into something out of a sci-fi film that’s remarkably unforgettable and must be seen in person.
This year’s event takes everything up a notch, with over 100 Cybertrucks expected to be on display, many sporting jaw-dropping modifications and custom wraps that push the boundaries of what these stainless steel beasts can look like.
Whether you’re a diehard Tesla fan, EV supporter, or just EV-mod-curious, the sheer spectacle is worth the drive.
The Chattanooga Charge doesn’t wait until Saturday morning to get started. The weekend technically kicks off Friday, March 20th, and the venue sets the tone immediately. Come share roadtrip stories over drinks at the W-XYZ Rooftop Bar on the top floor of the Aloft Chattanooga Hamilton Place Hotel, with sunset views over the city.
Come morning, nurse your hangover with a some good coffee, and convoy with hundreds of other Tesla and EV drivers through Chattanooga to the event for some morning meet and greets before the speaker panel starts and the food trucks fire up.
Tesla owner clubs travel from across the country to be here, not just to show off their vehicles,, but to connect, share, and celebrate a shared passion for the future of driving.
Sounds like a plan to me. See you there, guys. Don’t miss it. Get your tickets at ChattanoogaCharge.com and join the charge. 🔋⚡
Chattanooga Charge is a premier Tesla and EV gathering inspired by the X Takeover, known as one of the largest Tesla event gatherings. What began as a bold idea from the team at DIY Wraps/TESBROS, hosted in their hometown of Chattanooga, Tennessee, the event quickly became a movement across social media. The first annual Chattanooga Charge united over 16 Tesla clubs from 16 states, proof that the EV community was hungry for something big in the South. Year after year, the event has grown in scale, ambition, and heart.
News
Tesla Full Self-Driving gets latest bit of scrutiny from NHTSA
The analysis impacts roughly 3.2 million vehicles across the company’s entire lineup, and aims to identify how the suite’s degradation detection systems work and how effective they are when the cars encounter difficult visibility conditions.
The National Highway Traffic Safety Administration (NHTSA) has elevated its probe into Tesla’s Full Self-Driving (Supervised) suite to an Engineering Analysis.
The analysis impacts roughly 3.2 million vehicles across the company’s entire lineup, and aims to identify how the suite’s degradation detection systems work and how effective they are when the cars encounter difficult visibility conditions.
The step up into an Engineering Analysis is often required before the NHTSA will tell an automaker to issue a recall. However, this is not a guarantee that a recall will be issued.
🚨 The NHTSA said it was upgrading a probe into Tesla’s Full Self-Driving (Supervised) platform to an “engineering analysis”
It will examine 3.2 million vehicles and aims to determine its effectiveness in evaluating degraded road conditions pic.twitter.com/2dkrv1mR8o
— TESLARATI (@Teslarati) March 19, 2026
The NTHSA wants to examine Tesla FSD’s ability to assess road conditions that have reduced visibility, as well as detect degradation to alert the driver with sufficient time to respond.
The Office of Defects Investigation (ODI) will evaluate the performance of FSD in degraded roadway conditions and the updates or modifications Tesla makes to the degradation detection system, including the timing, purpose, and capabilities of the updates.
Tesla routinely ships software updates to improve the capabilities of the FSD suite, so it will be interesting to see if various versions of FSD are tested. Interestingly, you can find many examples from real-world users of FSD handling snow-covered roads, heavy rain, and single-lane backroads.
However, there are incidents that the NHTSA has used to determine the need for this probe, at least for now. The agency said:
“Available incident data raise concerns that Tesla’s degradation detection system, both as originally deployed and later updated, fails to detect and/or warn the driver appropriately under degraded visibility conditions such as glare and airborne obscurants. In the crashes that ODI has reviewed, the system did not detect common roadway conditions that impaired camera visibility and/or provide alerts when camera performance had deteriorated until immediately before the crash occurred.”
It continues to say in its report that a review of Tesla’s responses revealed additional crashes that occurred in similar environments showed FSD “did not detect a degraded state, and/or it did not present the driver with an alert with adequate time for the driver to react. In each of these crashes, FSD also lost track of or never detected a lead vehicle in its path.”
The next steps of the NHTSA Engineering Analysis require the agency to gather further information on Tesla’s attempts to upgrade the degradation detection system. It will also analyze six recent potentially related incidents.
The investigation is listed as EA26002.