Tesla is currently being investigated by the National Highway Traffic Safety Administration (NHTSA) after several of its electric cars crashed into stationary emergency vehicles while Autopilot was engaged. The premise of the investigation itself is enough to whet the appetite of every Tesla skeptic since the idea of Autopilot crashing consistently into parked emergency vehicles makes for a compelling narrative. Tesla later released an update, enabling Autopilot to detect and slow down for stationary emergency vehicles. The NHTSA responded by calling out the company for not issuing a recall when it released its proactive over-the-air software update.
What was lost amidst the spread of the Tesla NHTSA investigation story was the fact that the relatively minor Autopilot update, which simply allowed vehicles to slow down when they detect things such as a police car or a firetruck parked on the side of the road, is already saving numerous lives. This is because there is a deadly problem on America’s roads, and it is something that very few seem to be acknowledging. Emergency personnel are dying on the job at a frighteningly frequent basis. They are dying because cars crash into them while they’re parked on the side of the road. And disturbingly enough, very little is being done about it.
The Flaws of HumanPilot
*Author’s Note and Trigger Warning: The succeeding sections of this article contains links to footage and other online references that may cause distress to readers. Discretion is advised.
One thing that truly stuck out while writing this piece was the sheer frequency of the accidents that happen to emergency personnel while they are responding to someone in need. This was despite the fact that all 50 states in the USA have a “Slow Down Move Over (SDMO)” Law in place. The premise of the SDMO law is simple: Upon noticing an emergency vehicle’s sirens or flashing lights on the side of the road, drivers are required to move away from the emergency vehicle by going into the next lane. If that is not possible, drivers must slow down to reduce the chances of an accident happening. The SDMO law is based on a very simple premise, but it is one that gets violated on a consistent basis.
This is partly due to states interpreting the law differently, with some adopting a “Slow Down and Move Over” model while others are following a “Slow Down or Move Over” system. But ultimately, there have been zero fatalities involving a vehicle that actually slowed down and moved over when they spotted a stationary emergency vehicle. This suggests that the law works, provided that it does get followed.
But when the Move Over Law gets violated, the human toll becomes disturbingly real. A report from the Government Accountability Office (GAO) indicates that about 8,000 injuries involving a stationary emergency vehicle have been reported in one year. As of this year alone, a total of 57 emergency responders have been killed while addressing a roadside issue. Posts from the National Struck-By Heroes Facebook group, which highlight the aftermath of Struck-by injuries (SBIs) are heartbreaking, and videos and posts shared by companies whose staff are killed while on the job are harrowing. This is something that was highlighted by James D. Garcia, the creator of the Move Over Law and an SBI survivor, who shared some of his insights with Teslarati.
“This year is the 25th anniversary of the first Slow Down Move Over Law, passed in South Carolina in 1996. Every state in the US has had an SDMO Law since 2012, and yet this year, we have already reached a record 56 responder deaths (This number has since risen to 57 as of this writing). Since 2018, there have been over 45,000 collisions with stationary roadside objects. Every seven seconds, an object is struck. Every other day, a responder is struck and injured. Every five days, a responder is killed.”
“If you ask the general public the most dangerous risk to a police officer, most would say the chance of being shot in pursuit. If you ask the biggest danger to a firefighter, most envision being trapped in a burning or collapsing building. But statistics prove the real story. Across all agencies, responders are twice more likely to die in an SBI than any other category of work-related injury. It is by far the most dangerous aspect of our job,” Garcia noted.
A DIY Solution
Perhaps the most heart-wrenching thing about the whole situation is the fact that SBIs are not even collected, considered, and analyzed formally by an official government agency, despite it being the leading cause of death and permanent injury for public safety and roadway responders. This situation has been so prevalent that James W. Law, a 32-year-veteran in the emergency roadside response industry and a specialist researcher in the Move Over Law, opted to develop a light sequence he fondly dubs as “E-Modes” to help drivers inform other vehicles that a parked emergency vehicle is nearby. Simply put, the problem of drivers not following SDMO laws is so real and deadly that emergency responders are DIY-ing a solution themselves — because they cannot count on anyone else.
Responding to roadside problems on America’s roads for the past 32 years is no joke, and over this time, Law has encountered the worst drivers possible. Law shared with Teslarati that over the course of his career, he has been personally involved in an accident four times, the first of which happened when he was just 18 years old. In what could very well prove the point that humans are bad drivers, one of Law’s experiences actually involved a driver intentionally crashing into him because he felt upset that traffic was disrupted due to an incident. Law’s legs broke the irate driver’s headlights because of the crash, and the driver wanted to accuse the roadside responder of damaging his car. The police were fortunately reasonable, and Law was not charged. The irate driver, on the other hand, received a $500 ticket for using his vehicle as a weapon.
Speaking with Teslarati, Law admitted that he is a pretty notable Tesla supporter, and he tried his best to emulate CEO Elon Musk’s first principles thinking when he developed E-modes’ custom light sequence. He aims to donate the light sequence protocols he developed to Tesla, partly due to the fact that the company is really the only carmaker out there that seems to be actively doing something to address the deadly issue plaguing emergency roadside personnel today. This became quite evident when the company updated its vehicles to detect and respond to traffic cones on the road. This small update, Law noted, may seem minor — even marginal — to the layman, but for roadside personnel, it was a godsend.
“Tesla’s traffic cone recognition is a crucial safety feature that I take full advantage of on any and all incidents. Properly setting up cones to define the ‘Kill Zone’ offers a quick way to communicate directly to any Tesla vehicle. Unlike humans, Tesla Vision is always aware. It’s one of the ways I communicate with oncoming Teslas. If Elon adopts E-Modes, a Tesla could communicate back to me that it is situation-aware. As a safety advocate, I strongly insist that every emergency responders use cones on every scene every time because it’s the right thing to do to protect everyone,” Law said.
The Lone Problem Solver
Inasmuch as the mainstream media coverage of the NHTSA’s probe on Autopilot’s incidents with emergency vehicles is substantial, the fact is that Tesla only accounted for nine crash injuries with first responder vehicles in the past 12 months. That’s a tiny fraction of the ~8,000 injuries the GAO indicated in its report. The company has also steadily rolled out features to make its vehicles safer. With every update of Autopilot and FSD, features like traffic cone recognition get more refined, and the more refined they get, the more emergency responders they protect. Tesla’s recent Autopilot update, which allows vehicles to slow down when they detect a parked emergency vehicle, is further proof of this.
Law noted that he had been involved in thousands of close calls in his 32-year career, but the one that truly stuck out to him involved a Tesla driver from late 2019, just after the company rolled out Autopilot’s capability to recognize and avoid traffic cones. While he was defining a “Kill Zone” on the road after responding to an incident, he saw an approaching Tesla whose driver appeared to be looking down and not paying attention to the road. Law was unsure if the Tesla was on Autopilot, but the vehicle moved over to the other lane seemingly as soon as it detected the traffic cones that he set up. The veteran emergency responder noted that the Tesla driver seemed surprised as the electric vehicle avoided the cones on its own.
Such an incident, ultimately, is what makes Tesla stand apart, at least for now. It may be an inconvenient truth, especially to those who salivate at the thought of FSD or Autopilot going berserk and hunting down emergency responders, but the fact remains that Tesla is doing far more to protect both its drivers and other people on the road than any other carmaker out there. Emergency responder deaths are preventable, and as the creator of the Move Over Law noted, the lion’s share of these incidents is due to human error. It is this human error that technologies such as Autopilot and FSD are trying to solve, NHTSA probe notwithstanding.
“Ninety percent of all struck-by deaths are a direct result of poor driver behavior. That means that nine out of ten responder deaths could have been prevented if the driver had maintained control of their vehicle at a reasonable speed and reacted in a considerate and attentive manner. Twenty-three percent of lethal struck-by violators were impaired. Five percent were distracted, and another three percent were drowsy. It is important we continue to support efforts to reduce drunk driving and speak out about the rapid rise of distracted driving resulting in responder deaths. Multiple agencies have ongoing PR campaigns to address these aspects, but none are taking on the most dominant category — angry, aggressive, entitled, and selfish drivers.
“The remaining 69% of drivers that crashed into and killed a responder were completely sober. They saw the lights, they recognized the situation, yet they still felt the need to speed up and pass just a few more cars before they moved over. They were in too big of a hurry to slow down to a controllable speed and killed a responder. These drivers consciously made an intentional personal decision to carelessly disregard the life of a responder. Self-absorbed drivers have become the norm. Stronger laws, higher fines, bigger signs, and brighter lights have no effect once they get behind the wheel. We need to face this reality and develop a strategy that confronts this disregard. We must reinforce the value of a responder’s life over whatever current personal priorities are influencing these drivers’ behavior,” Garcia noted.
A (Potentially) Safer Future
One can only hope that agencies such as the NHTSA could see the bigger picture with regards to vehicles and the advantages of technologies such as Autopilot and Full Self-Driving. It takes an immense amount of short-sightedness, after all, to remain fixated on whether a recall was filed for a proactive Autopilot update, or on 11 incidents that involved a Tesla crashing into a stationary emergency vehicle, all while one emergency personnel is killed every five days. Focusing on Tesla and ignoring the larger problem at hand seems counter-productive at best.
In an ideal scenario, technologies such as Autopilot’s capability to identify, slow down, and potentially even move over to another lane when an emergency vehicle is detected would become mandatory for all cars on the road. As noted by esteemed auto teardown expert Sandy Munro, advanced driver-assist systems such as Autopilot and FSD have the potential to save lives on the same level as seatbelts, perhaps even more. And in this light, John Gardella, a shareholder at CMBG3 Law in Boston, MA, told Teslarati that if the NHTSA really wishes to help roll out new safety features, it would actually be a lot easier than one might imagine.
“Implementing the safety feature in Tesla’s vehicles will be easier than one might imagine. The National Highway Traffic Safety Administration (NHTSA) showed earlier in 2021 through its final rule for safety features for automated driving systems that it does not wish to set onerous standards prior to many features for automated driving system (ADS) vehicles coming to market. In fact, the desire of the NHTSA was to reduce barriers to having ADS safety features come to market more rapidly, and thereby accelerate autonomous vehicles coming to mass markets. The NHTSA received some criticism for its approach. However, the NHTSA does still have the authority to interpret the Federal Motor Vehicle Safety Standards (FMVSS), investigate perceived defects or unreasonably safe vehicle features, and carry out its enforcement authority, including recall power,” Gardella said.
Don’t hesitate to contact us with news tips. Just send a message to tips@teslarati.com to give us a heads up.
News
Tesla Full Self-Driving v14.2.2.5 might be the most confusing release ever
With each Full Self-Driving release, I am realistic. I know some things are going to get better, and I know some things will regress slightly. However, these instances of improvements are relatively mild, as are the regressions. Yet, this version has shown me that it contains extremes of both.
Tesla Full Self-Driving v14.2.2.5 hit my car back on Valentine’s Day, February 14, and since I’ve had it, it has become, in my opinion, the most confusing release I’ve ever had.
With each Full Self-Driving release, I am realistic. I know some things are going to get better, and I know some things will regress slightly. However, these instances of improvements are relatively mild, as are the regressions. Yet, this version has shown me that it contains extremes of both.
It has been about three weeks of driving on v14.2.2.5; I’ve used it for nearly every mile traveled since it hit my car. I’ve taken short trips of 10 minutes or less, I’ve taken medium trips of an hour or less, and I’ve taken longer trips that are over 100 miles per leg and are over two hours of driving time one way.
These are my thoughts on it thus far:
Speed Profiles Are a Mixed Bag
Speed Profiles are something Tesla seems to tinker with quite frequently, and each version tends to show a drastic difference in how each one behaves compared to the previous version.
I do a vast majority of my FSD travel using Standard and Hurry modes, although in bad weather, I will scale it back to Chill, and when it’s a congested city on a weekend or during rush hour, I’ll throw it into Mad Max so it takes what it needs.
Early on, Speed Profiles really felt great. This is one of those really subjective parts of the FSD where someone might think one mode travels too quickly, whereas another person might see the identical performance as too slow or just right.
To me, I would like to see more consistency from release to release on them, but overall, things are pretty good. There are no real complaints on my end, as I had with previous releases.
In a past release, Mad Max traveled under the speed limit quite frequently, and I only had that experience because Hurry was acting the same way. I’ve had no instances of that with v14.2.2.5.
Strange Turn Signal Behavior
This is the first Full Self-Driving version where I’ve had so many weird things happen with the turn signals.
Two things come to mind: Using a turn signal on a sharp turn, and ignoring the navigation while putting the wrong turn signal on. I’ve encountered both things on v14.2.2.5.
On my way to the Supercharger, I take a road that has one semi-sharp right-hand turn with a driveway entrance right at the beginning of the turn.
Only recently, with the introduction of v14.2.2.5, have I had FSD put on the right turn signal when going around this turn. It’s obviously a minor issue, but it still happens, and it’s not standard practice:
How can we get Full Self-Driving to stop these turn signals?
There’s no need to use one here; the straight path is a driveway, not a public road. The right turn signal here is unnecessary pic.twitter.com/7uLDHnqCfv
— TESLARATI (@Teslarati) February 28, 2026
When sharing this on X, I had Tesla fans (the ones who refuse to acknowledge that the company can make mistakes) tell me that it’s a “valid” behavior that would be taught to anyone who has been “professionally trained” to drive.
Apparently, if you complain about this turn signal, you are also claiming you know more than Tesla engineers…okay.
Nobody in their right mind has ever gone around a sharp turn when driving their car and put on a signal when continuing on the same road. You would put a left turn signal on to indicate you were turning into that driveway if that’s what your intention was.
Like I said, it’s a totally minor issue. However, it’s not really needed, and nor is it normal. If I were in the car with someone who was taking a simple turn on a road they were traveling, and they signaled because the turn was sharp, I’d be scratching my head.
I’ve also had three separate instances of the car completely ignoring the navigation and putting on a signal that is opposite to what the routing says. Really quite strange.
Parking Performance is Still Underwhelming
Parking has been a complaint of mine with FSD for a long time, so much so that it is pretty rare that I allow the vehicle to park itself. More often than not, it is because I want to pick a spot that is relatively isolated.
However, in the times I allow it to pull into a spot, it still does some pretty head-scratching things.
Recently, it tried to back into a spot that was ~60% covered in plowed snow. The snow was piled about six feet high in a Target parking lot.
A few days later, it tried backing into a spot where someone failed the universal litmus test of returning their shopping cart. Both choices were baffling and required me to manually move the car to a different portion of the lot.
I used Autopark on both occasions, and it did a great job of getting into the spot. I notice that the parking performance when I manually choose the spot is much better than when the car does the entire parking process, meaning choosing the spot and parking in it.
It’s Doing Things (For Me) It’s Never Done Before
Two things that FSD has never done before, at least for me, are slow down in School Zones and avoid deer. The first is something I usually take over manually, and the second I surprisingly have not had to deal with yet.
I had my Tesla slow down at a school zone yesterday for the first time, traveling at 20 MPH and not 15 MPH as the sign suggested, but at the speed of other cars in the School Zone. This was impressive and the first time I experienced it.
I would like to see this more consistently, and I think School Zones should be one of those areas where, no matter what, FSD will only travel the speed limit.
Last night, FSD v14.2.2.5 recognized a deer in a roadside field and slowed down for it:
🚨 Cruising home on a rainy, foggy evening and my Tesla on Full Self-Driving begins to slow down suddenly
FSD just wanted Mr. Deer to make it home to his deer family ❤️ pic.twitter.com/cAeqVDgXo5
— TESLARATI (@Teslarati) March 4, 2026
Navigation Still SUCKS
Navigation will be a complaint until Tesla proves it can fix it. For now, it’s just terrible.
It still has not figured out how to leave my neighborhood. I give it the opportunity to prove me wrong each time I leave my house, and it just can’t do it.
It always tries to go out of the primary entrance/exit of the neighborhood when the route needs to take me left, even though that exit is a right turn only. I always leave a voice prompt for Tesla about it.
It still picks incredibly baffling routes for simple navigation. It’s the one thing I still really want Tesla to fix.
Investor's Corner
Tesla gets tip of the hat from major Wall Street firm on self-driving prowess
“Tesla is at the forefront of autonomous driving, supported by a camera-only approach that is technically harder but much cheaper than the multi-sensor systems widely used in the industry. This strategy should allow Tesla to scale more profitably compared to Robotaxi competitors, helped by a growing data engine from its existing fleet,” BoA wrote.
Tesla received a tip of the hat from major Wall Street firm Bank of America on Wednesday, as it reinitiated coverage on Tesla shares with a bullish stance that comes with a ‘Buy’ rating and a $460 price target.
In a new note that marks a sharp reversal from its neutral position earlier in 2025, the bank declared Tesla’s Full Self-Driving (FSD) technology the “leading consumer autonomy solution.”
Analysts highlighted Tesla’s camera-only architecture, known as Tesla Vision, as a strategic masterstroke. While technically more challenging than the multi-sensor setups favored by rivals, the vision-based approach is dramatically cheaper to produce and maintain.
This cost edge, combined with Tesla’s rapidly expanding real-world data engine, positions the company to scale robotaxis far more profitably than competitors, BofA argues in the new note:
“Tesla is at the forefront of autonomous driving, supported by a camera-only approach that is technically harder but much cheaper than the multi-sensor systems widely used in the industry. This strategy should allow Tesla to scale more profitably compared to Robotaxi competitors, helped by a growing data engine from its existing fleet.”
The bank now attributes roughly 52% of Tesla’s total valuation to its Robotaxi ambitions. It also flagged meaningful upside from the Optimus humanoid robot program and the fast-growing energy storage business, suggesting the auto segment’s recent headwinds, including expired incentives, are being eclipsed by these higher-margin opportunities.
Tesla’s own data underscores exactly why Wall Street is waking up to FSD’s potential. According to Tesla’s official safety reporting page, the FSD Supervised fleet has now surpassed 8.4 billion cumulative miles driven.
Tesla FSD (Supervised) fleet passes 8.4 billion cumulative miles
That total ballooned from just 6 million miles in 2021 to 80 million in 2022, 670 million in 2023, 2.25 billion in 2024, and a staggering 4.25 billion in 2025 alone. In the first 50 days of 2026, owners added another 1 billion miles — averaging more than 20 million miles per day.
This avalanche of real-world, camera-captured footage, much of it on complex city streets, gives Tesla an unmatched training dataset. Every mile feeds its neural networks, accelerating improvement cycles that lidar-dependent rivals simply cannot match at scale.
Tesla owners themselves will tell you the suite gets better with every release, bringing new features and improvements to its self-driving project.
The $460 target implies roughly 15 percent upside from recent trading levels around $400. While regulatory and safety hurdles remain, BofA’s endorsement signals growing institutional conviction that Tesla’s data advantage is not hype; it’s a tangible moat already delivering billions of miles of proof.
News
Tesla to discuss expansion of Samsung AI6 production plans: report
Tesla has reportedly requested an additional 24,000 wafers per month, which would bring total production capacity to around 40,000 wafers if finalized.
Tesla is reportedly discussing an expansion of its next-generation AI chip supply deal with Samsung Electronics.
As per a report from Korean industry outlet The Elec, Tesla purchasing executives are reportedly scheduled to meet Samsung officials this week to negotiate additional production volume for the company’s upcoming AI6 chip.
Industry sources cited in the report stated that Tesla is pushing to increase the production volume of its AI6 chip, which will be manufactured using Samsung’s 2-nanometer process.
Tesla previously signed a long-term foundry agreement with Samsung covering AI6 production through December 31, 2033. The deal was reportedly valued at about 22.8 trillion won (roughly $16–17 billion).
Under the existing agreement, Tesla secured approximately 16,000 wafers per month from the facility. The company has reportedly requested an additional 24,000 wafers per month, which would bring total production capacity to around 40,000 wafers if finalized.
Tesla purchasing executives are expected to discuss detailed supply terms during their visit to Samsung this week.
The AI6 chip is expected to support several Tesla technologies. Industry sources stated that the chip could be used for the company’s Full Self-Driving system, the Optimus humanoid robot, and Tesla’s internal AI data centers.
The report also indicated that AI6 clusters could replace the role previously planned for Tesla’s Dojo AI supercomputer. Instead of a single system, multiple AI6 chips would be combined into server-level clusters.
Tesla’s semiconductor collaboration with Samsung dates back several years. Samsung participated in the design of Tesla’s HW3 (AI3) chip and manufactured it using a 14-nanometer process. The HW4 chip currently used in Tesla vehicles was also produced by Samsung using a 5-nanometer node.
Tesla previously planned to split production of its AI5 chip between Samsung and TSMC. However, the company reportedly chose Samsung as the primary partner for the newer AI6 chip.