Connect with us

News

Tesla Autopilot’s emergency vehicle response feature is addressing a deadly problem no one wants to talk about

(Credit: James W Law, Andres GE)

Published

on

Tesla is currently being investigated by the National Highway Traffic Safety Administration (NHTSA) after several of its electric cars crashed into stationary emergency vehicles while Autopilot was engaged. The premise of the investigation itself is enough to whet the appetite of every Tesla skeptic since the idea of Autopilot crashing consistently into parked emergency vehicles makes for a compelling narrative. Tesla later released an update, enabling Autopilot to detect and slow down for stationary emergency vehicles. The NHTSA responded by calling out the company for not issuing a recall when it released its proactive over-the-air software update. 

What was lost amidst the spread of the Tesla NHTSA investigation story was the fact that the relatively minor Autopilot update, which simply allowed vehicles to slow down when they detect things such as a police car or a firetruck parked on the side of the road, is already saving numerous lives. This is because there is a deadly problem on America’s roads, and it is something that very few seem to be acknowledging. Emergency personnel are dying on the job at a frighteningly frequent basis. They are dying because cars crash into them while they’re parked on the side of the road. And disturbingly enough, very little is being done about it. 

The Flaws of HumanPilot

*Author’s Note and Trigger Warning: The succeeding sections of this article contains links to footage and other online references that may cause distress to readers. Discretion is advised. 

One thing that truly stuck out while writing this piece was the sheer frequency of the accidents that happen to emergency personnel while they are responding to someone in need. This was despite the fact that all 50 states in the USA have a “Slow Down Move Over (SDMO)” Law in place. The premise of the SDMO law is simple: Upon noticing an emergency vehicle’s sirens or flashing lights on the side of the road, drivers are required to move away from the emergency vehicle by going into the next lane. If that is not possible, drivers must slow down to reduce the chances of an accident happening. The SDMO law is based on a very simple premise, but it is one that gets violated on a consistent basis.

This is partly due to states interpreting the law differently, with some adopting a “Slow Down and Move Over” model while others are following a “Slow Down or Move Over” system. But ultimately, there have been zero fatalities involving a vehicle that actually slowed down and moved over when they spotted a stationary emergency vehicle. This suggests that the law works, provided that it does get followed.

Advertisement
-->

But when the Move Over Law gets violated, the human toll becomes disturbingly real. A report from the Government Accountability Office (GAO) indicates that about 8,000 injuries involving a stationary emergency vehicle have been reported in one year. As of this year alone, a total of 57 emergency responders have been killed while addressing a roadside issue. Posts from the National Struck-By Heroes Facebook group, which highlight the aftermath of Struck-by injuries (SBIs) are heartbreaking, and videos and posts shared by companies whose staff are killed while on the job are harrowing. This is something that was highlighted by James D. Garcia, the creator of the Move Over Law and an SBI survivor, who shared some of his insights with Teslarati

“This year is the 25th anniversary of the first Slow Down Move Over Law, passed in South Carolina in 1996. Every state in the US has had an SDMO Law since 2012, and yet this year, we have already reached a record 56 responder deaths (This number has since risen to 57 as of this writing). Since 2018, there have been over 45,000 collisions with stationary roadside objects. Every seven seconds, an object is struck. Every other day, a responder is struck and injured. Every five days, a responder is killed.”

“If you ask the general public the most dangerous risk to a police officer, most would say the chance of being shot in pursuit. If you ask the biggest danger to a firefighter, most envision being trapped in a burning or collapsing building. But statistics prove the real story. Across all agencies, responders are twice more likely to die in an SBI than any other category of work-related injury. It is by far the most dangerous aspect of our job,” Garcia noted. 

A DIY Solution

Perhaps the most heart-wrenching thing about the whole situation is the fact that SBIs are not even collected, considered, and analyzed formally by an official government agency, despite it being the leading cause of death and permanent injury for public safety and roadway responders. This situation has been so prevalent that James W. Law, a 32-year-veteran in the emergency roadside response industry and a specialist researcher in the Move Over Law, opted to develop a light sequence he fondly dubs as “E-Modes” to help drivers inform other vehicles that a parked emergency vehicle is nearby. Simply put, the problem of drivers not following SDMO laws is so real and deadly that emergency responders are DIY-ing a solution themselves — because they cannot count on anyone else. 

Responding to roadside problems on America’s roads for the past 32 years is no joke, and over this time, Law has encountered the worst drivers possible. Law shared with Teslarati that over the course of his career, he has been personally involved in an accident four times, the first of which happened when he was just 18 years old. In what could very well prove the point that humans are bad drivers, one of Law’s experiences actually involved a driver intentionally crashing into him because he felt upset that traffic was disrupted due to an incident. Law’s legs broke the irate driver’s headlights because of the crash, and the driver wanted to accuse the roadside responder of damaging his car. The police were fortunately reasonable, and Law was not charged. The irate driver, on the other hand, received a $500 ticket for using his vehicle as a weapon. 

Advertisement
-->

Speaking with Teslarati, Law admitted that he is a pretty notable Tesla supporter, and he tried his best to emulate CEO Elon Musk’s first principles thinking when he developed E-modes’ custom light sequence. He aims to donate the light sequence protocols he developed to Tesla, partly due to the fact that the company is really the only carmaker out there that seems to be actively doing something to address the deadly issue plaguing emergency roadside personnel today. This became quite evident when the company updated its vehicles to detect and respond to traffic cones on the road. This small update, Law noted, may seem minor — even marginal — to the layman, but for roadside personnel, it was a godsend. 

“Tesla’s traffic cone recognition is a crucial safety feature that I take full advantage of on any and all incidents. Properly setting up cones to define the ‘Kill Zone’ offers a quick way to communicate directly to any Tesla vehicle. Unlike humans, Tesla Vision is always aware. It’s one of the ways I communicate with oncoming Teslas. If Elon adopts E-Modes, a Tesla could communicate back to me that it is situation-aware. As a safety advocate, I strongly insist that every emergency responders use cones on every scene every time because it’s the right thing to do to protect everyone,” Law said. 

The Lone Problem Solver

Inasmuch as the mainstream media coverage of the NHTSA’s probe on Autopilot’s incidents with emergency vehicles is substantial, the fact is that Tesla only accounted for nine crash injuries with first responder vehicles in the past 12 months. That’s a tiny fraction of the ~8,000 injuries the GAO indicated in its report. The company has also steadily rolled out features to make its vehicles safer. With every update of Autopilot and FSD, features like traffic cone recognition get more refined, and the more refined they get, the more emergency responders they protect. Tesla’s recent Autopilot update, which allows vehicles to slow down when they detect a parked emergency vehicle, is further proof of this. 

Law noted that he had been involved in thousands of close calls in his 32-year career, but the one that truly stuck out to him involved a Tesla driver from late 2019, just after the company rolled out Autopilot’s capability to recognize and avoid traffic cones. While he was defining a “Kill Zone” on the road after responding to an incident, he saw an approaching Tesla whose driver appeared to be looking down and not paying attention to the road. Law was unsure if the Tesla was on Autopilot, but the vehicle moved over to the other lane seemingly as soon as it detected the traffic cones that he set up. The veteran emergency responder noted that the Tesla driver seemed surprised as the electric vehicle avoided the cones on its own

Such an incident, ultimately, is what makes Tesla stand apart, at least for now. It may be an inconvenient truth, especially to those who salivate at the thought of FSD or Autopilot going berserk and hunting down emergency responders, but the fact remains that Tesla is doing far more to protect both its drivers and other people on the road than any other carmaker out there. Emergency responder deaths are preventable, and as the creator of the Move Over Law noted, the lion’s share of these incidents is due to human error. It is this human error that technologies such as Autopilot and FSD are trying to solve, NHTSA probe notwithstanding. 

Advertisement
-->

“Ninety percent of all struck-by deaths are a direct result of poor driver behavior. That means that nine out of ten responder deaths could have been prevented if the driver had maintained control of their vehicle at a reasonable speed and reacted in a considerate and attentive manner. Twenty-three percent of lethal struck-by violators were impaired. Five percent were distracted, and another three percent were drowsy. It is important we continue to support efforts to reduce drunk driving and speak out about the rapid rise of distracted driving resulting in responder deaths. Multiple agencies have ongoing PR campaigns to address these aspects, but none are taking on the most dominant category — angry, aggressive, entitled, and selfish drivers. 

“The remaining 69% of drivers that crashed into and killed a responder were completely sober. They saw the lights, they recognized the situation, yet they still felt the need to speed up and pass just a few more cars before they moved over. They were in too big of a hurry to slow down to a controllable speed and killed a responder. These drivers consciously made an intentional personal decision to carelessly disregard the life of a responder. Self-absorbed drivers have become the norm. Stronger laws, higher fines, bigger signs, and brighter lights have no effect once they get behind the wheel. We need to face this reality and develop a strategy that confronts this disregard. We must reinforce the value of a responder’s life over whatever current personal priorities are influencing these drivers’ behavior,” Garcia noted. 

A (Potentially) Safer Future

One can only hope that agencies such as the NHTSA could see the bigger picture with regards to vehicles and the advantages of technologies such as Autopilot and Full Self-Driving. It takes an immense amount of short-sightedness, after all, to remain fixated on whether a recall was filed for a proactive Autopilot update, or on 11 incidents that involved a Tesla crashing into a stationary emergency vehicle, all while one emergency personnel is killed every five days. Focusing on Tesla and ignoring the larger problem at hand seems counter-productive at best. 

In an ideal scenario, technologies such as Autopilot’s capability to identify, slow down, and potentially even move over to another lane when an emergency vehicle is detected would become mandatory for all cars on the road. As noted by esteemed auto teardown expert Sandy Munro, advanced driver-assist systems such as Autopilot and FSD have the potential to save lives on the same level as seatbelts, perhaps even more. And in this light, John Gardella, a shareholder at CMBG3 Law in Boston, MA, told Teslarati that if the NHTSA really wishes to help roll out new safety features, it would actually be a lot easier than one might imagine. 

“Implementing the safety feature in Tesla’s vehicles will be easier than one might imagine. The National Highway Traffic Safety Administration (NHTSA) showed earlier in 2021 through its final rule for safety features for automated driving systems that it does not wish to set onerous standards prior to many features for automated driving system (ADS) vehicles coming to market. In fact, the desire of the NHTSA was to reduce barriers to having ADS safety features come to market more rapidly, and thereby accelerate autonomous vehicles coming to mass markets. The NHTSA received some criticism for its approach. However, the NHTSA does still have the authority to interpret the Federal Motor Vehicle Safety Standards (FMVSS), investigate perceived defects or unreasonably safe vehicle features, and carry out its enforcement authority, including recall power,” Gardella said. 

Advertisement
-->

Don’t hesitate to contact us with news tips. Just send a message to tips@teslarati.com to give us a heads up.

Simon is an experienced automotive reporter with a passion for electric cars and clean energy. Fascinated by the world envisioned by Elon Musk, he hopes to make it to Mars (at least as a tourist) someday. For stories or tips--or even to just say a simple hello--send a message to his email, simon@teslarati.com or his handle on X, @ResidentSponge.

Advertisement
Comments

Elon Musk

Elon Musk proposes Grok 5 vs world’s best League of Legends team match

Musk’s proposal has received positive reception from professional players and Riot Games alike.

Published

on

UK Government, CC BY 2.0 , via Wikimedia Commons

Elon Musk has proposed a high-profile gaming challenge for xAI’s upcoming Grok 5. As per Musk, it would be interesting to see if the large language model could beat the world’ best human League of Legends team with specific constraints.

Musk’s proposal has received positive reception from professional players and Riot Games alike, suggesting that the exciting exhibition match might indeed happen. 

Musk outlines restrictions for Grok

In his post on X, Musk detailed constraints to keep the match competitive, including limiting Grok to human-level reaction times, human-speed clicking, and viewing the game only through a camera feed with standard 20/20 vision. The idea quickly circulated across the esports community, drawing commentary from former pros and AI researchers, as noted in a Dexerto report.

Former League pro Eugene “Pobelter” Park expressed enthusiasm, offering to help Musk’s team and noting the unique comparison to past AI-versus-human breakthroughs, such as OpenAI’s Dota 2 bots. AI researcher Oriol Vinyals, who previously reached Grandmaster rank in StarCraft, suggested testing Grok in RTS gameplay as well. 

Musk welcomed the idea, even responding positively to Vinyals’ comment that it would be nice to see Optimus operate the mouse and keyboard.

Advertisement
-->

Pros debate Grok’s chances, T1 and Riot show interest

Reactions weren’t universally optimistic. Former professional mid-laner Joedat “Voyboy” Esfahani argued that even with Grok’s rapid learning capabilities, League of Legends requires deep synergy, game-state interpretation, and team coordination that may be difficult for AI to master at top competitive levels. Yiliang “Doublelift” Peng was similarly skeptical, publicly stating he doubted Grok could beat T1, or even himself, and jokingly promised to shave his head if Grok managed to win.

T1, however, embraced the proposal, responding with a GIF of Faker and the message “We are ready,” signaling their willingness to participate. Riot Games itself also reacted, with co-founder Marc Merrill replying to Musk with “let’s discuss.” Needless to say, it appears that Riot Games in onboard with the idea.

Though no match has been confirmed, interest from players, teams, and Riot suggests the concept could materialize into a landmark AI-versus-human matchup, potentially becoming one of the most viewed League of Legends events in history. The fact that Grok 5 will be constrained to human limits would definitely add an interesting dimension to the matchup, as it could truly demonstrate how human-like the large language model could be like in real-time scenarios.

Tesla has passed a key milestone, and it was one that CEO Elon Musk initially mentioned more than nine years ago when he published Master Plan, Part Deux. 

As per Tesla China in a post on its official Weibo account, the company’s Autopilot system has accumulated over 10 billion kilometers of real-world driving experience.

Tesla China’s subtle, but huge announcement

In its Weibo post, Tesla China announced that the company’s Autopilot system has accumulated 10 billion kilometers of driving experience. “In this respect, Tesla vehicles equipped with Autopilot technology can be considered to have the world’s most experienced and seasoned driver.” 

Advertisement
-->

Tesla AI’s handle on Weibo also highlighted a key advantage of the company’s self-driving system. “It will never drive under the influence of alcohol, be distracted, or be fatigued,” the team wrote. “We believe that advancements in Autopilot technology will save more lives.”

Tesla China did not clarify exactly what it meant by “Autopilot” in its Weibo post, though the company’s intense focus on FSD over the past years suggests that the term includes miles that were driven by FSD (Beta) and Full Self-Driving (Supervised). Either way, 10 billion cumulative miles of real-world data is something that few, if any, competitors could compete with.

Advertisement

–>

Credit: Tesla China/Weibo

Elon Musk’s 10-billion-km estimate, way back in 2016

When Elon Musk published Master Plan Part Deux, he outlined his vision for the company’s autonomous driving system. At the time, Autopilot was still very new, though Musk was already envisioning how the system could get regulatory approval worldwide. He estimated that worldwide regulatory approval will probably require around 10 billion miles of real-world driving data, which was an impossible-sounding amount at the time. 

“Even once the software is highly refined and far better than the average human driver, there will still be a significant time gap, varying widely by jurisdiction, before true self-driving is approved by regulators. We expect that worldwide regulatory approval will require something on the order of 6 billion miles (10 billion km). Current fleet learning is happening at just over 3 million miles (5 million km) per day,” Musk wrote. 

Advertisement
-->

It’s quite interesting but Tesla is indeed getting regulatory approval for FSD (Supervised) at a steady pace today, at a time when 10 billion miles of data has been achieved. The system has been active in the United States and has since been rolled out to other countries such as Australia, New Zealand, China, and, more recently, South Korea. Expectations are high that Tesla could secure FSD approval in Europe sometime next year as well. 

Continue Reading

News

Elon Musk’s Boring Company reveals Prufrock TBM’s most disruptive feature

As it turns out, the tunneling startup, similar to other Elon Musk-backed ventures, is also dead serious about pursuing reusability.

Published

on

The Boring Company has quietly revealed one of its tunnel boring machines’ (TBMs) most underrated feature. As it turns out, the tunneling startup, similar to other Elon Musk-backed ventures, is also dead serious about pursuing reusability.

Prufrock 5 leaves the factory

The Boring Company is arguably the quietest venture currently backed by Elon Musk, inspiring far fewer headlines than his other, more high-profile companies such as Tesla, SpaceX, and xAI. Still, the Boring Company’s mission is ambitious, as it is a company designed to solve the problem of congestion in cities.

To accomplish this, the Boring Company would need to develop tunnel boring machines that could dig incredibly quickly. To this end, the startup has designed Prufrock, an all-electric TBM that’s designed to eventually be fast enough as an everyday garden snail. Among TBMs, such a speed would be revolutionary. 

The startup has taken a step towards this recently, when The Boring Company posted a photo of Prufrock-5 coming out of its Bastrop, Texas facility. “On a rainy day in Bastrop, Prufrock-5 has left the factory. Will begin tunneling by December 1.  Hoping for a step function increase in speed,” the Boring Company wrote.

Prufrock’s quiet disruption

Interestingly enough, the Boring Company also mentioned a key feature of its Prufrock machines that makes them significantly more sustainable and reusable than conventional TBMs. As per a user on X, standard tunnel boring machines are often left underground at the conclusion of a project because retrieving them is usually more expensive and impractical than abandoning them in the location. 

Advertisement
-->

As per the Boring Company, however, this is not the case for its Prufrock machines, as they are retrieved, upgraded, and deployed again with improvements. “All Prufrocks are reused, usually with upgrades between launches. Prufrock-1 has now dug six tunnels,” the Boring Company wrote in its reply on X.

The Boring Company’s reply is quite exciting as it suggests that the TBMs from the tunneling startup could eventually be as reusable as SpaceX’s boosters. This is on brand for an Elon Musk-backed venture, of course, though the Boring Company’s disruption is a bit more underground. 

Continue Reading

News

Tesla accused of infringing robotics patents in new lawsuit

Published

on

tesla store in New York City
Credit: Tesla

Tesla is being accused of infringing robotics patents by a company called Perrone Robotics, which is based out of Charlottesville, Virginia.

The suit was filed in Alexandria, Virginia, and accuses Tesla of knowingly infringing upon five patents related to robotics systems for self-driving vehicles.

The company said its founder, Paul Perrone, developed general-purpose robotics operating systems for individual robots and automated devices.

Perrone Robotics claims that all Tesla vehicles utilizing the company’s Autopilot suite within the last six years infringe the five patents, according to a report from Reuters.

Tesla’s new Safety Report shows Autopilot is nine times safer than humans

One patent was something the company attempted to sell to Tesla back in 2017. The five patents cover a “General Purpose Operating System for Robotics,” otherwise known as GPROS.

The GPROS suite includes extensions for autonomous vehicle controls, path planning, and sensor fusion. One key patent, U.S. 10,331,136, was explicitly offered to Tesla by Perrone back in 2017, but the company rejected it.

The suit aims to halt any further infringements and seeks unspecified damages.

This is far from the first suit Tesla has been involved in, including one from his year with Perceptive Automata LLC, which accused Tesla of infringing on AI models to interpret pedestrian/cyclist intent via cameras without licensing. Tesla appeared in court in August, but its motion to dismiss was partially denied earlier this month.

Tesla also settled a suit with Arsus LLC, which accused Autopilot’s electronic stability features of infringing on rollover prevention tech. Tesla won via an inter partes review in September.

Most of these cases involve non-practicing entities or startups asserting broad autonomous vehicle patents against Tesla’s rapid iteration.

Tesla typically counters with those inter partes reviews, claiming invalidity. Tesla has successfully defended about 70 percent of the autonomous vehicle lawsuits it has been involved in since 2020, but settlements are common to avoid discovery costs.

The case is Perrone Robotics Inc v Tesla Inc, U.S. District Court, Eastern District of Virginia, No. 25-02156. Tesla has not yet listed an attorney for the case, according to the report.

Continue Reading