News
IIHS announces new ratings set for the safeguards of semi-autonomous vehicles
The Insurance Institute for Highway Safety (IIHS) has announced that it is developing a new ratings program that evaluates the safeguards that vehicles with partial automation employ to help drivers stay attentive.
The IIHS will use four levels for rating the safeguards: good, acceptable, marginal, or poor. Vehicles with “good” safeguard system ratings will need to ensure that the driver’s eyes are directed at the road and their hands are either on the wheel or ready to grab it at any point. Vehicles with escalating alert systems and appropriate emergency procedures when a driver does not meet those conditions will also be required, the IIHS said.
Expectations for the IIHS are that the first set of ratings will be released in 2022. The precise timing is currently not solidified as supply chain bottlenecks have affected the IIHS’ ability to obtain test vehicles from manufacturers.
IIHS President David Harkey believes a rating system for these “driver monitoring” systems could determine their effectiveness and whether safeguards actually hold drivers accountable. “Partial automation systems may make long drives seem like less of a burden, but there is no evidence that they make driving safer,” Harkey said. ” In fact, the opposite may be the case if systems lack adequate safeguards.”
Self-driving cars are not yet available to consumers, the IIHS reassures in its press release. While some advertising operations or product names could be somewhat misleading, the IIHS admits that some vehicles have partial automation. However, the human driver is still required to handle many routine driving tasks that many of the systems simply cannot perform. The driver always needs to be attentive and monitor the vehicle’s behavior, especially in case of an emergency where the driver needs to take over control of the car. The numerous semi-autonomous or partially automated programs on the market, like Tesla Autopilot, Volvo Pilot Assist, and GM’s Super Cruise, to name a few, all have safeguards in place to help ensure drivers are focused and ready. However, the IIHS says that “none of them meet all the pending IIHS criteria.”
The previously named partially automated driving systems all use cameras, radar, or other sensors to “see” the road. Systems currently offered on the market combine Adaptive Cruise Control (ACC) and lane centering with other driver assistance features. Automated lane changing is becoming common as well, and is a great example of one of these additional features.
Regardless of how many features a semi-autonomous driving program has, all of them still require the driver to remain attentive and vigilant during operation. This does not mean that all drivers maintain attention, as some may use cheat devices or other loopholes to operate a vehicle with semi-autonomous features in a fully autonomous way. Additionally, the IIHS mentions in its press release that some manufacturers “have oversold the capabilities of their systems, prompting drivers to treat the systems as if they can drive the car on their own.”
RELATED:
Level 2 systems like Tesla Autopilot can improve drivers’ attentiveness: IIHS study
The main issue is the fact that many operators deliberately misuse the systems. IIHS Research Scientist Alexandra Mueller is spearheading the new ratings program, and she says that abuse of the systems is one of many problems with semi-autonomous vehicle features.
“The way many of these systems operate gives people the impression that they’re capable of doing more than they really are,” Mueller said regarding the features. “But even when drivers understand the limitations of partial automation, their minds can still wander. As humans, it’s harder for us to remain vigilant when we’re watching and waiting for a problem to occur than it is when we’re doing all the driving ourselves.”
There is no way to monitor a driver’s thoughts or their level of focus on driving. However, there are ways to monitor gaze, head and hand position, posture, and other indicators that, when correctly displayed, could be consistent with someone who is actively engaged in driving.
The IIHS’ new ratings program aims to encourage the introduction of safeguards that can help reduce intentional and unintentional misuse. They would not address the functional aspects of some systems and whether they are activating properly, which could also contribute to crashes. It will only judge the systems that monitor human behaviors while driving.
“To earn a good rating, systems should use multiple types of alerts to quickly remind the driver to look at the road and return their hands to the wheel when they’ve looked elsewhere or left the steering unattended for too long. Evidence shows that the more types of alerts a driver receives, the more likely they will notice them and respond. These alerts must begin and escalate quickly. Alerts might include chimes, vibrations, pulsing the brakes, or tugging on the driver’s seat belt. The important thing is that the alerts are delivered through more channels and with greater urgency as time passes,” the IIHS says. Systems that work effectively would perform necessary maneuvers, like bringing the vehicle to a crawl or a stop if drivers that fail to respond to the numerous alerts. If an escalation of this nature occurs, the driver should be locked out of the system or the remainder of the drive, or until the vehicle is turned off and back on.
The rating criteria may also include certain requirements for automated lane changes, ACC, and lane centering. Automated lane changes should be initiated, or at least confirmed, by the driver before they are performed. If a vehicle comes to a complete stop when an ACC system is activated, the system “should not automatically resume if the driver is not looking at the road or the vehicle has been stopped for too long.” Lane centering features should also encourage the driver to share in steering, rather than switching off automatically when the driver adjusts the wheel. This could discourage some drivers from participating in driving, the IIHS said. Systems should also not be used if a seatbelt is unfastened, or when AEB or lane departure prevention is disabled.
“Nobody knows when we’ll have true self-driving cars, if ever. As automakers add partial automation to more and more vehicles, it’s imperative that they include effective safeguards that help drivers keep their heads in the game,” Harkey said.
I’d love to hear from you! If you have any comments, concerns, or questions, please email me at joey@teslarati.com. You can also reach me on Twitter @KlenderJoey, or if you have news tips, you can email us at tips@teslarati.com.
Elon Musk
Musk forces Judge’s exit from shareholder battles over viral social media slip-up
McCormick insisted in a court filing that she harbors no actual bias against Musk or the defendants. She claimed she either never clicked the “support” button, LinkedIn’s version of a “like,” or did so accidentally.
Many Tesla fans are familiar with the name Kathaleen McCormick, especially if they are investors in the company.
McCormick is a Delaware Chancery Court Judge who presided over Tesla CEO Elon Musk’s pay package lawsuit over the past few years, as well as his purchase of Twitter. However, she will no longer be sitting in on any issues related to Musk.
Elon Musk demands Delaware Judge recuse herself after ‘support’ post celebrating $2B court loss
In a rare admission of potential optics issues in one of America’s most powerful corporate courts, Delaware Chancery Court Chancellor Kathaleen McCormick stepped aside Monday from a cluster of shareholder lawsuits targeting Elon Musk and Tesla’s board.
The move came just days after Musk’s legal team highlighted her apparent “support” on LinkedIn for a post that mocked the billionaire over his 2022 tweets about the $44 billion Twitter acquisition.
McCormick insisted in a court filing that she harbors no actual bias against Musk or the defendants. She claimed she either never clicked the “support” button, LinkedIn’s version of a “like,” or did so accidentally.
She wrote in a newly published memo from the Delaware Chancery Court:
“The motion for recusal rests on a false premise — that I support a LinkedIn post about Mr. Musk, which I do not in fact support. I am not biased against the defendants in these actions.”
Yet she granted the reassignment anyway, acknowledging that the intense media scrutiny surrounding her involvement had become “detrimental to the administration of justice.”
The consolidated cases will now be handled by three of her colleagues on the Delaware Court of Chancery, the nation’s go-to venue for high-stakes corporate disputes. The lawsuits accuse Musk and Tesla directors of breaching fiduciary duties through lavish executive compensation and lax governance oversight.
One prominent claim, filed by a Detroit pension fund, challenges massive stock awards granted to board members, alleging the payouts harmed the company. The litigation also overlaps with issues stemming from Musk’s turbulent 2022 Twitter purchase.
McCormick’s history with Musk made her a lightning rod. In 2022, she presided over the fast-tracked lawsuit that ultimately forced Musk to complete the Twitter deal after he tried to back out.
Then in 2024, she struck down his record $56 billion Tesla compensation package, ruling the approval process was flawed and overly CEO-friendly. The Delaware Supreme Court later reinstated the pay on technical grounds, but the ruling fueled Musk’s long-standing criticism of the state’s judiciary.
Musk has repeatedly urged companies to reincorporate elsewhere, arguing Delaware courts have grown hostile to visionary leaders. Monday’s recusal hands him a symbolic victory and underscores how personal social-media activity can collide with judicial impartiality standards.
Delaware law requires judges to step aside if there’s even a “reasonable basis” to question their neutrality.
Court watchers say the episode highlights growing tensions in corporate America’s legal epicenter. While McCormick maintained her impartiality, the appearance of bias proved too costly to ignore. The cases will proceed without her, but the broader debate over Delaware’s dominance in business litigation is far from over.
Elon Musk
Elon Musk has generous TSA offer denied by the White House: here’s why
Musk stepped in on March 21 via a post on X, writing: “I would like to offer to pay the salaries of TSA personnel during this funding impasse that is negatively affecting the lives of so many Americans at airports throughout the country.”
Tesla and SpaceX CEO Elon Musk made a generous offer to pay the salaries of Transportation Security Administration (TSA) employees last week, but the offer was denied by the White House.
In a striking display of private-sector initiative clashing with federal bureaucracy, the White House has turned down an offer from Elon Musk to personally cover the salaries of TSA officers amid an ongoing partial government shutdown. The rejection, reported last Wednesday by multiple outlets, highlights the legal and political hurdles facing unconventional solutions to Washington’s funding gridlock.
The impasse began weeks ago when Congress failed to pass funding for the Department of Homeland Security (DHS), leaving TSA employees, essential workers who screen millions of travelers daily, without paychecks while still required to report for duty.
Frustrated travelers have endured record-long security lines at major airports, with reports of chaos and delays rippling across the country.
Musk stepped in on March 21 via a post on X, writing: “I would like to offer to pay the salaries of TSA personnel during this funding impasse that is negatively affecting the lives of so many Americans at airports throughout the country.”
I would like to offer to pay the salaries of TSA personnel during this funding impasse that is negatively affecting the lives of so many Americans at airports throughout the country
— Elon Musk (@elonmusk) March 21, 2026
But it was not for no reason.
White House spokesperson Abigail Jackson responded on behalf of the Trump administration, expressing appreciation for Musk’s gesture.
However, the legal obstacles, which would be insurmountable, would inhibit Musk from doing so. Jackson said:
“We greatly appreciate Elon’s generous offer. This would pose great legal challenges due to his involvement with federal government contracts.”
Musk’s companies hold significant federal contracts, including NASA launches through SpaceX and potential Defense Department work, raising concerns about conflicts of interest, ethics rules, and anti-bribery statutes that prohibit private payments to government employees. Administration officials also indicated they expect the shutdown to end soon, making external funding unnecessary.
The episode underscores deeper tensions in Washington. Musk, who has advised on government efficiency efforts and maintains a close relationship with President Trump, has frequently criticized wasteful spending and bureaucratic delays.
His offer came as airport security lines ballooned, drawing public frustration toward both parties. TSA officers, many of whom rely on paychecks to cover mortgages and family expenses, have continued working without compensation, a situation that has drawn bipartisan concern but little immediate resolution.
Critics of the rejection argue it prioritizes red tape over practical relief for frontline workers and travelers. Supporters of the White House position counter that allowing private funding sets a dangerous precedent and could undermine congressional authority over the budget.
The White House eventually came to terms with the TSA on Friday and started paying them once again, and lines at airports instantly shrank. The Department of Homeland Security (DHS) said that TSA staf would begin receiving paychecks “as early as” today.
Elon Musk
Tesla FSD mocks BMW human driver: Saves pedestrian from near miss
Tesla FSD anticipated a BMW driver’s lane drift before the human behind the wheel could react.
A video posted to r/TeslaFSD this week put a sharp spotlight on Tesla’s Full Self-Driving (FSD) software being able to react to pedestrian intent than an actual human driver behind the wheel. In the Reddit clip, a BMW driver can be seen rolling through a neighborhood street completely unaware of a pedestrian stepping in to cross. At the same time, a Tesla driving on FSD had already begun slowing down before the pedestrian even began their attempt to cross the street The BMW kept moving, prompting the pedestrian to hop back, while the Tesla came to a stop and provide right-of-way for the human to safely cross.
That gap between what the BMW driver saw and what FSD had already processed is the story. Tesla FSD wasn’t reacting to a person in the street, rather it was reading the signals that a person was about to enter it based on the pedestrian’s movement, trajectory, and their trajectory to telegraph intent.
Tesla’s FSD is now built on an end-to-end neural network trained on billions of real-world miles, learning to interpret subtle human behavioral cues the same way an experienced human driver does instinctively. The difference is consistency. A human driver distracted for two seconds misses what FSD does not.
Tesla sues California DMV over Autopilot and FSD advertising ruling
Reddit commenters in the thread were blunt about the BMW driver’s failure, with several pointing out that the pedestrian was visible well before the crossing. One response put it plainly that the car on FSD saw the situation developing before the human in the other car had registered there was a situation at all.
Tesla has published data showing FSD (Supervised) is 54% safer than a human driver, accumulated across billions of miles driven on the system. Elon Musk has said FSD v14 will outperform human drivers by a factor of two to three, and that v15 has “a shot” at a 10x improvement. Pedestrian safety is where the stakes are highest, and where intent prediction closes the gap fastest. At 30 mph, a car covers roughly 44 feet per second. An extra second of awareness from reading a person’s body language rather than waiting for them to step out is often the difference between a near miss and a fatality.
Video and community discussion: r/TeslaFSD on Reddit
FSD saves man from becoming a pancake. BMW driver nearly flattens him.
by
u/Qwertygolol in
TeslaFSD