Connect with us

News

Tesla Model 3 gets penalized in Europe despite top scores in vehicle assistance and safety

(Credit: Thatcham Research/YouTube)

Published

on

In collaboration with Thatcham Research, the Euro NCAP has launched the world’s first Assisted Driving Grading system, a new set of metrics that are specifically designed to evaluate the driver-assist systems of cars available on the market today. For its first batch of vehicles, the firms evaluated 10 cars, from premium SUVs like the Mercedes-Benz GLE to affordable hatchbacks like the Renault Clio to all-electric vehicles like the Tesla Model 3. 

As noted by Thatcham Research Director of Insurance Research Matthew Avery in a video outlining the results of the Assisted Driving Grading system’s first tests, vehicles would be graded on three metrics: the level of vehicle assistance that they provide, the level of driver engagement that they offer, and the effectiveness of their safety backup systems. The results of these tests, especially on the Tesla Model 3’s part, were rather peculiar, to say the least. 

Out of 10 vehicles that were evaluated, the Tesla Model 3 ranked 6th with a “Moderate” grade, falling behind the Mercedes-Benz GLE, BMW 3-Series, and Audi Q8, which were graded as “Very Good,” and the Ford Kuga, which received a “Good” rating. This was despite the Tesla Model 3 receiving the top scores in the “Vehicle Assistance” and “Safety Backup” metrics. 

(Credit: Thatcham Research)

The study, for example, dubbed the Model 3 as outstanding in terms of steering assistance, with the vehicle steering itself exceptionally well through an S-shaped curve at speeds of 80, 100, and 120 km/h. Tesla’s lane change systems were also satisfactory, despite the system’s limitations in Europe. Distance control was dominated by the Model 3 as well, with the evaluators stating that Tesla’s adaptive cruise control featured a “high level of technical maturity.” From a score of 100, Tesla’s vehicle assistance received a score of 87, the highest among the cars tested. 

The Model 3’s safety backup systems were also a league above its competition. As noted in a post from the Allgemeiner Deutscher Automobil-Club e.V. (ADAC), Tesla demonstrated its strengths with the Model 3’s collision avoidance systems. The all-electric sedan earned a perfect score in the firms’ tests, outperforming its premium German competition. Overall, the Model 3 received an impressive score of 95 in the Assisted Driving Grading system’s “Safety Backup” metric. 

Advertisement

Considering these scores, one might wonder why the Model 3 ended up ranked 6th among the 10 vehicles tested by the Euro NCAP and Thatcham Research. As it turned out, this was because of the Model 3’s poor scores in the “Driver Engagement” metric, where the vehicle only earned a score of 35 out of 100. So poor was the Model 3’s scores in this metric that it was ranked last among the 10 vehicles that were evaluated. 

(Credit: ADAS)

A look at the reasons behind the Model 3’s poor scores in “Driver Engagement” includes a number of interesting insights from Thatcham Research and the Euro NCAP. When testing the vehicles’ steering override functions, for example, the evaluators stated that the Model 3 resisted steering overrides from its driver. These issues were explained in the ADAC’s post. 

“Should the driver make a steering movement in order to avoid an object or a pothole in the roadway, the steering assistant should allow this without resistance. In the Tesla Model 3, for example, this is not the case. Apparently, Tesla trusts the system more than its driver. The necessary cooperative assistance is not given. Instead, the Tesla system prevents its driver from attempting to intervene – it mustn’t be,” the ADAC remarked in its post. 

Even more interesting is that part of the Model 3’s poor “Driver Engagement” scores was due to the term “Autopilot,” which Tesla uses to describe its driver-assist suite. The evaluators argued that the term “Autopilot” was misleading and irresponsible on Tesla’s part, and this was heavily taken against the Model 3’s rankings in the Assisted Driving Grading system. 

(Credit: ADAS)

“When it comes to the first test criterion – consumer information – the Tesla Model 3 in particular fails. The assistance systems are referred to as “Autopilot” in the operating instructions for the Model 3 as well as in the sales brochures and in marketing. However, the term suggests capabilities that the system does not have in sufficient measure. It tempts the driver to rely on the capabilities of the system – which is currently not allowed by the legislature anyway. Due to its good quick-start operating aid, the Tesla Model 3 still receives 10 points,” the evaluators noted. 

Ultimately, these complaints about Autopilot’s branding ended up pulling down the Model 3’s scores to the point where the all-electric sedan was ranked below the Ford Kuga. Thatcham Research Director of Insurance Research Matthew Avery explained this in a video released about the evaluation. “The Tesla Model 3 was the best for safety backup and vehicle assistance but lost ground for misleading consumers about the capability of its Autopilot system and actively discouraging drivers from engaging when behind the wheel,” Avery said. 

Advertisement

As noted by Avery, it is pertinent for vehicles to exhibit a balance to score very well in the Assisted Driving Grading system. This was not achieved by the Model 3 despite its industry-leading backup safety systems and actual vehicle assistance tech. ADAC explained it best when outlining why the Tesla Model 3 lost to four other vehicles despite being equipped with what is noticeably the most advanced driver-assist system. 

“When analyzing the test results, it is noticeable that the Tesla Model 3 has the most advanced assistance systems. With 95 points for emergency assistance (Safety Backup) and 91 points for technical assistance, it doesn’t beat the Mercedes GLE by far, but at least 11 points… Because Euro NCAP removes the many points in the area of driver support from the Tesla, because on the one hand it does not sufficiently comply with the driver’s request for a steering correction. On the other hand, because Tesla is irresponsible about the term autopilot – an even more serious reason. With only 36 points from the test area driver integration, the Tesla falls back to sixth place in the final bill,” the ADAC noted. 

Thatcham Research’s overall findings could be viewed in the video below. 

Advertisement

Simon is an experienced automotive reporter with a passion for electric cars and clean energy. Fascinated by the world envisioned by Elon Musk, he hopes to make it to Mars (at least as a tourist) someday. For stories or tips--or even to just say a simple hello--send a message to his email, simon@teslarati.com or his handle on X, @ResidentSponge.

Advertisement
Comments

Elon Musk

Elon Musk reveals unfortunate truth of Tesla Full Self-Driving development

In a candid reply to a dramatic video of Tesla’s Full Self-Driving (FSD) system averting disaster, Elon Musk laid bare a harsh reality facing autonomous vehicle technology.

Published

on

Tesla’s Full Self-Driving suite is one of the most significant technological developments in terms of passenger travel in decades, but it is not all sunshine and rainbows, even with major strides in safety, CEO Elon Musk revealed.

In a candid reply to a dramatic video of Tesla’s Full Self-Driving (FSD) system averting disaster, Elon Musk laid bare a harsh reality facing autonomous vehicle technology.

The clip shows a Model 3 traveling at over 65 mph on a foggy, rain-soaked highway when a pedestrian suddenly steps into traffic.

Full Self-Driving instantly detects the threat and swerves safely, preventing what could have been a fatal collision for both the pedestrian and the driver’s cousin.

Musk’s response was unequivocal:

“Tesla self-driving saves a lot of lives – the statistics are unequivocal. That doesn’t mean it’s perfect, of course.” Even with a projected 10x safety improvement over human drivers, FSD would still prevent roughly 90% of the world’s approximately one million annual auto fatalities. The remaining 10%—roughly 100,000 deaths—would expose Tesla to relentless lawsuits. Meanwhile, the vast majority of lives saved would go unnoticed. “The 90% who are still alive mostly won’t even know that Tesla saved them. Nonetheless, it is the right thing to do.”

This “unfortunate truth,” as Musk implicitly framed it, highlights a fundamental asymmetry in how society perceives safety technology. Human drivers cause the overwhelming majority of crashes through distraction, fatigue, or error.

Yet when FSD errs, the incident becomes headline news and a courtroom target. Prevented tragedies, by contrast, leave no trace.

Survivors simply continue their journeys, unaware of the split-second intervention that kept them alive. The result is a distorted public narrative that amplifies failures while rendering successes invisible.

We have seen this through various headlines throughout the years, including the mainstream media’s obsession with only mentioning the manufacturer’s name in the instance of an accident when it is “Tesla.”

Opinion: Tesla Autopilot NHTSA investigation headlines are out of control

The video’s real-world example underscores FSD’s current capabilities. In near-zero visibility, the system’s cameras and neural network reacted faster than any human could, demonstrating the life-saving potential Musk cites.

Tesla’s latest safety data already shows FSD (Supervised) performing significantly better than the U.S. average, with crashes occurring far less frequently per mile driven.

Still, regulatory scrutiny, liability concerns, and media focus on edge-case failures continue to slow widespread adoption. Musk’s frank admission suggests Tesla is prepared to push forward despite the legal and perceptual headwinds.

As FSD edges closer to unsupervised autonomy, Musk’s post serves as both a progress report and a reality check. The technology is already saving lives today.

The unfortunate truth is that proving it and scaling it responsibly will require society to value statistical lives saved as much as dramatic stories of those lost. In the race toward safer roads, perception may prove as formidable an obstacle as the fog and rain in that viral video.

Continue Reading

News

Tesla Full Self-Driving v14.3: First Impressions

Published

on

Tesla started rolling out Full Self-Driving v14.3 to Early Access Program (EAP) members earlier today, and I had the opportunity to see some of the improvements that were made from v14.2.2.5.

While a lot of things got better, and I truly enjoyed using Full Self-Driving again after being stuck with the widely confusing and frustrating v14.2.2.5, Tesla still has one major problem on its hands, and it has to do with Navigation and Routing. I truly believe those issues will be the biggest challenges Tesla will face with autonomy: the car simply going the correct way, not conflicting with what the navigation says, and taking the simplest and most ideal route to a destination.

Here’s what I noticed as an improvement with my first hour with v14.3. This is not a full review, nor is it reflective of everything I will likely experience with this new version. This is simply what I saw as a noticeable improvement from the past version, v14.2.2.5.

There is also a more streamlined version on X, available at the thread below:

Yellow Light Behavior is Significantly Better

On v14.2.2.5, I had so many instances of the car slamming the brakes on to stop at a yellow light when it was clearly the safer option to proceed through. There were several times when the car would be about 20 feet from the line, traveling at 15-20 MPH, the light would turn yellow, and it would slam the brakes to stop. I would nudge it through yellow lights constantly because of this by putting my foot on the accelerator.

The instances I’m talking about here would not have been close calls — the car would have likely moved through the intersection completely before the light would turn red.

On multiple occasions this evening, FSD proceeded through yellow lights safely, without hesitation or any brake stabbing. It was refreshing:

This was a huge complaint with v14.2.2.5. Sometimes, it’s a safer option to go through a yellow light, especially when you have traffic behind you. It’s a great way to get rear-ended.

Parking Performance

I had four instances of parking, and FSD v14.3 really did a flawless job. I was very impressed with how solid it was, but also with how efficiently it moved into the spot. When there was traffic around with past versions, I usually chose to park manually just because FSD took its time getting into a spot. I don’t see that being an issue anymore.

I complained about parking a lot and shared several images on X and Facebook of those examples:

No issues with it this evening. 4/4. Here are two looks:

Highway Performance

FSD v14.3 passed the five cars shown in this image:

The sixth was 200-300 yards ahead of the fifth. In v14.2.2.5, FSD would usually stay in the left lane, especially on Hurry and Mad Max. It did not do that, as it instead chose to get back over in the right lane after passing the final car.

Speed was not much of a concern here, even though it was going 21 MPH over. Although it was fast, I did have a line of cars behind me traveling at the same speed, and FSD had just merged about a half mile prior, so I chose to let it continue.

There were no instances of camping in the left lane for extended periods of time. I do want to do more testing with the Speed Profiles because they were in need of some work with the previous version. I am starting to side with those who want a Max Speed setting, which was removed last year.

Navigation and Routing Still Need Work

I was heading back toward where I came from, so I turned “Avoid Highways” on to take a different way. This confused the Routing system, and instead of turning left, then right, as the Routing said, the car turned right, then indicated for another right, basically going in a big rectangle. The car ignored the second right-hand turn and continued straight. I ended up turning “Avoid Highways” off and letting the car pick the same routing option as what took me here.

I have truly complained so much about Navigation and Routing that I’m starting to feel sort of bad. It is obviously such a massive challenge for some reason, but I am confident it will improve. I recall seeing Tesla hiring someone for this role a few months back, so perhaps there is hope for it to get better.

Smarter Behavior When Approaching Exits/Routing

This probably should be grouped in with Highway Behavior, but I wanted to highlight it on its own.

The highway exit pictured was always frustrating for v14.2.2.5. In the Hurry speed profile, I have seen it try to execute passes on multiple cars with as little as 0.6 miles to spare before taking the exit.

With three cars ahead of it, it chose to reduce speed and just wait until the exit. It was refreshing to see an improvement here, so I hope this behavior persists. Sometimes there’s just no reason to pass when you’re less than a mile from getting off the highway anyway.

Larger Visibility Warnings

Tesla seems to have increased the size of these “Camera Visibility Limited” warnings. Previously, they were just small thumbnails:

Stop Sign Behavior

This is probably the biggest improvement of all, because how it behaved at Stop Signs in v14.2.2.5 was so incredibly terrible and disruptive to the flow of a busy intersection.

There are several four-way, all-stop intersections near me. In the past, FSD would stop well behind the Stop Sign or the white-painted line on the road. It would then inch forward, stopping again at this line, essentially making two stops at a single intersection.

If there is visibility, I don’t truly care where FSD stops, as long as it stops once. Stopping twice just isn’t ideal or logical. I can’t imagine many humans would do it, I know I wouldn’t.

I didn’t have that issue this evening:

This was pretty tight, too, in the sense that both my car and the other one got to the intersection at the same time. FSD may have stopped first, but the other vehicle was probably around the same point that I was when FSD decided to stop. I was happy to see the assertiveness to proceed; it felt like it was ideal to just go through. I was happy it didn’t stop a second time up at the line. I’d be fine if it stopped at the line, as long as that was the only stop it made.

Continue Reading

News

Tesla Full Self-Driving v14.3 rolls out: here’s what’s new

We are in EAP and will be on the road with v14.3 in the coming hours, so we’ll have a lot of things to discuss over the next few days, especially coming from v14.2.2.5, which I called the most “confusing” FSD release of all time.

Published

on

Tesla has officially started rolling out Full Self-Driving v14.3 to Early Access Program (EAP) members, and there are a lot of new improvements.

We are in EAP and will be on the road with v14.3 in the coming hours, so we’ll have a lot of things to discuss over the next few days, especially coming from v14.2.2.5, which I called the most “confusing” FSD release of all time.

Tesla brought out a lot of improvements, according to the v14.3 release notes, which list a vast number of fixes, new features, and new capabilities.

Here’s what Tesla’s release notes for the v14.3 release state:

  • Improved parking location pin prediction, now shown on a map with a P icon.
  •  Increased decisiveness of parking spot selection and maneuvering.
  • Rewrote the Al compiler and runtime from the ground up with MLIR, resulting in 20% faster reaction time and improving model iteration speed.
  • Enhanced response to emergency vehicles, school buses, right-of-way violators, and other rare vehicles.
  • Mitigated unnecessary lane biasing and minor tailgating behaviors.
  • Improved handling of small animals by focusing RL training on harder examples and adding rewards for better proactive safety.
  • Improved traffic light handling at complex intersections with compound lights, curved roads, and yellow light stopping – driven by training on hard RL examples sourced from the Tesla fleet.
  • Upgraded the Reinforcement Learning (RL) stage of training the FSD neural network, resulting in improvements in a wide variety of driving scenarios.
  • Upgraded the neural network vision encoder, improving understanding in rare and low-visibility scenarios, strengthening 3D geometry understanding, and expanding traffic sign understanding.
  • Improved handling for rare and unusual objects extending, hanging, or leaning into the vehicle path by sourcing infrequent events from the fleet.
  • Improved handling of temporary system degradations by maintaining control and automatically recovering without driver intervention, reducing unnecessary disengagements.

Tesla also listed a handful of future improvements as well:

  • Expand reasoning to all behaviors beyond destination handling
  • Add pothole avoidance
  • Improve driver monitoring system sensitivity with better eye gaze tracking, eye wear handling, and higher accuracy in variable lighting situations

CEO Elon Musk has said that v14.3 could be “where the last big piece of the puzzle finally lands.” We have high expectations for this release because, in a lot of ways, v14.2’s final version was extremely disappointing and seemed to be a regression more than anything.

Nevertheless, Full Self-Driving v14.3 is going to be quite an interesting test, considering this is also the first time Musk has stated it will feel like the car will be “sentient.”

Reasoning will be a bigger piece of the puzzle with this release, although there were some elements of it in v14.2.

Tesla AI Head says future FSD feature has already partially shipped

We plan to travel plenty of miles with it over the next few days, so we’ll keep you posted on what our thoughts are.

Continue Reading