Connect with us

News

Do autonomous cars make us worse drivers?

Autonomous cars are coming. So is the first fatality associated with them. Statistically, that milestone should occur in the next 18 months. What will happen then?

Published

on

Tesla in autonomous mode

On May 31, 2009, an Airbus 330 on its way from Rio de Janiero to Paris plunged from an altitude of 35,000 feet into the Atlantic, killing all 228 people on board. Just prior to the crash, the airplane was operating in autopilot mode. A reconstruction of the disaster revealed input from several sensors had been compromised by ice that caused them to give false readings. Updated sensors that were less susceptible to ice accumulation were waiting to be installed after the plane arrived in Paris.

Because of the false readings the autopilot system disengaged returning control to the pilots however the senior pilot was sleeping at the time. The two junior pilots were not as highly trained in high altitude flight as they might have been, partly because the use of machines to control aircraft under those conditions was the norm.

Faced with the unexpected, the pilots behaved poorly. At one point they are heard to say on the cockpit recorder, “We completely lost control of the airplane, and we don’t understand anything! We tried everything!” While they tried to rouse the sleeping senior pilot, the nose of the aircraft climbed until a stall was induced. Stall is the point at which the wings become barn doors instead of airfoils. The Airbus 330 dropped from the sky like a rock.

In his excellent story about the crash published on Vanity Fair, William Langewiesche offered this conclusion: “Automation has made it more and more unlikely that ordinary airline pilots will ever have to face a raw crisis in flight—but also more and more unlikely that they will be able to cope with such a crisis if one arises.”

Advertisement

The Tesla community has seen similar instances lately. The driver in Salt Lake City who accidentally activated Summon, causing his car to drive into the back of a truck. The woman on a freeway in California who rear ended a car that suddenly slowed in front of her. The man in Europe who crashed into the back of a van that had stalled in the high speed lane of a highway. He at least had the courage to admit his error. “Yes, I could have reacted sooner, but when the car slows down correctly 1,000 times, you trust it to do it the next time to. My bad.”

After each of these incidents, the tendency has been for many to defend the machine and blame the human. But in a recent article for The Guardian, author Martin Robbins says, “Combine an autopilot with a good driver, and you get an autopilot with, if not a bad driver, at least not such a good one.” He says that statistically, the time when a car operating in autonomous mode causes a fatality is rapidly approaching.

Tesla_Model_S_dashcam-tacc-crash-van

Tesla Model S owner crashes into the back of a stalled van

On average, a person is killed in a traffic accident in the United States once every 100 million miles. Elon Musk says Tesla’s Autopilot is half as likely to be involved in a collision as a human driver. That would suggest that somewhere around the 200 million mile mark someone will die as a result of an automobile driven by a machine.

Tesla has already passed the 100 million mile mark for cars driving in Autopilot mode and continues to log 2.6 million miles driven per day. Statistically speaking, the time when a self driving car kills somebody is rapidly approaching. And since most autonomous cars on the road are Teslas, the odds are excellent it will be a Tesla that is involved in that first fatality.

What will happen then? Robbins goes back in history to look for an answer to that question. In 1896, Bridgit Driscoll became the first person in England to be killed by a motor car. The reaction among the public and the press was a fatalistic acceptance that progress will have a price. Within a few years, the speed limit in England was raised from 8 mph — which is was when Ms. Driscoll was killed — to 20 mph. This despite the fact that thousands of road deaths were being recorded on English roads by then.

Advertisement

Regulators around the world are racing to catch up with the explosion of new autonomous driving technology. But Robbins concludes,  “By the time they do, it’s likely that the technology will already be an accepted fact of life, its safety taken for granted by consumers, its failures written off as the fault of its error-prone human masters.”

The point is that injuries and fatalities will continue to occur as cars come to rely more and more on machines for routine driving chores. But in that transition period between now and the time when Level 4 autonomy becomes the norm — the day when cars come from the factory with no way for humans to control them directly — we need to accept that complacency and an inflated belief in the power of machines to protect us from harm may actually render us less competent behind the wheel.

We will need to remain vigilant, if for no other reason than telling a jury “It’s not my fault! The machine failed!” is not going to insulate us from the legal requirement to operate our private automobiles in a safe and prudent manner.

 

Advertisement

"I write about technology and the coming zero emissions revolution."

Advertisement
Comments

News

Tesla wins big as NHTSA drops three-year, 120k unit probe against Model Y

In all, 120,089 Model Ys were impacted, but in two cases, drivers reported the complete detachment of the steering wheel from the steering column while the vehicle was in motion. NHTSA’s initial review revealed that the vehicles had been delivered without the critical retaining bolt that secures the steering wheel to the splined steering column.

Published

on

Credit: Tesla Asia | X

A probe into over 120,000 2023 Tesla Model Y units has been closed by the National Highway Traffic Safety Administration (NHTSA). The probe ends without the agency requiring any action from Tesla.

The probe, designated PE23-003, opened in March 2023 and stemmed from just two consumer complaints involving low-mileage Model Y SUVs.

In all, 120,089 Model Ys were impacted, but in two cases, drivers reported the complete detachment of the steering wheel from the steering column while the vehicle was in motion. NHTSA’s initial review revealed that the vehicles had been delivered without the critical retaining bolt that secures the steering wheel to the splined steering column.

Factory records showed each car had undergone an “end-of-line” repair at Tesla’s facility, during which the steering wheel was removed and reinstalled. The bolt was apparently omitted after the repair, leaving only a friction fit between the wheel and column to hold it in place temporarily.

According to NHTSA documents, this friction fit maintained the connection during initial low-mileage driving until forces during normal operation caused the wheel to detach. Both vehicles that were impacted were repaired under warranty with no injuries reported, and no additional incidents surfaced during the agency’s three-year review.

Advertisement

Tesla Model Y steering wheel detachments prompt NHTSA probe

After analyzing manufacturing processes, complaint data, and field reports, NHTSA concluded the issue was isolated to those two post-repair vehicles rather than indicative of a systemic defect in Tesla’s production or quality control.

The closure means the agency has determined no recall or further enforcement is warranted for this specific missing-bolt condition.

This outcome marks the second NHTSA investigation into Tesla closed without action this month, as a recent probe into the company’s “Actually Smart Summon” feature was also resolved in April.

Advertisement

Tesla Full Self-Driving feature probe closed by NHTSA

The two resolutions provide some relief for Tesla amid the continuous and somewhat unfair regulatory scrutiny of its vehicles, including open inquiries into driver assistance systems.

Importantly, the closed probe does not involve or affect Tesla’s separate May 2023 voluntary recall of certain 2022-2023 Model Y vehicles. That recall addressed a different issue—steering-wheel fasteners that were installed but not torqued to specification—prompted by a service technician’s observation of a loose wheel during unrelated repairs.

Tesla identified a small number of related warranty claims and proactively addressed the matter without NHTSA mandate.

Advertisement

The Model Y remains one of the world’s best-selling vehicles, and Tesla continues to refine its lineup, including the recent “Juniper” refresh. While federal oversight of the electric vehicle pioneer remains intense, this decision underscores that isolated manufacturing anomalies do not always translate into broader safety defects requiring recalls.

Continue Reading

News

Tesla Model Y L gets biggest hint yet that it’s coming to the U.S.

Over the past week, a noticeable wave of American Tesla influencers descended on China and Australia, each posting in-depth YouTube reviews of the Model Y L within days of one another.

Published

on

Credit: Tesla China

The Tesla Model Y L is perhaps the most wanted vehicle in the company’s lineup in the United States, especially now that it is void of a true family vehicle with the removal of the Model X.

In China, Tesla currently offers a longer, more family-friendly version of the Model Y, known as the Model Y L, which is longer in terms of its wheelbase and larger in terms of interior space, making it the perfect option for those with a need for a tad more room than what the all-electric crossover offers in its Standard, Premium, and Performance trims.

However, there seems to be a hint that the Model Y L could be on its way to the United States. Over the past week, a noticeable wave of American Tesla influencers descended on China and Australia, each posting in-depth YouTube reviews of the Model Y L within days of one another:

The timing has sparked some intense speculation as to whether Tesla is quietly preparing to bring the long-wheelbase, three-row family SUV to North America after months of requests from fans.

The Model Y L stretches the wheelbase by about five inches compared to the standard Model Y.

Advertisement

This delivers dramatically more rear legroom, optional captain’s chairs in the second row, and a true six- or seven-seat configuration ideal for growing families. Reviewers praise its refined ride, upgraded interior features like a rear touchscreen and premium audio, and competitive range—up to roughly 466 miles in some configurations.

Many observers see the coordinated influencer trip as more than a coincidence. Tesla China appears to have hosted the group, possibly tied to the Beijing Auto Show, giving U.S.-focused creators early access to hands-on footage aimed squarely at North American audiences.

Tesla Model Y lineup expansion signals an uncomfortable reality for consumers

Tesla watchers are quick to point out this isn’t the first time such a pattern has emerged.

Advertisement

Just months earlier, American influencers were similarly invited to China to test-drive the refreshed Model Y Performance. Those videos dropped in the lead-up to the variant’s U.S. rollout, generating exactly the kind of pre-launch hype that helped smooth its September arrival in American showrooms.

The parallel is obviously hard to ignore, as Tesla has used overseas influencer trips before as a low-key way to build anticipation without formal announcements. With the Model Y L potentially hitting the U.S. market late this year, according to CEO Elon Musk, the timing would make sense.

Tesla Model Y L might not come to the U.S., and it’s a missed opportunity

Of course, it could still be coincidental. Tesla regularly invites creators to its Shanghai factory and events for broader promotional purposes, and the Model Y L has been on sale in China for some time. No official word has come from Tesla or Elon Musk about U.S. availability, pricing, or timing.

Advertisement

Import tariffs, regulatory hurdles, and production priorities at Fremont or the new Mexican Gigafactory could still delay or alter any stateside plans.

Even so, the buzz is real. U.S. families have long asked for a more spacious, three-row Tesla SUV that doesn’t require stepping up to the larger Model X.

If the influencer campaign is any indication, the Model Y L—or a close North American cousin—could finally answer that call. For now, American Tesla fans are watching closely and wondering whether this latest China trip is just good content… or the opening act for something much bigger stateside.

Advertisement
Continue Reading

News

Tesla begins probing owners on FSD’s navigation errors with small but mighty change

Previously lumped under “Other,” these incidents made it harder for Tesla’s AI team to isolate and prioritize map-related issues in their reinforcement learning models. There was a lot of disagreement on how certain interventions should be reported.

Published

on

Tesla has started probing owners on how often its Full Self-Driving suite has Navigation errors with a small but mighty change last night.

In its latest Software Update, which is Version 2026.2.9.9 featuring Full Self-Driving (Supervised) v14.3.2, Tesla has introduced a targeted improvement to how owners will report interventions.

With the initial rollout of v14.3.2, Tesla introduced a new Intervention Menu that appears when a disengagement occurs. It allowed owners to choose from four different categories: Preference, Comfort, Critical, or Other.

Tesla has voided the Other option and replaced it with a new “Navigation” choice, which seems much more ideal given the complaints owners have had about navigation. This seemingly minor UI tweak, rolled out widely in recent days, marks another step in Tesla’s ongoing effort to refine its autonomous driving stack through precise, crowdsourced data.

Advertisement

Tesla made this change in direct response to longstanding community feedback. For years, FSD users have noted that navigation errors—such as incorrect speed limits, suboptimal routes, or directing the vehicle to a building’s rear entrance instead of the main one—frequently force interventions.

Previously lumped under “Other,” these incidents made it harder for Tesla’s AI team to isolate and prioritize map-related issues in their reinforcement learning models. There was a lot of disagreement on how certain interventions should be reported:

Advertisement

By adding a dedicated “Navigation” label, the company can now tag disengagements more accurately, feeding cleaner data into its neural networks. This supports faster iteration on routing algorithms, map accuracy, and intent-aware navigation.

Advertisement

Community consensus around Tesla’s navigation system has been consistent and candid. While the end-to-end AI driving behavior in v14.x earns widespread acclaim for smoothness and safety, navigation remains FSD’s clearest Achilles’ heel.

Owners frequently cite outdated map data, failure to learn from repeated corrections, and routing decisions that feel less intuitive than Google Maps or Apple Maps. Common complaints include phantom speed-limit changes, inefficient local roads, and poor point-of-interest handling.

Tesla Summon got insanely good in FSD v14.3.2 — Navigation? Not so much

Many drivers report intervening on navigation far more often than on core driving maneuvers, with some estimating it accounts for the majority of disengagements outside of edge cases.

Advertisement

Long-term users note that the same mapping glitches persist across years and software versions, despite thousands of collective miles of feedback. Yet the addition of the “Navigation” option has been met with optimism. It signals Tesla’s commitment to data-driven progress and suggests navigation improvements could arrive sooner.

For a community that already logs millions of FSD miles monthly, this small change could unlock meaningful gains in reliability and user trust—potentially accelerating the path to unsupervised autonomy.

Continue Reading