Connect with us

News

Do autonomous cars make us worse drivers?

Autonomous cars are coming. So is the first fatality associated with them. Statistically, that milestone should occur in the next 18 months. What will happen then?

Published

on

Tesla in autonomous mode

On May 31, 2009, an Airbus 330 on its way from Rio de Janiero to Paris plunged from an altitude of 35,000 feet into the Atlantic, killing all 228 people on board. Just prior to the crash, the airplane was operating in autopilot mode. A reconstruction of the disaster revealed input from several sensors had been compromised by ice that caused them to give false readings. Updated sensors that were less susceptible to ice accumulation were waiting to be installed after the plane arrived in Paris.

Because of the false readings the autopilot system disengaged returning control to the pilots however the senior pilot was sleeping at the time. The two junior pilots were not as highly trained in high altitude flight as they might have been, partly because the use of machines to control aircraft under those conditions was the norm.

Faced with the unexpected, the pilots behaved poorly. At one point they are heard to say on the cockpit recorder, “We completely lost control of the airplane, and we don’t understand anything! We tried everything!” While they tried to rouse the sleeping senior pilot, the nose of the aircraft climbed until a stall was induced. Stall is the point at which the wings become barn doors instead of airfoils. The Airbus 330 dropped from the sky like a rock.

In his excellent story about the crash published on Vanity Fair, William Langewiesche offered this conclusion: “Automation has made it more and more unlikely that ordinary airline pilots will ever have to face a raw crisis in flight—but also more and more unlikely that they will be able to cope with such a crisis if one arises.”

Advertisement

The Tesla community has seen similar instances lately. The driver in Salt Lake City who accidentally activated Summon, causing his car to drive into the back of a truck. The woman on a freeway in California who rear ended a car that suddenly slowed in front of her. The man in Europe who crashed into the back of a van that had stalled in the high speed lane of a highway. He at least had the courage to admit his error. “Yes, I could have reacted sooner, but when the car slows down correctly 1,000 times, you trust it to do it the next time to. My bad.”

After each of these incidents, the tendency has been for many to defend the machine and blame the human. But in a recent article for The Guardian, author Martin Robbins says, “Combine an autopilot with a good driver, and you get an autopilot with, if not a bad driver, at least not such a good one.” He says that statistically, the time when a car operating in autonomous mode causes a fatality is rapidly approaching.

Tesla_Model_S_dashcam-tacc-crash-van

Tesla Model S owner crashes into the back of a stalled van

On average, a person is killed in a traffic accident in the United States once every 100 million miles. Elon Musk says Tesla’s Autopilot is half as likely to be involved in a collision as a human driver. That would suggest that somewhere around the 200 million mile mark someone will die as a result of an automobile driven by a machine.

Tesla has already passed the 100 million mile mark for cars driving in Autopilot mode and continues to log 2.6 million miles driven per day. Statistically speaking, the time when a self driving car kills somebody is rapidly approaching. And since most autonomous cars on the road are Teslas, the odds are excellent it will be a Tesla that is involved in that first fatality.

What will happen then? Robbins goes back in history to look for an answer to that question. In 1896, Bridgit Driscoll became the first person in England to be killed by a motor car. The reaction among the public and the press was a fatalistic acceptance that progress will have a price. Within a few years, the speed limit in England was raised from 8 mph — which is was when Ms. Driscoll was killed — to 20 mph. This despite the fact that thousands of road deaths were being recorded on English roads by then.

Advertisement

Regulators around the world are racing to catch up with the explosion of new autonomous driving technology. But Robbins concludes,  “By the time they do, it’s likely that the technology will already be an accepted fact of life, its safety taken for granted by consumers, its failures written off as the fault of its error-prone human masters.”

The point is that injuries and fatalities will continue to occur as cars come to rely more and more on machines for routine driving chores. But in that transition period between now and the time when Level 4 autonomy becomes the norm — the day when cars come from the factory with no way for humans to control them directly — we need to accept that complacency and an inflated belief in the power of machines to protect us from harm may actually render us less competent behind the wheel.

We will need to remain vigilant, if for no other reason than telling a jury “It’s not my fault! The machine failed!” is not going to insulate us from the legal requirement to operate our private automobiles in a safe and prudent manner.

 

Advertisement

"I write about technology and the coming zero emissions revolution."

Advertisement
Comments

Elon Musk

Musk forces Judge’s exit from shareholder battles over viral social media slip-up

McCormick insisted in a court filing that she harbors no actual bias against Musk or the defendants. She claimed she either never clicked the “support” button, LinkedIn’s version of a “like,” or did so accidentally.

Published

on

(Credit: Tesla)

Many Tesla fans are familiar with the name Kathaleen McCormick, especially if they are investors in the company.

McCormick is a Delaware Chancery Court Judge who presided over Tesla CEO Elon Musk’s pay package lawsuit over the past few years, as well as his purchase of Twitter. However, she will no longer be sitting in on any issues related to Musk.

Elon Musk demands Delaware Judge recuse herself after ‘support’ post celebrating $2B court loss

In a rare admission of potential optics issues in one of America’s most powerful corporate courts, Delaware Chancery Court Chancellor Kathaleen McCormick stepped aside Monday from a cluster of shareholder lawsuits targeting Elon Musk and Tesla’s board.

Advertisement

The move came just days after Musk’s legal team highlighted her apparent “support” on LinkedIn for a post that mocked the billionaire over his 2022 tweets about the $44 billion Twitter acquisition.

McCormick insisted in a court filing that she harbors no actual bias against Musk or the defendants. She claimed she either never clicked the “support” button, LinkedIn’s version of a “like,” or did so accidentally.

She wrote in a newly published memo from the Delaware Chancery Court:

“The motion for recusal rests on a false premise — that I support a LinkedIn post about Mr. Musk, which I do not in fact support. I am not biased against the defendants in these actions.”

Advertisement

Yet she granted the reassignment anyway, acknowledging that the intense media scrutiny surrounding her involvement had become “detrimental to the administration of justice.”

The consolidated cases will now be handled by three of her colleagues on the Delaware Court of Chancery, the nation’s go-to venue for high-stakes corporate disputes. The lawsuits accuse Musk and Tesla directors of breaching fiduciary duties through lavish executive compensation and lax governance oversight.

One prominent claim, filed by a Detroit pension fund, challenges massive stock awards granted to board members, alleging the payouts harmed the company. The litigation also overlaps with issues stemming from Musk’s turbulent 2022 Twitter purchase.

McCormick’s history with Musk made her a lightning rod. In 2022, she presided over the fast-tracked lawsuit that ultimately forced Musk to complete the Twitter deal after he tried to back out.

Advertisement

Then in 2024, she struck down his record $56 billion Tesla compensation package, ruling the approval process was flawed and overly CEO-friendly. The Delaware Supreme Court later reinstated the pay on technical grounds, but the ruling fueled Musk’s long-standing criticism of the state’s judiciary.

Musk has repeatedly urged companies to reincorporate elsewhere, arguing Delaware courts have grown hostile to visionary leaders. Monday’s recusal hands him a symbolic victory and underscores how personal social-media activity can collide with judicial impartiality standards.

Delaware law requires judges to step aside if there’s even a “reasonable basis” to question their neutrality.

Court watchers say the episode highlights growing tensions in corporate America’s legal epicenter. While McCormick maintained her impartiality, the appearance of bias proved too costly to ignore. The cases will proceed without her, but the broader debate over Delaware’s dominance in business litigation is far from over.

Advertisement
Continue Reading

Elon Musk

Elon Musk has generous TSA offer denied by the White House: here’s why

Musk stepped in on March 21 via a post on X, writing: “I would like to offer to pay the salaries of TSA personnel during this funding impasse that is negatively affecting the lives of so many Americans at airports throughout the country.”

Published

on

Gage Skidmore, CC BY-SA 4.0 , via Wikimedia Commons

Tesla and SpaceX CEO Elon Musk made a generous offer to pay the salaries of Transportation Security Administration (TSA) employees last week, but the offer was denied by the White House.

In a striking display of private-sector initiative clashing with federal bureaucracy, the White House has turned down an offer from Elon Musk to personally cover the salaries of TSA officers amid an ongoing partial government shutdown. The rejection, reported last Wednesday by multiple outlets, highlights the legal and political hurdles facing unconventional solutions to Washington’s funding gridlock.

The impasse began weeks ago when Congress failed to pass funding for the Department of Homeland Security (DHS), leaving TSA employees, essential workers who screen millions of travelers daily, without paychecks while still required to report for duty.

Frustrated travelers have endured record-long security lines at major airports, with reports of chaos and delays rippling across the country.

Advertisement

Musk stepped in on March 21 via a post on X, writing: “I would like to offer to pay the salaries of TSA personnel during this funding impasse that is negatively affecting the lives of so many Americans at airports throughout the country.”

But it was not for no reason.

Advertisement

White House spokesperson Abigail Jackson responded on behalf of the Trump administration, expressing appreciation for Musk’s gesture.

However, the legal obstacles, which would be insurmountable, would inhibit Musk from doing so. Jackson said:

“We greatly appreciate Elon’s generous offer. This would pose great legal challenges due to his involvement with federal government contracts.”

Musk’s companies hold significant federal contracts, including NASA launches through SpaceX and potential Defense Department work, raising concerns about conflicts of interest, ethics rules, and anti-bribery statutes that prohibit private payments to government employees. Administration officials also indicated they expect the shutdown to end soon, making external funding unnecessary.

Advertisement

The episode underscores deeper tensions in Washington. Musk, who has advised on government efficiency efforts and maintains a close relationship with President Trump, has frequently criticized wasteful spending and bureaucratic delays.

His offer came as airport security lines ballooned, drawing public frustration toward both parties. TSA officers, many of whom rely on paychecks to cover mortgages and family expenses, have continued working without compensation, a situation that has drawn bipartisan concern but little immediate resolution.

Critics of the rejection argue it prioritizes red tape over practical relief for frontline workers and travelers. Supporters of the White House position counter that allowing private funding sets a dangerous precedent and could undermine congressional authority over the budget.

The White House eventually came to terms with the TSA on Friday and started paying them once again, and lines at airports instantly shrank.  The Department of Homeland Security (DHS) said that TSA staf would begin receiving paychecks “as early as” today.

Advertisement
Continue Reading

Elon Musk

Tesla FSD mocks BMW human driver: Saves pedestrian from near miss

Tesla FSD anticipated a BMW driver’s lane drift before the human behind the wheel could react.

Published

on

By

A video posted to r/TeslaFSD this week put a sharp spotlight on Tesla’s Full Self-Driving (FSD) software being able to react to pedestrian intent than an actual human driver behind the wheel. In the Reddit clip, a BMW driver can be seen rolling through a neighborhood street completely unaware of a pedestrian stepping in to cross. At the same time, a Tesla  driving on FSD had already begun slowing down before the pedestrian even began their attempt to cross the street The BMW kept moving, prompting the pedestrian to hop back, while the Tesla came to a stop and provide right-of-way for the human to safely cross.

That gap between what the BMW driver saw and what FSD had already processed is the story. Tesla FSD wasn’t reacting to a person in the street, rather it was reading the signals that a person was about to enter it based on the pedestrian’s movement, trajectory, and their trajectory to telegraph intent.

Tesla’s FSD is now built on an end-to-end neural network trained on billions of real-world miles, learning to interpret subtle human behavioral cues the same way an experienced human driver does instinctively. The difference is consistency. A human driver distracted for two seconds misses what FSD does not.

Tesla sues California DMV over Autopilot and FSD advertising ruling

Advertisement

Reddit commenters in the thread were blunt about the BMW driver’s failure, with several pointing out that the pedestrian was visible well before the crossing. One response put it plainly that the car on FSD saw the situation developing before the human in the other car had registered there was a situation at all.

Tesla has published data showing FSD (Supervised) is 54% safer than a human driver, accumulated across billions of miles driven on the system. Elon Musk has said FSD v14 will outperform human drivers by a factor of two to three, and that v15 has “a shot” at a 10x improvement. Pedestrian safety is where the stakes are highest, and where intent prediction closes the gap fastest. At 30 mph, a car covers roughly 44 feet per second. An extra second of awareness from reading a person’s body language rather than waiting for them to step out is often the difference between a near miss and a fatality.

Video and community discussion: r/TeslaFSD on Reddit

FSD saves man from becoming a pancake. BMW driver nearly flattens him.
by
u/Qwertygolol in
TeslaFSD

Advertisement
Continue Reading