Connect with us

News

Tesla crash leads NTSB to begin probe of autonomous driving technology

Published

on

The echoes of the Autopilot crash that killed Joshua Brown on May 7 are still reverberating. Not only is the National Highway Transportation Safety Administration conducting an investigation, now the National Transportation Safety Board (NTSB) is getting into the act. NHTSA chief Mark Rosekind and Transportation Secretary Anthony Fox have expressed approval of autonomous driving technology. With more than 80% of the 34,000 highway deaths in the US every year attributed to human error, they recognize the power of new safety system to reduce the carnage on America’s roadways.

The NTSB on the other hand has warned that such systems can lead to danger because they lull drivers into complacency behind the wheel. Missy Cummings, an engineering professor and human factors expert at Duke University, says humans tend to show “automation bias,” a trust that automated systems can handle all situations when they work 80% of the time.

According to Automotive News, NTSB will send a team of five investigators to Florida to probe the of Joshua Brown after his Tesla Model S while driving on Autopilot crashed into a tractor trailer. “It’s worth taking a look and seeing what we can learn from that event, so that as that automation is more widely introduced we can do it in the safest way possible,” says Christopher O’Neil, who heads the NTSB.

“It’s very significant,” says Clarence Ditlow, executive director of the Center for Auto Safety, an advocacy group in Washington, DC. “The NTSB only investigates crashes with broader implications.” He says the action by the NTSB is significant. “They’re not looking at just this crash. They’re looking at the broader aspects. Are these driverless vehicles safe? Are there enough regulations in place to ensure their safety” Ditlow adds, “And one thing in this crash I’m certain they’re going to look at is using the American public as test drivers for beta systems in vehicles. That is simply unheard of in auto safety,” he said.

Advertisement

That is the crux of the situation. Tesla obviously has the right to conduct a beta test of its Autopilot system if only other Tesla drivers were involved. The question is whether it has the same right to do so on public roads with other drivers who are not part of the beta test and who are unaware than an experiment is taking place around them as they drive?

For its part, Tesla steadfastly maintains that “Autopilot is by far the most advanced driver-assistance system on the road, but it does not turn a Tesla into an autonomous vehicle and does not allow the driver to abdicate responsibility. Since the release of Autopilot, we’ve continuously educated customers on the use of the feature, reminding them that they’re responsible for remaining alert and present when using Autopilot and must be prepared to take control at all times.”

That’s all well and good, but are such pronouncements from the company enough to overcome that “automation bias” Professor Cummings refers to?  Clearly, Ditlow thinks not. What the NTSB decides to do, if anything, after it completes its investigation could have a dramatic impact on self driving technology both in the US and around the world. Regulators in other countries will place a lot of weight on what the Board decides.

From Tesla’s perspective, they will want to know if the death of one person is reason enough to delay implementation of technology that could save 100,000 or more lives each year worldwide. The company is racing ahead with improvements to its Autopilot system. Just the other week a Tesla Model S was spotted testing near the company’s Silicon Valley-based headquarters with a LIDAR mounted to its roof.

Advertisement

NTSB has a lot of clout when it comes to promulgating safety regulations. It will be hard pressed to fairly balance all of the competing interests involved in the aftermath of the Joshua Brown fatality.

"I write about technology and the coming zero emissions revolution."

Advertisement
Comments

Elon Musk

Musk forces Judge’s exit from shareholder battles over viral social media slip-up

McCormick insisted in a court filing that she harbors no actual bias against Musk or the defendants. She claimed she either never clicked the “support” button, LinkedIn’s version of a “like,” or did so accidentally.

Published

on

(Credit: Tesla)

Many Tesla fans are familiar with the name Kathaleen McCormick, especially if they are investors in the company.

McCormick is a Delaware Chancery Court Judge who presided over Tesla CEO Elon Musk’s pay package lawsuit over the past few years, as well as his purchase of Twitter. However, she will no longer be sitting in on any issues related to Musk.

Elon Musk demands Delaware Judge recuse herself after ‘support’ post celebrating $2B court loss

In a rare admission of potential optics issues in one of America’s most powerful corporate courts, Delaware Chancery Court Chancellor Kathaleen McCormick stepped aside Monday from a cluster of shareholder lawsuits targeting Elon Musk and Tesla’s board.

Advertisement

The move came just days after Musk’s legal team highlighted her apparent “support” on LinkedIn for a post that mocked the billionaire over his 2022 tweets about the $44 billion Twitter acquisition.

McCormick insisted in a court filing that she harbors no actual bias against Musk or the defendants. She claimed she either never clicked the “support” button, LinkedIn’s version of a “like,” or did so accidentally.

She wrote in a newly published memo from the Delaware Chancery Court:

“The motion for recusal rests on a false premise — that I support a LinkedIn post about Mr. Musk, which I do not in fact support. I am not biased against the defendants in these actions.”

Advertisement

Yet she granted the reassignment anyway, acknowledging that the intense media scrutiny surrounding her involvement had become “detrimental to the administration of justice.”

The consolidated cases will now be handled by three of her colleagues on the Delaware Court of Chancery, the nation’s go-to venue for high-stakes corporate disputes. The lawsuits accuse Musk and Tesla directors of breaching fiduciary duties through lavish executive compensation and lax governance oversight.

One prominent claim, filed by a Detroit pension fund, challenges massive stock awards granted to board members, alleging the payouts harmed the company. The litigation also overlaps with issues stemming from Musk’s turbulent 2022 Twitter purchase.

McCormick’s history with Musk made her a lightning rod. In 2022, she presided over the fast-tracked lawsuit that ultimately forced Musk to complete the Twitter deal after he tried to back out.

Advertisement

Then in 2024, she struck down his record $56 billion Tesla compensation package, ruling the approval process was flawed and overly CEO-friendly. The Delaware Supreme Court later reinstated the pay on technical grounds, but the ruling fueled Musk’s long-standing criticism of the state’s judiciary.

Musk has repeatedly urged companies to reincorporate elsewhere, arguing Delaware courts have grown hostile to visionary leaders. Monday’s recusal hands him a symbolic victory and underscores how personal social-media activity can collide with judicial impartiality standards.

Delaware law requires judges to step aside if there’s even a “reasonable basis” to question their neutrality.

Court watchers say the episode highlights growing tensions in corporate America’s legal epicenter. While McCormick maintained her impartiality, the appearance of bias proved too costly to ignore. The cases will proceed without her, but the broader debate over Delaware’s dominance in business litigation is far from over.

Advertisement
Continue Reading

Elon Musk

Elon Musk has generous TSA offer denied by the White House: here’s why

Musk stepped in on March 21 via a post on X, writing: “I would like to offer to pay the salaries of TSA personnel during this funding impasse that is negatively affecting the lives of so many Americans at airports throughout the country.”

Published

on

Gage Skidmore, CC BY-SA 4.0 , via Wikimedia Commons

Tesla and SpaceX CEO Elon Musk made a generous offer to pay the salaries of Transportation Security Administration (TSA) employees last week, but the offer was denied by the White House.

In a striking display of private-sector initiative clashing with federal bureaucracy, the White House has turned down an offer from Elon Musk to personally cover the salaries of TSA officers amid an ongoing partial government shutdown. The rejection, reported last Wednesday by multiple outlets, highlights the legal and political hurdles facing unconventional solutions to Washington’s funding gridlock.

The impasse began weeks ago when Congress failed to pass funding for the Department of Homeland Security (DHS), leaving TSA employees, essential workers who screen millions of travelers daily, without paychecks while still required to report for duty.

Frustrated travelers have endured record-long security lines at major airports, with reports of chaos and delays rippling across the country.

Advertisement

Musk stepped in on March 21 via a post on X, writing: “I would like to offer to pay the salaries of TSA personnel during this funding impasse that is negatively affecting the lives of so many Americans at airports throughout the country.”

But it was not for no reason.

Advertisement

White House spokesperson Abigail Jackson responded on behalf of the Trump administration, expressing appreciation for Musk’s gesture.

However, the legal obstacles, which would be insurmountable, would inhibit Musk from doing so. Jackson said:

“We greatly appreciate Elon’s generous offer. This would pose great legal challenges due to his involvement with federal government contracts.”

Musk’s companies hold significant federal contracts, including NASA launches through SpaceX and potential Defense Department work, raising concerns about conflicts of interest, ethics rules, and anti-bribery statutes that prohibit private payments to government employees. Administration officials also indicated they expect the shutdown to end soon, making external funding unnecessary.

Advertisement

The episode underscores deeper tensions in Washington. Musk, who has advised on government efficiency efforts and maintains a close relationship with President Trump, has frequently criticized wasteful spending and bureaucratic delays.

His offer came as airport security lines ballooned, drawing public frustration toward both parties. TSA officers, many of whom rely on paychecks to cover mortgages and family expenses, have continued working without compensation, a situation that has drawn bipartisan concern but little immediate resolution.

Critics of the rejection argue it prioritizes red tape over practical relief for frontline workers and travelers. Supporters of the White House position counter that allowing private funding sets a dangerous precedent and could undermine congressional authority over the budget.

The White House eventually came to terms with the TSA on Friday and started paying them once again, and lines at airports instantly shrank.  The Department of Homeland Security (DHS) said that TSA staf would begin receiving paychecks “as early as” today.

Advertisement
Continue Reading

Elon Musk

Tesla FSD mocks BMW human driver: Saves pedestrian from near miss

Tesla FSD anticipated a BMW driver’s lane drift before the human behind the wheel could react.

Published

on

By

A video posted to r/TeslaFSD this week put a sharp spotlight on Tesla’s Full Self-Driving (FSD) software being able to react to pedestrian intent than an actual human driver behind the wheel. In the Reddit clip, a BMW driver can be seen rolling through a neighborhood street completely unaware of a pedestrian stepping in to cross. At the same time, a Tesla  driving on FSD had already begun slowing down before the pedestrian even began their attempt to cross the street The BMW kept moving, prompting the pedestrian to hop back, while the Tesla came to a stop and provide right-of-way for the human to safely cross.

That gap between what the BMW driver saw and what FSD had already processed is the story. Tesla FSD wasn’t reacting to a person in the street, rather it was reading the signals that a person was about to enter it based on the pedestrian’s movement, trajectory, and their trajectory to telegraph intent.

Tesla’s FSD is now built on an end-to-end neural network trained on billions of real-world miles, learning to interpret subtle human behavioral cues the same way an experienced human driver does instinctively. The difference is consistency. A human driver distracted for two seconds misses what FSD does not.

Tesla sues California DMV over Autopilot and FSD advertising ruling

Advertisement

Reddit commenters in the thread were blunt about the BMW driver’s failure, with several pointing out that the pedestrian was visible well before the crossing. One response put it plainly that the car on FSD saw the situation developing before the human in the other car had registered there was a situation at all.

Tesla has published data showing FSD (Supervised) is 54% safer than a human driver, accumulated across billions of miles driven on the system. Elon Musk has said FSD v14 will outperform human drivers by a factor of two to three, and that v15 has “a shot” at a 10x improvement. Pedestrian safety is where the stakes are highest, and where intent prediction closes the gap fastest. At 30 mph, a car covers roughly 44 feet per second. An extra second of awareness from reading a person’s body language rather than waiting for them to step out is often the difference between a near miss and a fatality.

Video and community discussion: r/TeslaFSD on Reddit

FSD saves man from becoming a pancake. BMW driver nearly flattens him.
by
u/Qwertygolol in
TeslaFSD

Advertisement
Continue Reading