Connect with us

News

Tesla reportedly dropped by NTSB from fatal Model X investigation [Updated]

The aftermath of a fatal Tesla Model X accident. [Credit: Mercury News/Twitter]

Published

on

Tesla has opted to step back from the ongoing NTSB investigation into the fatal Model X accident last month near Mountain View, CA.

According to Tesla, it has decided to withdraw from its party agreement with the NTSB because it might result in withholding information that affects public safety. In an emailed statement to Bloomberg, the Elon Musk-led electric car maker stated that it believes in transparency.

“Tesla withdrew from the party agreement with the NTSB because it requires that we not release information about Autopilot to the public, a requirement which we believe fundamentally affects public safety negatively. We believe in transparency, so an agreement that prevents public release of information for over a year is unacceptable,” Tesla stated.

Despite not being a formal part of the ongoing NTSB investigation, Tesla stated that it would continue to provide technical assistance to the agency as it continues its probe into the tragic accident.

Advertisement

Citing a person familiar with the matter, Bloomberg stated that the NTSB is actually removing Tesla from the investigation. Quite unlike the “very constructive conversation” reported by an NTSB spokesman last weekend between Tesla CEO Elon Musk and NTSB Chief Robert Sumwalt, the anonymous Bloomberg source stated that the talk involved Sumwalt informing Musk that his company was being taken off the investigation. The source further claimed that the conversation between Musk and the NTSB chief was “tense” due to the Tesla CEO’s reaction to the agency’s decision. 

Tesla’s decision to release information related to the NTSB’s ongoing probe resulted in the agency stating that it was “unhappy” with the electric car maker. Responding to the NTSB, Musk stated on Twitter that Tesla will immediately release information that can directly affect public safety.

“Lot of respect for NTSB, but NHTSA regulates cars, not NTSB, which is an advisory body. Tesla releases critical crash data affecting public safety immediately & always will. To do otherwise would be unsafe,” Musk tweeted.

Updated: Tesla has issued the following response to a statement made by the NTSB.

Advertisement

“Last week, in a conversation with the NTSB, we were told that if we made additional statements before their 12-24 month investigative process is complete, we would no longer be a party to the investigation agreement. On Tuesday, we chose to withdraw from the agreement and issued a statement to correct misleading claims that had been made about Autopilot — claims which made it seem as though Autopilot creates safety problems when the opposite is true. In the US, there is one automotive fatality every 86 million miles across all vehicles. For Tesla, there is one fatality, including known pedestrian fatalities, every 320 million miles in vehicles equipped with Autopilot hardware. If you are driving a Tesla equipped with Autopilot hardware, you are 3.7 times less likely to be involved in a fatal accident and this continues to improve.

It’s been clear in our conversations with the NTSB that they’re more concerned with press headlines than actually promoting safety. Among other things, they repeatedly released partial bits of incomplete information to the media in violation of their own rules, at the same time that they were trying to prevent us from telling all the facts. We don’t believe this is right and we will be making an official complaint to Congress. We will also be issuing a Freedom Of Information Act request to understand the reasoning behind their focus on the safest cars in America while they ignore the cars that are the least safe. Perhaps there is a sound rationale for this, but we cannot imagine what that could possibly be.

Something the public may not be aware of is that the NTSB is not a regulatory body, it is an advisory body. The regulatory body for the automotive industry in the US is the National Highway Traffic Safety Administration (NHTSA) with whom we have a strong and positive relationship. After doing a comprehensive study, NHTSA found that even the early version of Tesla Autopilot resulted in 40% fewer crashes. Autopilot has improved substantially since then.

When tested by NHTSA, Model S and Model X each received five stars not only overall but in every sub-category. This was the only time an SUV had ever scored that well. Moreover, of all the cars that NHTSA has ever tested, Model S and Model X scored as the two cars with the lowest probability of injury. There is no company that cares more about safety and the evidence speaks for itself.”

Advertisement

Just recently, the wife of the ill-fated Model X driver has gone on local news agency ABC7 News to state that her husband had complained about Autopilot multiple times before the March 23 accident. According to Mike Fong, the Huang family’s lawyer, the collision would not have happened had Autopilot not been activated. Fong noted that he would not file a complaint against Tesla while the NTSB investigation is ongoing, though he did state that the carmaker’s responses so far have been to blame the Model X’s driver.

While Tesla could be facing a lawsuit from the Huang family over the fatal incident, Will Huang, the Model X driver’s brother, previously stated to ABC7 News that his brother could have survived the accident had his car collided with a working crash attenuator. During its initial update about the fatal collision, Tesla stated that the crash attenuator that the Model X smashed into had been left unrepaired, causing extensive damage to the vehicle.

“That (the crash attenuator) ultimately should’ve saved my brother’s life. We’ve seen videos of similar crash(es) with cushion, and the driver walked out of it unharmed,” Will said.

Later statements from CalTrans eventually revealed that the highway safety device had been damaged from a collision 11 days before the Model X accident. According to CalTrans, crash attenuators are usually repaired in 7 days or 5 business days, but storms in the area prevented any repair work.

Advertisement

Simon is an experienced automotive reporter with a passion for electric cars and clean energy. Fascinated by the world envisioned by Elon Musk, he hopes to make it to Mars (at least as a tourist) someday. For stories or tips--or even to just say a simple hello--send a message to his email, simon@teslarati.com or his handle on X, @ResidentSponge.

Advertisement
Comments

News

Tesla confirms Cybercab with no steering wheel enters production

Published

on

Tesla has confirmed today that its steering wheel-less and pedal-less Cybercab, the vehicle geared toward launching the company’s autonomous ride-hailing hopes, has officially entered production at its Giga Texas production facility outside of Austin.

The Cybercab is a sleek two-door, two-passenger coupe engineered from the ground up as an electric self-driving vehicle. It features no steering wheel or pedals, relying instead on Tesla’s advanced vision-only Full Self-Driving system powered by multiple cameras and artificial intelligence.

The minimalist cabin centers on a large display screen that serves as the primary interface for passengers, creating an open, futuristic space optimized for comfort during unsupervised rides. A compact 35-kilowatt-hour battery pack delivers exceptional efficiency at 5.5 miles per kilowatt-hour, providing an estimated 200-mile range.

Additional innovations include inductive charging compatibility and a lightweight design that enhances aerodynamics and performance.

Production at Giga Texas builds on earlier prototypes and initial units completed earlier in 2026. The facility, already a hub for Model Y and Cybertruck assembly, now ramps up dedicated lines for the Cybercab.

Advertisement

This shift to volume manufacturing reflects Tesla’s strategy to scale affordable autonomous vehicles rapidly.

By focusing on a dedicated platform rather than adapting existing models, the company aims to keep costs low while prioritizing safety and reliability through continuous AI improvements.

The Cybercab’s debut in production carries broad implications for urban mobility. As the cornerstone of Tesla’s Robotaxi network, it promises on-demand, driverless rides that could slash transportation expenses, reduce traffic accidents caused by human error, and lower emissions through its all-electric powertrain.

Accessibility features, such as space for service animals or assistive devices, further broaden its appeal. Regulators and cities worldwide will soon evaluate its deployment, but the vehicle’s design already addresses key hurdles in scaling unsupervised autonomy.

Advertisement

Challenges persist, including full regulatory clearance and building charging infrastructure. Yet this production launch signals momentum. With Cybercabs poised to roll out in increasing numbers, Tesla edges closer to a future where personal ownership meets shared fleets of intelligent vehicles.

The start of Cybercab production is more than just a new vehicle entering mass manufacturing for Tesla, as it’s a signal autonomy is near. Being developed without manual controls is such a massive sign by Tesla that it trusts its progress on Full Self-Driving.

While the development of that suite continues, Tesla is making a clear cut statement that it is prepared to get its fully autonomous vehicle out in public roads as it prepares to revolutionize passenger travel once and for all.

Advertisement
Continue Reading

News

Tesla Summon got insanely good in FSD v14.3.2 — Navigation? Not so much

There were two new lines of improvements in the release notes: one addressing Actually Smart Summon (ASS), and another that now allows drivers to choose a reason for an intervention via a small menu during disengagement.

Published

on

(Photo: Hector Perez/YouTube)

Tesla Full Self-Driving v14.3.2 began rolling out to some owners earlier this week, and there are some notable improvements that came with this update.

There were two new lines of improvements in the release notes: one addressing Actually Smart Summon (ASS), and another that now allows drivers to choose a reason for an intervention via a small menu during disengagement.

Overall operation saw a handful of slight improvements, especially with parking performance, which has been the most notable difference with the arrival of FSD v14.3. However, there are still some very notable shortcomings, most notably with region-specific signage and navigation.

Tesla Assisted Smart Summon (ASS) improvements

There are noticeable improvements to ASS operation, which has definitely been inconsistent in terms of performance. Tesla wrote in the release notes for v14.3.2:

Advertisement

“Unified the model between Actually Smart Summon, FSD, and Robotaxi for more capable and reliable behavior.”

As recently as this month, I used Summon with no success. It had pulled around the parking lot I was in incorrectly, leaving the range at which Summon can be operated and losing a signal while moving in the middle of the lot.

This caused me to sprint across the lot to retrieve the vehicle:

Unfortunately, Summon was not dependable or accurate enough to use regularly. It appears Tesla might have bridged the gap needed to make it an effective feature, as two tests in parking lots proved that Summon was more responsive and faster to navigate to the location chosen.

Advertisement

It also did so without hesitation, confidently, and at a comfortable speed. I was able to test it twice at different distances:

Advertisement

I plan to test this more thoroughly and regularly through the next few weeks, and I avoided using it in a congested parking lot initially because I have not had overwhelming success with Summon in the past. I wanted to set a low baseline for it to see if it could simply pull up to the place I pinned in the Tesla app.

It was two for two, which is a big improvement because I don’t think I ever had successful Summon attempts back-to-back. It just seems more confident than ever before.

Advertisement

New Disengagement Categories

This is a really good idea from Tesla, but there are some issues with it. The categories you can select are Critical, Comfort, Preference, and Other.

I think the reasons why people choose to take over would be a better way to prompt drivers, like, “Traveling Too Fast,” “Incorrect Maneuver,” “Navigation Error,” would be more beneficial.

I say this because it seems that how we each categorize things might be different. For example, I shared a video of an intervention because the car had navigated to an exit to a parking lot and put its left blinker on, despite left turns not being allowed there.

I disengaged and chose Critical as the reason; it’s not a comfort issue, it’s not a preference, it’s quite literally an illegal turn, and it’s also dangerous because it cuts across several lanes of traffic and is 180 degrees.

Advertisement

Some said I should not have labeled this as Critical, but that’s the description I best characterized the disengagement as.

Advertisement

Categorizing interventions is a good thing, but it’s kind of hard to determine how to label them correctly.

Inconsistency with Regional Traffic Patterns

Tesla Full Self-Driving is pretty inconsistent with how it handles regional or local traffic patterns and road rules. The most frequent example I like to use is that of the “Except Right Turn” stop sign, which has become a notorious sighting on our social media platforms.

In the initial rollout of v14.3, my Model Y successfully navigated through one of these stop signs with no issues. However, testing at two of these stop signs yesterday proved it is still not sure how to read signs and navigate through them properly.

Off camera, I approached another one of these signs and felt the car coming to a stop, so I nudged it forward with the accelerator pedal pressed.

This helped the car go through the sign without stopping, but I could feel the bucking of the vehicle as the car really wanted to stop.

Musk said on the earnings call earlier this week that unsupervised FSD would probably be available in some regions before others, including a state-to-state basis in the U.S.

Advertisement

“It’s difficult to release this like to everyone everywhere all at once because we do want to make sure that they’re not unique situations in a city that particularly complex intersection or — actually, they tend to be places where people get into accidents a lot because they’re just — perhaps there’s — and like I said, an unsafe intersection or bad road markings or a lot of weather challenges. So I think we would release unsupervised gradually to the customer fleet as we feel like a particular geography is confirmed to be safe.”

This could be one of those examples that Tesla just has to figure out.

Highway Operation

Full Self-Driving is already pretty good at routine roadway navigation, so I don’t have too much to report here.

However, I was happy with FSD’s decision-making at several points, including its choice not to pass a slightly slower car and remain in the right lane as we approached the off-ramp:

Advertisement

Better Maneuvering at Stop Signs

Many FSD users report some strange operations at stop signs, especially four-way intersections where there is a stop sign and a line on the road, and they’re not even with one another.

Advertisement

I experienced this quite frequently and found that FSD would actually double stop: once at the stop sign and again at the line.

This created some interesting scenarios for me and I had many cars honk at me when the second stop would happen. Other vehicles that had waved me on to proceed through the intersection would become frustrated at the second stop.

FSD seems to have worked through this particular maneuver:

FSD should know to go to the more appropriate location (whichever provides better visibility), and proceed when it is the car’s turn to move. The double stop really ruined the flow of traffic at times and generally caused some frustration from other drivers.

Advertisement
Continue Reading

News

Tesla plans to resolve its angriest bunch of owners: here’s how

Since the rollout of the AI4 chip in Tesla vehicles, owners with the last generation self-driving chip, known as Hardware 3, have been persistent in their quest for a solution to their issue: they were told their cars were capable of unsupervised Full Self-Driving. It turns out the cars are not.

Published

on

tesla-asia-model-3
Credit: Tesla Asia/Twitter

Tesla has a plan to make Hardware 3 owners whole after CEO Elon Musk admitted that those with that self-driving chip in their cars will not have access to unsupervised Full Self-Driving.

The company’s strategy is so crazy that it is sort of hard to believe.

Since the rollout of the AI4 chip in Tesla vehicles, owners with the last generation self-driving chip, known as Hardware 3, have been persistent in their quest for a solution to their issue: they were told their cars were capable of unsupervised Full Self-Driving. It turns out the cars are not.

During the Tesla Q1 earnings call on Wednesday, Musk finally clarified what the company’s plans are for Hardware 3 owners, what they will be offered, and what Tesla will have to do internally to prepare for it.

The answer was somewhat mind-boggling.

Musk said:

Advertisement

“Unfortunately, Hardware 3 — I wish it were otherwise, but Hardware 3 simply does not have the capability to achieve unsupervised FSD. We did think at one point it would have that, but relative to Hardware 4, it has only 1/8 of the memory bandwidth of Hardware 4. And memory bandwidth is one of the key elements needed for unsupervised FSD.”

He continued, stating that HW3 owners would have the opportunity to trade their cars in at a discounted rate in order to get the AI4 chip:

“So for customers that have bought FSD, what we’re offering is essentially a trade-in — like a discounted trade-in for cars that have AI4 hardware, and we’ll also be offering the ability to upgrade the car, to replace the computer. And you also need to replace the cameras, unfortunately, to go to Hardware 4.”

Obviously, Tesla has a lot of people to work with and make this whole thing right. Musk was adamant that HW3 would be capable of FSD, and now that the company has finally admitted that it is not, there are some things that could come of this.

Advertisement

There has been open talk about some sort of class action lawsuit against Tesla. The promises that Tesla made previously could be considered a breach of contract or even false advertising, and that’s according to Grok, Musk’s own AI program.

Musk went on to say that Tesla would likely have to establish new microfactories to effectively and efficiently replace HW3 computers and cameras:

…So to do this efficiently, we’re going to have to set up, like kind of micro factories or small factories in major metropolitan areas in order to do it efficiently. Because if it’s done just at the service center, it is extremely slow to do so and inefficient. So we basically need like many production lines to make the change.”

This is going to be an extremely costly process, especially if Tesla has to buy real estate, properties, and equipment to complete this work. Additionally, there was no wording on pricing, but Musk never said it would be free. It will likely come with some kind of price tag, and HW3 owners, after being left hanging for so long, will have something to say about that.

Advertisement
Continue Reading