Connect with us

News

Tesla owner explains Autopilot behavior at Model X accident scene

[Credit: Privater/YouTube]

Published

on

A Tesla owner recently shared a theory on factors that might have led up to the fatal Model X accident near Mountain View, CA, on March 23. Driving on the same stretch of road on Autopilot, the Tesla owner observed that there were deviations on the street’s markings and repair cuts — things which might have caused the electric car’s sensors to misread the highway’s lanes.

The 36-second clip was uploaded and shared on YouTube by Privater, who included annotations to the video highlighting his observations. At the 0:05-second mark on the clip, the Tesla owner noted that the markings on the road deviated from their original line due to the beginning of a repair cut. Further into the street (0:12 into the clip), Privater noted that the repair cuts in the road became very prominent. This could have confused Autopilot into thinking that it was a lane, especially under the direct glare of the sun.

As the barrier where the fatal Model X accident took place in came into view (0:23 into the video), Privater noted that the section of the road leading up to the crash cushion was marked by solid white lines. As could be seen in the Tesla owner’s clip, the lines were almost wide enough to be a lane, which could have also been misread by Autopilot.

The Tesla owner noted that he had been driving on the same stretch of road on Autopilot for almost two years. During that time, Privater stated that his car had misread the road marks and nearly collided with the crash cushion once or twice. He described his experiences as a response to a comment on his YouTube video.

A Tesla owner suggests a possible explanation for the fatal Model X accident on March 23, 2018. [Credit: Privater/YouTube]

“On the video, my car is on Autopilot. I drive the same section for nearly two years, (and) 99.9% of (the) time, I’m on Autopilot. However, this kind of error only happened to me once or twice. It’s scary enough for me to keep high alert on this intersection,” he wrote.

In an update to its first statement about the fatal Model X accident, Tesla confirmed that the ill-fated electric SUV was on Autopilot when it collided with the highway barrier. According to Tesla, the Model X driver had received several visual warnings and one audible hands-on warning earlier during the drive. The ill-fated electric SUV’s driver had also not placed his hands on the steering wheel for 6 seconds before the fatal accident. Overall, the Model X driver had about 5 seconds and 150 meters of unobstructed view to steer the car away from the highway divider before the collision occurred.

Advertisement
-->

In a statement to Reuters, NTSB spokesman Chris O’Neil expressed the agency’s disagreement about the Elon Musk-led company’s decision to release information about the investigation to the public.

“The agency needs the cooperation of Tesla to decode the data the vehicle recorded. In each of our investigations involving a Tesla vehicle, Tesla has been extremely cooperative on assisting with the vehicle data. However, the NTSB is unhappy with the release of investigative information by Tesla,” O’Neil said.

As we noted in a previous report, the Model X crash was so severe because a crash attenuator, a highway safety device designed to absorb the impact of a colliding vehicle, had not been repaired by CalTrans since a 2010 Toyota Prius smashed into the safety device 11 days before the Tesla accident. In a statement to ABC7 News, Caltrans stated that the standard timeline for a crash attenuator’s repair is 7 days or 5 business days after an accident. The safety device’s repairs were delayed, however, due to storms in the area. 

Watch Privater’s Autopilot drive-by in the video below.

Advertisement
-->

Simon is an experienced automotive reporter with a passion for electric cars and clean energy. Fascinated by the world envisioned by Elon Musk, he hopes to make it to Mars (at least as a tourist) someday. For stories or tips--or even to just say a simple hello--send a message to his email, simon@teslarati.com or his handle on X, @ResidentSponge.

Advertisement
Comments

News

Tesla is not sparing any expense in ensuring the Cybercab is safe

Images shared by the longtime watcher showed 16 Cybercab prototypes parked near Giga Texas’ dedicated crash test facility.

Published

on

Credit: @JoeTegtmeyer/X

The Tesla Cybercab could very well be the safest taxi on the road when it is released and deployed for public use. This was, at least, hinted at by the intensive safety tests that Tesla seems to be putting the autonomous two-seater through at its Giga Texas crash test facility. 

Intensive crash tests

As per recent images from longtime Giga Texas watcher and drone operator Joe Tegtmeyer, Tesla seems to be very busy crash testing Cybercab units. Images shared by the longtime watcher showed 16 Cybercab prototypes parked near Giga Texas’ dedicated crash test facility just before the holidays. 

Tegtmeyer’s aerial photos showed the prototypes clustered outside the factory’s testing building. Some uncovered Cybercabs showed notable damage and one even had its airbags engaged. With Cybercab production expected to start in about 130 days, it appears that Tesla is very busy ensuring that its autonomous two-seater ends up becoming the safest taxi on public roads. 

Prioritizing safety

With no human driver controls, the Cybercab demands exceptional active and passive safety systems to protect occupants in any scenario. Considering Tesla’s reputation, it is then understandable that the company seems to be sparing no expense in ensuring that the Cybercab is as safe as possible.

Tesla’s focus on safety was recently highlighted when the Cybertruck achieved a Top Safety Pick+ rating from the Insurance Institute for Highway Safety (IIHS). This was a notable victory for the Cybertruck as critics have long claimed that the vehicle will be one of, if not the, most unsafe truck on the road due to its appearance. The vehicle’s Top Safety Pick+ rating, if any, simply proved that Tesla never neglects to make its cars as safe as possible, and that definitely includes the Cybercab.

Advertisement
-->
Continue Reading

Elon Musk

Tesla’s Elon Musk gives timeframe for FSD’s release in UAE

Provided that Musk’s timeframe proves accurate, FSD would be able to start saturating the Middle East, starting with the UAE, next year. 

Published

on

Tesla CEO Elon Musk stated on Monday that Full Self-Driving (Supervised) could launch in the United Arab Emirates (UAE) as soon as January 2026. 

Provided that Musk’s timeframe proves accurate, FSD would be able to start saturating the Middle East, starting with the UAE, next year. 

Musk’s estimate

In a post on X, UAE-based political analyst Ahmed Sharif Al Amiri asked Musk when FSD would arrive in the country, quoting an earlier post where the CEO encouraged users to try out FSD for themselves. Musk responded directly to the analyst’s inquiry. 

“Hopefully, next month,” Musk wrote. The exchange attracted a lot of attention, with numerous X users sharing their excitement at the idea of FSD being brought to a new country. FSD (Supervised), after all, would likely allow hands-off highway driving, urban navigation, and parking under driver oversight in traffic-heavy cities such as Dubai and Abu Dhabi.

Musk’s comments about FSD’s arrival in the UAE were posted following his visit to the Middle Eastern country. Over the weekend, images were shared online of Musk meeting with UAE Defense Minister, Deputy Prime Minister, and Dubai Crown Prince HH Sheikh Hamdan bin Mohammed. Musk also posted a supportive message about the country, posting “UAE rocks!” on X.

Advertisement
-->

FSD recognition

FSD has been getting quite a lot of support from foreign media outlets. FSD (Supervised) earned high marks from Germany’s largest car magazine, Auto Bild, during a test in Berlin’s challenging urban environment. The demonstration highlighted the system’s ability to handle dense traffic, construction sites, pedestrian crossings, and narrow streets with smooth, confident decision-making.

Journalist Robin Hornig was particularly struck by FSD’s superior perception and tireless attention, stating: “Tesla FSD Supervised sees more than I do. It doesn’t get distracted and never gets tired. I like to think I’m a good driver, but I can’t match this system’s all-around vision. It’s at its best when both work together: my experience and the Tesla’s constant attention.” Only one intervention was needed when the system misread a route, showcasing its maturity while relying on vision-only sensors and over-the-air learning.

Continue Reading

News

Tesla quietly flexes FSD’s reliability amid Waymo blackout in San Francisco

“Tesla Robotaxis were unaffected by the SF power outage,” Musk wrote in his post.

Published

on

Tesla highlighted its Full Self-Driving (Supervised) system’s robustness this week by sharing dashcam footage of a vehicle in FSD navigating pitch-black San Francisco streets during the city’s widespread power outage. 

While Waymo’s robotaxis stalled and caused traffic jams, Tesla’s vision-only approach kept operating seamlessly without remote intervention. Elon Musk amplified the clip, highlighting the contrast between the two systems.

Tesla FSD handles total darkness

The @Tesla_AI account posted a video from a Model Y operating on FSD during San Francisco’s blackout. As could be seen in the video, streetlights, traffic signals, and surrounding illumination were completely out, but the vehicle drove confidently and cautiously, just like a proficient human driver.

Musk reposted the clip, adding context to reports of Waymo vehicles struggling in the same conditions. “Tesla Robotaxis were unaffected by the SF power outage,” Musk wrote in his post. 

Musk and the Tesla AI team’s posts highlight the idea that FSD operates a lot like any experienced human driver. Since the system does not rely on a variety of sensors and a complicated symphony of factors, vehicles could technically navigate challenging circumstances as they emerge. This definitely seemed to be the case in San Francisco.  

Advertisement
-->

Waymo’s blackout struggles

Waymo faced scrutiny after multiple self-driving Jaguar I-PACE taxis stopped functioning during the blackout, blocking lanes, causing traffic jams, and requiring manual retrieval. Videos shared during the power outage showed fleets of Waymo vehicles just stopping in the middle of the road, seemingly confused about what to do when the lights go out. 

In a comment, Waymo stated that its vehicles treat nonfunctional signals as four-way stops, but “the sheer scale of the outage led to instances where vehicles remained stationary longer than usual to confirm the state of the affected intersections. This contributed to traffic friction during the height of the congestion.”

A company spokesperson also shared some thoughts about the incidents. “Yesterday’s power outage was a widespread event that caused gridlock across San Francisco, with non-functioning traffic signals and transit disruptions. While the failure of the utility infrastructure was significant, we are committed to ensuring our technology adjusts to traffic flow during such events,” the Waymo spokesperson stated, adding that it is “focused on rapidly integrating the lessons learned from this event, and are committed to earning and maintaining the trust of the communities we serve every day.”

Continue Reading