News
Tesla with sleeping driver proves there’s still misunderstanding and irresponsibility surrounding autonomy
Update: 11:06 AM EST: Paragraph 7 added to show the probability of the driver having a medical emergency. California Highway Patrol saw the vehicle and noted the driver was awake after catching up to the car.
A Tesla Model Y with a sleeping driver was recently spotted on the I-15 Freeway near Temecula, California, which proves that people and media still have a vast misunderstanding and irresponsible tone regarding the capabilities of semi-autonomous vehicles.
According to a report from KTLA 5, a woman in a Tesla Model Y was followed by another driver for more than fifteen minutes on the California interstate in an attempt to wake the woman who was taking advantage of the automaker’s semi-autonomous driving functions up.
The report and the incident prove there are still huge misunderstandings in the capabilities of semi-autonomous driving suites, including Tesla’s Full Self-Driving and Autopilot, which require users to remain vigilant and be prepared to take over the vehicle at any point.
Drivers utilize semi-autonomous vehicle functionalities irresponsibly often, and social media has proven time and time again that people take advantage of the capabilities, even though they are not fully autonomous.
It is no secret that people and companies have utilized whatever they can to alleviate themselves of the responsibility of paying attention while the car operates some tasks on its own. With the introduction of advanced driver assistance systems (ADAS) over the past several years, drivers have taken advantage of the functions to instead play on their phones, read books, eat food, or even catch up on sleep.
Tesla’s camera-based driver monitoring system goes through the cellphone test
However, the risks that come with this behavior are potentially catastrophic. For one, those who use these functions irresponsibly put themselves and every other driver on the road at risk because if the vehicle needs assistance or encounters a situation where it would not react safely, the driver is responsible for taking over. Additionally, if an accident occurs, it can be framed as Tesla’s, or any other manufacturer’s fault, depending on the vehicle used, and instances like this can set the future of semi-autonomous and autonomous driving back years due to skepticism.
There is the possibility that the driver had some type of medical emergency or accidentally fell asleep, in which the Tesla’s functionalities kept the operator and others safe. Police stated the driver was caught up to two minutes after receiving calls about the driver, and the driver was attentive at this time.
However, the media’s portrayal of the situation also proves that many are widely uninformed regarding the capabilities of Teslas. While Tesla’s Full Self-Driving suite has caused controversy over its name, the automaker continues to remind those who use it to remain vigilant, as the cars cannot truly drive themselves.
In Tesla’s FAQ section of the Autopilot and Full Self-Driving page, the company answers the question, “Do I still need to pay attention while using Autopilot?:”
“Yes. Autopilot is a hands-on driver assistance system that is intended to be used only with a fully attentive driver. It does not turn a Tesla into a self-driving car nor does it make a car autonomous.
Before enabling Autopilot, you must agree to “keep your hands on the steering wheel at all times” and to always “maintain control and responsibility for your car.” Once engaged, Autopilot will also deliver an escalating series of visual and audio warnings, reminding you to place your hands on the wheel if insufficient torque is applied. If you repeatedly ignore these warnings, you will be locked out from using Autopilot during that trip.
You can override any of Autopilot’s features at any time by steering, applying the brakes, or using the cruise control stalk to deactivate.”
Media labeling the vehicle as “a self-driving Tesla” is a disservice to people and the company. Teslas do not drive themselves, as the vehicles are defined as Level 2, according to the Society of Automotive Engineers Levels of Driving Automation. Level 2 systems reiterate that the driver is still responsible for driving the car when these systems are activated. “You must constantly supervise these support features,” the SAE says. Level 3 to Level 5 systems maintain that the operator is not driving the car, but Level 5 systems are the only ones that are explicitly labeled as “self-driving.”
“This feature can drive the vehicle under all conditions,” the SAE table states.
Credit: Society of Automotive Engineers
Recent ratings by Consumer Reports showed that Tesla’s biggest flaw was driver monitoring. Many systems use cabin-facing cameras to monitor eye behavior to ensure the operator is keeping their eyes on the road. Teslas use a series of audible and visual cues to alert drivers of their inattentiveness, and steering wheel sensors make sure the driver keeps their hands on the wheel.
However, various cheat devices have been marketed across the internet, and in this instance, the driver appears to have their hands on the wheel while they are dozed.
Tesla activated camera-based driver monitoring in May 2021. “The cabin camera above your rearview mirror can now detect and alert driver inattentiveness while Autopilot is engaged,” Tesla said in the notes. Tests of Tesla’s driver monitoring tests showed the system was effective in some instances, especially when looking at cell phones, with alerts coming in 15 seconds.
The potential irresponsibility of users puts major risks to those on the road and the companies that develop these driver assistance programs. While there are workarounds through the previously-mentioned cheat devices, people have to know their irresponsibility could cost them, or others, their lives.
I’d love to hear from you! If you have any comments, concerns, or questions, please email me at joey@teslarati.com. You can also reach me on Twitter @KlenderJoey, or if you have news tips, you can email us at tips@teslarati.com.
News
Tesla is not sparing any expense in ensuring the Cybercab is safe
Images shared by the longtime watcher showed 16 Cybercab prototypes parked near Giga Texas’ dedicated crash test facility.
The Tesla Cybercab could very well be the safest taxi on the road when it is released and deployed for public use. This was, at least, hinted at by the intensive safety tests that Tesla seems to be putting the autonomous two-seater through at its Giga Texas crash test facility.
Intensive crash tests
As per recent images from longtime Giga Texas watcher and drone operator Joe Tegtmeyer, Tesla seems to be very busy crash testing Cybercab units. Images shared by the longtime watcher showed 16 Cybercab prototypes parked near Giga Texas’ dedicated crash test facility just before the holidays.
Tegtmeyer’s aerial photos showed the prototypes clustered outside the factory’s testing building. Some uncovered Cybercabs showed notable damage and one even had its airbags engaged. With Cybercab production expected to start in about 130 days, it appears that Tesla is very busy ensuring that its autonomous two-seater ends up becoming the safest taxi on public roads.
Prioritizing safety
With no human driver controls, the Cybercab demands exceptional active and passive safety systems to protect occupants in any scenario. Considering Tesla’s reputation, it is then understandable that the company seems to be sparing no expense in ensuring that the Cybercab is as safe as possible.
Tesla’s focus on safety was recently highlighted when the Cybertruck achieved a Top Safety Pick+ rating from the Insurance Institute for Highway Safety (IIHS). This was a notable victory for the Cybertruck as critics have long claimed that the vehicle will be one of, if not the, most unsafe truck on the road due to its appearance. The vehicle’s Top Safety Pick+ rating, if any, simply proved that Tesla never neglects to make its cars as safe as possible, and that definitely includes the Cybercab.
Elon Musk
Tesla’s Elon Musk gives timeframe for FSD’s release in UAE
Provided that Musk’s timeframe proves accurate, FSD would be able to start saturating the Middle East, starting with the UAE, next year.
Tesla CEO Elon Musk stated on Monday that Full Self-Driving (Supervised) could launch in the United Arab Emirates (UAE) as soon as January 2026.
Provided that Musk’s timeframe proves accurate, FSD would be able to start saturating the Middle East, starting with the UAE, next year.
Musk’s estimate
In a post on X, UAE-based political analyst Ahmed Sharif Al Amiri asked Musk when FSD would arrive in the country, quoting an earlier post where the CEO encouraged users to try out FSD for themselves. Musk responded directly to the analyst’s inquiry.
“Hopefully, next month,” Musk wrote. The exchange attracted a lot of attention, with numerous X users sharing their excitement at the idea of FSD being brought to a new country. FSD (Supervised), after all, would likely allow hands-off highway driving, urban navigation, and parking under driver oversight in traffic-heavy cities such as Dubai and Abu Dhabi.
Musk’s comments about FSD’s arrival in the UAE were posted following his visit to the Middle Eastern country. Over the weekend, images were shared online of Musk meeting with UAE Defense Minister, Deputy Prime Minister, and Dubai Crown Prince HH Sheikh Hamdan bin Mohammed. Musk also posted a supportive message about the country, posting “UAE rocks!” on X.
FSD recognition
FSD has been getting quite a lot of support from foreign media outlets. FSD (Supervised) earned high marks from Germany’s largest car magazine, Auto Bild, during a test in Berlin’s challenging urban environment. The demonstration highlighted the system’s ability to handle dense traffic, construction sites, pedestrian crossings, and narrow streets with smooth, confident decision-making.
Journalist Robin Hornig was particularly struck by FSD’s superior perception and tireless attention, stating: “Tesla FSD Supervised sees more than I do. It doesn’t get distracted and never gets tired. I like to think I’m a good driver, but I can’t match this system’s all-around vision. It’s at its best when both work together: my experience and the Tesla’s constant attention.” Only one intervention was needed when the system misread a route, showcasing its maturity while relying on vision-only sensors and over-the-air learning.
News
Tesla quietly flexes FSD’s reliability amid Waymo blackout in San Francisco
“Tesla Robotaxis were unaffected by the SF power outage,” Musk wrote in his post.
Tesla highlighted its Full Self-Driving (Supervised) system’s robustness this week by sharing dashcam footage of a vehicle in FSD navigating pitch-black San Francisco streets during the city’s widespread power outage.
While Waymo’s robotaxis stalled and caused traffic jams, Tesla’s vision-only approach kept operating seamlessly without remote intervention. Elon Musk amplified the clip, highlighting the contrast between the two systems.
Tesla FSD handles total darkness
The @Tesla_AI account posted a video from a Model Y operating on FSD during San Francisco’s blackout. As could be seen in the video, streetlights, traffic signals, and surrounding illumination were completely out, but the vehicle drove confidently and cautiously, just like a proficient human driver.
Musk reposted the clip, adding context to reports of Waymo vehicles struggling in the same conditions. “Tesla Robotaxis were unaffected by the SF power outage,” Musk wrote in his post.
Musk and the Tesla AI team’s posts highlight the idea that FSD operates a lot like any experienced human driver. Since the system does not rely on a variety of sensors and a complicated symphony of factors, vehicles could technically navigate challenging circumstances as they emerge. This definitely seemed to be the case in San Francisco.
Waymo’s blackout struggles
Waymo faced scrutiny after multiple self-driving Jaguar I-PACE taxis stopped functioning during the blackout, blocking lanes, causing traffic jams, and requiring manual retrieval. Videos shared during the power outage showed fleets of Waymo vehicles just stopping in the middle of the road, seemingly confused about what to do when the lights go out.
In a comment, Waymo stated that its vehicles treat nonfunctional signals as four-way stops, but “the sheer scale of the outage led to instances where vehicles remained stationary longer than usual to confirm the state of the affected intersections. This contributed to traffic friction during the height of the congestion.”
A company spokesperson also shared some thoughts about the incidents. “Yesterday’s power outage was a widespread event that caused gridlock across San Francisco, with non-functioning traffic signals and transit disruptions. While the failure of the utility infrastructure was significant, we are committed to ensuring our technology adjusts to traffic flow during such events,” the Waymo spokesperson stated, adding that it is “focused on rapidly integrating the lessons learned from this event, and are committed to earning and maintaining the trust of the communities we serve every day.”