Connect with us

News

Tesla MIT study concludes that drivers maintain vigilance when using Autopilot

[Credit: LivingTesla/YouTube]

Published

on

Tesla owners using Autopilot are highly engaged when driving with the feature despite fears to the contrary, according to a study recently published by scientists at MIT titled Human Side of Tesla Autopilot: Exploration of Functional Vigilance in Real-World Human-Machine Collaboration.

The data used in the study was generated from the over 1 billion miles driven by Tesla owners since its activation in 2015, about 35% of which were determined to be assisted by Autopilot. Of these, 18,928 disengagements of Autopilot were annotated, which indicated instances when drivers took over during challenging driving situations. Overall, the numbers demonstrate a high rate of driver vigilance.

Tesla has provided a unique opportunity to form a baseline for objective, representative analysis of real-world use of Autopilot, as stated in the study:

“Due to its scale of deployment and individual utilization, [Tesla’s] Autopilot serves as perhaps the currently best available opportunity to study and understand human interaction with AI assisted vehicles ‘in the wild’…naturalistic driving research can now begin investigating and identify both promising and concerning trends in drivers’ behavioral patterns in the context of Autopilot.”

Results graph from “Human Side of Tesla Autopilot” Study. | Credit: MIT

As automation has expanded over the last several decades, a pattern of overtrust in reliable automated systems has been shown by human behavior research studies. In the context of driving scenarios where property damage, injury, or death are possible consequences, the concern with the transition to semi-autonomous systems relying on driver input to function safely is obviously significant. The results of the MIT study are therefore promising, initially showing an approach to automation in driving systems that’s more careful than other areas.

“The two main results of this work are that (1) drivers elect to use Autopilot for a significant percent of their driven miles and (2) drivers do not appear to over-trust the system to a degree that results in significant functional vigilance degradation in their supervisory role of system operation,” the MIT scientists concluded.

Advertisement
-->

The study further notes that more research will be needed as more data becomes available and more familiarity grows with Autopilot’s features.

Tesla has received a fair amount of criticism and attention whenever an accident involves one of its cars, especially if Autopilot was engaged around the time of the event. However, Tesla consistently maintains its position that the feature is not yet fully autonomous and requires drivers to both pay attention and intervene when necessary while Autopilot is in operation. The program is additionally equipped with several alerts which give drivers audio and visual warnings if hands are not detected on the steering wheel, something found to have been ignored in some prior crash events, playing into concerns the MIT study sought to address.

The Tesla Model 3’s ratings from the National Highway Traffic Safety Administration. [Credit: NHTSA]

Beginning in Q3 2018, Tesla has been releasing quarterly Vehicle Safety Reports providing updated numbers for vehicle incidents occurring both when Autopilot was engaged and when the driver-assist feature was deactivated. For Q3, the company reported one accident or crash-like event for every 3.34 million miles driven with Autopilot active and one event for every 1.92 million miles driven with Autopilot disengaged. In Q4 2018, those numbers dropped slightly, possibly due to winter conditions, to one accident for every 2.91 million miles driven with Autopilot engaged and one accident for every 1.58 million miles driven without.

By comparison, the National Highway Traffic Safety Administration’s (NHTSA) most recent data at the time showed a crash event every 436,000 miles, a figure which includes all vehicles in the US whether or not the cars are equipped with driving enhancement software. Tesla’s numbers further include both accidents that have occurred and “near-misses”, and the NHTSA’s figures only include accidents that actually transpired.

Along with touting a correlation between lower accident rates and Autopilot being engaged, Tesla also maintains its title of producing the safest cars in the world based on NHTSA test results.

Advertisement
-->

Accidental computer geek, fascinated by most history and the multiplanetary future on its way. Quite keen on the democratization of space. | It's pronounced day-sha, but I answer to almost any variation thereof.

Advertisement
Comments

News

Tesla Robotaxi Safety Monitor seems to doze off during Bay Area ride

We won’t try to blame the camera person for the incident, because it clearly is not their fault. But it seems somewhat interesting that they did not try to wake the driver up and potentially contact Tesla immediately to alert them of the situation.

Published

on

Credit: u/ohmichael on Reddit

A Tesla Robotaxi Safety Monitor appeared to doze off during a ride in the California Bay Area, almost ironically proving the need for autonomous vehicles.

The instance was captured on camera and posted to Reddit in the r/sanfrancisco subreddit by u/ohmichael. They wrote that they have used Tesla’s ride-hailing service in the Bay Area in the past and had pleasant experiences.

However, this one was slightly different. They wrote:

“I took a Tesla Robotaxi in SF just over a week ago. I have used the service a few times before and it has always been great. I actually felt safer than in a regular rideshare.

This time was different. The safety driver literally fell asleep at least three times during the ride. Each time the car’s pay attention safety alert went off and the beeping is what woke him back up.

I reported it through the app to the Robotaxi support team and told them I had videos, but I never got a response.

I held off on posting anything because I wanted to give Tesla a chance to respond privately. It has been more than a week now and this feels like a serious issue for other riders too.

Has anyone else seen this happen?”

Advertisement

-->

My Tesla Robotaxi “safety” driver fell asleep
byu/ohmichael insanfrancisco

The driver eventually woke up after prompts from the vehicle, but it is pretty alarming to see someone like this while they’re ultimately responsible for what happens with the ride.

We won’t try to blame the camera person for the incident, because it clearly is not their fault. But it seems somewhat interesting that they did not try to wake the driver up and potentially contact Tesla immediately to alert them of the situation.

They should have probably left the vehicle immediately.

Tesla’s ride-hailing service in the Bay Area differs from the one that is currently active in Austin, Texas, due to local regulations. In Austin, there is no Safety Monitor in the driver’s seat unless the route requires the highway.

Tesla plans to remove the Safety Monitors in Austin by the end of the year.

Continue Reading

News

Tesla opens Robotaxi access to everyone — but there’s one catch

Published

on

Credit: Tesla

Tesla has officially opened Robotaxi access to everyone and everyone, but there is one catch: you have to have an iPhone.

Tesla’s Robotaxi service in Austin and its ride-hailing service in the Bay Area were both officially launched to the public today, giving anyone using the iOS platform the ability to simply download the app and utilize it for a ride in either of those locations.

It has been in operation for several months: it launched in Austin in late June and in the Bay Area about a month later. In Austin, there is nobody in the driver’s seat unless the route takes you on the freeway.

In the Bay Area, there is someone in the driver’s seat at all times.

The platform was initially launched to those who were specifically invited to Austin to try it out.

Tesla confirms Robotaxi is heading to five new cities in the U.S.

Slowly, Tesla launched the platform to more people, hoping to expand the number of rides and get more valuable data on its performance in both regions to help local regulatory agencies relax some of the constraints that were placed on it.

Additionally, Tesla had its own in-house restrictions, like the presence of Safety Monitors in the vehicles. However, CEO Elon Musk has maintained that these monitors were present for safety reasons specifically, but revealed the plan was to remove them by the end of the year.

Now, Tesla is opening up Robotaxi to anyone who wants to try it, as many people reported today that they were able to access the app and immediately fetch a ride if they were in the area.

We also confirmed it ourselves, as it was shown that we could grab a ride in the Bay Area if we wanted to:

The launch of a more public Robotaxi network that allows anyone to access it seems to be a serious move of confidence by Tesla, as it is no longer confining the service to influencers who are handpicked by the company.

In the coming weeks, we expect Tesla to then rid these vehicles of the Safety Monitors as Musk predicted. If it can come through on that by the end of the year, the six-month period where Tesla went from launching Robotaxi to enabling driverless rides is incredibly impressive.

Continue Reading

News

Tesla analyst sees Full Self-Driving adoption rates skyrocketing: here’s why

“You’ll see increased adoption as people are exposed to it. I’ve been behind the wheel of several of these and the different iterations of FSD, and it is getting better and better. It’s something when people experience it, they will be much more comfortable utilizing FSD and paying for it.”

Published

on

tesla interior operating on full self driving
Credit: TESLARATI

Tesla analyst Stephen Gengaro of Stifel sees Full Self-Driving adoption rates skyrocketing, and he believes more and more people will commit to paying for the full suite or the subscription service after they try it.

Full Self-Driving is Tesla’s Level 2 advanced driver assistance suite (ADAS), and is one of the most robust on the market. Over time, the suite gets better as the company accumulates data from every mile driven by its fleet of vehicles, which has swelled to over five million cars sold.

The suite features a variety of advanced driving techniques that many others cannot do. It is not your typical Traffic-Aware Cruise Control (TACC) and Lane Keeping ADAS system. Instead, it can handle nearly every possible driving scenario out there.

It still requires the driver to pay attention and ultimately assume responsibility for the vehicle, but their hands are not required to be on the steering wheel.

It is overwhelmingly impressive, and as a personal user of the FSD suite on a daily basis, I have my complaints, but overall, there are very few things it does incorrectly.

Tesla Full Self-Driving (Supervised) v14.1.7 real-world drive and review

Gengaro, who increased his Tesla price target to $508 yesterday, said in an interview with CNBC that adoption rates of FSD will increase over the coming years as more people try it for themselves.

At first, it is tough to feel comfortable with your car literally driving you around. Then, it becomes second nature.

Gengaro said:

“You’ll see increased adoption as people are exposed to it. I’ve been behind the wheel of several of these and the different iterations of FSD, and it is getting better and better. It’s something when people experience it, they will be much more comfortable utilizing FSD and paying for it.”

Tesla Full Self-Driving take rates also have to increase as part of CEO Elon Musk’s recently approved compensation package, as one tranche requires ten million active subscriptions in order to win that portion of the package.

The company also said in the Q3 2025 Earnings Call in October that only 12 percent of the current ownership fleet are paid customers of Full Self-Driving, something the company wants to increase considerably moving forward.

Continue Reading