Connect with us

News

Stanford studies human impact when self-driving car returns control to driver

Published

on

Tesla Autopilot in 'Shadow Mode' will pit human vs computer

Researchers involved with the Stanford University Dynamic Design Lab have completed a study that examines how human drivers respond when an autonomous driving system returns control of a car to them. The Lab’s mission, according to its website, is to “study the design and control of motion, especially as it relates to cars and vehicle safety. Our research blends analytical approaches to vehicle dynamics and control together with experiments in a variety of test vehicles and a healthy appreciation for the talents and demands of human drivers.” The results of the study were published on December 6 in the first edition of the journal Science Robotics.

Holly Russell, lead author of study and former graduate student at the Dynamic Design Lab says, “Many people have been doing research on paying attention and situation awareness. That’s very important. But, in addition, there is this physical change and we need to acknowledge that people’s performance might not be at its peak if they haven’t actively been participating in the driving.”

The report emphasizes that the DDL’s autonomous driving program is its own proprietary system and is not intended to mimic any particular autonomous driving system currently available from any automobile manufacturer, such as Tesla’s Autopilot.

The study found that the period of time known as “the handoff” — when the computer returns control of a car to a human driver — can be an especially risky period, especially if the speed of the vehicle has changed since the last time the person had direct control of the car. The amount of steering input required to accurately control a vehicle varies according to speed. Greater input is needed at slower speeds while less movement of the wheel is required at higher speeds.

People learn over time how to steer accurately at all speeds based on experience. But when some time elapses during which the driver is not directly involved in steering the car, the researchers found that drivers require a brief period of adjustment before they can accurately steer the car again. The greater the speed change while the computer is in control, the more erratic the human drivers were in their steering inputs upon resuming control.

Advertisement

“Even knowing about the change, being able to make a plan and do some explicit motor planning for how to compensate, you still saw a very different steering behavior and compromised performance,” said Lene Harbott, co-author of the research and a research associate in the Revs Program at Stanford.

Handoff From Computer to Human

The testing was done on a closed course. The participants drove for 15 seconds on a course that included a straightaway and a lane change. Then they took their hands off the wheel and the car took over, bringing them back to the start. After familiarizing themselves with the course four times, the researchers altered the steering ratio of the cars at the beginning of the next lap. The changes were designed to mimic the different steering inputs required at different speeds. The drivers then went around the course 10 more times.

Even though they were notified of the changes to the steering ratio, the drivers’ steering maneuvers differed significantly from their paths previous to the modifications during those ten laps. At the end, the steering ratios were returned to the original settings and the drivers drove 6 more laps around the course. Again the researchers found the drivers needed a period of adjustment to accurately steer the cars.

The DDL experiment is very similar to a classic neuroscience experiment that assesses motor adaptation. In one version, participants use a hand control to move a cursor on a screen to specific points. The way the cursor moves in response to their control is adjusted during the experiment and they, in turn, change their movements to make the cursor go where they want it to go.

Just as in the driving test, people who take part in the experiment have to adjust to changes in how the controller moves the cursor. They also must adjust a second time if the original response relationship is restored. People can performed this experiment themselves by adjusting the speed of the cursor on their personal computers.

Advertisement

“Even though there are really substantial differences between these classic experiments and the car trials, you can see this basic phenomena of adaptation and then after-effect of adaptation,” says IIana Nisky, another co-author of the study and a senior lecturer at Ben-Gurion University in Israel “What we learn in the laboratory studies of adaptation in neuroscience actually extends to real life.”

In neuroscience this is explained as a difference between explicit and implicit learning, Nisky explains. Even when a person is aware of a change, their implicit motor control is unaware of what that change means and can only figure out how to react through experience.

Federal and state regulators are currently working on guidelines that will apply to Level 5 autonomous cars. What the Stanford research shows is that until full autonomy becomes a reality, the “hand off” moment will represent a period of special risk, not because of any failing on the part of computers but rather because of limitations inherent in the brains of human drivers.

The best way to protect ourselves from that period of risk is to eliminate the “hand off” period entirely by ceding total control of driving to computers as soon as possible.

Advertisement
Advertisement
Comments

News

Tesla rolls out new life-saving feature for kids in Europe

On average, 37 children die every year from being left in vehicles unattended.

Published

on

tesla model x
Credit: Tesla

Tesla is rolling out a new life-saving feature in the European market, one that has been available in the United States for some time and can be considered potentially invaluable.

One of the most preventable causes of death for children is being left in cars unattended. On average, 37 children die every year after being left in hot vehicles. The cause of death is usually heatstroke, and it is incredibly avoidable.

Tesla rolls out new crucial safety feature aimed at saving children

However, there are instances where kids are left in vehicles and lose their lives, something that many companies have tried to fight with alerts and features of their own.

Tesla is one of them, as it has rolled out features like ultrasonic sensors to detect heartbeats, interior cameras to detect movement, and alerts to notify parents if they leave someone in the car.

Advertisement

A few months ago, Tesla rolled out a new feature called “Child Left Alone Detection” in the United States. It was described as:

“If an unattended child is detected, the vehicle will flash the exterior indicator lights, play an alert tone, and send a notification to your Tesla app. This will repeat at regular intervals until you return to your vehicle. Cabin data is processed locally and is not transmitted to Tesla.

This feature is enabled by default. To disable, go to Controls > Safety > Child Left Alone Detection.”

This feature was only rolled out in the U.S. at the time. It is now making its way to the European market, according to Not a Tesla App, which detected the rollout in the 2025.32.6 software update.

The rollout of this feature could specifically change many unfortunate situations. For many of us, it seems hard to think about leaving something as precious as another human life in a hot car. Many of us won’t leave our vehicles without our cell phones, so it seems unlikely that someone would do it without a child.

Advertisement
Continue Reading

News

Tesla gets another NHTSA probe, this time related to door handles

“Although Tesla vehicles have manual door releases inside of the cabin, in these situations, a child may not be able to access or operate the releases even if the vehicle’s driver is aware of them.”

Published

on

Credit: Tesla

Tesla is facing another investigation into its vehicles by the National Highway Traffic Safety Administration (NHTSA), this time related to an issue with its door handles.

In a new Open Investigation named “Electronic door handles become inoperative,” the NHTSA says that it has received nine complaints from owners of the 2021 Tesla Model Y stemming from “an inability to open doors.”

These issues were reported after “parents exited their vehicle after a drive cycle in order to remove a child from the pack seat or placing a child in the back seat before starting a drive cycle.” Parents said they were “unable to reopen a door to regain access to the vehicle.”

Tesla door handles become unlikely hero as they stump road rager

Four of the nine complaints ended with having to break a window to regain access to the cabin.

Advertisement

The NHTSA goes on to explain that, while Teslas do have a manual door release inside the cabin, a child may not be able to access it:

“Although Tesla vehicles have manual door releases inside of the cabin, in these situations, a child may not be able to access or operate the releases even if the vehicle’s driver is aware of them. As a result, in these instances, an occupant who remains inside a vehicle in this condition may be unable to be rapidly retrieved by persons outside of the vehicle.”

Advertisement

It appears that the agency is attributing the issue to a low voltage in the vehicle’s 12V DC battery. This would mean there needs to be some sort of notification to the driver that the battery is running low on power and should be replaced to avoid this issue.

The NHTSA estimates that 174,290 vehicles are potentially impacted by this issue. It plans to assess the scope and severity of the condition, the agency says. The NHTSA also wants to see what approach Tesla uses to supply power to door locks and the reliability of the applicable power supplies.

Continue Reading

News

Tesla won’t implement strange Grok character as Musk dispels rumor

It is nice to see that Tesla is not forcing this kind of character upon owners of their vehicles, especially considering that many people had a real problem with it.

Published

on

Tesla is not going to implement a strange character as a Grok assistant in its vehicles, as CEO Elon Musk dispelled the rumor, which seemed to truly invoke some quite polarizing reactions.

Yesterday, there was some controversy within the Tesla community as rumors of a Grok assistant, named Mūn (pronounced like Moon), being implemented into the vehicles started to circulate.

It had some legitimacy. It was initially posted by an employee, and it appeared to be a relatively confirmed development.

However, it really did rub some people the wrong way. Mūn was an Anime-style female dressed in promiscuous clothing, so it was not everyone’s style, and I’m sure not everyone’s significant other’s cup of tea. It seemed a very strange decision to add it, especially considering that, at the time, there was no confirmation to dispel the arrival of the Grok assistant.

That was until Tesla CEO Elon Musk stepped in to put the speculation to bed once and for all.

Advertisement

It was somewhat strange that this type of issue arose in the first place, but given that it was initially released by an employee, the entire situation is self-explanatory.

It is nice to see that Tesla is not forcing this kind of character upon owners of their vehicles, especially considering that many people had a real problem with it. Many owners did not shy away from the fact that they would like the option to opt out:

For now, Grok remains a part of Tesla vehicles, and personally, it is very nice to have in my Model Y to answer some quick questions I might have or even to entertain some people in the car.

Nevertheless, I am relieved I won’t have this character forced upon me in my vehicle.

Continue Reading

Trending