During the Q3 earnings call on Tuesday, Elon Musk acknowledged that some drivers are misusing the Autosteer functions introduced in the latest Version 7.0. Despite Tesla’s warning that drivers must keep at least one hand on the steering wheel at all times, videos of Autpilot-enabled drivers depict a different story, showing drivers hopping into the back seat, shaving, reading the newspaper, or eating breakfast without ever touching the wheel.
Musk says his company is looking into putting “some additional constraints” on the Autopilot system in order to “minimize the possibility of people doing crazy things with it.” We know that Autopilot 1.01 will have improved lane holding, but Musk didn’t say what additional driver constraints would be introduced. It’s easy to imagine the software might be amended to require drivers to actually maintain contact with the wheel.
Those Autopilot videos are troubling to more than just Elon Musk and Tesla management. They have also captured the attention of federal regulators who are now on notice that Tesla automobiles are capable of doing things that are not subject to any federal oversight. Speaking to The Verge on Wednesday, Jeffrey Miller, associate professor of engineering practice at the University of Southern California and a member of the Institute of Electrical and Electronics Engineers, said Tesla’s beta software raises a host of questions for regulators.
“Beta software typically means that a company is not fully releasing this to the public,” Miller said. “They are releasing it to people who are willing to test it with the understanding there will be bugs in it.” Which is why Musk and company are so insistent that drivers remain actively engaged in controlling the car at all times. In its most recent monthly report on self-driving cars, Google told readers that it is pursuing fully autonomous systems precisely because it thinks human drivers will react too slowly to situations that require them to retake control of their cars.
For its part, the National Highway Transportation Safety Administration (NHTSA) says its mission is to “save lives, prevent injuries, [and] reduce vehicle-related crashes.” The agency “applies performance requirements to the regulated system as a whole, and typically does not develop requirements for specific elements within the regulated system such as its software,” a spokesman told The Verge. “As with any new vehicle feature, manufacturers are free to offer it. If defective however, [the] agency can pursue a recall.”
The NHTSA is not blind to the availability of new computer aided systems that help reduce the risk of injury or death. A year ago, it updated its 5-Star vehicle safety ratings to include automatic emergency braking as a recommended safety technology. And Since 2011, it has also added electronic stability control, forward collision warning, lane departure warning, and rearview camera systems to its list of “recommended” technologies. A NHTSA spokesperson told The Verge, “We are currently assessing the need for additional standards as it relates to software and vehicle electronics in general.”
Miller argues that regulators should lead the way in developing safety standards for beta updates and fully self-driving cars as they start to come online. But he says they probably won’t. “Very rarely do we get proactive laws. They’re always reactive. Right now we have an opportunity to get in front of the technology. We’ve had these small, incremental releases to the public, and we’ll continue to see small, incremental releases until we see a completely driverless vehicle around 2019, 2020.” He adds, “But this is the time that we need these regulatory agencies to say, ‘We’ve four years before we’re projecting one of these vehicles are released to the consumer market. Let’s come up with the laws first before that happens.'”
In the meantime, Tesla is not waiting around for regulators to act and currently have 40,000 cars with Autopilot that interact with one another via its ‘fleet learning’ technology. When one car learns something, all of them learn the same thing. With almost a million miles a day being recorded by Autopilot-enabled Teslas, the rate of feedback and improvement through machine learning will be an ongoing uphill challenge for regulators to keep up.
Related Autopilot News
- Watch Tesla Autopilot parallel park with precision
- Tesla Autopilot emergency saves driver from head-on collision
- Who is responsible when Tesla Autopilot results in a crash?
- Tesla Building Next Gen Maps through its Autopilot Drivers