The echoes of the Autopilot crash that killed Joshua Brown on May 7 are still reverberating. Not only is the National Highway Transportation Safety Administration conducting an investigation, now the National Transportation Safety Board (NTSB) is getting into the act. NHTSA chief Mark Rosekind and Transportation Secretary Anthony Fox have expressed approval of autonomous driving technology. With more than 80% of the 34,000 highway deaths in the US every year attributed to human error, they recognize the power of new safety system to reduce the carnage on America’s roadways.
The NTSB on the other hand has warned that such systems can lead to danger because they lull drivers into complacency behind the wheel. Missy Cummings, an engineering professor and human factors expert at Duke University, says humans tend to show “automation bias,” a trust that automated systems can handle all situations when they work 80% of the time.
According to Automotive News, NTSB will send a team of five investigators to Florida to probe the of Joshua Brown after his Tesla Model S while driving on Autopilot crashed into a tractor trailer. “It’s worth taking a look and seeing what we can learn from that event, so that as that automation is more widely introduced we can do it in the safest way possible,” says Christopher O’Neil, who heads the NTSB.
“It’s very significant,” says Clarence Ditlow, executive director of the Center for Auto Safety, an advocacy group in Washington, DC. “The NTSB only investigates crashes with broader implications.” He says the action by the NTSB is significant. “They’re not looking at just this crash. They’re looking at the broader aspects. Are these driverless vehicles safe? Are there enough regulations in place to ensure their safety” Ditlow adds, “And one thing in this crash I’m certain they’re going to look at is using the American public as test drivers for beta systems in vehicles. That is simply unheard of in auto safety,” he said.
That is the crux of the situation. Tesla obviously has the right to conduct a beta test of its Autopilot system if only other Tesla drivers were involved. The question is whether it has the same right to do so on public roads with other drivers who are not part of the beta test and who are unaware than an experiment is taking place around them as they drive?
For its part, Tesla steadfastly maintains that “Autopilot is by far the most advanced driver-assistance system on the road, but it does not turn a Tesla into an autonomous vehicle and does not allow the driver to abdicate responsibility. Since the release of Autopilot, we’ve continuously educated customers on the use of the feature, reminding them that they’re responsible for remaining alert and present when using Autopilot and must be prepared to take control at all times.”
That’s all well and good, but are such pronouncements from the company enough to overcome that “automation bias” Professor Cummings refers to? Clearly, Ditlow thinks not. What the NTSB decides to do, if anything, after it completes its investigation could have a dramatic impact on self driving technology both in the US and around the world. Regulators in other countries will place a lot of weight on what the Board decides.
From Tesla’s perspective, they will want to know if the death of one person is reason enough to delay implementation of technology that could save 100,000 or more lives each year worldwide. The company is racing ahead with improvements to its Autopilot system. Just the other week a Tesla Model S was spotted testing near the company’s Silicon Valley-based headquarters with a LIDAR mounted to its roof.
NTSB has a lot of clout when it comes to promulgating safety regulations. It will be hard pressed to fairly balance all of the competing interests involved in the aftermath of the Joshua Brown fatality.