The National Highway Transportation Safety Administration has sent Tesla Motors a 9 page letter asking for specifics about any modifications to the Autopilot system including software changes that Tesla has made since Autopilot was first rolled out in the fall of 2015.
Updated: We received a copy of the 9-page letter sent by the NHTSA to Tesla outlining details on the type of information Tesla must hand over.
At issue is how the system detects and reacts to cross traffic. The probe is related specifically to the crash that killed Joshua Brown on a Florida highway on May 7. His car was operating on Autopilot when it struck the side of a tractor trailer that was crossing the road in front of it. Tesla says the cameras on Brown’s Model S failed to distinguish the white side of the tractor trailer from a brightly lit sky. As a result, the car did not brake automatically.
According to ABC News, the letter is asking for information about how Autopilot reacts when proceeding through intersections where there is cross traffic. Its request also asks the company to describe how the system determines if the input from cameras and other sensors is “compromised or degraded” in any way, and how the problems are communicated to the driver.
Lastly, the NHTSA has given Tesla an August 26 deadline to turnover details of all known crashes, consumer complaints, and lawsuits filed or settled because the Autopilot system didn’t brake as expected. Agency spokesperson Bryan Thomas says the information requested is a routine part of an investigation and the agency has not yet determined whether Autopilot constitutes a safety risk.
Elon Musk told the Wall Street Journal on Tuesday that Tesla has no plans to disable its Autopilot feature in the wake of the fatal crash in Florida. Musk responded by saying that the company is planning an explanatory blog post to educate customers on how Autopilot works. “A lot of people don’t understand what it is and how you turn it on,” he said. Musk is a fierce supporter of the Autopilot system and claims it has already proven to be safer than a human driver.
All of this comes as more details are emerging about a crash in Montana involving a Model X that reportedly veered off a country road. The driver told the Montana state police he had initiated Autopilot, but Tesla says the vehicle log shows only auto steer was activated. The car’s computer detected no force on the steering wheel for more than two minutes. If there’s no force on the wheel or a sharp turn is detected, the vehicle is programmed to gradually reduce speed, stop, and turn on the emergency lights, Tesla said in a statement.
The company said the Model X alerted the driver to put his hands on the wheel, but he didn’t do it. “As road conditions became increasingly uncertain, the vehicle again alerted the driver to put his hands on the wheel. He did not do so and shortly thereafter the vehicle collided with a post on the edge of the roadway,” the statement said. It isn’t clear whether the Model X had made a decision to stop at the time of the crash.
The car traveled around a right hand curve, then went off the road. It traveled about 200 feet on the narrow shoulder, taking out 13 posts, said trooper Jade Shope. No citation was issued to the drivers because the trooper believed any citation would be void if the car was operating on Autopilot as claimed by the driver. That is a fairly curious position for a law enforcement officer to take, since there is no way for authorities to determine at the scene of an accident whether Autopilot actually was or was not activated.
The National Transportation Safety Board has also opened an investigation of the Florida crash. It is entirely possible that some new regulations regarding autonomous driving systems in general will result from the combined NHTSA/NTSB review.