A second Model S owner has come out to blame Tesla Autopilot for causing an accident. Arianna Simpson was driving her Model S on the I-5 north of Los Angeles with Autopilot engaged when “All of a sudden, the car ahead of me came to a halt. There was a decent amount of space so I figured that the car was going to brake as it is supposed to and didn’t brake immediately. When it became apparent that the car was not slowing down at all, I slammed on the brakes but was probably still going 40 when I collided with the other car,” she says. Simpson blames her car for the accident. Fortunately, there were no injuries.
Not so, says Tesla. As soon as Simpson hit the brakes, she immediately disabled Autopilot and the Traffic Aware Cruise Control. It also overrode the automatic emergency braking system that was rolled out starting with firmware update 6.2 last year.
In a statement to Ars Technica, Tesla said,
“Safety is the top priority at Tesla, and we engineer and build our cars with this foremost in mind. We also ask our customers to exercise safe behavior when using our vehicles. Since the release of Autopilot, we’ve continuously educated customers on the use of the feature, reminding them that they’re responsible for remaining alert and present when using Autopilot and must be prepared to take control at all times.
Tesla Autopilot is designed to provide a hands-on experience to give drivers more confidence behind the wheel, increase their safety on the road, and make highway driving more enjoyable. Autopilot is by far the most advanced such system on the road, but it does not turn a Tesla into an autonomous vehicle and does not allow the driver to abdicate responsibility.”
Simpson says she has always been “super pro-Tesla and self-driving things in general.” But the way Tesla has responded to her plight has left her distraught. She believes Tesla lacks empathy and has been “pretty awful throughout the process.”
Jared Overton of Salt Lake City would likely agree. His recent claim that his Model S malfunctioned and crashed itself into the back of a parked trailer, was refuted by Tesla saying that Overton activated Summon which caused the vehicle to move on its own. Tesla says data retrieved from Overton’s car proves his account of what happened is simply wrong. “They’re just assuming that I sat there and watched it happen, and I was OK with that,” Overton says.
Avid Tesla fan KManAuto posted a video showing how Overton may have inadvertently activated Summon by, in effect, making a butt dial to the car with his remote. His explanation may be accurate, but if so, shouldn’t Tesla be concerned that such an error can occur and take steps to keep it from happening?
According to Ars Technica, the message is that before fully autonomous cars arrive in a few year, “If behind the wheel before then, you — not your car, not the company that built it — are in charge of where it goes. That also means you’re liable for anything it hits.”
Is that an accurate statement of how most Tesla owners feel when driving with Autopilot engaged?
Photo credit: Arianna Simpson
Interested in solar? Get a solar cost estimate and find out how much a solar system would cost for your home or business.