News

Do autonomous cars make us worse drivers?

Tesla in autonomous mode

Autonomous cars are coming. So is the first fatality associated with them. Statistically, that milestone should occur in the next 18 months. What will happen then?

On May 31, 2009, an Airbus 330 on its way from Rio de Janiero to Paris plunged from an altitude of 35,000 feet into the Atlantic, killing all 228 people on board. Just prior to the crash, the airplane was operating in autopilot mode. A reconstruction of the disaster revealed input from several sensors had been compromised by ice that caused them to give false readings. Updated sensors that were less susceptible to ice accumulation were waiting to be installed after the plane arrived in Paris.

Because of the false readings the autopilot system disengaged returning control to the pilots however the senior pilot was sleeping at the time. The two junior pilots were not as highly trained in high altitude flight as they might have been, partly because the use of machines to control aircraft under those conditions was the norm.

Faced with the unexpected, the pilots behaved poorly. At one point they are heard to say on the cockpit recorder, “We completely lost control of the airplane, and we don’t understand anything! We tried everything!” While they tried to rouse the sleeping senior pilot, the nose of the aircraft climbed until a stall was induced. Stall is the point at which the wings become barn doors instead of airfoils. The Airbus 330 dropped from the sky like a rock.

In his excellent story about the crash published on Vanity Fair, William Langewiesche offered this conclusion: “Automation has made it more and more unlikely that ordinary airline pilots will ever have to face a raw crisis in flight—but also more and more unlikely that they will be able to cope with such a crisis if one arises.”

The Tesla community has seen similar instances lately. The driver in Salt Lake City who accidentally activated Summon, causing his car to drive into the back of a truck. The woman on a freeway in California who rear ended a car that suddenly slowed in front of her. The man in Europe who crashed into the back of a van that had stalled in the high speed lane of a highway. He at least had the courage to admit his error. “Yes, I could have reacted sooner, but when the car slows down correctly 1,000 times, you trust it to do it the next time to. My bad.”

After each of these incidents, the tendency has been for many to defend the machine and blame the human. But in a recent article for The Guardian, author Martin Robbins says, “Combine an autopilot with a good driver, and you get an autopilot with, if not a bad driver, at least not such a good one.” He says that statistically, the time when a car operating in autonomous mode causes a fatality is rapidly approaching.

Tesla_Model_S_dashcam-tacc-crash-van

Tesla Model S owner crashes into the back of a stalled van

On average, a person is killed in a traffic accident in the United States once every 100 million miles. Elon Musk says Tesla’s Autopilot is half as likely to be involved in a collision as a human driver. That would suggest that somewhere around the 200 million mile mark someone will die as a result of an automobile driven by a machine.

Tesla has already passed the 100 million mile mark for cars driving in Autopilot mode and continues to log 2.6 million miles driven per day. Statistically speaking, the time when a self driving car kills somebody is rapidly approaching. And since most autonomous cars on the road are Teslas, the odds are excellent it will be a Tesla that is involved in that first fatality.

What will happen then? Robbins goes back in history to look for an answer to that question. In 1896, Bridgit Driscoll became the first person in England to be killed by a motor car. The reaction among the public and the press was a fatalistic acceptance that progress will have a price. Within a few years, the speed limit in England was raised from 8 mph — which is was when Ms. Driscoll was killed — to 20 mph. This despite the fact that thousands of road deaths were being recorded on English roads by then.

Regulators around the world are racing to catch up with the explosion of new autonomous driving technology. But Robbins concludes,  “By the time they do, it’s likely that the technology will already be an accepted fact of life, its safety taken for granted by consumers, its failures written off as the fault of its error-prone human masters.”

The point is that injuries and fatalities will continue to occur as cars come to rely more and more on machines for routine driving chores. But in that transition period between now and the time when Level 4 autonomy becomes the norm — the day when cars come from the factory with no way for humans to control them directly — we need to accept that complacency and an inflated belief in the power of machines to protect us from harm may actually render us less competent behind the wheel.

We will need to remain vigilant, if for no other reason than telling a jury “It’s not my fault! The machine failed!” is not going to insulate us from the legal requirement to operate our private automobiles in a safe and prudent manner.

 

To Top