Firmware

Research group demos why Tesla Autopilot could crash into a stationary vehicle

[Credit: Thatcham Research/YouTube]

Thatcham Research, a UK-based agency that works for the Association of British Insurers, has concluded that manufacturers such as Tesla need to be clearer and more specific about the differences between the capabilities of its Autopilot driving-assistance system and full self-driving suites like those employed by Waymo and GM Cruise. In an interview with BBC News, Matthew Avery of Thatcham Research stated that features with names such as Tesla’s Autopilot invoke the impression that the vehicles can drive themselves under all circumstances, when they actually can’t.

“There’s a problem with the manufacturers trying to introduce technology and consumers not being ready for it, not being sure if it’s automated or ‘Do I need to keep watching?’ We want it very clear. Either you are driving – assisted – or you’re not driving – automated,” Avery said.

Avery conducted a demonstration of the risks drivers face if they become over-reliant on the capabilities of the driver-assist system. Engaging a Tesla Model S’ Autopilot, Avery opted to do two tests. In the first test, which involved the electric car following a vehicle that slowed to a halt, the Model S performed well. In the second test, however, which involved a lead vehicle changing lanes without warning, the Model S was not able to stop in time, colliding with a stationary car represented by an inflatable dummy in Thatcham’s test track.

Tesla uses an array of 8 cameras, 12 ultrasonic sensors, and a forward-facing radar to detect objects and other vehicles on the road. Enhanced Autopilot, the company’s driver-assist system that utilizes Tesla Vision, uses a combination of Traffic-Aware Cruise Control (TACC) and Autosteer. Traffic-Aware Cruise Control enables drivers to set a speed and follow a lead vehicle while keeping a fixed distance. Autosteer, on the other hand, detects lane markings and the presence of vehicles and objects, steering Model S based on the road’s markings.

There are some instances, however, where TACC would prove inadequate, such as when the lead vehicle is traveling below the user’s set speed before switching lanes. In such a case, the Tesla would match the lead vehicle’s slower speed to prevent a collision, and once the lead vehicle leaves the lane, the electric car would accelerate to achieve its target speed. Based on the results of the preliminary investigation of the NTSB, this scenario was reflected in the fatal Model X crash in Mountain View, CA earlier this year, with the electric SUV accelerating after the car directly in front departed from the lane.

Such instances, however, only happen when drivers become too reliant on Autopilot. If drivers are paying attention to the road, a simple finger-flick of the Tesla’s stalk to disable TACC is enough to stop the car from accelerating and hitting a stationary vehicle. An alert driver would also be able to apply enough brakes and take over the car’s steering on time to avoid a collision. Tesla actually mentions these in the Owner’s Manual of its vehicles.

“Traffic-Aware Cruise Control does not eliminate the need to watch the road in front of you and to manually apply the brakes when needed. It is your responsibility to stay alert, drive safely, and be in control of the vehicle at all times. Never depend on Traffic-Aware Cruise Control to adequately slow down Model S. Always watch the road in front of you and be prepared to take corrective action at all times. Failure to do so can result in serious injury or death.”

A Tesla Model S crashes into an inflatable dummy. [Credit: Thatcham Research/YouTube]

As noted in a Jalopnik report, Tesla has responded to the results of Thatcham Research’s test, with the company stating that the Model S’ collision would not have happened had the driver been using Autopilot correctly. The California-based electric car maker further noted that Tesla users understand that their attention on the road is paramount when Autopilot is engaged.

“Tesla has always been clear that Autopilot doesn’t make the car impervious to all accidents and the issues described by Thatcham won’t be a problem for drivers using Autopilot correctly. The feedback that we get from our customers shows that they have a very clear understanding of what Autopilot is, how to properly use it, and what features it consists of,” Tesla noted in a statement to the publication.

In order to increase the safety of its drivers, Tesla has been steadily rolling out safety features for its fleet. The latest update to Autopilot, v8.1(2018.21.9), included a more assertive hands-on alert system that ultimately caused complaints from drivers, with some stating that the system’s reminders have become too intrusive. Musk, for his part, has pledged to adjust Autopilot’s screen alerts to attain a balance between safety and drivers’ convenience. If Musk’s recent Twitter updates are any indication, however, it would only be a matter of months before the company starts rolling out the first full self-driving features to its fleet with the release of Software Version 9 in August.  

Watch Thatcham Research’s segment on Tesla’s Autopilot in the video below.

Research group demos why Tesla Autopilot could crash into a stationary vehicle
To Top