Consumer Reports claims to have shown that Tesla Autopilot can be “easily tricked” into driving without anyone in the driver’s seat. The test process was extremely bizarre and required certain items that most drivers would never have in their vehicles.
CR released a report on April 22nd entitled, “CR Engineers Show a Tesla Will Drive With No One in the Driver’s Seat.” The test was in response to the recent and very public Tesla Model S crash in Texas, where two men, unfortunately, passed away after their all-electric sedan crashed violently into a tree at a high speed. Investigators are attempting to determine whether the vehicle was “driverless,” a claim made by several mainstream media outlets. CEO Elon Musk chimed in just days after the crash and the very public coverage of it to say that it would be impossible for Autopilot to function on the road where the crash occurred due to the lack of road lines, which are required to initiate the use of Basic Autopilot.
The CR test required the vehicle, a Tesla Model Y, to be in motion, and engineers then engaged Autopilot and set the speed dial to 0, which brought the car to a stop. Next, Jake Fisher, CR’s Senior Director of Auto Testing, placed a “small, weighted chain on the steering wheel, to simulate the weight of a driver’s hand, and slid over into the front passenger seat without opening any of the vehicle’s doors, because that would disengage Autopilot.” The Autopilot speed was then adjusted so that the vehicle would accelerate from its stationary position. The car managed to drive up and down the half-mile lane of the CR test track, although nobody was in the seat or controlling the vehicle. “It was a bit frightening when we realized how easy it was to defeat the safeguards, which we proved were clearly insufficient,” Fisher said. The engineers encouraged nobody to try the experiment at home, but who will have a custom weighted chain sitting around to experiment with anyway?
“In our evaluation, the system not only failed to make sure the driver was paying attention, but it also couldn’t tell if there was a driver there at all,” Fisher added, but he wasn’t done throwing shade at Tesla. “Tesla is falling behind other automakers like GM and Ford that, on models with advanced driver assist systems, use technology to make sure the driver is looking at the road.” GM’s SuperCruise and Ford’s recently released BlueCruise are what Fisher is referencing, but the comparisons don’t really add up.
Tesla Autopilot has over 23 billion real-world miles of data that is stored in a Neural Network to improve performance. With every mile driven, Tesla’s semi-autonomous driving functionalities become more robust, more precise, and more adaptable to human behavior. Ford and GM have accumulated only a fraction of these statistics. Tesla, meanwhile, recently reported its Q1 2021 Safety Report, where it found that Autopilot is nearly 10 times safer than human driving.
The test performed by CR is extremely bizarre because people would not normally have all of these things in their vehicle or even in their possession, to begin with. Tesla maintains that drivers are responsible for remaining attentive during the entirety of their driving experience. The company has never claimed to have released a program capable of Level 5 autonomy where a driver needs to pay no attention to the road or the vehicle’s surroundings. Yet, Tesla’s very-publicized crash raises questions from those who have a historical distaste for the company and its products. Consumer Reports has not been keen on Tesla in the past. They have indicated that GM’s SuperCruise, despite being less effective or safe than Autopilot based on data, holds a commanding lead over Tesla’s semi-autonomous driving program.
It is worth noting that Tesla has several safety thresholds that would prohibit anyone from attempting to let the vehicle drive itself. These include a steering wheel monitoring system, which will bring the car to a complete stop if the driver is not holding it. The system also requires a driver to be in the seat to function, and the company recently revoked FSD software from several drivers who were abusing the program by being inattentive. More safety features, like a facial features recognition camera, will monitor the driver’s eyes and face to ensure they are paying attention to the road.