A team of researchers have found a way to exploit Tesla’s Autopilot sensors by fooling it into believing objects are not present when in fact they are, and vice versa. Engineers from Chinese security firm Qihoo 360, a firm that was previously awarded $10k by Tesla after winning a competition to uncover security holes in the Model S, in conjunction with researchers from the University of South Carolina and China’s Zhejiang University demonstrated their ability to create jamming devices capable of altering signals detected by Tesla’s Autopilot suite of radar and ultrasonic sensors.
In a series of videos published on Wired via The Scene, researchers placed purpose-built radio equipment on a cart directly in from of a Tesla Model S in order to simulate a vehicle’s presence. With the signal generating “jamming device” set to an off position, the ubiquitous blue Tesla-imagined vehicle appears on the instrument cluster. Once the researcher turned on the jamming device, the virtual vehicle previously detected by the Tesla Autopilot sensors suddenly disappears from screen.
Though this demonstration is not likely to take place in a real world environment considering the fact that the jamming device used in the experiment costs $90,000, it still highlights the rare edge cases that Tesla need to account for as they continue to drive forward in their development of “narrow AI” and fully autonomous vehicles. This couldn’t be more the case following the global media frenzy that ensued after Joshua Brown died behind the wheel of Model S while on Autopilot.
University of South Carolina computer science professor Wenyuan Xu who led the research said that an attack on Tesla’s radar sensors is technically possible, however it “would require some effort” as the jamming signal would need to be strategically aimed and strike Tesla’s radars at the correct angle.
The researchers also devised a far less expensive attack using a $40 set up consisting of a tiny Arduino computer and an ultrasonic transducer. The set up emits sound waves against Tesla’s ultrasonic sensors, tricking it into thinking there are obstacles either in the vehicles way or entirely not present. The result of the ‘hack’ would be a Tesla prematurely stopping when attempting to self-park or, in the opposite case, strike an obstacle after being fooled into thinking nothing was in the vehicle’s path of travel.
Tesla owners shouldn’t be overly concerned with these tests but it does surface the need for additional improvements in its Autopilot technology especially as price of testing equipment becomes cheaper and more readily available. In the wrong hands, a Tesla hacker could technically disrupt a vehicle driving on Autopilot.
Recent sightings of a Model S testing with LIDAR equipment near Tesla’s HQ suggests that the automaker may be exploring alternative technology to its predominantly radar and sonar based sensors. However Tesla CEO Elon Musk has reaffirmed his stance that the existing Tesla radar by itself and decoupled from the onboard cameras will be able to operate with similar precision, “like lidar”.