Why Tesla Autopilot will ultimately prove the self-driving industry leader

A Tesla Model 3 on Autopilot. [Credit: LivingTesla/YouTube]

Tesla took an early lead in the race to develop vehicle autonomy, and its Autopilot system remains the state of the art. However, the technology is advancing more slowly than the company predicted – Elon Musk promised a coast-to-coast driverless demo run for 2018, and we’re still waiting. Meanwhile, competitors are hard at work on their own autonomy tech – GM’s Super Cruise, is now available on the CT6 luxury sedan.

Is Tesla in danger of falling behind in the self-driving race? Trent Eady, writing in Medium, takes a detailed look at the company’s Autopilot technology, and argues that the California automaker will continue to set the pace.

Every Tesla vehicle produced since October 2016 is equipped with a hardware suite designed for Full Self-Driving, including cameras, radar, ultrasonic sensors and an upgradable onboard computer. Around 150,000 of these “Hardware 2” Teslas are currently on the road, and could theoretically be upgraded to self-driving vehicles via an over-the-air software update.

Above: In its current state, Tesla’s Autopilot requires a hands-on approach (Youtube: Tesla)

Tesla disagrees with most of the other players in the self-driving game on the subject of Lidar, a technology that calculates distances using pulses of infrared laser light. Waymo, Uber and others seem to regard lidar as a necessary component of any self-driving system. However, Tesla’s Hardware 2 sensor suite doesn’t include it, instead relying on radar and optical cameras.

Lidar’s strength is its high spatial precision – it can measure distances much more precisely than current camera technology can (Eady believes that better software could enable cameras to close the gap). Lidar’s weakness is that it functions poorly in bad weather. Heavy rain, snow or fog causes lidar’s laser pulses to refract and scatter. Radar works much better in challenging weather conditions.

According to Eady, the reason that Tesla eschews lidar may be the cost: “Autonomy-grade lidar is prohibitively expensive, so it’s not possible for Tesla to include it in its production cars. As far as I’m aware, no affordable autonomy-grade lidar product has yet been announced. It looks like that is still years away.”

If Elon Musk and his autonomy team are convinced that lidar isn’t necessary, why does everyone else seem so sure that it is? “Lidar has accrued an aura of magic in the popular imagination,” opines Mr. Eady. “It is easier to swallow the new and hard-to-believe idea of self-driving cars if you tell the story that they are largely enabled by a cool, futuristic laser technology…It is harder to swallow the idea that if you plug some regular ol’ cameras into a bunch of deep neural networks, somehow that makes a car capable of driving itself through complicated city streets.”

Those deep neural networks are the real reason that Eady believes Tesla will stay ahead of its competitors in the autonomy field. The flood of data that Tesla is gathering through the sensors of the 150,000 or so existing Hardware 2 vehicles “offers a scale of real-world testing and training that is new in the history of computer science.”

Competitor Waymo has a computer simulation that contains 25,000 virtual cars, and generates data from 8 million miles of simulated driving per day. Tesla’s real-world data is of course vastly more valuable than any simulation data could ever be, and the company uses it to feed deep neural networks, allowing it to continuously improve Autopilot’s capabilities.

A deep neural network is a type of computing system that’s loosely based on the way the human brain is organized (sounds like the kind of AI that Elon Musk is worried about, but we’ll have to trust that Tesla has this under control). Deep neural networks are good at modeling complex non-linear relationships. The more data that’s available to train the network, the better its performance will be.

“Deep neural networks started to gain popularity in 2012, after a deep neural network won the ImageNet Challenge, a computer vision contest focused on image classification,” Eady explains. “For the first time in 2015, a deep neural network slightly outperformed the human benchmark for the ImageNet Challenge…The fact that computers can outperform humans on even some visual tasks is exciting for anyone who wants computers to do things better than humans can. Things like driving.”

By the way, who was the human benchmark who was bested by a machine in the ImageNet Challenge? Andrej Karpathy, who is now Director of AI at Tesla.

===

Note: Article originally published on evannex.com by Charles Morris; Source: Medium

evannex: EVANNEX carries aftermarket accessories, parts, and gear for Tesla owners. Its blog is updated daily with Tesla news.
Related Post
Disqus Comments Loading...