General Motors issued a press release on Monday announcing that it will acquire Strobe, a California-based technology startup that makes affordable chip-scale LIDAR technology for self-driving cars. An 11-person team from Strobe will be joining GM’s Cruise Automation unit as part of the acquisition.
With more affordable and higher accuracy LIDAR sensors coming to market, automakers that are looking to transition to all-electric fleets are assessing the strategic value with investing into self-driving technology. GM’s purchase of Strobe can be seen as just that. Acquiring a small and nimble startup that has a core focus on developing the key sensor used in autonomous vehicles allows the Detroit-based auto giant to speed its path to market with a self-driving car.
Kyle Vogt, GM’s Cruise Automation Founder and CEO, said through a press release, “Strobe’s LIDAR technology will significantly improve the cost and capabilities of our vehicles so that we can more quickly accomplish our mission to deploy driverless vehicles at scale.”
While GM continues to charge forward with implementing LIDAR technology into its self-driving program, the company also complements its technology with radar sensors to create a fault-tolerant sensing suite. Tesla CEO Elon Musk has famously touted LIDAR as ‘unnecessary’ in the context of an autonomous car due to its high cost. Instead, Tesla has opted to use a combination of cameras, radars and ultrasonic sensors to form the foundation for its Autopilot system. But as pricing for LIDAR technology continues to drop, could we see a change of core design in future versions of Autopilot?
Good thing about radar is that, unlike lidar (which is visible wavelength), it can see through rain, snow, fog and dust
— Elon Musk (@elonmusk) July 15, 2016
Vogt asserts that radar can operate under more challenging weather conditions, however it lacks the precision needed when making critical maneuvers at speed. “Strobe’s LIDAR sensors provide both accurate distance and velocity information, which can be checked against similar information from a RADAR sensor for redundancy. RADARs typically also provide distance and velocity information and operate under more challenging weather conditions, but they lack the angular resolution needed to make certain critical maneuvers at speed. When used together, cameras, LIDARs, and RADARs can complement each other to create a robust and fault-tolerant sensing suite that operates in a wide range of environmental and lighting conditions.” said Vogt in a blog post on Medium.
LIDAR on the other hand uses laser or concentrated light to map a high resolution 3D view of a the world, which arguably provides a higher precision view of a self-driving car’s surroundings. GM’s Director of autonomous vehicle integration has recently spoken up against Musk’s narrative that Tesla Autopilot will be fully autonomous and capable of piloting a car from California to New York on its own by the end of the year.
“The level of technology and knowing what it takes to do the mission, to say you can be a full level five with just cameras and radars is not physically possible,” said Miller about Tesla’s Autopilot suite. “Could you do it with what’s in a current Tesla Model S? I don’t think so.”
As the race to produce a fully autonomous car continues to heat up between Tesla, GM, Uber, and Google, and hardware prices decline, it’s only a matter of time before a tried and true combination of hardware will become the de-facto self-driving hardware suite. What will it be?