News
GM buys LIDAR company for self-driving car program to take Tesla head-on
General Motors issued a press release on Monday announcing that it will acquire Strobe, a California-based technology startup that makes affordable chip-scale LIDAR technology for self-driving cars. An 11-person team from Strobe will be joining GM’s Cruise Automation unit as part of the acquisition.
With more affordable and higher accuracy LIDAR sensors coming to market, automakers that are looking to transition to all-electric fleets are assessing the strategic value with investing into self-driving technology. GM’s purchase of Strobe can be seen as just that. Acquiring a small and nimble startup that has a core focus on developing the key sensor used in autonomous vehicles allows the Detroit-based auto giant to speed its path to market with a self-driving car.
Kyle Vogt, GM’s Cruise Automation Founder and CEO, said through a press release, “Strobe’s LIDAR technology will significantly improve the cost and capabilities of our vehicles so that we can more quickly accomplish our mission to deploy driverless vehicles at scale.”
While GM continues to charge forward with implementing LIDAR technology into its self-driving program, the company also complements its technology with radar sensors to create a fault-tolerant sensing suite. Tesla CEO Elon Musk has famously touted LIDAR as ‘unnecessary’ in the context of an autonomous car due to its high cost. Instead, Tesla has opted to use a combination of cameras, radars and ultrasonic sensors to form the foundation for its Autopilot system. But as pricing for LIDAR technology continues to drop, could we see a change of core design in future versions of Autopilot?
Good thing about radar is that, unlike lidar (which is visible wavelength), it can see through rain, snow, fog and dust
— Elon Musk (@elonmusk) July 15, 2016
Vogt asserts that radar can operate under more challenging weather conditions, however it lacks the precision needed when making critical maneuvers at speed. “Strobe’s LIDAR sensors provide both accurate distance and velocity information, which can be checked against similar information from a RADAR sensor for redundancy. RADARs typically also provide distance and velocity information and operate under more challenging weather conditions, but they lack the angular resolution needed to make certain critical maneuvers at speed. When used together, cameras, LIDARs, and RADARs can complement each other to create a robust and fault-tolerant sensing suite that operates in a wide range of environmental and lighting conditions.” said Vogt in a blog post on Medium.
LIDAR on the other hand uses laser or concentrated light to map a high resolution 3D view of a the world, which arguably provides a higher precision view of a self-driving car’s surroundings. GM’s Director of autonomous vehicle integration has recently spoken up against Musk’s narrative that Tesla Autopilot will be fully autonomous and capable of piloting a car from California to New York on its own by the end of the year.
“The level of technology and knowing what it takes to do the mission, to say you can be a full level five with just cameras and radars is not physically possible,” said Miller about Tesla’s Autopilot suite. “Could you do it with what’s in a current Tesla Model S? I don’t think so.”
As the race to produce a fully autonomous car continues to heat up between Tesla, GM, Uber, and Google, and hardware prices decline, it’s only a matter of time before a tried and true combination of hardware will become the de-facto self-driving hardware suite. What will it be?
Elon Musk
Elon Musk and Tesla AI Director share insights after empty driver seat Robotaxi rides
The executives’ unoccupied tests hint at the rapid progress of Tesla’s unsupervised Robotaxi efforts.
Tesla CEO Elon Musk and AI Director Ashok Elluswamy celebrated Christmas Eve by sharing personal experiences with Robotaxi vehicles that had no safety monitor or occupant in the driver’s seat. Musk described the system’s “perfect driving” around Austin, while Elluswamy posted video from the back seat, calling it “an amazing experience.”
The executives’ unoccupied tests hint at the rapid progress of Tesla’s unsupervised Robotaxi efforts.
Elon and Ashok’s firsthand Robotaxi insights
Prior to Musk and the Tesla AI Director’s posts, sightings of unmanned Teslas navigating public roads were widely shared on social media. One such vehicle was spotted in Austin, Texas, which Elon Musk acknowleged by stating that “Testing is underway with no occupants in the car.”
Based on his Christmas Eve post, Musk seemed to have tested an unmanned Tesla himself. “A Tesla with no safety monitor in the car and me sitting in the passenger seat took me all around Austin on Sunday with perfect driving,” Musk wrote in his post.
Elluswamy responded with a 2-minute video showing himself in the rear of an unmanned Tesla. The video featured the vehicle’s empty front seats, as well as its smooth handling through real-world traffic. He captioned his video with the words, “It’s an amazing experience!”
Towards Unsupervised operations
During an xAI Hackathon earlier this month, Elon Musk mentioned that Tesla owed be removing Safety Monitors from its Robotaxis in Austin in just three weeks. “Unsupervised is pretty much solved at this point. So there will be Tesla Robotaxis operating in Austin with no one in them. Not even anyone in the passenger seat in about three weeks,” he said. Musk echoed similar estimates at the 2025 Annual Shareholder Meeting and the Q3 2025 earnings call.
Considering the insights that were posted Musk and Elluswamy, it does appear that Tesla is working hard towards operating its Robotaxis with no safety monitors. This is quite impressive considering that the service was launched just earlier this year.
Elon Musk
Starlink passes 9 million active customers just weeks after hitting 8 million
The milestone highlights the accelerating growth of Starlink, which has now been adding over 20,000 new users per day.
SpaceX’s Starlink satellite internet service has continued its rapid global expansion, surpassing 9 million active customers just weeks after crossing the 8 million mark.
The milestone highlights the accelerating growth of Starlink, which has now been adding over 20,000 new users per day.
9 million customers
In a post on X, SpaceX stated that Starlink now serves over 9 million active users across 155 countries, territories, and markets. The company reached 8 million customers in early November, meaning it added roughly 1 million subscribers in under seven weeks, or about 21,275 new users on average per day.
“Starlink is connecting more than 9M active customers with high-speed internet across 155 countries, territories, and many other markets,” Starlink wrote in a post on its official X account. SpaceX President Gwynne Shotwell also celebrated the milestone on X. “A huge thank you to all of our customers and congrats to the Starlink team for such an incredible product,” she wrote.
That growth rate reflects both rising demand for broadband in underserved regions and Starlink’s expanding satellite constellation, which now includes more than 9,000 low-Earth-orbit satellites designed to deliver high-speed, low-latency internet worldwide.
Starlink’s momentum
Starlink’s momentum has been building up. SpaceX reported 4.6 million Starlink customers in December 2024, followed by 7 million by August 2025, and 8 million customers in November. Independent data also suggests Starlink usage is rising sharply, with Cloudflare reporting that global web traffic from Starlink users more than doubled in 2025, as noted in an Insider report.
Starlink’s momentum is increasingly tied to SpaceX’s broader financial outlook. Elon Musk has said the satellite network is “by far” the company’s largest revenue driver, and reports suggest SpaceX may be positioning itself for an initial public offering as soon as next year, with valuations estimated as high as $1.5 trillion. Musk has also suggested in the past that Starlink could have its own IPO in the future.
News
NVIDIA Director of Robotics: Tesla FSD v14 is the first AI to pass the “Physical Turing Test”
After testing FSD v14, Fan stated that his experience with FSD felt magical at first, but it soon started to feel like a routine.
NVIDIA Director of Robotics Jim Fan has praised Tesla’s Full Self-Driving (Supervised) v14 as the first AI to pass what he described as a “Physical Turing Test.”
After testing FSD v14, Fan stated that his experience with FSD felt magical at first, but it soon started to feel like a routine. And just like smartphones today, removing it now would “actively hurt.”
Jim Fan’s hands-on FSD v14 impressions
Fan, a leading researcher in embodied AI who is currently solving Physical AI at NVIDIA and spearheading the company’s Project GR00T initiative, noted that he actually was late to the Tesla game. He was, however, one of the first to try out FSD v14.
“I was very late to own a Tesla but among the earliest to try out FSD v14. It’s perhaps the first time I experience an AI that passes the Physical Turing Test: after a long day at work, you press a button, lay back, and couldn’t tell if a neural net or a human drove you home,” Fan wrote in a post on X.
Fan added: “Despite knowing exactly how robot learning works, I still find it magical watching the steering wheel turn by itself. First it feels surreal, next it becomes routine. Then, like the smartphone, taking it away actively hurts. This is how humanity gets rewired and glued to god-like technologies.”
The Physical Turing Test
The original Turing Test was conceived by Alan Turing in 1950, and it was aimed at determining if a machine could exhibit behavior that is equivalent to or indistinguishable from a human. By focusing on text-based conversations, the original Turing Test set a high bar for natural language processing and machine learning.
This test has been passed by today’s large language models. However, the capability to converse in a humanlike manner is a completely different challenge from performing real-world problem-solving or physical interactions. Thus, Fan introduced the Physical Turing Test, which challenges AI systems to demonstrate intelligence through physical actions.
Based on Fan’s comments, Tesla has demonstrated these intelligent physical actions with FSD v14. Elon Musk agreed with the NVIDIA executive, stating in a post on X that with FSD v14, “you can sense the sentience maturing.” Musk also praised Tesla AI, calling it the best “real-world AI” today.