General Motors (GM) has announced some crucial details about its upcoming Ultra Cruise autonomous driving system.
With the mass proliferation of autonomous driving, thanks largely to Tesla, more and more companies have begun working on their own systems. This includes GM, which has already released its Super Cruise system but has now released details about its next iteration, Ultra Cruise.
In the design process of autonomous systems, two leaders with two very different design philosophies have emerged. Tesla is the first, heavily relying on AI while focusing on visual sensor systems to guide the vehicle. This has been seen most clearly in Tesla’s upcoming hardware 4, which eliminates ultra-sonic sensors, instead opting to dramatically increase the quality of the visual sensing systems around the vehicle. The second camp is currently headed by Mercedes.
Mercedes has taken the complete opposite approach to Tesla. While still relying on AI guidance, Mercedes uses a combination of three different sensor arrays, visual, ultra-sonic, and LiDAR, to help guide the vehicle.
That takes us to GM’s Ultra Cruise, which was revealed in detail today. Much like Mercedes, GM has chosen to use three sensor arrays; visual, ultra-sonic, and LiDAR. Further emulating the premium German auto group, GM’s system “will have a 360-degree view of the vehicle,” according to the automaker.
According to GM, this architecture allows redundancy and sensor specialization, whereby each sensor group will help focus on a single task. The camera and short-range ultra-sonic radar systems focus on object detection, primarily at low speeds and in urban environments. These systems will help the vehicle detect other vehicles, traffic signals and signs, and pedestrians. At higher speeds, the long-range radar and LiDAR systems also come into play, helping to detect vehicles and road features from further away.
GM also points out that, thanks to the capabilities of radar and LiDAR systems in poor visibility conditions, the system benefits from better overall uptime. GM aims to create an autonomous driving system allowing hands-free driving in 95% of situations.
As for the Tesla approach, the leader in autonomous driving certainly has credibility in its design. According to Tesla’s blog post about removing the ultra-sonic sensor capabilities from its vehicles, “Tesla Vision” equipped vehicles perform just as well, if not better, in tests like the pedestrian automatic emergency braking (AEB) test. Though it should be noted that the lack of secondary sensors is also likely to help reduce vehicle manufacturing costs.
Ultra Cruise will first be available on the upcoming Cadillac Celestiq. Still, with a growing number of vehicles coming with GM’s Super Cruise, it’s likely only a matter of time before the more advanced ADAS system makes its way to mass market offerings as well.
“GM’s fundamental strategy for all ADAS features, including Ultra Cruise, is safely deploying these technologies,” said Jason Ditman, GM chief engineer, Ultra Cruise. “A deep knowledge of what Ultra Cruise is capable of, along with the detailed picture provided by its sensors, will help us understand when Ultra Cruise can be engaged and when to hand control back to the driver. We believe consistent, clear operation can help build drivers’ confidence in Ultra Cruise.”
With more and more automakers entering the autonomous driving space every year, it will be interesting to see which architecture they choose to invest in. But what could prove to be the defining trait is which system performs better in the real world. And as of now, it isn’t immediately clear who the victor is.
What do you think of the article? Do you have any comments, questions, or concerns? Shoot me an email at william@teslarati.com. You can also reach me on Twitter @WilliamWritin. If you have news tips, email us at tips@teslarati.com!
Elon Musk
Elon Musk and Tesla AI Director share insights after empty driver seat Robotaxi rides
The executives’ unoccupied tests hint at the rapid progress of Tesla’s unsupervised Robotaxi efforts.
Tesla CEO Elon Musk and AI Director Ashok Elluswamy celebrated Christmas Eve by sharing personal experiences with Robotaxi vehicles that had no safety monitor or occupant in the driver’s seat. Musk described the system’s “perfect driving” around Austin, while Elluswamy posted video from the back seat, calling it “an amazing experience.”
The executives’ unoccupied tests hint at the rapid progress of Tesla’s unsupervised Robotaxi efforts.
Elon and Ashok’s firsthand Robotaxi insights
Prior to Musk and the Tesla AI Director’s posts, sightings of unmanned Teslas navigating public roads were widely shared on social media. One such vehicle was spotted in Austin, Texas, which Elon Musk acknowleged by stating that “Testing is underway with no occupants in the car.”
Based on his Christmas Eve post, Musk seemed to have tested an unmanned Tesla himself. “A Tesla with no safety monitor in the car and me sitting in the passenger seat took me all around Austin on Sunday with perfect driving,” Musk wrote in his post.
Elluswamy responded with a 2-minute video showing himself in the rear of an unmanned Tesla. The video featured the vehicle’s empty front seats, as well as its smooth handling through real-world traffic. He captioned his video with the words, “It’s an amazing experience!”
Towards Unsupervised operations
During an xAI Hackathon earlier this month, Elon Musk mentioned that Tesla owed be removing Safety Monitors from its Robotaxis in Austin in just three weeks. “Unsupervised is pretty much solved at this point. So there will be Tesla Robotaxis operating in Austin with no one in them. Not even anyone in the passenger seat in about three weeks,” he said. Musk echoed similar estimates at the 2025 Annual Shareholder Meeting and the Q3 2025 earnings call.
Considering the insights that were posted Musk and Elluswamy, it does appear that Tesla is working hard towards operating its Robotaxis with no safety monitors. This is quite impressive considering that the service was launched just earlier this year.
Elon Musk
Starlink passes 9 million active customers just weeks after hitting 8 million
The milestone highlights the accelerating growth of Starlink, which has now been adding over 20,000 new users per day.
SpaceX’s Starlink satellite internet service has continued its rapid global expansion, surpassing 9 million active customers just weeks after crossing the 8 million mark.
The milestone highlights the accelerating growth of Starlink, which has now been adding over 20,000 new users per day.
9 million customers
In a post on X, SpaceX stated that Starlink now serves over 9 million active users across 155 countries, territories, and markets. The company reached 8 million customers in early November, meaning it added roughly 1 million subscribers in under seven weeks, or about 21,275 new users on average per day.
“Starlink is connecting more than 9M active customers with high-speed internet across 155 countries, territories, and many other markets,” Starlink wrote in a post on its official X account. SpaceX President Gwynne Shotwell also celebrated the milestone on X. “A huge thank you to all of our customers and congrats to the Starlink team for such an incredible product,” she wrote.
That growth rate reflects both rising demand for broadband in underserved regions and Starlink’s expanding satellite constellation, which now includes more than 9,000 low-Earth-orbit satellites designed to deliver high-speed, low-latency internet worldwide.
Starlink’s momentum
Starlink’s momentum has been building up. SpaceX reported 4.6 million Starlink customers in December 2024, followed by 7 million by August 2025, and 8 million customers in November. Independent data also suggests Starlink usage is rising sharply, with Cloudflare reporting that global web traffic from Starlink users more than doubled in 2025, as noted in an Insider report.
Starlink’s momentum is increasingly tied to SpaceX’s broader financial outlook. Elon Musk has said the satellite network is “by far” the company’s largest revenue driver, and reports suggest SpaceX may be positioning itself for an initial public offering as soon as next year, with valuations estimated as high as $1.5 trillion. Musk has also suggested in the past that Starlink could have its own IPO in the future.
News
NVIDIA Director of Robotics: Tesla FSD v14 is the first AI to pass the “Physical Turing Test”
After testing FSD v14, Fan stated that his experience with FSD felt magical at first, but it soon started to feel like a routine.
NVIDIA Director of Robotics Jim Fan has praised Tesla’s Full Self-Driving (Supervised) v14 as the first AI to pass what he described as a “Physical Turing Test.”
After testing FSD v14, Fan stated that his experience with FSD felt magical at first, but it soon started to feel like a routine. And just like smartphones today, removing it now would “actively hurt.”
Jim Fan’s hands-on FSD v14 impressions
Fan, a leading researcher in embodied AI who is currently solving Physical AI at NVIDIA and spearheading the company’s Project GR00T initiative, noted that he actually was late to the Tesla game. He was, however, one of the first to try out FSD v14.
“I was very late to own a Tesla but among the earliest to try out FSD v14. It’s perhaps the first time I experience an AI that passes the Physical Turing Test: after a long day at work, you press a button, lay back, and couldn’t tell if a neural net or a human drove you home,” Fan wrote in a post on X.
Fan added: “Despite knowing exactly how robot learning works, I still find it magical watching the steering wheel turn by itself. First it feels surreal, next it becomes routine. Then, like the smartphone, taking it away actively hurts. This is how humanity gets rewired and glued to god-like technologies.”
The Physical Turing Test
The original Turing Test was conceived by Alan Turing in 1950, and it was aimed at determining if a machine could exhibit behavior that is equivalent to or indistinguishable from a human. By focusing on text-based conversations, the original Turing Test set a high bar for natural language processing and machine learning.
This test has been passed by today’s large language models. However, the capability to converse in a humanlike manner is a completely different challenge from performing real-world problem-solving or physical interactions. Thus, Fan introduced the Physical Turing Test, which challenges AI systems to demonstrate intelligence through physical actions.
Based on Fan’s comments, Tesla has demonstrated these intelligent physical actions with FSD v14. Elon Musk agreed with the NVIDIA executive, stating in a post on X that with FSD v14, “you can sense the sentience maturing.” Musk also praised Tesla AI, calling it the best “real-world AI” today.