Connect with us

News

GM buys LIDAR company for self-driving car program to take Tesla head-on

Published

on

General Motors issued a press release on Monday announcing that it will acquire Strobe, a California-based technology startup that makes affordable chip-scale LIDAR technology for self-driving cars. An 11-person team from Strobe will be joining GM’s Cruise Automation unit as part of the acquisition.

With more affordable and higher accuracy LIDAR sensors coming to market, automakers that are looking to transition to all-electric fleets are assessing the strategic value with investing into self-driving technology. GM’s purchase of Strobe can be seen as just that. Acquiring a small and nimble startup that has a core focus on developing the key sensor used in autonomous vehicles allows the Detroit-based auto giant to speed its path to market with a self-driving car.

Kyle Vogt, GM’s Cruise Automation Founder and CEO, said through a press release, “Strobe’s LIDAR technology will significantly improve the cost and capabilities of our vehicles so that we can more quickly accomplish our mission to deploy driverless vehicles at scale.”

While GM continues to charge forward with implementing LIDAR technology into its self-driving program, the company also complements its technology with radar sensors to create a fault-tolerant sensing suite. Tesla CEO Elon Musk has famously touted LIDAR as ‘unnecessary’ in the context of an autonomous car due to its high cost. Instead, Tesla has opted to use a combination of cameras, radars and ultrasonic sensors to form the foundation for its Autopilot system. But as pricing for LIDAR technology continues to drop, could we see a change of core design in future versions of Autopilot?

Vogt asserts that radar can operate under more challenging weather conditions, however it lacks the precision needed when making critical maneuvers at speed. “Strobe’s LIDAR sensors provide both accurate distance and velocity information, which can be checked against similar information from a RADAR sensor for redundancy. RADARs typically also provide distance and velocity information and operate under more challenging weather conditions, but they lack the angular resolution needed to make certain critical maneuvers at speed. When used together, cameras, LIDARs, and RADARs can complement each other to create a robust and fault-tolerant sensing suite that operates in a wide range of environmental and lighting conditions.” said Vogt in a blog post on Medium.

LIDAR on the other hand uses laser or concentrated light to map a high resolution 3D view of a the world, which arguably provides a higher precision view of a self-driving car’s surroundings. GM’s Director of autonomous vehicle integration has recently spoken up against Musk’s narrative that Tesla Autopilot will be fully autonomous and capable of piloting a car from California to New York on its own by the end of the year.

“The level of technology and knowing what it takes to do the mission, to say you can be a full level five with just cameras and radars is not physically possible,” said Miller about Tesla’s Autopilot suite. “Could you do it with what’s in a current Tesla Model S? I don’t think so.”

As the race to produce a fully autonomous car continues to heat up between Tesla, GM, Uber, and Google, and hardware prices decline, it’s only a matter of time before a tried and true combination of hardware will become the de-facto self-driving hardware suite. What will it be?

Advertisement
-->

 

Advertisement
Comments

Elon Musk

Starlink passes 9 million active customers just weeks after hitting 8 million

The milestone highlights the accelerating growth of Starlink, which has now been adding over 20,000 new users per day.

Published

on

Credit: Starlink/X

SpaceX’s Starlink satellite internet service has continued its rapid global expansion, surpassing 9 million active customers just weeks after crossing the 8 million mark. 

The milestone highlights the accelerating growth of Starlink, which has now been adding over 20,000 new users per day.

9 million customers

In a post on X, SpaceX stated that Starlink now serves over 9 million active users across 155 countries, territories, and markets. The company reached 8 million customers in early November, meaning it added roughly 1 million subscribers in under seven weeks, or about 21,275 new users on average per day. 

“Starlink is connecting more than 9M active customers with high-speed internet across 155 countries, territories, and many other markets,” Starlink wrote in a post on its official X account. SpaceX President Gwynne Shotwell also celebrated the milestone on X. “A huge thank you to all of our customers and congrats to the Starlink team for such an incredible product,” she wrote. 

That growth rate reflects both rising demand for broadband in underserved regions and Starlink’s expanding satellite constellation, which now includes more than 9,000 low-Earth-orbit satellites designed to deliver high-speed, low-latency internet worldwide.

Advertisement
-->

Starlink’s momentum

Starlink’s momentum has been building up. SpaceX reported 4.6 million Starlink customers in December 2024, followed by 7 million by August 2025, and 8 million customers in November. Independent data also suggests Starlink usage is rising sharply, with Cloudflare reporting that global web traffic from Starlink users more than doubled in 2025, as noted in an Insider report.

Starlink’s momentum is increasingly tied to SpaceX’s broader financial outlook. Elon Musk has said the satellite network is “by far” the company’s largest revenue driver, and reports suggest SpaceX may be positioning itself for an initial public offering as soon as next year, with valuations estimated as high as $1.5 trillion. Musk has also suggested in the past that Starlink could have its own IPO in the future. 

Continue Reading

News

NVIDIA Director of Robotics: Tesla FSD v14 is the first AI to pass the “Physical Turing Test”

After testing FSD v14, Fan stated that his experience with FSD felt magical at first, but it soon started to feel like a routine.

Published

on

Credit: Grok Imagine

NVIDIA Director of Robotics Jim Fan has praised Tesla’s Full Self-Driving (Supervised) v14 as the first AI to pass what he described as a “Physical Turing Test.”

After testing FSD v14, Fan stated that his experience with FSD felt magical at first, but it soon started to feel like a routine. And just like smartphones today, removing it now would “actively hurt.”

Jim Fan’s hands-on FSD v14 impressions

Fan, a leading researcher in embodied AI who is currently solving Physical AI at NVIDIA and spearheading the company’s Project GR00T initiative, noted that he actually was late to the Tesla game. He was, however, one of the first to try out FSD v14

“I was very late to own a Tesla but among the earliest to try out FSD v14. It’s perhaps the first time I experience an AI that passes the Physical Turing Test: after a long day at work, you press a button, lay back, and couldn’t tell if a neural net or a human drove you home,” Fan wrote in a post on X. 

Fan added: “Despite knowing exactly how robot learning works, I still find it magical watching the steering wheel turn by itself. First it feels surreal, next it becomes routine. Then, like the smartphone, taking it away actively hurts. This is how humanity gets rewired and glued to god-like technologies.”

Advertisement
-->

The Physical Turing Test

The original Turing Test was conceived by Alan Turing in 1950, and it was aimed at determining if a machine could exhibit behavior that is equivalent to or indistinguishable from a human. By focusing on text-based conversations, the original Turing Test set a high bar for natural language processing and machine learning. 

This test has been passed by today’s large language models. However, the capability to converse in a humanlike manner is a completely different challenge from performing real-world problem-solving or physical interactions. Thus, Fan introduced the Physical Turing Test, which challenges AI systems to demonstrate intelligence through physical actions.

Based on Fan’s comments, Tesla has demonstrated these intelligent physical actions with FSD v14. Elon Musk agreed with the NVIDIA executive, stating in a post on X that with FSD v14, “you can sense the sentience maturing.” Musk also praised Tesla AI, calling it the best “real-world AI” today.

Continue Reading

News

Tesla AI team burns the Christmas midnight oil by releasing FSD v14.2.2.1

The update was released just a day after FSD v14.2.2 started rolling out to customers. 

Published

on

Credit: Grok

Tesla is burning the midnight oil this Christmas, with the Tesla AI team quietly rolling out Full Self-Driving (Supervised) v14.2.2.1 just a day after FSD v14.2.2 started rolling out to customers. 

Tesla owner shares insights on FSD v14.2.2.1

Longtime Tesla owner and FSD tester @BLKMDL3 shared some insights following several drives with FSD v14.2.2.1 in rainy Los Angeles conditions with standing water and faded lane lines. He reported zero steering hesitation or stutter, confident lane changes, and maneuvers executed with precision that evoked the performance of Tesla’s driverless Robotaxis in Austin.

Parking performance impressed, with most spots nailed perfectly, including tight, sharp turns, in single attempts without shaky steering. One minor offset happened only due to another vehicle that was parked over the line, which FSD accommodated by a few extra inches. In rain that typically erases road markings, FSD visualized lanes and turn lines better than humans, positioning itself flawlessly when entering new streets as well.

“Took it up a dark, wet, and twisty canyon road up and down the hill tonight and it went very well as to be expected. Stayed centered in the lane, kept speed well and gives a confidence inspiring steering feel where it handles these curvy roads better than the majority of human drivers,” the Tesla owner wrote in a post on X.

Tesla’s FSD v14.2.2 update

Just a day before FSD v14.2.2.1’s release, Tesla rolled out FSD v14.2.2, which was focused on smoother real-world performance, better obstacle awareness, and precise end-of-trip routing. According to the update’s release notes, FSD v14.2.2 upgrades the vision encoder neural network with higher resolution features, enhancing detection of emergency vehicles, road obstacles, and human gestures.

Advertisement
-->

New Arrival Options also allowed users to select preferred drop-off styles, such as Parking Lot, Street, Driveway, Parking Garage, or Curbside, with the navigation pin automatically adjusting to the ideal spot. Other refinements include pulling over for emergency vehicles, real-time vision-based detours for blocked roads, improved gate and debris handling, and Speed Profiles for customized driving styles.

Continue Reading