News
Tesla Autopilot veterans launch company to accelerate self-driving development
After working on Tesla’s Autopilot team for 2.5 years, Andrew Kouri and Erik Reed decided to start their own self-driving, AI-based company rightfully named lvl5. Together with iRobot engineer George Tal, lvl5 aims to develop advanced vision software and HD maps for self-driving cars.
Founded in 2016, lvl5 was incubated at renown Silicon Valley incubator Y Combinator and later raised $2 million in seed funding from investor Paul Buchheit, who’s a partner at Y Combinator and creator of Gmail, and Max Altman’s 9Point Ventures.

In just 3 months, lvl5 racked up almost 500,000 miles of US roadway coverage with Payver. (Photo: lvl5)
“Working with lvl5’s founders while they were at Y Combinator, it was clear they have unmatched expertise in computer vision, which is the secret sauce of their solution,” said Buchheit. “I have no doubt this is the team to make self-driving a reality in the near term.”
At the center of lvl5’s technology is their computer vision algorithms. Founder and CTO George Tall previously specialized in computer vision technology at iRobot. In addition to Tall’s experience at iRobot, Kouri and Reed’s experience at Tesla undoubtedly left them with unparalleled expertise in computer vision.
Instead of turning to expensive LiDAR technology, lvl5’s computer vision analyzes its environment for stoplights, signs, potholes, and other objects. The system can be accurate to 10cm, a notable measure considering it’s derived from simple cameras and smartphones. In comparison, LiDAR systems can cost over $80,000 but are accurate to 3cm.
- Each purple trace through the intersection contributes to building the 3D map from a 2D image. For each frame, lvl5’s computer vision technology computes the position of the vehicle relative to other objects in the intersection and create a point cloud that resembles the output from LiDAR. Each white sideways “pyramid” represents the location of a captured frame in the video trace. (Photo: lvl5)
- This image is taken from one of lvl5’s neural nets, which is designed to draw a box around the position of traffic lights in an image. (Photo: lvl5)
- With only two trips through this intersection, lvl5 can start to extract semantic features such as a stop sign. (Photo: lvl5)
- The three founders of lvl5 in front of their SF home. Left to right: Erik Reed, Andrew Kouri, George Tall (Photo: Lvl5)
So how will lvl5 map roadways in the world using their computer vision technology? Smartphones. Well, for now at least. The company has released an app called Payver that allows anyone’s smartphone to collect data while driving and get paid between $.01-$.05 per mile, depending on a number of factors. Users of the app place their phone in a mount on their dashboard and let the app gather driving data.
The data is sent to lvl5’s central hub and processed by their computer vision technology. “Lvl5 is solving one of the biggest obstacles to widespread availability of self-driving technology,” said Max Altman, one of lvl5’s seed round investors and partner at 9Point Ventures. “Without accurate and efficient HD mapping, as well as the computer vision software that enables it, self-driving vehicles will take much longer to reach mass-market. This will delay everything from safer roads to efficient delivery services.”
GIF: lvl5
“We have to make self-driving available worldwide – not just in California,” Co-Founder and CEO Andrew Kouri said in a company statement. “Our approach, which combines computer vision software, crowdsourcing and widely available, affordable hardware, means our technology is accessible and will make self-driving a reality today, rather than five years from now.”
The company has already established pilot programs with major automakers and both Uber and Lyft. Companies will pay lvl5 an initial fee to use the maps, along with a monthly subscription to keep the maps continuously updated. “Through its OEM-agnostic approach, lvl5 will be able to collect significant amounts of mapping data from millions of cars in order to scale the technology for the benefit of drivers and pedestrians around the world,” the company’s press release states.
Elon Musk
Starlink passes 9 million active customers just weeks after hitting 8 million
The milestone highlights the accelerating growth of Starlink, which has now been adding over 20,000 new users per day.
SpaceX’s Starlink satellite internet service has continued its rapid global expansion, surpassing 9 million active customers just weeks after crossing the 8 million mark.
The milestone highlights the accelerating growth of Starlink, which has now been adding over 20,000 new users per day.
9 million customers
In a post on X, SpaceX stated that Starlink now serves over 9 million active users across 155 countries, territories, and markets. The company reached 8 million customers in early November, meaning it added roughly 1 million subscribers in under seven weeks, or about 21,275 new users on average per day.
“Starlink is connecting more than 9M active customers with high-speed internet across 155 countries, territories, and many other markets,” Starlink wrote in a post on its official X account. SpaceX President Gwynne Shotwell also celebrated the milestone on X. “A huge thank you to all of our customers and congrats to the Starlink team for such an incredible product,” she wrote.
That growth rate reflects both rising demand for broadband in underserved regions and Starlink’s expanding satellite constellation, which now includes more than 9,000 low-Earth-orbit satellites designed to deliver high-speed, low-latency internet worldwide.
Starlink’s momentum
Starlink’s momentum has been building up. SpaceX reported 4.6 million Starlink customers in December 2024, followed by 7 million by August 2025, and 8 million customers in November. Independent data also suggests Starlink usage is rising sharply, with Cloudflare reporting that global web traffic from Starlink users more than doubled in 2025, as noted in an Insider report.
Starlink’s momentum is increasingly tied to SpaceX’s broader financial outlook. Elon Musk has said the satellite network is “by far” the company’s largest revenue driver, and reports suggest SpaceX may be positioning itself for an initial public offering as soon as next year, with valuations estimated as high as $1.5 trillion. Musk has also suggested in the past that Starlink could have its own IPO in the future.
News
NVIDIA Director of Robotics: Tesla FSD v14 is the first AI to pass the “Physical Turing Test”
After testing FSD v14, Fan stated that his experience with FSD felt magical at first, but it soon started to feel like a routine.
NVIDIA Director of Robotics Jim Fan has praised Tesla’s Full Self-Driving (Supervised) v14 as the first AI to pass what he described as a “Physical Turing Test.”
After testing FSD v14, Fan stated that his experience with FSD felt magical at first, but it soon started to feel like a routine. And just like smartphones today, removing it now would “actively hurt.”
Jim Fan’s hands-on FSD v14 impressions
Fan, a leading researcher in embodied AI who is currently solving Physical AI at NVIDIA and spearheading the company’s Project GR00T initiative, noted that he actually was late to the Tesla game. He was, however, one of the first to try out FSD v14.
“I was very late to own a Tesla but among the earliest to try out FSD v14. It’s perhaps the first time I experience an AI that passes the Physical Turing Test: after a long day at work, you press a button, lay back, and couldn’t tell if a neural net or a human drove you home,” Fan wrote in a post on X.
Fan added: “Despite knowing exactly how robot learning works, I still find it magical watching the steering wheel turn by itself. First it feels surreal, next it becomes routine. Then, like the smartphone, taking it away actively hurts. This is how humanity gets rewired and glued to god-like technologies.”
The Physical Turing Test
The original Turing Test was conceived by Alan Turing in 1950, and it was aimed at determining if a machine could exhibit behavior that is equivalent to or indistinguishable from a human. By focusing on text-based conversations, the original Turing Test set a high bar for natural language processing and machine learning.
This test has been passed by today’s large language models. However, the capability to converse in a humanlike manner is a completely different challenge from performing real-world problem-solving or physical interactions. Thus, Fan introduced the Physical Turing Test, which challenges AI systems to demonstrate intelligence through physical actions.
Based on Fan’s comments, Tesla has demonstrated these intelligent physical actions with FSD v14. Elon Musk agreed with the NVIDIA executive, stating in a post on X that with FSD v14, “you can sense the sentience maturing.” Musk also praised Tesla AI, calling it the best “real-world AI” today.
News
Tesla AI team burns the Christmas midnight oil by releasing FSD v14.2.2.1
The update was released just a day after FSD v14.2.2 started rolling out to customers.
Tesla is burning the midnight oil this Christmas, with the Tesla AI team quietly rolling out Full Self-Driving (Supervised) v14.2.2.1 just a day after FSD v14.2.2 started rolling out to customers.
Tesla owner shares insights on FSD v14.2.2.1
Longtime Tesla owner and FSD tester @BLKMDL3 shared some insights following several drives with FSD v14.2.2.1 in rainy Los Angeles conditions with standing water and faded lane lines. He reported zero steering hesitation or stutter, confident lane changes, and maneuvers executed with precision that evoked the performance of Tesla’s driverless Robotaxis in Austin.
Parking performance impressed, with most spots nailed perfectly, including tight, sharp turns, in single attempts without shaky steering. One minor offset happened only due to another vehicle that was parked over the line, which FSD accommodated by a few extra inches. In rain that typically erases road markings, FSD visualized lanes and turn lines better than humans, positioning itself flawlessly when entering new streets as well.
“Took it up a dark, wet, and twisty canyon road up and down the hill tonight and it went very well as to be expected. Stayed centered in the lane, kept speed well and gives a confidence inspiring steering feel where it handles these curvy roads better than the majority of human drivers,” the Tesla owner wrote in a post on X.
Tesla’s FSD v14.2.2 update
Just a day before FSD v14.2.2.1’s release, Tesla rolled out FSD v14.2.2, which was focused on smoother real-world performance, better obstacle awareness, and precise end-of-trip routing. According to the update’s release notes, FSD v14.2.2 upgrades the vision encoder neural network with higher resolution features, enhancing detection of emergency vehicles, road obstacles, and human gestures.
New Arrival Options also allowed users to select preferred drop-off styles, such as Parking Lot, Street, Driveway, Parking Garage, or Curbside, with the navigation pin automatically adjusting to the ideal spot. Other refinements include pulling over for emergency vehicles, real-time vision-based detours for blocked roads, improved gate and debris handling, and Speed Profiles for customized driving styles.


