

News
Tesla Autopilot veterans launch company to accelerate self-driving development
After working on Tesla’s Autopilot team for 2.5 years, Andrew Kouri and Erik Reed decided to start their own self-driving, AI-based company rightfully named lvl5. Together with iRobot engineer George Tal, lvl5 aims to develop advanced vision software and HD maps for self-driving cars.
Founded in 2016, lvl5 was incubated at renown Silicon Valley incubator Y Combinator and later raised $2 million in seed funding from investor Paul Buchheit, who’s a partner at Y Combinator and creator of Gmail, and Max Altman’s 9Point Ventures.

In just 3 months, lvl5 racked up almost 500,000 miles of US roadway coverage with Payver. (Photo: lvl5)
“Working with lvl5’s founders while they were at Y Combinator, it was clear they have unmatched expertise in computer vision, which is the secret sauce of their solution,” said Buchheit. “I have no doubt this is the team to make self-driving a reality in the near term.”
At the center of lvl5’s technology is their computer vision algorithms. Founder and CTO George Tall previously specialized in computer vision technology at iRobot. In addition to Tall’s experience at iRobot, Kouri and Reed’s experience at Tesla undoubtedly left them with unparalleled expertise in computer vision.
Instead of turning to expensive LiDAR technology, lvl5’s computer vision analyzes its environment for stoplights, signs, potholes, and other objects. The system can be accurate to 10cm, a notable measure considering it’s derived from simple cameras and smartphones. In comparison, LiDAR systems can cost over $80,000 but are accurate to 3cm.
- Each purple trace through the intersection contributes to building the 3D map from a 2D image. For each frame, lvl5’s computer vision technology computes the position of the vehicle relative to other objects in the intersection and create a point cloud that resembles the output from LiDAR. Each white sideways “pyramid” represents the location of a captured frame in the video trace. (Photo: lvl5)
- This image is taken from one of lvl5’s neural nets, which is designed to draw a box around the position of traffic lights in an image. (Photo: lvl5)
- With only two trips through this intersection, lvl5 can start to extract semantic features such as a stop sign. (Photo: lvl5)
- The three founders of lvl5 in front of their SF home. Left to right: Erik Reed, Andrew Kouri, George Tall (Photo: Lvl5)
So how will lvl5 map roadways in the world using their computer vision technology? Smartphones. Well, for now at least. The company has released an app called Payver that allows anyone’s smartphone to collect data while driving and get paid between $.01-$.05 per mile, depending on a number of factors. Users of the app place their phone in a mount on their dashboard and let the app gather driving data.
The data is sent to lvl5’s central hub and processed by their computer vision technology. “Lvl5 is solving one of the biggest obstacles to widespread availability of self-driving technology,” said Max Altman, one of lvl5’s seed round investors and partner at 9Point Ventures. “Without accurate and efficient HD mapping, as well as the computer vision software that enables it, self-driving vehicles will take much longer to reach mass-market. This will delay everything from safer roads to efficient delivery services.”
GIF: lvl5
“We have to make self-driving available worldwide – not just in California,” Co-Founder and CEO Andrew Kouri said in a company statement. “Our approach, which combines computer vision software, crowdsourcing and widely available, affordable hardware, means our technology is accessible and will make self-driving a reality today, rather than five years from now.”
The company has already established pilot programs with major automakers and both Uber and Lyft. Companies will pay lvl5 an initial fee to use the maps, along with a monthly subscription to keep the maps continuously updated. “Through its OEM-agnostic approach, lvl5 will be able to collect significant amounts of mapping data from millions of cars in order to scale the technology for the benefit of drivers and pedestrians around the world,” the company’s press release states.
News
Tesla lands regulatory green light for Robotaxi testing in new state
This will be the third state in total where Tesla is operating Robotaxi, following Austin and California.

Tesla has landed a regulatory green light to test its Robotaxi platform in a new state, less than three months after the ride-hailing service launched in Texas.
Tesla first launched its driverless Robotaxi suite in Austin, Texas, back on June 22. Initially offering rides to a small group of people, Tesla kept things limited, but this was not to be the mentality for very long.
It continued to expand the rider population, the service area, and the vehicle fleet in Austin.
The company also launched rides in the Bay Area, but it does use a person in the driver’s seat to maintain safety. In Austin, the “Safety Monitor” is present in the passenger’s seat during local rides, and in the driver’s seat for routes that involve highway driving.
Tesla is currently testing the Robotaxi platform in other states. We reported that it was testing in Tempe, Arizona, as validation vehicles are traveling around the city in preparation for Robotaxi.
Tesla looks to make a big splash with Robotaxi in a new market
Tesla is also hoping to launch in Florida and New York, as job postings have shown the company’s intention to operate there.
However, it appears it will launch in Nevada before those states, as the company submitted its application to obtain a Testing Registry certification on September 3. It was processed by the state’s Department of Motor Vehicles Office of Business Licensing on September 10.
NEWS: Tesla has officially received approval from the Nevada DMV to start testing autonomous vehicles (robotaxis) on public roads.
Today, I confirmed directly with the Nevada DMV that @Tesla‘s application to obtain a Testing Registry certification was approved by the DMV Office… pic.twitter.com/hx5JhHBFiD
— Sawyer Merritt (@SawyerMerritt) September 11, 2025
It will then need to self-certify for operations, essentially meaning they will need to comply with various state requirements.
This will be the third state in total where Tesla is operating Robotaxi, following Austin and California.
CEO Elon Musk has stated that he believes Robotaxi will be available to at least half of the U.S. population by the end of the year. Geographically, Tesla will need to make incredible strides over the final four months of the year to achieve this.
News
Tesla is improving this critical feature in older vehicles

Tesla is set to improve a critical feature that has not been present in older vehicles with a new update.
Tesla vehicles feature a comprehensive suite of driver assistance features, some of which aid in driving itself, while others support the vehicle’s surroundings.
One of those features is that of Driver Visualization, and with the rollout of a new update, owners of Intel-based Tesla vehicles are receiving an upgrade that will come with a simple software update.
Tesla plans to use Unreal Engine for driver visualization with crazy upgrade
The update will provide new visualizations while Intel-based vehicles are in reverse, a feature that was not previously available, but will be with Software Update 2025.32.2.
The improvement was spotted by Not a Tesla App via TheBeatYT_evil:
Noticed something new in 2025.32.2 on my Intel MCU + USS car with FSD.
When shifting into reverse, the full FSD visualization now stays on instead of switching to the old plain autopilot visuals.
Might be small, but it makes backing up feel more seamless. pic.twitter.com/o44levkdtM
— Beat (@TheBeatYT_evil) September 5, 2025
Previously, vehicles Tesla built were equipped with Intel-based processors, but newer cars feature the AMD chip, which is capable of rendering these visualizations as they happen. They were capable of visualizations when driving forward, but not in reverse, which is what this change resolves.
It is a good sign for those with Intel-based vehicles, as Tesla seems to be paying attention to what those cars are not capable of and improving them.
This was an undocumented improvement associated with this particular update, so you will not find any mention of it in the release notes that Tesla distributes with each update.
News
Tesla looks to make a big splash with Robotaxi in a new market
Tesla has been transparent that it is prioritizing safety, but it believes it can expand to basically any geographical location within the United States and find success with its Robotaxi suite. CEO Elon Musk said it could be available to half of the U.S. population by the end of the year.

Tesla is looking to make a big splash with Robotaxi in a new market, as the company was spotted testing validation vehicles in one region where it has not yet launched its ride-hailing service.
After launching Robotaxi in Austin in late June, Tesla followed up with a relatively quick expansion to the Bay Area of California. Both service areas are operating with a geofence that is expansive: In Texas, it is 173 square miles, while in the Bay Area, it is roughly 400 square miles.
Tesla has been transparent that it is prioritizing safety, but it believes it can expand to basically any geographical location within the United States and find success with its Robotaxi suite. CEO Elon Musk said it could be available to half of the U.S. population by the end of the year.
There have been plenty of reports out there that have speculated as to where Tesla would land next to test Robotaxi, and Nevada, Florida, Arizona, and New York have all been in the realm of possibility. These regions will need to approve Tesla for regulatory purposes before Robotaxi can officially operate.
Tesla is still testing and performing validation in several regions, and in Tempe, Arizona, things are moving forward as a Model Y with a LiDAR rig was spotted performing ground truth for the platform:
🚨 BREAKING: Just caught Tesla Robotaxi test vehicles cruising in Tempe, AZ! Rollout coming soon! pic.twitter.com/Oanw0Zx5pP
— Adub08 (@adub0808) September 10, 2025
With the LiDAR unit, many followers of the self-driving and autonomy space might wonder why Tesla uses these apparatuses during validation, especially considering the company’s stance and vision-based approach.
LiDAR is used for “ground truth,” which is basically a solidification or confirmation of what the cameras on the car are seeing. It is a great way to essentially confirm the accuracy of the vision-based suite, and will not be used on Robotaxi units used within the ride-hailing suite.
The Robotaxi platform was made available to the public earlier this month, as Tesla launched its app for iOS users.
Tesla Robotaxi app download rate demolishes Uber, Waymo all-time highs
Downloading the app allows you to join a waitlist, giving you the opportunity to utilize and test the Robotaxi platform in either Austin or the Bay Area.
-
News2 weeks ago
Tesla is overhauling its Full Self-Driving subscription for easier access
-
Elon Musk2 weeks ago
Elon Musk shares unbelievable Starship Flight 10 landing feat
-
Elon Musk2 weeks ago
Elon Musk reveals when SpaceX will perform first-ever Starship catch
-
Elon Musk1 week ago
Tesla’s next-gen Optimus prototype with Grok revealed
-
News6 days ago
Tesla launches new Supercharger program that business owners will love
-
Elon Musk5 days ago
Tesla Board takes firm stance on Elon Musk’s political involvement in pay package proxy
-
News1 week ago
Tesla appears to be mulling a Cyber SUV design
-
News1 week ago
Tesla deploys Unsupervised FSD in Europe for the first time—with a twist