Connect with us
Tesla Autopilot now enables the car to perceive space around it Tesla Autopilot now enables the car to perceive space around it

News

Tesla Autopilot now enables the car to perceive space around it

Credit: Ashok Elluswamy

Published

on

Tesla Autopilot is now enabling the car to perceive the space around it thanks to the development of its Occupancy Networks. Tesla’s Autopilot Software Director, Ashok Elluswamy, shared a detailed thread on Twitter about a recent workshop the Autopilot team held. He also shared the workshop on Twitter.

In the video and Twitter Thread, Ashok explained how Tesla developed Occupancy Networks to literally give the car a sense of its surroundings. Humans have the ability to understand the objects around them at any given time. Is that car down the road moving at a slow speed or a fast speed? Do I, a pedestrian, have enough time to get across the street before being hit? What is that in the middle of the road? What is that falling from the sky? I should move out the way.

Advertisement

These reactions to scenarios and split-second decisions come naturally to humans. Tesla’s Autopilot Team is working to program the vehicles to do the same thing and this will save lives. Imagine the car being able to correctly detect its surroundings while the driver isn’t even paying attention. An example is sudden unintended accelerations (SUA). Ashok pointed out that Autopilot prevents around 40 of these types of accidents daily.

The workshop was held in June at this year’s Conference on Computer Vision and Pattern Recognition (CVPR.) in New Orleans. Ashok explained that the team developed Occupancy Networks which enable the car to predict the volumetric occupancy of everything around it.

Ashok explained that the typical approaches such as image-space segmentation of free space or pixel-wise depth have many issues. The solution to those issues is Occupancy Networks.

In other words, Occupancy Networks enable the car to perceive the space around it and determine whether or not it can drive in that space. For example, if a UFO were to suddenly crash in front of you while you’re driving, you would react quickly in the safest way possible. This is what the Autopilot Team is training the software to do.

Advertisement

Ashok shared details of how Occupancy Networks used Neural Radience Fields (NeRFs). “The occupancy representation of these networks allows for differentiable rendering of images (based on the Neural Radiance Fields work). However, unlike typical NeRFs, which are per scene, these occupancy nets generalize across scenes.”

You can read Ashok’s full Twitter thread here and you can watch his presentation here. We are a little over a month before Tesla’s AI Day and I’m sure Tesla will share more about the life-saving technology it is working on as well as the Optimus Bot.

Advertisement

Dr. Know It All recently published a video about the new 10.69 update and shared his thought about Occupancy Network.

In a message on Twitter, he told me, “The beauty of Occupancy Networks is that the car doesn’t have to know what the objects it sees are, it just has to know that they are there in order to avoid them!”

Note: Johnna is a Tesla shareholder and supports its mission. 

Your feedback is important. If you have any comments, concerns, or see a typo, you can email me at johnna@teslarati.com. You can also reach me on Twitter @JohnnaCrider1

Advertisement

Johnna Crider is a Baton Rouge writer covering Tesla, Elon Musk, EVs, and clean energy & supports Tesla's mission. Johnna also interviewed Elon Musk and you can listen here

Advertisement
Comments

Elon Musk

Tesla is sending its humanoid Optimus robot to the Boston Marathon

Tesla’s Optimus robot is heading to the Boston Marathon finish line

Published

on

By

Tesla’s Optimus humanoid robot will be stationed at the Tesla showroom at 888 Boylston Street in Boston, right along the final stretch of the Boston Marathon today, ready to cheer on runners and pose for photos with spectators.

According to a Tesla email shared by content creator Sawyer Merritt on X, Optimus will be at the Boston Boylston Street showroom on April 20, coinciding with Marathon Monday weekend. The Boston Marathon finishes on Boylston Street, and the surrounding area draws hundreds of thousands of spectators along with international broadcast coverage. Placing Optimus there puts it in front of a massive public audience at zero advertising cost.

The Tesla showroom is at 888 Boylston Street, between Gloucester Street and Fairfield Street. The final mile of the marathon runs directly along Boylston Street, with runners passing the big stores before reaching the finish line at Copley Square.

Optimus was first announced at Tesla’s AI Day event on August 19, 2021, when Elon Musk presented a vision for a general-purpose robot designed to take on dangerous, repetitive, and unwanted tasks. In March 2026, Optimus appeared at the Appliance and Electronics World Expo in Shanghai, where on-site staff stated that mass production of the robot could begin by the end of 2026. Before that, it showed up at the Tesla Hollywood Diner opening in July 2025 and at a Miami showroom event in December 2025.

Tesla’s well-calculated display of Optimus gives the public a low-pressure first encounter with a robot that Tesla is preparing  to soon deploy at scale. The company has previously indicated plans to manufacture Optimus robots at its Fremont facility at up to 1 million units annually, with an Optimus production line at Gigafactory Texas targeting 10 million units per year.

Tesla showcases Optimus humanoid robot at AWE 2026 in Shanghai

Advertisement

Musk has said that Optimus “has the potential to be more significant than the vehicle business over time,” and separately that roughly 80 percent of Tesla’s future value will come from the robot program. Whether that holds depends on production execution. For now, Boston gets a preview of what that future looks like, standing at the finish line on Boylston Street while 32,000 runners pass by.

Continue Reading

News

Tesla expands Unsupervised Robotaxi service to two new cities

This expansion builds directly on Tesla’s existing operations. Robotaxi has been ramping unsupervised rides in Austin for months and maintains activity in the San Francisco Bay Area.

Published

on

Credit: Tesla

Tesla has taken a major step forward in its autonomous ride-hailing ambitions.

On April 18, the company’s official Robotaxi account announced that Robotaxi service is now rolling out in Dallas and Houston, Texas. The update signals the rapid scaling of unsupervised autonomous operations in the Lone Star State.

The announcement includes a compelling 14-second video captured from inside a Model Y. Shot from the passenger perspective, the footage shows the vehicle navigating suburban roads in both cities with zero driver intervention, with no Safety Monitor to be seen.

Tesla also shared geofence maps highlighting the initial service areas: a compact zone in Houston covering parts of Willowbrook and Jersey Village, and a similarly defined area in Dallas near Highland Park and central neighborhoods.

Advertisement

This expansion builds directly on Tesla’s existing operations. Robotaxi has been ramping unsupervised rides in Austin for months and maintains activity in the San Francisco Bay Area.

With Dallas and Houston now live, Texas hosts three active hubs—an impressive concentration that triples the company’s Lone Star footprint in just weeks. The move aligns with Tesla’s Q4 2025 earnings guidance, which outlined a broader H1 2026 rollout across seven U.S. cities, including Phoenix, Miami, Orlando, Tampa, and Las Vegas.

Texas offers favorable regulations, high ride-share demand, and relatively straightforward suburban-to-urban driving patterns ideal for early autonomous scaling. While initial geofences appear modest—roughly 25 square miles per city—Tesla has historically expanded these zones quickly as it gathers real-world data.

Tesla confirms Robotaxi expansion plans with new cities and aggressive timeline

Advertisement

Unsupervised operation marks a critical milestone: passengers can summon, ride, and exit without safety drivers, a leap beyond many competitors still requiring human oversight.

For Tesla, the implications are significant. Successful scaling in major metros could accelerate the transition to a fully driverless fleet, unlocking new revenue streams and validating years of Full Self-Driving investment.

Riders gain convenient, potentially lower-cost mobility, while the company edges closer to Elon Musk’s vision of Robotaxis transforming urban transport.

As Tesla pushes into more cities this year, today’s launch in Dallas and Houston underscores its momentum. Hopefully, Tesla will be able to expand unsupervised rides to another U.S. state soon, which will mark yet another chapter in this short-but-encouraging Robotaxi story.

Advertisement
Continue Reading

News

Tesla is pushing Robotaxi features to owner cars with Spring Update

Tesla has quietly begun rolling out one of its most forward-looking Robotaxi-inspired features to existing customer vehicles.

Published

on

Tesla is starting to push Robotaxi features to owner cars, and the first instances are coming as the Spring 2026 Update starts to roll out.

Tesla has quietly begun rolling out one of its most forward-looking Robotaxi-inspired features to existing customer vehicles.

With the 2026 Spring Update (version 2026.14+), the rear passenger display now features a fully interactive navigation map that works while the car is driving — a capability previously reserved for Tesla Robotaxi.

Until now, Tesla’s rear displays have been largely limited to media controls, climate settings, and static route overviews. The new interactive map transforms the backseat into an active navigation hub, exactly the kind of passenger-first interface Tesla has been prototyping for its driverless fleet.

In a Robotaxi, where no one sits behind the wheel, every rider will need intuitive, real-time map access. By shipping this UI into thousands of owner cars months ahead of the Cybercab’s planned unveiling, Tesla is stress-testing the software in real-world conditions and giving loyal customers an early taste of the autonomous future.

The rollout is still in its early wave. Only a small number of vehicles have received 2026.14.1 so far, but the feature is expected to expand rapidly in the coming weeks. Owners of Model S, Model X, Model 3, Model Y, and Cybertruck are all eligible.

Advertisement

For buyers of the new Signature Edition Model S and X Plaid vehicles — whose deliveries begin in May — the update will likely arrive shortly after they take delivery, meaning the final chapter of Tesla’s flagship lineup will ship with cutting-edge Robotaxi preview tech baked in.

Elon Musk has long emphasized that Tesla ships supporting infrastructure well before new products launch. This rear-map rollout is a textbook example of that philosophy — quietly preparing both the software and the customer base for a world of fully driverless rides.

While the interactive map may seem like a modest convenience upgrade on the surface, its deeper purpose is unmistakable. Tesla is using its massive installed base of vehicles as a proving ground for the exact passenger experience that will define the Robotaxi era.

For current owners, it’s a free preview of tomorrow’s mobility; for the company, it’s invaluable data and real-world validation before the Cybercab hits the streets.

Advertisement
Continue Reading