News
Google’s DeepMind unit develops AI that predicts 3D layouts from partial images
Google’s DeepMind unit, the same division that created AlphaGo, an AI that outplayed the best Go player in the world, has created a neural network capable of rendering an accurate 3D environment from just a few still images, filling in the gaps with an AI form of perceptual intuition.
According to Google’s official DeepMind blog, the goal of its recent AI project is to make neural networks easier and simpler to train. Today’s most advanced AI-powered visual recognition systems are trained through the use of large datasets comprised of images that are human-annotated. This makes training a very tedious, lengthy, and expensive process, as every aspect of every object in each scene in the dataset has to be labeled by a person.
The DeepMind team’s new AI, dubbed the Generative Query Network (GQN) is designed to remove this dependency on human-annotated data, as the GQN is designed to infer a space’s three-dimensional layout and features despite being provided with only partial images of a space.
Similar to babies and animals, DeepMind’s GQN learns by making observations of the world around it. By doing so, DeepMind’s new AI learns about plausible scenes and their geometrical properties even without human labeling. The GQN is comprised of two parts — a representation network that produces a vector describing a scene and a generation network that “imagines” the scene from a previously unobserved viewpoint. So far, the results of DeepMind’s training for the AI have been encouraging, with the GQN being able to create representations of objects and rooms based on just a single image.
As noted by the DeepMind team, however, the training methods that have been used for the development of the GQN are still limited compared to traditional computer vision techniques. The AI creators, however, remain optimistic that as new sources of data become available and as improvements in hardware get introduced, the applications for the GQN framework could move over to higher-resolution images of real-world scenes. Ultimately, the DeepMind team believes that the GQN could be a useful system in technologies such as augmented reality and self-driving vehicles by giving them a form of perceptual intuition – extremely desirable for companies focused on autonomy, like Tesla.

Google DeepMind’s GQN AI in action. [Credit: Google DeepMind]
In a talk at Train AI 2018 last May, Tesla’s head of AI Andrej Karpathy discussed the challenges involved in training the company’s Autopilot system. Tesla trains Autopilot by feeding the system with massive data sets from the company’s fleet of vehicles. This data is collected through means such as Shadow Mode, which allows the company to gather statistical data to show false positives and false negatives of Autopilot software.
During his talk, Karpathy discussed how features such as blinker detection become challenging for Tesla’s neural network to learn, considering that vehicles on the road have their turn signals off most of the time and blinkers have a high variability from one car brand to another. Karpathy also discussed how Tesla has transitioned a huge portion of its AI team to labeling roles, doing the human annotation that Google DeepMind explicitly wants to avoid with the GQN.
Musk also mentioned that its upcoming all-electric supercar — the next-generation Tesla Roadster — would feature an “Augmented Mode” that would enhance drivers’ capability to operate the high-performance vehicle. With Tesla’s flagship supercar seemingly set on embracing AR technology, the emergence of new techniques for training AI such as Google DeepMind’s GQN would be a perfect fit for the next generation of vehicles about to enter the automotive market.
News
Tesla revises new Intervention Reporting system with Full Self-Driving
It is the second revision to the program as Tesla is trying to make it easier to decipher driver and owner complaints, but also to make it easier to report issues within the suite for them.
Tesla has revised its new Intervention Reporting system within the Full Self-Driving suite that now categorizes reasons that drivers take over when the semi-autonomous driving functionality is active.
It is the second revision to the program as Tesla is trying to make it easier to decipher driver and owner complaints, but also to make it easier to report issues within the suite for them.
With the initial rollout of Full Self-Driving v14.3.2, Tesla included a new reporting menu that gave four options for an intervention: Preference, Comfort, Critical, and Other. A slightly revised version of Full Self-Driving with the same ID number then came out a few days later, changing the “Other” option to “Navigation” after numerous complaints from owners.
It appears Tesla has listened to those owners once again and has not only made it smaller and more compact, but also easier to report the issues than previously.
The new menu is now embedded within the request for a Voice Memo from Tesla, and does not block the entire screen, as the second rollout of the menu was:
Thank you Tesla! The new intervention screen is much better! @Tesla_AI pic.twitter.com/1lea9G27N1
— Dirty Tesla (@DirtyTesLa) May 1, 2026
There will likely be one additional revision to the Interventions Menu, as we have coined it here at Teslarati.
Unfortunately, at times, there are no reasons for an intervention at all, but the menu does not give an option to simply disregard the reporting and forces the driver to choose one of the options. We, as well as other notable Tesla influencers, indicated that there is not always a reason for an intervention.
For example, I choose to back into my parking spot in my neighborhood at least some of the time for the reason of charging. I usually hit “Preference” for this, but it sends a false positive to Tesla that there was a reason I took over that I was unhappy with.
Tesla begins probing owners on FSD’s navigation errors with small but mighty change
Instead, I’m simply performing a maneuver that is not yet available to us. When Tesla allows drivers to choose the orientation at which their car enters a parking spot, I and many others won’t have to deal with this menu.
Others are still skeptical that it will help resolve any issues whatsoever and prefer to disregard the menu altogether. It does seem as if Tesla will issue another revision in the coming days to allow this to happen.
Lifestyle
California hits Tesla Cybercab and Robotaxi driverless cars with new law
California just gave police power to ticket driverless cars, including Tesla’s Cybercab fleet.
California DMV formally adopted new rules on April 29, 2026 that allow law enforcement to issue “notices of noncompliance”, or in other words ticket autonomous vehicle companies when their cars commit moving violations. The rules take effect July 1, 2026 and officially closes a regulatory gap that previously let driverless cars operate on public roads with nearly no traffic enforcement consequences.
Until now, state traffic laws only applied to human “drivers,” which meant that when no person was behind the wheel, police had no mechanism to issue a ticket. Officers were limited to citing driverless vehicles for parking violations only. A well-known example came in September 2025, when a San Bruno officer watched a Waymo robotaxi execute an illegal U-turn and could do nothing but notify the company.
Under the new framework, when an officer observes a violation, the autonomous vehicle company is effectively treated as the driver. Companies must report each incident to the DMV within 72 hours, or 24 hours if a collision is involved. Repeated violations can result in fleet size restrictions, operational suspensions, or full permit revocation. Local officials also gained new authority to geofence driverless vehicles out of active emergency zones within two minutes and require a live emergency response line answered within 30 seconds.
Tesla Cybercab ramps Robotaxi public street testing as vehicle enters mass production queue
California’s new enforcement rules arrive at a pivotal moment for Tesla. The company is ramping Cybercab production at Giga Texas toward hundreds of units per week, targeting at least 2 million units annually at full capacity, while simultaneously pushing to expand its Robotaxi service to dozens of U.S. cities by end of 2026. Unsupervised FSD for consumer vehicles is currently targeted for Q4 2026, and when it arrives, Tesla’s fleet may not have a human to absorb legal accountability, under the July 1 rules.
Tesla has confirmed plans to expand its Robotaxi service to seven new cities in the first half of 2026, including Dallas, Houston, Phoenix, Miami, Orlando, Tampa, and Las Vegas, with the service already running without safety drivers in Austin. Musk has said he expects robotaxis to cover between a quarter and half of the United States by end of year.
News
Tesla Model X shocks everyone by crushing every other used car in America
The Model X is one of Tesla’s flagship models, the other being the Model S. Earlier this year, Tesla confirmed it would discontinue production of both the Model S and Model X to make way for Optimus robot production at the Fremont Factory in Northern California.
The Tesla Model X was the fastest-selling used vehicle in the United States in the first quarter of the year, crushing every other used car in America.
iSeeCars data for the first quarter shows that the Model X was the fastest-selling used car, lasting just 25.6 days on the market on average, two days better than that of the second-place Lexus RX 350h. The Cybertruck, Model Y, and Model S, in seventh, ninth, and thirteenth place, respectively, also made the list.
The Model X is one of Tesla’s flagship models, the other being the Model S. Earlier this year, Tesla confirmed it would discontinue production of both the Model S and Model X to make way for Optimus robot production at the Fremont Factory in Northern California.
Tesla brings closure to flagship ‘sentimental’ models, Musk confirms
Bringing closure to these two vehicles signaled the end of the road for the cars that have effectively built Tesla’s reputation for luxury and high-end passenger vehicles.
Relying on the sales of its mass market Model Y and Model 3, as well as leaning on the success of future products like the Cybercab, is the angle Tesla has chosen to take.
Teslas are also performing extremely well as a whole on the resale market. iSeeCars data shows that, “while the average price of a 1- to 5-year-old non-Tesla EV fell 10.3% in Q1 2026 year-over-year, the average price of a used Tesla was essentially flat at 0.1% lower across the same period. Traditional gas car prices dropped 2.8% during this same period.”
Additionally, market share for gas cars has dropped nearly 3 percent since the same quarter last year. Tesla has remained level, while the non-Tesla EV market share has increased 30 percent, mostly due to more models available.
Nevertheless, those non-Tesla EVs have seen their value drop by over 10 percent, while Tesla’s values have remained level.
Executive Analyst Karl Brauer said:
“Used electric vehicles without a Tesla badge have lost more than 10% of their value in the past year. This compares to stable values for Teslas and hybrids, and a modest 2.8% drop for traditional gasoline vehicles.”
Teslas, as well as non-luxury hybrids, are displaying the strongest resistance in the face of faltering demand, the publication says. But the more impressive performance is that of the Model X alone.
Tesla’s decision to stop production of the Model X may have played some part in the vehicle’s pristine performance in Q1. With the car already placed at a premium price point, used models are already more appealing to consumers. Perhaps second-hand versions were more than enough for those who wanted a Model X, and only a Model X.