News
Google’s DeepMind unit develops AI that predicts 3D layouts from partial images
Google’s DeepMind unit, the same division that created AlphaGo, an AI that outplayed the best Go player in the world, has created a neural network capable of rendering an accurate 3D environment from just a few still images, filling in the gaps with an AI form of perceptual intuition.
According to Google’s official DeepMind blog, the goal of its recent AI project is to make neural networks easier and simpler to train. Today’s most advanced AI-powered visual recognition systems are trained through the use of large datasets comprised of images that are human-annotated. This makes training a very tedious, lengthy, and expensive process, as every aspect of every object in each scene in the dataset has to be labeled by a person.
The DeepMind team’s new AI, dubbed the Generative Query Network (GQN) is designed to remove this dependency on human-annotated data, as the GQN is designed to infer a space’s three-dimensional layout and features despite being provided with only partial images of a space.
Similar to babies and animals, DeepMind’s GQN learns by making observations of the world around it. By doing so, DeepMind’s new AI learns about plausible scenes and their geometrical properties even without human labeling. The GQN is comprised of two parts — a representation network that produces a vector describing a scene and a generation network that “imagines” the scene from a previously unobserved viewpoint. So far, the results of DeepMind’s training for the AI have been encouraging, with the GQN being able to create representations of objects and rooms based on just a single image.
As noted by the DeepMind team, however, the training methods that have been used for the development of the GQN are still limited compared to traditional computer vision techniques. The AI creators, however, remain optimistic that as new sources of data become available and as improvements in hardware get introduced, the applications for the GQN framework could move over to higher-resolution images of real-world scenes. Ultimately, the DeepMind team believes that the GQN could be a useful system in technologies such as augmented reality and self-driving vehicles by giving them a form of perceptual intuition – extremely desirable for companies focused on autonomy, like Tesla.

Google DeepMind’s GQN AI in action. [Credit: Google DeepMind]
In a talk at Train AI 2018 last May, Tesla’s head of AI Andrej Karpathy discussed the challenges involved in training the company’s Autopilot system. Tesla trains Autopilot by feeding the system with massive data sets from the company’s fleet of vehicles. This data is collected through means such as Shadow Mode, which allows the company to gather statistical data to show false positives and false negatives of Autopilot software.
During his talk, Karpathy discussed how features such as blinker detection become challenging for Tesla’s neural network to learn, considering that vehicles on the road have their turn signals off most of the time and blinkers have a high variability from one car brand to another. Karpathy also discussed how Tesla has transitioned a huge portion of its AI team to labeling roles, doing the human annotation that Google DeepMind explicitly wants to avoid with the GQN.
Musk also mentioned that its upcoming all-electric supercar — the next-generation Tesla Roadster — would feature an “Augmented Mode” that would enhance drivers’ capability to operate the high-performance vehicle. With Tesla’s flagship supercar seemingly set on embracing AR technology, the emergence of new techniques for training AI such as Google DeepMind’s GQN would be a perfect fit for the next generation of vehicles about to enter the automotive market.
News
Tesla winter weather test: How long does it take to melt 8 inches of snow?
In Pennsylvania, we got between 10 and 12 inches of snow over the weekend as a nasty Winter storm ripped through a large portion of the country, bringing snow to some areas and nasty ice storms to others.
I have had a Model Y Performance for the week courtesy of Tesla, which got the car to me last Monday. Today was my last full day with it before I take it back to my local showroom, and with all the accumulation on it, I decided to run a cool little experiment: How long would it take for Tesla’s Defrost feature to melt 8 inches of snow?
Tesla’s Defrost feature is one of the best and most underrated that the car has in its arsenal. While every car out there has a defrost setting, Tesla’s can be activated through the Smartphone App and is one of the better-performing systems in my opinion.
It has come in handy a lot through the Fall and Winter, helping clear up my windshield more efficiently while also clearing up more of the front glass than other cars I’ve owned.
The test was simple: don’t touch any of the ice or snow with my ice scraper, and let the car do all the work, no matter how long it took. Of course, it would be quicker to just clear the ice off manually, but I really wanted to see how long it would take.
Tesla Model Y heat pump takes on Model S resistive heating in defrosting showdown
Observations
I started this test at around 10:30 a.m. It was still pretty cloudy and cold out, and I knew the latter portion of the test would get some help from the Sun as it was expected to come out around noon, maybe a little bit after.
I cranked it up and set my iPhone up on a tripod, and activated the Time Lapse feature in the Camera settings.
The rest of the test was sitting and waiting.
It didn’t take long to see some difference. In fact, by the 20-minute mark, there was some notable melting of snow and ice along the sides of the windshield near the A Pillar.
However, this test was not one that was “efficient” in any manner; it took about three hours and 40 minutes to get the snow to a point where I would feel comfortable driving out in public. In no way would I do this normally; I simply wanted to see how it would do with a massive accumulation of snow.
It did well, but in the future, I’ll stick to clearing it off manually and using the Defrost setting for clearing up some ice before the gym in the morning.
Check out the video of the test below:
❄️ How long will it take for the Tesla Model Y Performance to defrost and melt ONE FOOT of snow after a blizzard?
Let’s find out: pic.twitter.com/Zmfeveap1x
— TESLARATI (@Teslarati) January 26, 2026
News
Tesla Robotaxi ride-hailing without a Safety Monitor proves to be difficult
Tesla Robotaxi ride-hailing without a Safety Monitor is proving to be a difficult task, according to some riders who made the journey to Austin to attempt to ride in one of its vehicles that has zero supervision.
Last week, Tesla officially removed Safety Monitors from some — not all — of its Robotaxi vehicles in Austin, Texas, answering skeptics who said the vehicles still needed supervision to operate safely and efficiently.
BREAKING: Tesla launches public Robotaxi rides in Austin with no Safety Monitor
Tesla aimed to remove Safety Monitors before the end of 2025, and it did, but only to company employees. It made the move last week to open the rides to the public, just a couple of weeks late to its original goal, but the accomplishment was impressive, nonetheless.
However, the small number of Robotaxis that are operating without Safety Monitors has proven difficult to hail for a ride. David Moss, who has gained notoriety recently as the person who has traveled over 10,000 miles in his Tesla on Full Self-Driving v14 without any interventions, made it to Austin last week.
He has tried to get a ride in a Safety Monitor-less Robotaxi for the better part of four days, and after 38 attempts, he still has yet to grab one:
Wow just wow!
It’s 8:30PM, 29° out ice storm hailing & Tesla Robotaxi service has turned back on!
Waymo is offline & vast majority of humans are home in the storm
Ride 38 was still supervised but by far most impressive yet pic.twitter.com/1aUnJkcYm8
— David Moss (@DavidMoss) January 25, 2026
Tesla said last week that it was rolling out a controlled test of the Safety Monitor-less Robotaxis. Ashok Elluswamy, who heads the AI program at Tesla, confirmed that the company was “starting with a few unsupervised vehicles mixed in with the broader Robotaxi fleet with Safety Monitors,” and that “the ratio will increase over time.”
This is a good strategy that prioritizes safety and keeps the company’s controlled rollout at the forefront of the Robotaxi rollout.
However, it will be interesting to see how quickly the company can scale these completely monitor-less rides. It has proven to be extremely difficult to get one, but that is understandable considering only a handful of the cars in the entire Austin fleet are operating with no supervision within the vehicle.
News
Tesla gives its biggest hint that Full Self-Driving in Europe is imminent
Tesla has given its biggest hint that Full Self-Driving in Europe is imminent, as a new feature seems to show that the company is preparing for frequent border crossings.
Tesla owner and influencer BLKMDL3, also known as Zack, recently took his Tesla to the border of California and Mexico at Tijuana, and at the international crossing, Full Self-Driving showed an interesting message: “Upcoming country border — FSD (Supervised) will become unavailable.”
FSD now shows a new message when approaching an international border crossing.
Stayed engaged the whole way as we crossed the border and worked great in Mexico! pic.twitter.com/bDzyLnyq0g
— Zack (@BLKMDL3) January 26, 2026
Due to regulatory approvals, once a Tesla operating on Full Self-Driving enters a new country, it is required to comply with the laws and regulations that are applicable to that territory. Even if legal, it seems Tesla will shut off FSD temporarily, confirming it is in a location where operation is approved.
This is something that will be extremely important in Europe, as crossing borders there is like crossing states in the U.S.; it’s pretty frequent compared to life in America, Canada, and Mexico.
Tesla has been working to get FSD approved in Europe for several years, and it has been getting close to being able to offer it to owners on the continent. However, it is still working through a lot of the red tape that is necessary for European regulators to approve use of the system on their continent.
This feature seems to be one that would be extremely useful in Europe, considering the fact that crossing borders into other countries is much more frequent than here in the U.S., and would cater to an area where approvals would differ.
Tesla has been testing FSD in Spain, France, England, and other European countries, and plans to continue expanding this effort. European owners have been fighting for a very long time to utilize the functionality, but the red tape has been the biggest bottleneck in the process.
Tesla Europe builds momentum with expanding FSD demos and regional launches
Tesla operates Full Self-Driving in the United States, China, Canada, Mexico, Puerto Rico, Australia, New Zealand, and South Korea.