News
Google’s DeepMind unit develops AI that predicts 3D layouts from partial images
Google’s DeepMind unit, the same division that created AlphaGo, an AI that outplayed the best Go player in the world, has created a neural network capable of rendering an accurate 3D environment from just a few still images, filling in the gaps with an AI form of perceptual intuition.
According to Google’s official DeepMind blog, the goal of its recent AI project is to make neural networks easier and simpler to train. Today’s most advanced AI-powered visual recognition systems are trained through the use of large datasets comprised of images that are human-annotated. This makes training a very tedious, lengthy, and expensive process, as every aspect of every object in each scene in the dataset has to be labeled by a person.
The DeepMind team’s new AI, dubbed the Generative Query Network (GQN) is designed to remove this dependency on human-annotated data, as the GQN is designed to infer a space’s three-dimensional layout and features despite being provided with only partial images of a space.
Similar to babies and animals, DeepMind’s GQN learns by making observations of the world around it. By doing so, DeepMind’s new AI learns about plausible scenes and their geometrical properties even without human labeling. The GQN is comprised of two parts — a representation network that produces a vector describing a scene and a generation network that “imagines” the scene from a previously unobserved viewpoint. So far, the results of DeepMind’s training for the AI have been encouraging, with the GQN being able to create representations of objects and rooms based on just a single image.
As noted by the DeepMind team, however, the training methods that have been used for the development of the GQN are still limited compared to traditional computer vision techniques. The AI creators, however, remain optimistic that as new sources of data become available and as improvements in hardware get introduced, the applications for the GQN framework could move over to higher-resolution images of real-world scenes. Ultimately, the DeepMind team believes that the GQN could be a useful system in technologies such as augmented reality and self-driving vehicles by giving them a form of perceptual intuition – extremely desirable for companies focused on autonomy, like Tesla.

Google DeepMind’s GQN AI in action. [Credit: Google DeepMind]
In a talk at Train AI 2018 last May, Tesla’s head of AI Andrej Karpathy discussed the challenges involved in training the company’s Autopilot system. Tesla trains Autopilot by feeding the system with massive data sets from the company’s fleet of vehicles. This data is collected through means such as Shadow Mode, which allows the company to gather statistical data to show false positives and false negatives of Autopilot software.
During his talk, Karpathy discussed how features such as blinker detection become challenging for Tesla’s neural network to learn, considering that vehicles on the road have their turn signals off most of the time and blinkers have a high variability from one car brand to another. Karpathy also discussed how Tesla has transitioned a huge portion of its AI team to labeling roles, doing the human annotation that Google DeepMind explicitly wants to avoid with the GQN.
Musk also mentioned that its upcoming all-electric supercar — the next-generation Tesla Roadster — would feature an “Augmented Mode” that would enhance drivers’ capability to operate the high-performance vehicle. With Tesla’s flagship supercar seemingly set on embracing AR technology, the emergence of new techniques for training AI such as Google DeepMind’s GQN would be a perfect fit for the next generation of vehicles about to enter the automotive market.
News
Tesla Robotaxi ride-hailing without a Safety Monitor proves to be difficult
Tesla Robotaxi ride-hailing without a Safety Monitor is proving to be a difficult task, according to some riders who made the journey to Austin to attempt to ride in one of its vehicles that has zero supervision.
Last week, Tesla officially removed Safety Monitors from some — not all — of its Robotaxi vehicles in Austin, Texas, answering skeptics who said the vehicles still needed supervision to operate safely and efficiently.
BREAKING: Tesla launches public Robotaxi rides in Austin with no Safety Monitor
Tesla aimed to remove Safety Monitors before the end of 2025, and it did, but only to company employees. It made the move last week to open the rides to the public, just a couple of weeks late to its original goal, but the accomplishment was impressive, nonetheless.
However, the small number of Robotaxis that are operating without Safety Monitors has proven difficult to hail for a ride. David Moss, who has gained notoriety recently as the person who has traveled over 10,000 miles in his Tesla on Full Self-Driving v14 without any interventions, made it to Austin last week.
He has tried to get a ride in a Safety Monitor-less Robotaxi for the better part of four days, and after 38 attempts, he still has yet to grab one:
Wow just wow!
It’s 8:30PM, 29° out ice storm hailing & Tesla Robotaxi service has turned back on!
Waymo is offline & vast majority of humans are home in the storm
Ride 38 was still supervised but by far most impressive yet pic.twitter.com/1aUnJkcYm8
— David Moss (@DavidMoss) January 25, 2026
Tesla said last week that it was rolling out a controlled test of the Safety Monitor-less Robotaxis. Ashok Elluswamy, who heads the AI program at Tesla, confirmed that the company was “starting with a few unsupervised vehicles mixed in with the broader Robotaxi fleet with Safety Monitors,” and that “the ratio will increase over time.”
This is a good strategy that prioritizes safety and keeps the company’s controlled rollout at the forefront of the Robotaxi rollout.
However, it will be interesting to see how quickly the company can scale these completely monitor-less rides. It has proven to be extremely difficult to get one, but that is understandable considering only a handful of the cars in the entire Austin fleet are operating with no supervision within the vehicle.
News
Tesla gives its biggest hint that Full Self-Driving in Europe is imminent
Tesla has given its biggest hint that Full Self-Driving in Europe is imminent, as a new feature seems to show that the company is preparing for frequent border crossings.
Tesla owner and influencer BLKMDL3, also known as Zack, recently took his Tesla to the border of California and Mexico at Tijuana, and at the international crossing, Full Self-Driving showed an interesting message: “Upcoming country border — FSD (Supervised) will become unavailable.”
FSD now shows a new message when approaching an international border crossing.
Stayed engaged the whole way as we crossed the border and worked great in Mexico! pic.twitter.com/bDzyLnyq0g
— Zack (@BLKMDL3) January 26, 2026
Due to regulatory approvals, once a Tesla operating on Full Self-Driving enters a new country, it is required to comply with the laws and regulations that are applicable to that territory. Even if legal, it seems Tesla will shut off FSD temporarily, confirming it is in a location where operation is approved.
This is something that will be extremely important in Europe, as crossing borders there is like crossing states in the U.S.; it’s pretty frequent compared to life in America, Canada, and Mexico.
Tesla has been working to get FSD approved in Europe for several years, and it has been getting close to being able to offer it to owners on the continent. However, it is still working through a lot of the red tape that is necessary for European regulators to approve use of the system on their continent.
This feature seems to be one that would be extremely useful in Europe, considering the fact that crossing borders into other countries is much more frequent than here in the U.S., and would cater to an area where approvals would differ.
Tesla has been testing FSD in Spain, France, England, and other European countries, and plans to continue expanding this effort. European owners have been fighting for a very long time to utilize the functionality, but the red tape has been the biggest bottleneck in the process.
Tesla Europe builds momentum with expanding FSD demos and regional launches
Tesla operates Full Self-Driving in the United States, China, Canada, Mexico, Puerto Rico, Australia, New Zealand, and South Korea.
Elon Musk
SpaceX Starship V3 gets launch date update from Elon Musk
The first flight of Starship Version 3 and its new Raptor V3 engines could happen as early as March.
Elon Musk has announced that SpaceX’s next Starship launch, Flight 12, is expected in about six weeks. This suggests that the first flight of Starship Version 3 and its new Raptor V3 engines could happen as early as March.
In a post on X, Elon Musk stated that the next Starship launch is in six weeks. He accompanied his announcement with a photo that seemed to have been taken when Starship’s upper stage was just about to separate from the Super Heavy Booster. Musk did not state whether SpaceX will attempt to catch the Super Heavy Booster during the upcoming flight.
The upcoming flight will mark the debut of Starship V3. The upgraded design includes the new Raptor V3 engine, which is expected to have nearly twice the thrust of the original Raptor 1, at a fraction of the cost and with significantly reduced weight. The Starship V3 platform is also expected to be optimized for manufacturability.
The Starship V3 Flight 12 launch timeline comes as SpaceX pursues an aggressive development cadence for the fully reusable launch system. Previous iterations of Starship have racked up a mixed but notable string of test flights, including multiple integrated flight tests in 2025.
Interestingly enough, SpaceX has teased an aggressive timeframe for Starship V3’s first flight. Way back in late November, SpaceX noted on X that it will be aiming to launch Starship V3’s maiden flight in the first quarter of 2026. This was despite setbacks like a structural anomaly on the first V3 booster during ground testing.
“Starship’s twelfth flight test remains targeted for the first quarter of 2026,” the company wrote in its post on X.