Connect with us

News

Google’s DeepMind unit develops AI that predicts 3D layouts from partial images

[Credit: Google DeepMind]

Published

on

Google’s DeepMind unit, the same division that created AlphaGo, an AI that outplayed the best Go player in the world, has created a neural network capable of rendering an accurate 3D environment from just a few still images, filling in the gaps with an AI form of perceptual intuition.

According to Google’s official DeepMind blog, the goal of its recent AI project is to make neural networks easier and simpler to train. Today’s most advanced AI-powered visual recognition systems are trained through the use of large datasets comprised of images that are human-annotated. This makes training a very tedious, lengthy, and expensive process, as every aspect of every object in each scene in the dataset has to be labeled by a person.

The DeepMind team’s new AI, dubbed the Generative Query Network (GQN) is designed to remove this dependency on human-annotated data, as the GQN is designed to infer a space’s three-dimensional layout and features despite being provided with only partial images of a space.

Similar to babies and animals, DeepMind’s GQN learns by making observations of the world around it. By doing so, DeepMind’s new AI learns about plausible scenes and their geometrical properties even without human labeling. The GQN is comprised of two parts — a representation network that produces a vector describing a scene and a generation network that “imagines” the scene from a previously unobserved viewpoint. So far, the results of DeepMind’s training for the AI have been encouraging, with the GQN being able to create representations of objects and rooms based on just a single image.

Advertisement

As noted by the DeepMind team, however, the training methods that have been used for the development of the GQN are still limited compared to traditional computer vision techniques. The AI creators, however, remain optimistic that as new sources of data become available and as improvements in hardware get introduced, the applications for the GQN framework could move over to higher-resolution images of real-world scenes. Ultimately, the DeepMind team believes that the GQN could be a useful system in technologies such as augmented reality and self-driving vehicles by giving them a form of perceptual intuition – extremely desirable for companies focused on autonomy, like Tesla.

Google DeepMind’s GQN AI in action. [Credit: Google DeepMind]

In a talk at Train AI 2018 last May, Tesla’s head of AI Andrej Karpathy discussed the challenges involved in training the company’s Autopilot system. Tesla trains Autopilot by feeding the system with massive data sets from the company’s fleet of vehicles. This data is collected through means such as Shadow Mode, which allows the company to gather statistical data to show false positives and false negatives of Autopilot software.

During his talk, Karpathy discussed how features such as blinker detection become challenging for Tesla’s neural network to learn, considering that vehicles on the road have their turn signals off most of the time and blinkers have a high variability from one car brand to another. Karpathy also discussed how Tesla has transitioned a huge portion of its AI team to labeling roles, doing the human annotation that Google DeepMind explicitly wants to avoid with the GQN. 

Musk also mentioned that its upcoming all-electric supercar — the next-generation Tesla Roadster — would feature an “Augmented Mode” that would enhance drivers’ capability to operate the high-performance vehicle. With Tesla’s flagship supercar seemingly set on embracing AR technology, the emergence of new techniques for training AI such as Google DeepMind’s GQN would be a perfect fit for the next generation of vehicles about to enter the automotive market.

Advertisement

Simon is an experienced automotive reporter with a passion for electric cars and clean energy. Fascinated by the world envisioned by Elon Musk, he hopes to make it to Mars (at least as a tourist) someday. For stories or tips--or even to just say a simple hello--send a message to his email, simon@teslarati.com or his handle on X, @ResidentSponge.

Advertisement
Comments

Elon Musk

Tesla FSD mocks BMW human driver: Saves pedestrian from near miss

Tesla FSD anticipated a BMW driver’s lane drift before the human behind the wheel could react.

Published

on

By

A video posted to r/TeslaFSD this week put a sharp spotlight on Tesla’s Full Self-Driving (FSD) software being able to react to pedestrian intent than an actual human driver behind the wheel. In the Reddit clip, a BMW driver can be seen rolling through a neighborhood street completely unaware of a pedestrian stepping in to cross. At the same time, a Tesla  driving on FSD had already begun slowing down before the pedestrian even began their attempt to cross the street The BMW kept moving, prompting the pedestrian to hop back, while the Tesla came to a stop and provide right-of-way for the human to safely cross.

That gap between what the BMW driver saw and what FSD had already processed is the story. Tesla FSD wasn’t reacting to a person in the street, rather it was reading the signals that a person was about to enter it based on the pedestrian’s movement, trajectory, and their trajectory to telegraph intent.

Tesla’s FSD is now built on an end-to-end neural network trained on billions of real-world miles, learning to interpret subtle human behavioral cues the same way an experienced human driver does instinctively. The difference is consistency. A human driver distracted for two seconds misses what FSD does not.

Tesla sues California DMV over Autopilot and FSD advertising ruling

Advertisement

Reddit commenters in the thread were blunt about the BMW driver’s failure, with several pointing out that the pedestrian was visible well before the crossing. One response put it plainly that the car on FSD saw the situation developing before the human in the other car had registered there was a situation at all.

Tesla has published data showing FSD (Supervised) is 54% safer than a human driver, accumulated across billions of miles driven on the system. Elon Musk has said FSD v14 will outperform human drivers by a factor of two to three, and that v15 has “a shot” at a 10x improvement. Pedestrian safety is where the stakes are highest, and where intent prediction closes the gap fastest. At 30 mph, a car covers roughly 44 feet per second. An extra second of awareness from reading a person’s body language rather than waiting for them to step out is often the difference between a near miss and a fatality.

Video and community discussion: r/TeslaFSD on Reddit

FSD saves man from becoming a pancake. BMW driver nearly flattens him.
by
u/Qwertygolol in
TeslaFSD

Advertisement
Continue Reading

News

Tesla Robotaxi gets a small but significant change

In the world of Tesla, where billion-dollar battery breakthroughs and autonomy milestones dominate headlines, a quiet design update can still pack a punch.

Published

on

Credit: David Moss | X

In the world of Tesla, where billion-dollar battery breakthroughs and autonomy milestones dominate headlines, a quiet design update can still pack a punch.

Last week in downtown Austin, sharp-eyed observers spotted a subtle but telling evolution on the Cybercab: a new “ROBOTAXI” logo graphic now graces the vehicle’s doors at Tesla’s Autonomy Popup.

What looks at first glance like a minor stylistic choice is, in fact, a deliberate rebranding move that hints at how the company envisions its robotaxi fleet fitting into everyday life.

The updated lettering is bold, graffiti-inspired, and unapologetically street-smart. Rendered in black with dripping white accents and a glowing yellow outline, the font evokes urban energy and playful irreverence.

Advertisement

Gone is the sleek, minimalist typography that defined earlier Cybercab prototypes. In its place is something more human, almost rebellious.

Advertisement

The new logo pops against the Cybercab’s smooth, metallic body, turning the autonomous pod into a rolling piece of public art rather than just another futuristic taxi.

Designers know that fonts are silent brand ambassadors. They shape perception before a single ride is taken. Tesla’s classic sans-serif aesthetic screams precision engineering and Silicon Valley cool.

The new Robotaxi script leans into accessibility and fun, suggesting the vehicle is approachable, not intimidating. For a product meant to ferry strangers through city streets 24/7, that matters. It signals that the robotaxi isn’t reserved for tech elites; it’s for everyone.

Tesla Cybercab spotted next to Model Y shows size comparison

Advertisement

The timing is no accident. With regulatory approvals for unsupervised autonomy advancing and Tesla preparing to scale Cybercab production, the company is shifting from prototype showcase to fleet deployment.

A fresh logo helps differentiate the vehicles visually in dense urban environments—crucial for rider recognition and brand recall. It also aligns with Elon Musk’s long-standing ethos: make the future feel exciting, not sterile.

Small changes like this often foreshadow a larger strategy. Tesla has always obsessed over details—door handles, screen interfaces, even the curvature of a steering wheel.

Updating the Robotaxi font reflects the same meticulous care now applied to consumer-facing autonomy. It’s not just paint on metal; it’s a statement that the ride of the future should feel personal, memorable, and undeniably cool.

Advertisement

In an industry racing toward self-driving fleets, Tesla’s willingness to evolve even the smallest visual cues shows confidence. A font won’t launch the robotaxi network, but it might just help millions climb aboard with a smile.

Continue Reading

News

Tesla makes latest announcement on Model S and Model X

The announcement follows Tesla CEO Elon Musk’s statement on the Q4 2025 earnings call in late January. Musk described the decision as an “honorable discharge” for the two vehicles, noting that production would wind down in Q2 2026.

Published

on

Credit: Tesla

Tesla has officially begun winding down production of its flagship Model S and Model X in the United States, notifying owners via email that the long-running models will soon reach the end of the line.

The email, sent to U.S. customers on March 27, opens with gratitude. “Model S and Model X marked the beginning of the world’s transition to electric transportation,” it reads. “These vehicles also made it possible for Tesla to develop the technology that would move our world toward autonomy.”

Tesla officially begins sunset of Model S and Model X

It then delivers the news directly: “As we make way for this autonomous future, Model S and Model X production will be ending. If you’d like to bring home a new Model S or Model X, order yours soon from our limited inventory.”

Advertisement

The message closes with a simple thank-you: “Thank you for being part of our journey.”

Advertisement

The announcement follows Tesla CEO Elon Musk’s statement on the Q4 2025 earnings call in late January. Musk described the decision as an “honorable discharge” for the two vehicles, noting that production would wind down in Q2 2026.

The move frees factory floor space at Fremont, California, for next-generation manufacturing, including Optimus humanoid robots and the upcoming Robotaxi platform.

Introduced in 2012 and 2015, respectively, the Model S and Model X were Tesla’s original halo cars. They proved EVs could outperform gasoline luxury vehicles in acceleration, range, and tech features while pioneering over-the-air updates and early autonomy hardware.

Although they never matched the volume of the Model 3 and Model Y, their engineering breakthroughs laid the foundation for the company’s current lineup and full self-driving development.

Advertisement

Early adopters highlighted how the cars convinced them to invest in Tesla stock and the EV movement. Some U.S. owners who had not yet received the note voiced mild frustration, and international customers confirmed the outreach remains U.S.-only for now.

Tesla has not detailed an exact final production date beyond the Q2 2026 target or confirmed immediate replacements. Speculation continues about a possible Cybertruck-derived SUV, but the company’s public focus has shifted squarely to autonomy and robotics.

For buyers still interested in the S or X, the window is closing. Inventory is described as limited, and Tesla’s Korean division has already set a March 31 cutoff for new orders in that market. The email serves as both a farewell and final sales push, an elegant close to a chapter that helped define modern electric driving.

Advertisement
Continue Reading