Tesla’s AI Day is here. In a few minutes, Tesla watchers would be seeing executives like Elon Musk provide an in-depth discussion on the company’s AI efforts on not just its automotive business but on its energy business and beyond as well. AI Day promises to be yet another tour-de-force of technical information from the electric car manufacturer. Thus, it is no surprise that there is a lot of excitement from the EV community heading into the event.
Tesla has kept the details of AI Day behind closed doors, so the specifics of the actual event are scarce. That being said, an AI Day agenda sent to attendees indicated that they could expect to hear Elon Musk speak during a live keynote, speak with Andrej Karpathy and the rest of Tesla’s AI engineers, and participate in breakout sessions with the teams behind Tesla’s AI development.
Similar to Autonomy Day and Battery Day, Teslarati would be following along on AI Day’s discussions to provide you with an updated account of the highly-anticipated event. Please refresh this page from time to time, as notes, details, and quotes from Elon Musk’s keynote and its following discussions will be posted here.
Simon 19:40 PT – A question about the use cases for the Tesla Bot was asked. Musk notes that the Tesla Bot would start with boring, repetitive, work, or work that people would least like to do.
Simon 19:25 PT – A question about AI and manufacturing is asked and how it potentially relates to the “Alien Dreadnaught” concept. Musk notes that most of Tesla’s manufacturing today is already automated. Musk also noted that humanoid robots would be done either way, so it would be great for Tesla to do this project, and safely as well. “We’re making the pieces that would be useful for a humanoid robot, so we should probably make it. If we don’t someone else will — and we want to make sure it’s safe,” Musk said.
Simon 19:15 PT – And the Q&A starts. First question involves open-sourcing Tesla’s innovations. Musk notes that it’s pretty expensive to develop all this tech, so he’s not sure how things could be open-sourced. But if other car companies would like to license the system, that could be done.
Simon 19:11 PT – There will really be a “Tesla Bot.” It would be built by humans, for humans. It would be friendly, and it would eliminate dangerous, repetitive, boring tasks. This is still petty darn unreal. It uses the systems that are currently being developed for the company’s vehicles. “There will be profound applications for the economy,” Musk said.
Simon 19:06 PT – New products! A whole Tesla suit?! After a fun skit, Elon says the “Tesla Bot” would eventually be real.
Simon 19:00 PT – What is crazy is that Dojo is not even done. This is just what it is today. Dojo is still evolving, and it is going to be way more powerful in the future. Now, it’s Elon Musk’s turn. What’s next for Tesla beyond vehicles.
Simon 19:00 PT – Venkataramanan teases the ExaPOD. Yet another revolutionary solution from Tesla. With all this, it is evident that Tesla’s approach to autonomy is on a whole other level. It would not be surprising if it takes Wall Street and the market a few days to fully absorb what is happening here.
Simon 18:55 PT – The specs of Dojo are insane. Behind its beastly specs, it seems that Dojo’s full potential lies in the fact that all this power is being used to do one thing: to make autonomous cars possible. Dojo is a pure learning machine, with more than 500,000 training nodes being built together. Nine petaflops of compute per tile, 36 terabytes per second of off-tile bandwidth. But this is just the tip of the iceberg for Dojo.
Simon 18:50 PT – Ganesh Venkataramanan, Project Dojo’s lead, takes the stage. He states that Elon Musk wanted a super-fast training computer to train Autopilot. And thus Project Dojo was born. Dojo is a distributed compute architecture connected by network fabric. It also has a large compute plane, extremely high bandwidth with low latencies, and big networks that are partitioned and mapped, to name a few.

Simon 18:45 PT – Milan Kovac, Tesla’s Director of Autopilot Engineering takes the stage. He notes that he would discuss how neural networks are run in the company’s cars. He notes that Tesla’s systems require supercomputers.
Simon 18:40 PT – Ashok notes that simulations have helped Tesla a lot already. It has, for example, helped the company identify pedestrian, bicycle, and vehicle detection and kinematics. The networks in the vehicles were traded to 371 million simulated images and 480 million cuboids.
Simon 18:35 PT – Ashok notes that these strategies ultimately helped Tesla retire radar from its FSD and Autopilot suite and adopt a pure vision model. A comparison between a radar+camera system and pure vision shows just how much more refined the company’s current strategy is. The executive also touched on how simulations help Tesla develop its self-driving systems. He states that simulations help when data is difficult to source, difficult to label, or in a closed loop.
Simon 18:30 PT – Ashok returns to discuss Auto Labeling. Simply put, there is so much labeling that needs to be done that it’s impossible to be done manually. He shows how roads and other items on the road are “reconstructed” from a single car that’s driving. This effectively allowed Tesla to label data much faster, while allowing vehicles to navigate safely and accurately even when occlusions are present.
Simon 18:25 PT – Karpathy returns to talk about manual labeling. He notes that manual labeling that’s outsourced to third-party firms is not optimal. Thus, in the spirit of vertical integration, Tesla opted to establish its own labeling team. Karpathy notes that in the beginning, that Tesla was using 2D image labeling. Eventually, Tesla transitioned to 4D labeling, where the company could label in vector space. But even this was not enough, and thus, auto labeling was developed.
Simon 18:23 PT – The executive states that traffic behavior is extremely complicated, especially in several parts of the world. Ashok notes that this partly illustrated by parking lots and how they are actually complex. Summoning a car from a parking lot, for example, used to utilize 400k notes to navigate, resulting in a system whose performance left much to be desired.
Simon 18:18 PT – Ashok notes that when driving alongside other cars, Autopilot must not only think about how they would drive, they must also think about how other cars would operate. He shows a video of a Tesla navigating a road and dealing with multiple vehicles to demonstrate this point.
Simon 18:15 PT – Director of Autopilot Software Ashok Elluswamy takes the stage. He starts off by discussing some key problems in planning in both non-convex and high-dimensional action spaces. He also shows Tesla’s solution to these issues, a “Hybrid Planning System.” He demonstrates this by showing how Autopilot performs a lane change.
Simon 18:10 PT – Karpathy’s discussion notes that today, Tesla’s FSD strategy is a lot more cohesive. This is demonstrated by the fact that the company’s vehicles could effectively draw a map in real-time as it drives. This is a massive difference compared to the pre-mapped strategies employed by rivals in both the automotive and software field like Super Cruise and Waymo.
To solve several problems encountered over the last few years with the previous suite, Tesla re-engineered their NN learning from the ground up and utilized a multi-head route, camera calibrations, caching, queues, and optimizations to streamline all tasks.
(heavily simplified) pic.twitter.com/LG2TRgjxip
— Teslascope (@teslascope) August 20, 2021
Simon 18:05 PT – The AI Director discusses how Tesla practically re-engineered their neural network learning from the ground-up and utilized a multi-head route. These include camera calibrations, caching, queues, and optimizations to streamline all tasks. Do note that this is an extremely simplified iteration of Karpathy’s discussion so far.
Simon 18:00 PT – Karpathy covers more challenges that are involved in even the basics of perception. Needless to say, AI Day is quickly proving to be Tesla’s most technical event right off the bat. That said, multi-camera networks are amazing. They’re just a ton of work, but it may very well be a silver bullet for Tesla’s predictive efforts.
Simon 17:56 PT – Karpathy showcases a video of how Tesla used to process its image data in the past. He shows a popular video for FSD that has been shared in the past. He notes that while great, such a system proved to be inadequate, and this is something that Tesla learned when it launched Smart Summon. While per-camera detection is great, the vector space proves inadequate.
Simon 17:55 PT – Karpathy noted that when Tesla designs the visual cortex in its car, the company is modeling it to how a biological vision is perceived by eyes. He also touches on how Tesla’s visual processing strategies have evolved over the years, and how it is done today. The AI Director also touches on Tesla’s “HydraNets,” on account of their multi-task learning capabilities.

Simon 17:51 PT – Karpathy starts off by discussing the visual component of Tesla’s AI, as characterized by the eight cameras used in the company’s vehicles. The AI director notes that AI could be considered like a biological being, and it’s built from the ground up, including its synthetic visual cortex.
Simon 17:48 PT – Elon Musk takes the stage. He apologizes for the event’s delay. He jokes that Tesla probably needs AI to solve these “technical difficulties.” The CEO highlights that AI Day is a recruitment event. He calls Tesla’s head of AI Andrej Karpathy. There’s no better person to discuss AI.
Simon 17:45 PT – We’re here watching the AI Day FSD preview video and we can’t help but notice that… are those Waypoints?!
Simon 17:38 PT – Looks like we’ve got an Elon sighting! And a preview video too! Here we go, folks!
We’ve got an Elon sighting
— Rob Maurer (@TeslaPodcast) August 20, 2021
Simon 17:30 PT – A 30-minute delay. We haven’t seen this much delay in quite a bit.
Simon 17:20 PT – It’s a good thing that Tesla has great taste in music. Did Grimes mix this track?
Simon 17:15 PT – We’re 15 minutes in. “Elon Time” is going strong on AI Day. To be honest, though, this music would fit the “Rave Cave” in Giga Berlin this coming October.
Simon 17:10 PT – A good thing to keep in mind is that AI Day is a recruitment event. Some food for thought just in case the discussions take a turn for the extremely technical. AI Day is designed to attract individuals who speak Tesla’s language in its rawest form. We’re just fortunate enough to come along for the ride.
Tesla Board Member Hiro Mizuno sums it up in this tweet pretty well.
Anybody passionate about real world AI !! https://t.co/ydaWQlkE4O
— HIRO MIZUNO (@hiromichimizuno) August 20, 2021
Simon 17:05 PT – I guess AI Day is starting on “Elon Time?” We’re on to the next track of chill music.
Simon 17:00 PT – And with 5 p.m. PST here, the music is officially live on the AI Day live stream. Looks like we’re in for some wait. Wonder how many minutes it would take before it starts? Gotta love this chill music though.
Simon 16:58 PT – While waiting, I can’t help but think that a ton of TSLA bears and Wall Street would likely not understand the nuances of what Tesla would be discussing today. Will Tesla go three-for-three? It was certainly the case with Battery Day and Autonomy Day.
Made it pic.twitter.com/aAWqxgf0bP
— Johnna (@JohnnaCrider1) August 19, 2021
Simon 16:55 PT – T-minus 5 minutes. Some attendees of AI Day are now posting some photos on Twitter, but it seems like photos and videos are not allowed on the actual venue of the event. Pretty much expected, I guess.
Simon 16:50 PT – Greetings, everyone, and welcome to another Live Blog. This is Tesla’s most technical event yet, so I expect this one to go extremely in-depth on the company’s AI efforts and the technology behind it. We’re pretty excited.
Don’t hesitate to contact us with news tips. Just send a message to tips@teslarati.com to give us a heads up.
News
Tesla expands crucial Supercharging feature to Google Maps
It is a useful tool, especially during hours of congestion. However, it has not been super effective for those who drive non-Tesla EVs, as other OEMs use UI platforms like Google’s Android Auto or Apple’s iOS.
Tesla has expanded a crucial Supercharging feature that helps owners identify stall availability at nearby locations.
Tesla said on Tuesday night that its “Live Availability” feature, which shows EV owners how many stalls are available at a Supercharger station, to Google Maps, a third-party app:
Live availability of Superchargers now in Google Maps pic.twitter.com/DJvS83wVxm
— Tesla Charging (@TeslaCharging) November 11, 2025
Already offering it in its own vehicles, the Live Availability feature that Teslas have is a helpful feature that helps you choose an appropriate station with plugs that are immediately available.
A number on an icon where the Supercharger is located lets EV drivers know how many stalls are available.
It is a useful tool, especially during hours of congestion. However, it has not been super effective for those who drive non-Tesla EVs, as other OEMs use UI platforms like Google’s Android Auto or Apple’s iOS.
Essentially, when those drivers needed to charge at a Supercharger that enables non-Tesla EVs to plug in, there was a bit more of a gamble. There was no guarantee that a plug would be available, and with no way to see how many are open, it was a risk.
Tesla adding this feature allows people to have a more convenient and easier-to-use experience if they are in a non-Tesla EV. With the already expansive Supercharger Network being available to so many EV owners, there is more congestion than ever.
This new feature makes the entire experience better for all owners, especially as there is more transparency regarding the availability of plugs at Supercharger stalls.
It will be interesting to see if Tesla is able to expand on this new move, as Apple Maps compatibility is an obvious goal of the company’s in the future, we could imagine. In fact, this is one of the first times an Android Auto feature is available to those owners before it became an option for iOS users.
Apple owners tend to get priority with new features within the Tesla App itself.
Elon Musk
Elon Musk’s Boring Co goes extra hard in Nashville with first rock-crushing TBM
The Boring Company’s machine for the project is now in final testing.
The Boring Company is gearing up to tackle one of its toughest projects yet, a new tunnel system beneath Nashville’s notoriously tough limestone terrain. Unlike the soft-soil conditions of Las Vegas and Austin, the Music City Loop will require a “hard-rock” boring machine capable of drilling through dense, erosion-resistant bedrock.
The Boring Company’s machine for the project is now in final testing.
A boring hard-rock tunneling machine
The Boring Company revealed on X that its new hard-rock TBM can generate up to 4 million pounds of grip force and 1.5 million pounds of maximum thrust load. It also features a 15-filter dust removal system designed to keep operations clean and efficient during excavation even in places where hard rock is present.
Previous Boring Co. projects, including its Loop tunnels in Las Vegas, Austin, and Bastrop, were dug primarily through soft soils. Nashville’s geology, however, poses a different challenge. Boring Company CEO and President Steve Davis mentioned this challenge during the project’s announcement in late July.
“It’s a tough place to tunnel, Nashville. If we were optimizing for the easiest places to tunnel, it would not be here. You have extremely hard rock, like way harder than it should be. It’s an engineering problem that’s fairly easy and straightforward to solve,” Davis said.
Nashville’s limestone terrain
Experts have stated that the city’s subsurface conditions make it one of the more complex tunneling environments in the U.S. The Outer Nashville Basin is composed of cherty Mississippian-age limestone, a strong yet soluble rock that can dissolve over time, creating underground voids and caves, as noted in a report from The Tennessean.
Jakob Walter, the founder and principal engineer of Haushepherd, shared his thoughts on these challenges. “Limestone is generally a stable sedimentary bedrock material with strength parameters that are favorable for tunneling. Limestone is however fairly soluble when compared to other rack materials, and can dissolve over long periods of time when exposed to water.
“Unexpected encounters with these features while tunneling can result in significant construction delays and potential instability of the excavation. In urban locations, structures at the ground surface should also be constantly monitored with robotic total stations or similar surveying equipment to identify any early signs of movement or distress,” he said.
Elon Musk
Elon Musk shares ridiculous fact about Optimus’ hand demos
It appears that Optimus’ V3 iteration is still very much under wraps.
Elon Musk recently revealed something quite shocking about the Optimus demonstration hand that was showcased at the 2025 Annual Shareholder Meeting. As per the CEO, the complex robotic hand that impressed the event’s attendees was not a component of Optimus V3 at all.
Needless to say, it appears that Optimus’ V3 iteration is still very much under wraps.
Optimus’s hand
Even in Tesla’s We, Robot event last year, the company showcased a robotic hand that seemed capable of performing complex tasks. A similar hand was showcased at the recent investor event. It was then no surprise that some attendees and EV community members assumed that the robotic component, which was very dexterous, was a preview of Optimus V3’s hand.
As per Elon Musk in a recent post on X, however, this was not the case. While the robotic hand that Tesla showcased at the 2025 Annual Shareholder Meeting was already very impressive, it was still a V2 component. In response to a quote post from his mom Maye Musk, who noted that “Elon told me a few times that the hand is the most difficult part of the robot,” Elon Musk clarified that the impressive component was still from Optimus V2.
“This is just the V2 Optimus hand. The V3 hand is another level beyond this. Exquisite engineering,” Musk wrote in his post on X.
Not like Tesla
Tesla is designing Optimus to be a potential replacement for humans in some of the world’s most delicate tasks, such as surgery. It is then extremely important for Optimus’ hand to be very dexterous and refined in its movements. This is something that even companies that are also producing humanoid robots have yet to accomplish fully. Musk highlighted this during the Annual Shareholder Meeting, when he discussed how Tesla is really the only company that can scale humanoid robots properly.
“You will see certainly many companies showing demonstration robots. There’s really three things that are super difficult about robots. One is the engineering of the forearm and hand because the human hand is an incredible thing, actually. It’s super dexterous.
“So, engineering the hand really well, the real-world AI, and then volume manufacturing. Those are generally the things that are missing. One or more of those things are missing from other companies. So Tesla is the only one that has all three of those,” Musk said.
-
News5 days agoTesla shares rare peek at Semi factory’s interior
-
Elon Musk5 days agoTesla says texting and driving capability is coming ‘in a month or two’
-
News4 days agoTesla makes online ordering even easier
-
News4 days agoTesla Model Y Performance set for new market entrance in Q1
-
News5 days agoTesla Cybercab production starts Q2 2026, Elon Musk confirms
-
News5 days agoTesla China expecting full FSD approval in Q1 2026: Elon Musk
-
News6 days agoTesla Model Y Performance is rapidly moving toward customer deliveries
-
News3 days agoTesla is launching a crazy new Rental program with cheap daily rates









