Tesla’s AI Day is here. In a few minutes, Tesla watchers would be seeing executives like Elon Musk provide an in-depth discussion on the company’s AI efforts on not just its automotive business but on its energy business and beyond as well. AI Day promises to be yet another tour-de-force of technical information from the electric car manufacturer. Thus, it is no surprise that there is a lot of excitement from the EV community heading into the event.
Tesla has kept the details of AI Day behind closed doors, so the specifics of the actual event are scarce. That being said, an AI Day agenda sent to attendees indicated that they could expect to hear Elon Musk speak during a live keynote, speak with Andrej Karpathy and the rest of Tesla’s AI engineers, and participate in breakout sessions with the teams behind Tesla’s AI development.
Similar to Autonomy Day and Battery Day, Teslarati would be following along on AI Day’s discussions to provide you with an updated account of the highly-anticipated event. Please refresh this page from time to time, as notes, details, and quotes from Elon Musk’s keynote and its following discussions will be posted here.
Simon 19:40 PT – A question about the use cases for the Tesla Bot was asked. Musk notes that the Tesla Bot would start with boring, repetitive, work, or work that people would least like to do.
Simon 19:25 PT – A question about AI and manufacturing is asked and how it potentially relates to the “Alien Dreadnaught” concept. Musk notes that most of Tesla’s manufacturing today is already automated. Musk also noted that humanoid robots would be done either way, so it would be great for Tesla to do this project, and safely as well. “We’re making the pieces that would be useful for a humanoid robot, so we should probably make it. If we don’t someone else will — and we want to make sure it’s safe,” Musk said.
Simon 19:15 PT – And the Q&A starts. First question involves open-sourcing Tesla’s innovations. Musk notes that it’s pretty expensive to develop all this tech, so he’s not sure how things could be open-sourced. But if other car companies would like to license the system, that could be done.
Simon 19:11 PT – There will really be a “Tesla Bot.” It would be built by humans, for humans. It would be friendly, and it would eliminate dangerous, repetitive, boring tasks. This is still petty darn unreal. It uses the systems that are currently being developed for the company’s vehicles. “There will be profound applications for the economy,” Musk said.
Simon 19:06 PT – New products! A whole Tesla suit?! After a fun skit, Elon says the “Tesla Bot” would eventually be real.
Simon 19:00 PT – What is crazy is that Dojo is not even done. This is just what it is today. Dojo is still evolving, and it is going to be way more powerful in the future. Now, it’s Elon Musk’s turn. What’s next for Tesla beyond vehicles.
Simon 19:00 PT – Venkataramanan teases the ExaPOD. Yet another revolutionary solution from Tesla. With all this, it is evident that Tesla’s approach to autonomy is on a whole other level. It would not be surprising if it takes Wall Street and the market a few days to fully absorb what is happening here.
Simon 18:55 PT – The specs of Dojo are insane. Behind its beastly specs, it seems that Dojo’s full potential lies in the fact that all this power is being used to do one thing: to make autonomous cars possible. Dojo is a pure learning machine, with more than 500,000 training nodes being built together. Nine petaflops of compute per tile, 36 terabytes per second of off-tile bandwidth. But this is just the tip of the iceberg for Dojo.
Simon 18:50 PT – Ganesh Venkataramanan, Project Dojo’s lead, takes the stage. He states that Elon Musk wanted a super-fast training computer to train Autopilot. And thus Project Dojo was born. Dojo is a distributed compute architecture connected by network fabric. It also has a large compute plane, extremely high bandwidth with low latencies, and big networks that are partitioned and mapped, to name a few.

Simon 18:45 PT – Milan Kovac, Tesla’s Director of Autopilot Engineering takes the stage. He notes that he would discuss how neural networks are run in the company’s cars. He notes that Tesla’s systems require supercomputers.
Simon 18:40 PT – Ashok notes that simulations have helped Tesla a lot already. It has, for example, helped the company identify pedestrian, bicycle, and vehicle detection and kinematics. The networks in the vehicles were traded to 371 million simulated images and 480 million cuboids.
Simon 18:35 PT – Ashok notes that these strategies ultimately helped Tesla retire radar from its FSD and Autopilot suite and adopt a pure vision model. A comparison between a radar+camera system and pure vision shows just how much more refined the company’s current strategy is. The executive also touched on how simulations help Tesla develop its self-driving systems. He states that simulations help when data is difficult to source, difficult to label, or in a closed loop.
Simon 18:30 PT – Ashok returns to discuss Auto Labeling. Simply put, there is so much labeling that needs to be done that it’s impossible to be done manually. He shows how roads and other items on the road are “reconstructed” from a single car that’s driving. This effectively allowed Tesla to label data much faster, while allowing vehicles to navigate safely and accurately even when occlusions are present.
Simon 18:25 PT – Karpathy returns to talk about manual labeling. He notes that manual labeling that’s outsourced to third-party firms is not optimal. Thus, in the spirit of vertical integration, Tesla opted to establish its own labeling team. Karpathy notes that in the beginning, that Tesla was using 2D image labeling. Eventually, Tesla transitioned to 4D labeling, where the company could label in vector space. But even this was not enough, and thus, auto labeling was developed.
Simon 18:23 PT – The executive states that traffic behavior is extremely complicated, especially in several parts of the world. Ashok notes that this partly illustrated by parking lots and how they are actually complex. Summoning a car from a parking lot, for example, used to utilize 400k notes to navigate, resulting in a system whose performance left much to be desired.
Simon 18:18 PT – Ashok notes that when driving alongside other cars, Autopilot must not only think about how they would drive, they must also think about how other cars would operate. He shows a video of a Tesla navigating a road and dealing with multiple vehicles to demonstrate this point.
Simon 18:15 PT – Director of Autopilot Software Ashok Elluswamy takes the stage. He starts off by discussing some key problems in planning in both non-convex and high-dimensional action spaces. He also shows Tesla’s solution to these issues, a “Hybrid Planning System.” He demonstrates this by showing how Autopilot performs a lane change.
Simon 18:10 PT – Karpathy’s discussion notes that today, Tesla’s FSD strategy is a lot more cohesive. This is demonstrated by the fact that the company’s vehicles could effectively draw a map in real-time as it drives. This is a massive difference compared to the pre-mapped strategies employed by rivals in both the automotive and software field like Super Cruise and Waymo.
To solve several problems encountered over the last few years with the previous suite, Tesla re-engineered their NN learning from the ground up and utilized a multi-head route, camera calibrations, caching, queues, and optimizations to streamline all tasks.
(heavily simplified) pic.twitter.com/LG2TRgjxip
— Teslascope (@teslascope) August 20, 2021
Simon 18:05 PT – The AI Director discusses how Tesla practically re-engineered their neural network learning from the ground-up and utilized a multi-head route. These include camera calibrations, caching, queues, and optimizations to streamline all tasks. Do note that this is an extremely simplified iteration of Karpathy’s discussion so far.
Simon 18:00 PT – Karpathy covers more challenges that are involved in even the basics of perception. Needless to say, AI Day is quickly proving to be Tesla’s most technical event right off the bat. That said, multi-camera networks are amazing. They’re just a ton of work, but it may very well be a silver bullet for Tesla’s predictive efforts.
Simon 17:56 PT – Karpathy showcases a video of how Tesla used to process its image data in the past. He shows a popular video for FSD that has been shared in the past. He notes that while great, such a system proved to be inadequate, and this is something that Tesla learned when it launched Smart Summon. While per-camera detection is great, the vector space proves inadequate.
Simon 17:55 PT – Karpathy noted that when Tesla designs the visual cortex in its car, the company is modeling it to how a biological vision is perceived by eyes. He also touches on how Tesla’s visual processing strategies have evolved over the years, and how it is done today. The AI Director also touches on Tesla’s “HydraNets,” on account of their multi-task learning capabilities.

Simon 17:51 PT – Karpathy starts off by discussing the visual component of Tesla’s AI, as characterized by the eight cameras used in the company’s vehicles. The AI director notes that AI could be considered like a biological being, and it’s built from the ground up, including its synthetic visual cortex.
Simon 17:48 PT – Elon Musk takes the stage. He apologizes for the event’s delay. He jokes that Tesla probably needs AI to solve these “technical difficulties.” The CEO highlights that AI Day is a recruitment event. He calls Tesla’s head of AI Andrej Karpathy. There’s no better person to discuss AI.
Simon 17:45 PT – We’re here watching the AI Day FSD preview video and we can’t help but notice that… are those Waypoints?!
Simon 17:38 PT – Looks like we’ve got an Elon sighting! And a preview video too! Here we go, folks!
We’ve got an Elon sighting
— Rob Maurer (@TeslaPodcast) August 20, 2021
Simon 17:30 PT – A 30-minute delay. We haven’t seen this much delay in quite a bit.
Simon 17:20 PT – It’s a good thing that Tesla has great taste in music. Did Grimes mix this track?
Simon 17:15 PT – We’re 15 minutes in. “Elon Time” is going strong on AI Day. To be honest, though, this music would fit the “Rave Cave” in Giga Berlin this coming October.
Simon 17:10 PT – A good thing to keep in mind is that AI Day is a recruitment event. Some food for thought just in case the discussions take a turn for the extremely technical. AI Day is designed to attract individuals who speak Tesla’s language in its rawest form. We’re just fortunate enough to come along for the ride.
Tesla Board Member Hiro Mizuno sums it up in this tweet pretty well.
Anybody passionate about real world AI !! https://t.co/ydaWQlkE4O
— HIRO MIZUNO (@hiromichimizuno) August 20, 2021
Simon 17:05 PT – I guess AI Day is starting on “Elon Time?” We’re on to the next track of chill music.
Simon 17:00 PT – And with 5 p.m. PST here, the music is officially live on the AI Day live stream. Looks like we’re in for some wait. Wonder how many minutes it would take before it starts? Gotta love this chill music though.
Simon 16:58 PT – While waiting, I can’t help but think that a ton of TSLA bears and Wall Street would likely not understand the nuances of what Tesla would be discussing today. Will Tesla go three-for-three? It was certainly the case with Battery Day and Autonomy Day.
Made it pic.twitter.com/aAWqxgf0bP
— Johnna (@JohnnaCrider1) August 19, 2021
Simon 16:55 PT – T-minus 5 minutes. Some attendees of AI Day are now posting some photos on Twitter, but it seems like photos and videos are not allowed on the actual venue of the event. Pretty much expected, I guess.
Simon 16:50 PT – Greetings, everyone, and welcome to another Live Blog. This is Tesla’s most technical event yet, so I expect this one to go extremely in-depth on the company’s AI efforts and the technology behind it. We’re pretty excited.
Don’t hesitate to contact us with news tips. Just send a message to tips@teslarati.com to give us a heads up.
News
Tesla Summon got insanely good in FSD v14.3.2 — Navigation? Not so much
There were two new lines of improvements in the release notes: one addressing Actually Smart Summon (ASS), and another that now allows drivers to choose a reason for an intervention via a small menu during disengagement.
Tesla Full Self-Driving v14.3.2 began rolling out to some owners earlier this week, and there are some notable improvements that came with this update.
There were two new lines of improvements in the release notes: one addressing Actually Smart Summon (ASS), and another that now allows drivers to choose a reason for an intervention via a small menu during disengagement.
Overall operation saw a handful of slight improvements, especially with parking performance, which has been the most notable difference with the arrival of FSD v14.3. However, there are still some very notable shortcomings, most notably with region-specific signage and navigation.
Tesla Assisted Smart Summon (ASS) improvements
There are noticeable improvements to ASS operation, which has definitely been inconsistent in terms of performance. Tesla wrote in the release notes for v14.3.2:
“Unified the model between Actually Smart Summon, FSD, and Robotaxi for more capable and reliable behavior.”
As recently as this month, I used Summon with no success. It had pulled around the parking lot I was in incorrectly, leaving the range at which Summon can be operated and losing a signal while moving in the middle of the lot.
This caused me to sprint across the lot to retrieve the vehicle:
It was pouring when I left the gym so I tried to Summon my Model Y
It turned the opposite way and drove out of range, stopping here and forcing me to walk even further across the lot in the rain for it 🤣
One day pic.twitter.com/iD10c8sriB
— TESLARATI (@Teslarati) April 5, 2026
Unfortunately, Summon was not dependable or accurate enough to use regularly. It appears Tesla might have bridged the gap needed to make it an effective feature, as two tests in parking lots proved that Summon was more responsive and faster to navigate to the location chosen.
It also did so without hesitation, confidently, and at a comfortable speed. I was able to test it twice at different distances:
🚨 Tesla FSD v14.3.2 ASS testing part 1
This was a significant improvement than recent tries using ASS. The parking lot was pretty empty but getting it to come to my location in one singular motion and maneuver was encouraging. https://t.co/vF7TS48GGV pic.twitter.com/sYt8tyHgNn
— TESLARATI (@Teslarati) April 23, 2026
Tesla Full Self-Driving v14.3.2 ASS testing part 2 https://t.co/lxfWfnLUxf pic.twitter.com/2R0r3ohI3M
— TESLARATI (@Teslarati) April 23, 2026
I plan to test this more thoroughly and regularly through the next few weeks, and I avoided using it in a congested parking lot initially because I have not had overwhelming success with Summon in the past. I wanted to set a low baseline for it to see if it could simply pull up to the place I pinned in the Tesla app.
It was two for two, which is a big improvement because I don’t think I ever had successful Summon attempts back-to-back. It just seems more confident than ever before.
New Disengagement Categories
This is a really good idea from Tesla, but there are some issues with it. The categories you can select are Critical, Comfort, Preference, and Other.
I think the reasons why people choose to take over would be a better way to prompt drivers, like, “Traveling Too Fast,” “Incorrect Maneuver,” “Navigation Error,” would be more beneficial.
I say this because it seems that how we each categorize things might be different. For example, I shared a video of an intervention because the car had navigated to an exit to a parking lot and put its left blinker on, despite left turns not being allowed there.
I disengaged and chose Critical as the reason; it’s not a comfort issue, it’s not a preference, it’s quite literally an illegal turn, and it’s also dangerous because it cuts across several lanes of traffic and is 180 degrees.
I chose to label this Navigation error as “Critical” while testing FSD v14.3.2
Here’s why:
✅ This intervention wasn’t “preference,” as the maneuver FSD routed was illegal
✅ If a police officer saw this maneuver, it would result in a ticket https://t.co/znhHb4haAo pic.twitter.com/bZOiLwWmQa— TESLARATI (@Teslarati) April 23, 2026
Some said I should not have labeled this as Critical, but that’s the description I best characterized the disengagement as.
Categorizing interventions is a good thing, but it’s kind of hard to determine how to label them correctly.
Inconsistency with Regional Traffic Patterns
Tesla Full Self-Driving is pretty inconsistent with how it handles regional or local traffic patterns and road rules. The most frequent example I like to use is that of the “Except Right Turn” stop sign, which has become a notorious sighting on our social media platforms.
In the initial rollout of v14.3, my Model Y successfully navigated through one of these stop signs with no issues. However, testing at two of these stop signs yesterday proved it is still not sure how to read signs and navigate through them properly.
🚨 Tesla FSD v14.3.2 attempts the “Except Right Turn” stop sign: https://t.co/W5MjAybaNK pic.twitter.com/P6oeUsk4PN
— TESLARATI (@Teslarati) April 23, 2026
Off camera, I approached another one of these signs and felt the car coming to a stop, so I nudged it forward with the accelerator pedal pressed.
This helped the car go through the sign without stopping, but I could feel the bucking of the vehicle as the car really wanted to stop.
Musk said on the earnings call earlier this week that unsupervised FSD would probably be available in some regions before others, including a state-to-state basis in the U.S.
“It’s difficult to release this like to everyone everywhere all at once because we do want to make sure that they’re not unique situations in a city that particularly complex intersection or — actually, they tend to be places where people get into accidents a lot because they’re just — perhaps there’s — and like I said, an unsafe intersection or bad road markings or a lot of weather challenges. So I think we would release unsupervised gradually to the customer fleet as we feel like a particular geography is confirmed to be safe.”
This could be one of those examples that Tesla just has to figure out.
Highway Operation
Full Self-Driving is already pretty good at routine roadway navigation, so I don’t have too much to report here.
However, I was happy with FSD’s decision-making at several points, including its choice not to pass a slightly slower car and remain in the right lane as we approached the off-ramp:
🚨 Tesla FSD v14.3.2 highway operation: generally happy with the performance here, especially behavior near the exit
Love that the car got over in the right lane after its final pass, and stayed there as the off ramp was approaching https://t.co/qVRVhg6XGR pic.twitter.com/1ELwHf2XKS
— TESLARATI (@Teslarati) April 23, 2026
Better Maneuvering at Stop Signs
Many FSD users report some strange operations at stop signs, especially four-way intersections where there is a stop sign and a line on the road, and they’re not even with one another.
I experienced this quite frequently and found that FSD would actually double stop: once at the stop sign and again at the line.
This created some interesting scenarios for me and I had many cars honk at me when the second stop would happen. Other vehicles that had waved me on to proceed through the intersection would become frustrated at the second stop.
FSD seems to have worked through this particular maneuver:
🚨 Tesla FSD v14.3.2 with a singular stop at the correct spot
No double stopping anymore in my experience https://t.co/Wd0TaNjc1R pic.twitter.com/CdQPvJHaAM
— TESLARATI (@Teslarati) April 23, 2026
FSD should know to go to the more appropriate location (whichever provides better visibility), and proceed when it is the car’s turn to move. The double stop really ruined the flow of traffic at times and generally caused some frustration from other drivers.
News
Tesla plans to resolve its angriest bunch of owners: here’s how
Since the rollout of the AI4 chip in Tesla vehicles, owners with the last generation self-driving chip, known as Hardware 3, have been persistent in their quest for a solution to their issue: they were told their cars were capable of unsupervised Full Self-Driving. It turns out the cars are not.
Tesla has a plan to make Hardware 3 owners whole after CEO Elon Musk admitted that those with that self-driving chip in their cars will not have access to unsupervised Full Self-Driving.
The company’s strategy is so crazy that it is sort of hard to believe.
Since the rollout of the AI4 chip in Tesla vehicles, owners with the last generation self-driving chip, known as Hardware 3, have been persistent in their quest for a solution to their issue: they were told their cars were capable of unsupervised Full Self-Driving. It turns out the cars are not.
Tesla owners with HW3 finally get their answer: https://t.co/CSZTKKkWXx
— TESLARATI (@Teslarati) April 22, 2026
During the Tesla Q1 earnings call on Wednesday, Musk finally clarified what the company’s plans are for Hardware 3 owners, what they will be offered, and what Tesla will have to do internally to prepare for it.
The answer was somewhat mind-boggling.
Musk said:
“Unfortunately, Hardware 3 — I wish it were otherwise, but Hardware 3 simply does not have the capability to achieve unsupervised FSD. We did think at one point it would have that, but relative to Hardware 4, it has only 1/8 of the memory bandwidth of Hardware 4. And memory bandwidth is one of the key elements needed for unsupervised FSD.”
He continued, stating that HW3 owners would have the opportunity to trade their cars in at a discounted rate in order to get the AI4 chip:
“So for customers that have bought FSD, what we’re offering is essentially a trade-in — like a discounted trade-in for cars that have AI4 hardware, and we’ll also be offering the ability to upgrade the car, to replace the computer. And you also need to replace the cameras, unfortunately, to go to Hardware 4.”
Obviously, Tesla has a lot of people to work with and make this whole thing right. Musk was adamant that HW3 would be capable of FSD, and now that the company has finally admitted that it is not, there are some things that could come of this.
There has been open talk about some sort of class action lawsuit against Tesla. The promises that Tesla made previously could be considered a breach of contract or even false advertising, and that’s according to Grok, Musk’s own AI program.
Musk went on to say that Tesla would likely have to establish new microfactories to effectively and efficiently replace HW3 computers and cameras:
…So to do this efficiently, we’re going to have to set up, like kind of micro factories or small factories in major metropolitan areas in order to do it efficiently. Because if it’s done just at the service center, it is extremely slow to do so and inefficient. So we basically need like many production lines to make the change.”
This is going to be an extremely costly process, especially if Tesla has to buy real estate, properties, and equipment to complete this work. Additionally, there was no wording on pricing, but Musk never said it would be free. It will likely come with some kind of price tag, and HW3 owners, after being left hanging for so long, will have something to say about that.
Elon Musk
SpaceX just got pulled into the biggest Weapons Program in U.S. history
SpaceX joins the Golden Dome software group, deepening its role in America’s most expensive defense program.
SpaceX has joined a nine-company group developing the core operating software for the Golden Dome, America’s next-generation missile defense system. According to a Bloomberg report, SpaceX is focused on integrating satellite communications for military operations and is working alongside eight other defense and artificial intelligence companies, including Anduril Industries, Palantir Technologies, and Aalyria Technologies, to build software connecting missile defense capabilities.
The Golden Dome concept dates back to President Trump’s 2024 campaign, and on January 27, 2025, he signed an executive order directing the U.S. Armed Forces to construct the system before the end of his term. The system is planned to employ a constellation of thousands of satellites equipped with interceptors, with data centers in space providing automated control through an AI network.
FCC accepts SpaceX filing for 1 million orbital data center plan
Space Force Gen. Michael Guetlein, director of the Golden Dome initiative, has described the software layer as a “glue layer” that would enable officers to manage and control radars, sensors, and missile batteries across services. The consortium is aiming to test the platform this summer.
Trump selected a design in May 2025 with a $175 billion price tag, expected to be operational by the end of his term in 2029, though the Congressional Budget Office projected the cost could reach $831 billion over two decades.
The Golden Dome role is only the latest in a string of military wins for SpaceX. As Teslarati reported, the U.S. Space Force awarded SpaceX a $178.5 million task order on April 1, 2026 to launch missile tracking satellites for the Space Development Agency, covering two Falcon 9 launches beginning in Q3 2027. That came on top of more than $22 billion in government contracts held by SpaceX as of 2024, per CEO Gwynne Shotwell, spanning NASA resupply missions, classified intelligence satellites through its Starshield program, and military broadband.
The accumulation of defense contracts, now including a seat at the table on the most expensive weapons program in U.S. history, positions SpaceX as the dominant infrastructure provider for American national security in space. With a SpaceX IPO still on the horizon, each new contract adds weight to what is already one of the most consequential companies in aerospace history, raising real questions about how much of America’s defense architecture will depend on a single private operator before it ever trades publicly.








