Tesla’s AI Day is here. In a few minutes, Tesla watchers would be seeing executives like Elon Musk provide an in-depth discussion on the company’s AI efforts on not just its automotive business but on its energy business and beyond as well. AI Day promises to be yet another tour-de-force of technical information from the electric car manufacturer. Thus, it is no surprise that there is a lot of excitement from the EV community heading into the event.
Tesla has kept the details of AI Day behind closed doors, so the specifics of the actual event are scarce. That being said, an AI Day agenda sent to attendees indicated that they could expect to hear Elon Musk speak during a live keynote, speak with Andrej Karpathy and the rest of Tesla’s AI engineers, and participate in breakout sessions with the teams behind Tesla’s AI development.
Similar to Autonomy Day and Battery Day, Teslarati would be following along on AI Day’s discussions to provide you with an updated account of the highly-anticipated event. Please refresh this page from time to time, as notes, details, and quotes from Elon Musk’s keynote and its following discussions will be posted here.
Simon 19:40 PT – A question about the use cases for the Tesla Bot was asked. Musk notes that the Tesla Bot would start with boring, repetitive, work, or work that people would least like to do.
Simon 19:25 PT – A question about AI and manufacturing is asked and how it potentially relates to the “Alien Dreadnaught” concept. Musk notes that most of Tesla’s manufacturing today is already automated. Musk also noted that humanoid robots would be done either way, so it would be great for Tesla to do this project, and safely as well. “We’re making the pieces that would be useful for a humanoid robot, so we should probably make it. If we don’t someone else will — and we want to make sure it’s safe,” Musk said.
Simon 19:15 PT – And the Q&A starts. First question involves open-sourcing Tesla’s innovations. Musk notes that it’s pretty expensive to develop all this tech, so he’s not sure how things could be open-sourced. But if other car companies would like to license the system, that could be done.
Simon 19:11 PT – There will really be a “Tesla Bot.” It would be built by humans, for humans. It would be friendly, and it would eliminate dangerous, repetitive, boring tasks. This is still petty darn unreal. It uses the systems that are currently being developed for the company’s vehicles. “There will be profound applications for the economy,” Musk said.
Simon 19:06 PT – New products! A whole Tesla suit?! After a fun skit, Elon says the “Tesla Bot” would eventually be real.
Simon 19:00 PT – What is crazy is that Dojo is not even done. This is just what it is today. Dojo is still evolving, and it is going to be way more powerful in the future. Now, it’s Elon Musk’s turn. What’s next for Tesla beyond vehicles.
Simon 19:00 PT – Venkataramanan teases the ExaPOD. Yet another revolutionary solution from Tesla. With all this, it is evident that Tesla’s approach to autonomy is on a whole other level. It would not be surprising if it takes Wall Street and the market a few days to fully absorb what is happening here.
Simon 18:55 PT – The specs of Dojo are insane. Behind its beastly specs, it seems that Dojo’s full potential lies in the fact that all this power is being used to do one thing: to make autonomous cars possible. Dojo is a pure learning machine, with more than 500,000 training nodes being built together. Nine petaflops of compute per tile, 36 terabytes per second of off-tile bandwidth. But this is just the tip of the iceberg for Dojo.
Simon 18:50 PT – Ganesh Venkataramanan, Project Dojo’s lead, takes the stage. He states that Elon Musk wanted a super-fast training computer to train Autopilot. And thus Project Dojo was born. Dojo is a distributed compute architecture connected by network fabric. It also has a large compute plane, extremely high bandwidth with low latencies, and big networks that are partitioned and mapped, to name a few.

Simon 18:45 PT – Milan Kovac, Tesla’s Director of Autopilot Engineering takes the stage. He notes that he would discuss how neural networks are run in the company’s cars. He notes that Tesla’s systems require supercomputers.
Simon 18:40 PT – Ashok notes that simulations have helped Tesla a lot already. It has, for example, helped the company identify pedestrian, bicycle, and vehicle detection and kinematics. The networks in the vehicles were traded to 371 million simulated images and 480 million cuboids.
Simon 18:35 PT – Ashok notes that these strategies ultimately helped Tesla retire radar from its FSD and Autopilot suite and adopt a pure vision model. A comparison between a radar+camera system and pure vision shows just how much more refined the company’s current strategy is. The executive also touched on how simulations help Tesla develop its self-driving systems. He states that simulations help when data is difficult to source, difficult to label, or in a closed loop.
Simon 18:30 PT – Ashok returns to discuss Auto Labeling. Simply put, there is so much labeling that needs to be done that it’s impossible to be done manually. He shows how roads and other items on the road are “reconstructed” from a single car that’s driving. This effectively allowed Tesla to label data much faster, while allowing vehicles to navigate safely and accurately even when occlusions are present.
Simon 18:25 PT – Karpathy returns to talk about manual labeling. He notes that manual labeling that’s outsourced to third-party firms is not optimal. Thus, in the spirit of vertical integration, Tesla opted to establish its own labeling team. Karpathy notes that in the beginning, that Tesla was using 2D image labeling. Eventually, Tesla transitioned to 4D labeling, where the company could label in vector space. But even this was not enough, and thus, auto labeling was developed.
Simon 18:23 PT – The executive states that traffic behavior is extremely complicated, especially in several parts of the world. Ashok notes that this partly illustrated by parking lots and how they are actually complex. Summoning a car from a parking lot, for example, used to utilize 400k notes to navigate, resulting in a system whose performance left much to be desired.
Simon 18:18 PT – Ashok notes that when driving alongside other cars, Autopilot must not only think about how they would drive, they must also think about how other cars would operate. He shows a video of a Tesla navigating a road and dealing with multiple vehicles to demonstrate this point.
Simon 18:15 PT – Director of Autopilot Software Ashok Elluswamy takes the stage. He starts off by discussing some key problems in planning in both non-convex and high-dimensional action spaces. He also shows Tesla’s solution to these issues, a “Hybrid Planning System.” He demonstrates this by showing how Autopilot performs a lane change.
Simon 18:10 PT – Karpathy’s discussion notes that today, Tesla’s FSD strategy is a lot more cohesive. This is demonstrated by the fact that the company’s vehicles could effectively draw a map in real-time as it drives. This is a massive difference compared to the pre-mapped strategies employed by rivals in both the automotive and software field like Super Cruise and Waymo.
To solve several problems encountered over the last few years with the previous suite, Tesla re-engineered their NN learning from the ground up and utilized a multi-head route, camera calibrations, caching, queues, and optimizations to streamline all tasks.
(heavily simplified) pic.twitter.com/LG2TRgjxip
— Teslascope (@teslascope) August 20, 2021
Simon 18:05 PT – The AI Director discusses how Tesla practically re-engineered their neural network learning from the ground-up and utilized a multi-head route. These include camera calibrations, caching, queues, and optimizations to streamline all tasks. Do note that this is an extremely simplified iteration of Karpathy’s discussion so far.
Simon 18:00 PT – Karpathy covers more challenges that are involved in even the basics of perception. Needless to say, AI Day is quickly proving to be Tesla’s most technical event right off the bat. That said, multi-camera networks are amazing. They’re just a ton of work, but it may very well be a silver bullet for Tesla’s predictive efforts.
Simon 17:56 PT – Karpathy showcases a video of how Tesla used to process its image data in the past. He shows a popular video for FSD that has been shared in the past. He notes that while great, such a system proved to be inadequate, and this is something that Tesla learned when it launched Smart Summon. While per-camera detection is great, the vector space proves inadequate.
Simon 17:55 PT – Karpathy noted that when Tesla designs the visual cortex in its car, the company is modeling it to how a biological vision is perceived by eyes. He also touches on how Tesla’s visual processing strategies have evolved over the years, and how it is done today. The AI Director also touches on Tesla’s “HydraNets,” on account of their multi-task learning capabilities.

Simon 17:51 PT – Karpathy starts off by discussing the visual component of Tesla’s AI, as characterized by the eight cameras used in the company’s vehicles. The AI director notes that AI could be considered like a biological being, and it’s built from the ground up, including its synthetic visual cortex.
Simon 17:48 PT – Elon Musk takes the stage. He apologizes for the event’s delay. He jokes that Tesla probably needs AI to solve these “technical difficulties.” The CEO highlights that AI Day is a recruitment event. He calls Tesla’s head of AI Andrej Karpathy. There’s no better person to discuss AI.
Simon 17:45 PT – We’re here watching the AI Day FSD preview video and we can’t help but notice that… are those Waypoints?!
Simon 17:38 PT – Looks like we’ve got an Elon sighting! And a preview video too! Here we go, folks!
We’ve got an Elon sighting
— Rob Maurer (@TeslaPodcast) August 20, 2021
Simon 17:30 PT – A 30-minute delay. We haven’t seen this much delay in quite a bit.
Simon 17:20 PT – It’s a good thing that Tesla has great taste in music. Did Grimes mix this track?
Simon 17:15 PT – We’re 15 minutes in. “Elon Time” is going strong on AI Day. To be honest, though, this music would fit the “Rave Cave” in Giga Berlin this coming October.
Simon 17:10 PT – A good thing to keep in mind is that AI Day is a recruitment event. Some food for thought just in case the discussions take a turn for the extremely technical. AI Day is designed to attract individuals who speak Tesla’s language in its rawest form. We’re just fortunate enough to come along for the ride.
Tesla Board Member Hiro Mizuno sums it up in this tweet pretty well.
Anybody passionate about real world AI !! https://t.co/ydaWQlkE4O
— HIRO MIZUNO (@hiromichimizuno) August 20, 2021
Simon 17:05 PT – I guess AI Day is starting on “Elon Time?” We’re on to the next track of chill music.
Simon 17:00 PT – And with 5 p.m. PST here, the music is officially live on the AI Day live stream. Looks like we’re in for some wait. Wonder how many minutes it would take before it starts? Gotta love this chill music though.
Simon 16:58 PT – While waiting, I can’t help but think that a ton of TSLA bears and Wall Street would likely not understand the nuances of what Tesla would be discussing today. Will Tesla go three-for-three? It was certainly the case with Battery Day and Autonomy Day.
Made it pic.twitter.com/aAWqxgf0bP
— Johnna (@JohnnaCrider1) August 19, 2021
Simon 16:55 PT – T-minus 5 minutes. Some attendees of AI Day are now posting some photos on Twitter, but it seems like photos and videos are not allowed on the actual venue of the event. Pretty much expected, I guess.
Simon 16:50 PT – Greetings, everyone, and welcome to another Live Blog. This is Tesla’s most technical event yet, so I expect this one to go extremely in-depth on the company’s AI efforts and the technology behind it. We’re pretty excited.
Don’t hesitate to contact us with news tips. Just send a message to tips@teslarati.com to give us a heads up.
Elon Musk
Elon Musk reveals unfortunate truth of Tesla Full Self-Driving development
In a candid reply to a dramatic video of Tesla’s Full Self-Driving (FSD) system averting disaster, Elon Musk laid bare a harsh reality facing autonomous vehicle technology.
Tesla’s Full Self-Driving suite is one of the most significant technological developments in terms of passenger travel in decades, but it is not all sunshine and rainbows, even with major strides in safety, CEO Elon Musk revealed.
In a candid reply to a dramatic video of Tesla’s Full Self-Driving (FSD) system averting disaster, Elon Musk laid bare a harsh reality facing autonomous vehicle technology.
The clip shows a Model 3 traveling at over 65 mph on a foggy, rain-soaked highway when a pedestrian suddenly steps into traffic.
Full Self-Driving instantly detects the threat and swerves safely, preventing what could have been a fatal collision for both the pedestrian and the driver’s cousin.
Musk’s response was unequivocal:
“Tesla self-driving saves a lot of lives – the statistics are unequivocal. That doesn’t mean it’s perfect, of course.” Even with a projected 10x safety improvement over human drivers, FSD would still prevent roughly 90% of the world’s approximately one million annual auto fatalities. The remaining 10%—roughly 100,000 deaths—would expose Tesla to relentless lawsuits. Meanwhile, the vast majority of lives saved would go unnoticed. “The 90% who are still alive mostly won’t even know that Tesla saved them. Nonetheless, it is the right thing to do.”
This “unfortunate truth,” as Musk implicitly framed it, highlights a fundamental asymmetry in how society perceives safety technology. Human drivers cause the overwhelming majority of crashes through distraction, fatigue, or error.
Tesla self-driving saves a lot of lives – the statistics are unequivocal.
That doesn’t mean it’s perfect, of course.
Even when we improve safety 10X, saving 90% of the million lives lost in auto accidents every year, Tesla will still get sued for the 10% who did die. The 90%… https://t.co/OrNB1mO5eF
— Elon Musk (@elonmusk) April 6, 2026
Yet when FSD errs, the incident becomes headline news and a courtroom target. Prevented tragedies, by contrast, leave no trace.
Survivors simply continue their journeys, unaware of the split-second intervention that kept them alive. The result is a distorted public narrative that amplifies failures while rendering successes invisible.
We have seen this through various headlines throughout the years, including the mainstream media’s obsession with only mentioning the manufacturer’s name in the instance of an accident when it is “Tesla.”
Opinion: Tesla Autopilot NHTSA investigation headlines are out of control
The video’s real-world example underscores FSD’s current capabilities. In near-zero visibility, the system’s cameras and neural network reacted faster than any human could, demonstrating the life-saving potential Musk cites.
Tesla’s latest safety data already shows FSD (Supervised) performing significantly better than the U.S. average, with crashes occurring far less frequently per mile driven.
Still, regulatory scrutiny, liability concerns, and media focus on edge-case failures continue to slow widespread adoption. Musk’s frank admission suggests Tesla is prepared to push forward despite the legal and perceptual headwinds.
As FSD edges closer to unsupervised autonomy, Musk’s post serves as both a progress report and a reality check. The technology is already saving lives today.
The unfortunate truth is that proving it and scaling it responsibly will require society to value statistical lives saved as much as dramatic stories of those lost. In the race toward safer roads, perception may prove as formidable an obstacle as the fog and rain in that viral video.
News
Tesla Full Self-Driving v14.3: First Impressions
Tesla started rolling out Full Self-Driving v14.3 to Early Access Program (EAP) members earlier today, and I had the opportunity to see some of the improvements that were made from v14.2.2.5.
While a lot of things got better, and I truly enjoyed using Full Self-Driving again after being stuck with the widely confusing and frustrating v14.2.2.5, Tesla still has one major problem on its hands, and it has to do with Navigation and Routing. I truly believe those issues will be the biggest challenges Tesla will face with autonomy: the car simply going the correct way, not conflicting with what the navigation says, and taking the simplest and most ideal route to a destination.
Here’s what I noticed as an improvement with my first hour with v14.3. This is not a full review, nor is it reflective of everything I will likely experience with this new version. This is simply what I saw as a noticeable improvement from the past version, v14.2.2.5.
There is also a more streamlined version on X, available at the thread below:
Tesla Full Self-Driving v14.3 testing now: pic.twitter.com/9UuP11Fv9f
— TESLARATI (@Teslarati) April 7, 2026
Yellow Light Behavior is Significantly Better
On v14.2.2.5, I had so many instances of the car slamming the brakes on to stop at a yellow light when it was clearly the safer option to proceed through. There were several times when the car would be about 20 feet from the line, traveling at 15-20 MPH, the light would turn yellow, and it would slam the brakes to stop. I would nudge it through yellow lights constantly because of this by putting my foot on the accelerator.
The instances I’m talking about here would not have been close calls — the car would have likely moved through the intersection completely before the light would turn red.
On multiple occasions this evening, FSD proceeded through yellow lights safely, without hesitation or any brake stabbing. It was refreshing:
🚨 Here’s an EXCELLENT example:
v14.2.2.5 would have slammed the brakes and stopped at this stop sign. I would have tapped the accelerator to proceed.
You can see the light turns yellow and the car makes — in my opinion — the correct decision to proceed. https://t.co/hHMikimkbp pic.twitter.com/Iesta1OYoV
— TESLARATI (@Teslarati) April 7, 2026
This was a huge complaint with v14.2.2.5. Sometimes, it’s a safer option to go through a yellow light, especially when you have traffic behind you. It’s a great way to get rear-ended.
Parking Performance
I had four instances of parking, and FSD v14.3 really did a flawless job. I was very impressed with how solid it was, but also with how efficiently it moved into the spot. When there was traffic around with past versions, I usually chose to park manually just because FSD took its time getting into a spot. I don’t see that being an issue anymore.
I complained about parking a lot and shared several images on X and Facebook of those examples:
Still a few issues with parking on FSD v14.2.2.4 pic.twitter.com/BphvVWDPqe
— TESLARATI (@Teslarati) February 5, 2026
No issues with it this evening. 4/4. Here are two looks:
Highway Performance
FSD v14.3 passed the five cars shown in this image:
The sixth was 200-300 yards ahead of the fifth. In v14.2.2.5, FSD would usually stay in the left lane, especially on Hurry and Mad Max. It did not do that, as it instead chose to get back over in the right lane after passing the final car.
Speed was not much of a concern here, even though it was going 21 MPH over. Although it was fast, I did have a line of cars behind me traveling at the same speed, and FSD had just merged about a half mile prior, so I chose to let it continue.
There were no instances of camping in the left lane for extended periods of time. I do want to do more testing with the Speed Profiles because they were in need of some work with the previous version. I am starting to side with those who want a Max Speed setting, which was removed last year.
Navigation and Routing Still Need Work
I was heading back toward where I came from, so I turned “Avoid Highways” on to take a different way. This confused the Routing system, and instead of turning left, then right, as the Routing said, the car turned right, then indicated for another right, basically going in a big rectangle. The car ignored the second right-hand turn and continued straight. I ended up turning “Avoid Highways” off and letting the car pick the same routing option as what took me here.
I have truly complained so much about Navigation and Routing that I’m starting to feel sort of bad. It is obviously such a massive challenge for some reason, but I am confident it will improve. I recall seeing Tesla hiring someone for this role a few months back, so perhaps there is hope for it to get better.
Smarter Behavior When Approaching Exits/Routing
This probably should be grouped in with Highway Behavior, but I wanted to highlight it on its own.
The highway exit pictured was always frustrating for v14.2.2.5. In the Hurry speed profile, I have seen it try to execute passes on multiple cars with as little as 0.6 miles to spare before taking the exit.
With three cars ahead of it, it chose to reduce speed and just wait until the exit. It was refreshing to see an improvement here, so I hope this behavior persists. Sometimes there’s just no reason to pass when you’re less than a mile from getting off the highway anyway.
Larger Visibility Warnings
Tesla seems to have increased the size of these “Camera Visibility Limited” warnings. Previously, they were just small thumbnails:
🚨 The warnings of “Camera Visibility Limited” appear to be larger with v14.3
Previously, it was a small thumbnail. Haven’t seen it this magnified before. https://t.co/iKJLsZ8P4Q pic.twitter.com/qRWwFyIZNd
— TESLARATI (@Teslarati) April 7, 2026
Stop Sign Behavior
This is probably the biggest improvement of all, because how it behaved at Stop Signs in v14.2.2.5 was so incredibly terrible and disruptive to the flow of a busy intersection.
There are several four-way, all-stop intersections near me. In the past, FSD would stop well behind the Stop Sign or the white-painted line on the road. It would then inch forward, stopping again at this line, essentially making two stops at a single intersection.
If there is visibility, I don’t truly care where FSD stops, as long as it stops once. Stopping twice just isn’t ideal or logical. I can’t imagine many humans would do it, I know I wouldn’t.
I didn’t have that issue this evening:
🚨 Here’s a look with some commentary – Previously, FSD would stop where it did in this video, then again at the white line, before proceeding. https://t.co/xwyVGMy28y pic.twitter.com/MObgUa7DoA
— TESLARATI (@Teslarati) April 7, 2026
This was pretty tight, too, in the sense that both my car and the other one got to the intersection at the same time. FSD may have stopped first, but the other vehicle was probably around the same point that I was when FSD decided to stop. I was happy to see the assertiveness to proceed; it felt like it was ideal to just go through. I was happy it didn’t stop a second time up at the line. I’d be fine if it stopped at the line, as long as that was the only stop it made.
News
Tesla Full Self-Driving v14.3 rolls out: here’s what’s new
We are in EAP and will be on the road with v14.3 in the coming hours, so we’ll have a lot of things to discuss over the next few days, especially coming from v14.2.2.5, which I called the most “confusing” FSD release of all time.
Tesla has officially started rolling out Full Self-Driving v14.3 to Early Access Program (EAP) members, and there are a lot of new improvements.
We are in EAP and will be on the road with v14.3 in the coming hours, so we’ll have a lot of things to discuss over the next few days, especially coming from v14.2.2.5, which I called the most “confusing” FSD release of all time.
🚨 Tesla Full Self-Driving v14.3 is here and it is coming with so many new features
Looks like there will be some MAJOR improvements to the general performance.
Truly seems like it will be significantly different than v14.2 pic.twitter.com/mhdfBLuDup
— TESLARATI (@Teslarati) April 7, 2026
Tesla brought out a lot of improvements, according to the v14.3 release notes, which list a vast number of fixes, new features, and new capabilities.
Here’s what Tesla’s release notes for the v14.3 release state:
- Improved parking location pin prediction, now shown on a map with a P icon.
- Increased decisiveness of parking spot selection and maneuvering.
- Rewrote the Al compiler and runtime from the ground up with MLIR, resulting in 20% faster reaction time and improving model iteration speed.
- Enhanced response to emergency vehicles, school buses, right-of-way violators, and other rare vehicles.
- Mitigated unnecessary lane biasing and minor tailgating behaviors.
- Improved handling of small animals by focusing RL training on harder examples and adding rewards for better proactive safety.
- Improved traffic light handling at complex intersections with compound lights, curved roads, and yellow light stopping – driven by training on hard RL examples sourced from the Tesla fleet.
- Upgraded the Reinforcement Learning (RL) stage of training the FSD neural network, resulting in improvements in a wide variety of driving scenarios.
- Upgraded the neural network vision encoder, improving understanding in rare and low-visibility scenarios, strengthening 3D geometry understanding, and expanding traffic sign understanding.
- Improved handling for rare and unusual objects extending, hanging, or leaning into the vehicle path by sourcing infrequent events from the fleet.
- Improved handling of temporary system degradations by maintaining control and automatically recovering without driver intervention, reducing unnecessary disengagements.
Tesla also listed a handful of future improvements as well:
- Expand reasoning to all behaviors beyond destination handling
- Add pothole avoidance
- Improve driver monitoring system sensitivity with better eye gaze tracking, eye wear handling, and higher accuracy in variable lighting situations
CEO Elon Musk has said that v14.3 could be “where the last big piece of the puzzle finally lands.” We have high expectations for this release because, in a lot of ways, v14.2’s final version was extremely disappointing and seemed to be a regression more than anything.
Nevertheless, Full Self-Driving v14.3 is going to be quite an interesting test, considering this is also the first time Musk has stated it will feel like the car will be “sentient.”
Reasoning will be a bigger piece of the puzzle with this release, although there were some elements of it in v14.2.
Tesla AI Head says future FSD feature has already partially shipped
We plan to travel plenty of miles with it over the next few days, so we’ll keep you posted on what our thoughts are.










