Tesla announced that its second AI Day will be held in Palo Alto. Many speculated that it would be held in Austin due to all the hard work employees have put into making Gigafactory Texas. The 10 million square foot building is a sight to behold for those of us who have seen it in person. Even more so if you’ve been inside of it.
While most of the U.S. (East and Gulf Coast at least) were sleeping, Tesla dropped a late-night tweet on Twitter with the caption “AI Day 2022 on Sept 30 🤖” The image included Palo Alto and the little robot emoji is a hint that Tesla’s plans to reveal a working Optimus prototype are on track.
During my own interview with Tesla CEO Elon Musk, he told me that I would be able to attend in person again this year. Last year, I was invited by Tesla to attend at the last minute and scrambled to get there. To be honest, I am not an AI expert and the Tesla staff that I met were very accommodating, and kind, and tried their best to explain things to me on a level that I could understand.
I learned a lot while being there and for those who doubt Tesla’s progress with Full Self-Driving, I think that you need to not only watch the live stream of this year’s and last year’s presentations but take notes.
Last year, Tesla had its beautiful Cybertruck on display and provided snacks for all of the attendees.
Last year, I learned more about AI and its impact at Tesla than I’ve ever learned in my entire life. There, Tesla showed that it’s much more than an EV car company. It has deep AI activity in its hardware, on the inference level, and at the training level. That day, Tesla established itself as a leader in real-world AI and Tesla’s FSD Beta software is just one application of real-world AI.
One key takeaway from last year’s AI Day that got overshadowed by Tesla’s announcement that it was making a robot was something that then-Director of AI, Andrej Karpathy said.
“What I find kind of fascinating about this is we are effectively building a synthetic animal from the ground up. So the car can be thought of as an animal. It moves around, it senses the environment, acts autonomously and intelligently, and we are building all of the components from scratch and in-house.”
“When we designed the visual cortex of the car, we also wanted to design the neural architecture of how the information flows in the system.”
Not only will we get to learn more about Tesla’s new Optimus Bot, but I think we will learn quite a bit more about the Occupancy Networks that enable the care to perceive space around it as a human or animal would. Tesla’s Autopilot Software Director, Ashok Elluswamy, recently shared a deep dive into Occupancy Networks and you can read more about that here.
Note: Johnna is a Tesla shareholder and supports its mission.