Elon Musk debunks WSJ report about Tesla licensing AI models from xAI

Credit: xAI/X

Elon Musk has debunked the claims of a recent Wall Street Journal report which suggested that Tesla and xAI have discussed a potential deal where the electric vehicle maker would license the AI startup’s models to help develop technologies such as Full Self-Driving (FSD). As per Musk, there is no need for Tesla to license anything from xAI.

Citing people reportedly familiar with the matter, the WSJ noted that the proposed deal between Tesla and xAI has been described to investors. With the deal in place, Tesla would reportedly license xAI’s artificial intelligence models to help power FSD. Tesla would then share some of FSD’s revenue with xAI. 

The Journal’s report also alleged that xAI would be developing other features for Tesla’s vehicles, such as a Siri-esque voice assistant and the software that would power Optimus, the company’s humanoid robot. 

In a post on X, Musk noted that while he has not fully read the WSJ report and all its points, the article’s overall claim is inaccurate. Musk admitted that Tesla has learned a lot from discussions with xAI’s engineers, but there is really no need to license anything from the AI startup

Musk also explained that there are massive differences between the AI models that are being developed by xAI and Tesla. xAI’s models are gigantic and could not possibly run on Tesla’s electric vehicles, while the EV maker’s models have incredibly dense intelligence since they are focused on real-world driving.

“Haven’t read the article, but the above is not accurate. Tesla has learned a lot from discussions with engineers at xAI that have helped accelerate achieving unsupervised FSD, but there is no need to license anything from xAI.

“The xAI models are gigantic, containing, in compressed form, most of human knowledge, and couldn’t possibly run on the Tesla vehicle inference computer, nor would we want them to. The Tesla AI models have incredibly “dense” (in a good way, lol) intelligence, as they compress video of reality into driving commands, but must operate on a ~300W computer with memory size and bandwidth far lower than, say, an H100 GPU. 

“Tesla real-world AI also has a vastly larger context size than an LLM, as the combined video history from all cameras is several gigabytes in size,” Musk wrote. 

Don’t hesitate to contact us with news tips. Just send a message to simon@teslarati.com to give us a heads up.

Simon Alvarez: Simon is a reporter with a passion for electric cars and clean energy. Fascinated by the world envisioned by Elon Musk, he hopes to make it to Mars (at least as a tourist) someday.
Related Post
Disqus Comments Loading...