News
‘Ludicrous+’ Tesla Model S P100D could see 0-60 in 2.1 sec., when ‘stripped down’
Following yesterday’s announcement that a new ‘Ludicrous+’ mode has made Tesla’s flagship P100D even quicker by dropping 0-60 mph times down to 2.4 seconds, a new tweet from Elon Musk today suggests that things could get even quicker if the vehicle was “stripped down”.
The first of a series of tweets by Musk today noted that the new P100D Ludicrous+ Easter egg on a Model S will allow it to accelerate to 60 mph in as little as 2.34 seconds. That tweet got a response from the folks at Motor Trend, who kindly offered to validate Tesla’s claim.
Bring it on! We'd be happy to validate the P100D, as we did the P90D's Ludicrous (2.6 sec) 0-60 mph time https://t.co/WzK5OmByfz
— motortrend (@MotorTrend) January 12, 2017
Also responding was Twitter user Trevor Clark who calls out Faraday Future’s impressive 2.39 second 0-60 mph time in the FF 91, but through a backhanded compliment. Clark, who mentions Musk in the tweet, notes that Faraday Future’s 1,050 horsepower electric car has no interior and uses lightweight race seats.
Related: Tech reviewer and YouTube personality MKBHD reviews Faraday Future’s FF 91
Musk would later clarify that the 2.34 second 0-60 mph time is for a production car, however a stripped down race-ready Tesla could see a mind-boggling 0-60 in 2.1 seconds. “Good point. 2.34 would be a production Tesla. Stripped down, maybe as low as 2.1.”
Good point. 2.34 would be a production Tesla. Stripped down, maybe as low as 2.1.
— Elon Musk (@elonmusk) January 12, 2017
Aside from tweeting about Tesla’s Ludicrous+ performance, the serial tech entrepreneur also noted that the company’s next iteration of Autopilot enhancement is expected to see a worldwide over-the-air roll out in the coming week. First generation Autopilot vehicles on “hardware 1” are expected to see the software update as early as this weekend, according to Musk’s tweet.
If the results look good from the latest point release, then we are days away from a release to all HW1 and to all HW2 early next week
— Elon Musk (@elonmusk) January 12, 2017
Autopilot continues to be one of Musk’s top priorities, as evidenced by the recent hiring of Chris Lattner as the new Vice President of Autopilot Software. Lattner is an 11 year veteran of Apple, where he was credited with creating of Apple’s Swift programming language used for building apps.
Once the software is fully activated, drivers of cars with the Hardware 2 package will be able to enjoy what Tesla now refers to as Enhanced Autopilot. Vehicles equipped with Autopilot 2.0 hardware, or otherwise referred to as “hardware 2″ by Musk, will be capable of operating in Level 5 fully autonomous mode once regulatory approval for self-driving cars on public roads is obtained.
Until then, Enhanced Autopilot “will match speed to traffic conditions, keep within a lane, automatically change lanes without requiring driver input, transition from one freeway to another, exit the freeway when your destination is near, self-park when near a parking spot and be summoned to and from your garage,” according to the company.
News
Tesla seen as early winner as Canada reopens door to China-made EVs
Tesla had already prepared for Chinese exports to Canada in 2023 by equipping its Shanghai Gigafactory to produce a Canada-specific version of the Model Y.
Tesla seems poised to be an early beneficiary of Canada’s decision to reopen imports of Chinese-made electric vehicles, following the removal of a 100% tariff that halted shipments last year.
Thanks to Giga Shanghai’s capability to produce Canadian-spec vehicles, it might only be a matter of time before Tesla is able to export vehicles to Canada from China once more.
Under the new U.S.–Canada trade agreement, Canada will allow up to 49,000 vehicles per year to be imported from China at a 6.1% tariff, with the quota potentially rising to 70,000 units within five years, according to Prime Minister Mark Carney.
Half of the initial quota is reserved for vehicles priced under CAD 35,000, a threshold above current Tesla models, though the electric vehicle maker could still benefit from the rule change, as noted in a Reuters report.
Tesla had already prepared for Chinese exports to Canada in 2023 by equipping its Shanghai Gigafactory to produce a Canada-specific version of the Model Y. That year, Tesla began shipping vehicles from Shanghai to Canada, contributing to a sharp 460% year-over-year increase in China-built vehicle imports through Vancouver.
When Ottawa imposed a 100% tariff in 2024, however, Tesla halted those shipments and shifted Canadian supply to its U.S. and Berlin factories. With tariffs now reduced, Tesla could quickly resume China-to-Canada exports.
Beyond manufacturing flexibility, Tesla could also benefit from its established retail presence in Canada. The automaker operates 39 stores across Canada, while Chinese brands like BYD and Nio have yet to enter the Canadian market directly. Tesla’s relatively small lineup, which is comprised of four core models plus the Cybertruck, allows it to move faster on marketing and logistics than competitors with broader portfolios.
Elon Musk
Tesla confirms that work on Dojo 3 has officially resumed
“Now that the AI5 chip design is in good shape, Tesla will restart work on Dojo 3,” Elon Musk wrote in a post on X.
Tesla has restarted work on its Dojo 3 initiative, its in-house AI training supercomputer, now that its AI5 chip design has reached a stable stage.
Tesla CEO Elon Musk confirmed the update in a recent post on X.
Tesla’s Dojo 3 initiative restarted
In a post on X, Musk said that with the AI5 chip design now “in good shape,” Tesla will resume work on Dojo 3. He added that Tesla is hiring engineers interested in working on what he expects will become the highest-volume AI chips in the world.
“Now that the AI5 chip design is in good shape, Tesla will restart work on Dojo3. If you’re interested in working on what will be the highest volume chips in the world, send a note to AI_Chips@Tesla.com with 3 bullet points on the toughest technical problems you’ve solved,” Musk wrote in his post on X.
Musk’s comment followed a series of recent posts outlining Tesla’s broader AI chip roadmap. In another update, he stated that Tesla’s AI4 chip alone would achieve self-driving safety levels well above human drivers, AI5 would make vehicles “almost perfect” while significantly enhancing Optimus, and AI6 would be focused on Optimus and data center applications.
Musk then highlighted that AI7/Dojo 3 will be designed to support space-based AI compute.
Tesla’s AI roadmap
Musk’s latest comments helped resolve some confusion that emerged last year about Project Dojo’s future. At the time, Musk stated on X that Tesla was stepping back from Dojo because it did not make sense to split resources across multiple AI chip architectures.
He suggested that clustering large numbers of Tesla AI5 and AI6 chips for training could effectively serve the same purpose as a dedicated Dojo successor. “In a supercomputer cluster, it would make sense to put many AI5/AI6 chips on a board, whether for inference or training, simply to reduce network cabling complexity & cost by a few orders of magnitude,” Musk wrote at the time.
Musk later reinforced that idea by responding positively to an X post stating that Tesla’s AI6 chip would effectively be the new Dojo. Considering his recent updates on X, however, it appears that Tesla will be using AI7, not AI6, as its dedicated Dojo successor. The CEO did state that Tesla’s AI7, AI8, and AI9 chips will be developed in short, nine-month cycles, so Dojo’s deployment might actually be sooner than expected.
Elon Musk
Elon Musk’s xAI brings 1GW Colossus 2 AI training cluster online
Elon Musk shared his update in a recent post on social media platform X.
xAI has brought its Colossus 2 supercomputer online, making it the first gigawatt-scale AI training cluster in the world, and it’s about to get even bigger in a few months.
Elon Musk shared his update in a recent post on social media platform X.
Colossus 2 goes live
The Colossus 2 supercomputer, together with its predecessor, Colossus 1, are used by xAI to primarily train and refine the company’s Grok large language model. In a post on X, Musk stated that Colossus 2 is already operational, making it the first gigawatt training cluster in the world.
But what’s even more remarkable is that it would be upgraded to 1.5 GW of power in April. Even in its current iteration, however, the Colossus 2 supercomputer already exceeds the peak demand of San Francisco.
Commentary from users of the social media platform highlighted the speed of execution behind the project. Colossus 1 went from site preparation to full operation in 122 days, while Colossus 2 went live by crossing the 1-GW barrier and is targeting a total capacity of roughly 2 GW. This far exceeds the speed of xAI’s primary rivals.
Funding fuels rapid expansion
xAI’s Colossus 2 launch follows xAI’s recently closed, upsized $20 billion Series E funding round, which exceeded its initial $15 billion target. The company said the capital will be used to accelerate infrastructure scaling and AI product development.
The round attracted a broad group of investors, including Valor Equity Partners, Stepstone Group, Fidelity Management & Research Company, Qatar Investment Authority, MGX, and Baron Capital Group. Strategic partners NVIDIA and Cisco also continued their support, helping xAI build what it describes as the world’s largest GPU clusters.
xAI said the funding will accelerate its infrastructure buildout, enable rapid deployment of AI products to billions of users, and support research tied to its mission of understanding the universe. The company noted that its Colossus 1 and 2 systems now represent more than one million H100 GPU equivalents, alongside recent releases including the Grok 4 series, Grok Voice, and Grok Imagine. Training is also already underway for its next flagship model, Grok 5.