News
The saga continues with Model X driver involved in Montana crash
Mr. Pang is back this time with a second open letter to Tesla
The Tesla Model X driver involved in a Montana crash while using Autopilot is stirring up controversy once again this time asking Tesla Motors to reveal additional details from the incident. It seems that language differences play a large role in this dispute. Acting as his representative, Steven Xu sent us a second open letter Mr. Pang penned to Elon Musk, in which he takes issue with Tesla’s account of the accident. The open letter reads as follows:
Here is the second letter from my friend, Mr.Pang.
To Tesla Team:
It has been weeks since I published the letter. No one has ever tried to contact us and discus about the crash. To fully understand the reason that caused this crash is critical for all tesla drivers. After awhile tesla published a response towards our letter. Most of parts are fit into the story. However there are few points that I would like to point out.
“From this data, we learned that after you engaged Autosteer, your hands were not detected on the steering wheel for over two minutes. This is contrary to the terms of use when first enabling the feature and the visual alert presented you every time Autosteer is activated.”
I admit that my hands were out of steering wheel after I engaged autopilot. The reason that I was doing that is because I put too much faith in this system. I also believe most Tesla driver would do the something when they
engage autopilot including Elon. The problem here is that Tesla had over advertised this feature by calling it “autopilot”. This feature should named “advance driving assistant”. It is possible that Tesla had known accident like this would come sooner or later. Tesla might think that setting up the term by saying “please put hands on steering wheel at all time” would be response free for Tesla.
2、 As road conditions became increasingly uncertain, the vehicle again alerted you to put your hands on the wheel.
The road condition was better than fine. Lane mark is absolutely clear. Road is flat and there is no incoming car. No matter what my sight was never out of the road. However everything was happened too fast for me to take control. Everything happened in less than a second.
3、No steering torque was then detected until Autosteer was disabled with an abrupt steering action. Immediately following detection of the first impact, adaptive cruise control was also disabled, the vehicle began to slow, and you applied the brake pedal.
No one should avoid the cause of the malfunction of autopilot feature. Since you start explaining it, I realize that you are implying that some sort of force was applied to the steering wheel by me. I had no idea how Tesla got this clue. There are two points I want to make here. First, my hands were not on the steering wheel. Second no obstacle was on the road to alter the steering wheel direction. The one and the only one that was taking control of this entire vehicle and steering it away from the road is autopilot software itself. Somehow I realize if my hands were on the steering wheel with a force, would Tesla blame me for the collision? To me it looks like that if an accident occur by autopilot, either hands are on or not on the steering wheel, Tesla can always find a way out by saying “abrupt steering action”.
Tesla also claimed that “abrupt steering adaptive cruise control was also disabled, the vehicle began to slow.”
This is nowhere near the truth. The real thing is that vehicle was NEVER attended to slow from hitting the first pole towards the last. It only took about a second to hit 12 wood poles. I believe if it wasn’t me who brake the vehicle it would continued cruising. Mr. Huang was injured severely due to high speed impact.
Tesla as a global impact company should respect the truth of every incident. Nothing is more important hand human life. Lying or manipulating towards public about what really happened is unacceptable.
Weeks ago I got contacted by Tesla regarding this accident. Since you cannot find a mandarin translator, we rearranged the call again in four hours. However that was the last time when Tesla tries to contact me. What I am asking is to fully reveal the driving data from the collision. Reliability of Autopilot software matters to hundreds and thousands of Tesla drivers. I wish to know the entire story about what really happened on us on that collision.
Thanks
Sincerely
Mr. Pang
Steven Xu pointed us to comments being made on the Tesla Motors Club forum that seemingly offers Mr. Pang no support at all. In fact, based on those comments, there almost seems to be a cultural bias in play in this situation. One wonders if perhaps things would seem different if they were driving a car in China that only displayed instructions in Mandarin.
Pang’s complaint is very similar to one lodged by a Chinese customer last month whose Tesla crashed on the highway on the way to work. He claimed that the salesman he spoke to before purchasing his car told him specifically that the car could drive itself and proved it by driving with his hands off the wheel during a test drive. Tesla later amended the language it uses to describe its Autopilot system on its Chinese website. It’s possible that same linguistic confusion has a bearing on Mr. Pang’s unfortunate accident.
At this point, it seems the matter will be handled by insurance companies and lawyers. Tesla apparently has had no further contact with Pang. Through Steven, Pang says, “Weeks ago I got contacted by Tesla regarding this accident. Since you cannot find a Mandarin translator, we re-arranged the call again in four hours. However, that was the last time when Tesla tries to contact me.
“What I am asking is to fully reveal the driving data from the collision. Reliability of Autopilot software
matters to hundreds and thousands of Tesla drivers. I wish to know the entire story about what really happened on us on that collision.”
News
Tesla Model Y L gets new entertainment feature
Beyond audio quality, Immersive Sound X aligns with Tesla’s ecosystem of over-the-air updates, potentially allowing future refinements.
Tesla is including a new entertainment feature in the Model Y L, improving the vehicle even further and making it what appears to be the best configuration of the all-electric crossover globally.
Unfortunately, we in the U.S. do not yet have access to the vehicle, and the plans for it to enter the market remain up in the air, as CEO Elon Musk has said it could appear late this year. However, there is nothing concrete at this time.
Tesla’s latest enhancement to the Model Y L is a new Immersive Sound X feature, exclusive to the Model Y L.
Model YL has new sound system setting. Immersive Sound X. This is NOT on the new Y and 3 pic.twitter.com/7OpJuzyoGf
— Electric Future (@electricfuture5) March 16, 2026
It aims to transform the in-car listening experience into something truly cinematic. First introduced by Tesla China in October 2025, this advanced audio mode is now rolling out to deliveries in Australia and New Zealand, highlighting Tesla’s approach to region-specific premium upgrades.
At its core, Immersive Sound X leverages real-time sound extraction technology to create a customizable 3D soundstage. Using advanced algorithms, it analyzes audio tracks to separate direct sounds, such as vocals or lead instruments, from ambient elements like echoes and reverb.
The system then positions direct sounds front and center while diffusing ambient sounds to the side and rear speakers, simulating an expansive virtual environment. This results in a heightened sense of depth and spatial awareness, making listeners feel as if they’re in a concert hall or studio.
What sets Immersive Sound X apart from the standard Immersive Sound found in other Tesla models is its hardware dependency and enhanced processing. The Model Y L boasts an 18-speaker system with a subwoofer, compared to the 15-speaker setup, plus a subwoofer, in the Model Y Long Range’s previous premium audio configuration.
This upgrade provides more “kick” and precision, enabling finer control over the soundstage. Unlike traditional surround sound, which requires multi-channel mixes like Dolby Atmos, Immersive Sound X works with any stereo source from platforms like Spotify or Apple Music, so every owner will be able to use it.
Tesla Model Y lineup expansion signals an uncomfortable reality for consumers
You can fine-tune the experience via an adjustable immersion slider, scaling the “size” of the virtual space to personal preferences. This caters to a more custom sound.
An Auto mode intelligently adapts based on media type, whether it’s music, podcasts, or videos, ensuring optimal immersion without manual tweaks. This feature is unavailable on standard Model Y variants (with 7 or 15 speakers) or Model 3 trims, underscoring Tesla’s strategy to differentiate higher trims through superior hardware and software integration.
Beyond audio quality, Immersive Sound X aligns with Tesla’s ecosystem of over-the-air updates, potentially allowing future refinements.
For audiophiles and casual listeners alike, it elevates mundane commutes into immersive journeys, proving Tesla’s commitment to blending cutting-edge tech with user-centric design.
Elon Musk
Elon Musk teases crazy outlook for xAI against its competitors
Musk’s response was vintage hyperbole, designed to rally supporters and dismiss doubters, something his responses on social media often do.
Elon Musk has never been one to shy away from crazy timelines, massive expectations, and outrageous outlooks. However, his recent plans for xAI and where he believes it will end up compared to its competitors are sure to stimulate conversation.
In a bold and characteristic response on X, Elon Musk fired back at a recent analysis that positioned his AI venture, xAI, as lagging behind industry frontrunners.
The post, from March 14, came as a direct reply to forecaster Peter Wildeford’s assessment, which drew from benchmarks and reporting to rank AI developers.
xAI will catch up this year and then exceed them all by such a long distance in 3 years that you will need the James Webb telescope to see who is in second place
— Elon Musk (@elonmusk) March 14, 2026
Wildeford placed Anthropic, Google, and OpenAI in a virtual tie at the top, with xAI and Meta trailing by about seven months. Chinese players like Moonshot, Deepseek, zAI, and Alibaba were estimated to be nine months behind, while France’s Mistral lagged by about a year and a half.
Musk’s response was vintage hyperbole, designed to rally supporters and dismiss doubters, something his responses on social media often do.
He claimed xAI would “catch up this year,” meaning by the end of 2026, erasing that seven-month deficit against the leaders. But he didn’t stop there.
Musk escalated his vision to 2029, predicting xAI would “exceed them all by such a long distance” that observers would need the James Webb Space Telescope, NASA’s orbiting observatory stationed about 930,000 miles from Earth, to spot whoever lands in second place. This analogy underscores Musk’s confidence in xAI’s trajectory, implying an astronomical lead that could redefine the AI landscape.
Breaking down these claims reveals Musk’s strategic optimism. First, the short-term catch-up: xAI, launched in 2023, has already released models like Grok, but recent benchmarks, including those for Grok 4.2, have shown it falling short in capabilities compared to rivals.
Anthropic’s Claude series, Google’s Gemini, and OpenAI’s GPT models dominate in areas like reasoning, coding, and multimodal tasks. Musk’s assertion suggests aggressive scaling in compute, talent, or architecture, perhaps leveraging xAI’s ties to Tesla’s Dojo supercomputers or Musk’s vast resources, to close the gap swiftly.
The longer-term dominance by 2029 paints an even more audacious picture. Musk envisions xAI not just parity but supremacy, outpacing competitors in innovation speed and model sophistication.
This could involve breakthroughs in energy-efficient training, real-world integration, like Tesla’s robotics, or ethical AI alignment, aligning with Musk’s stated goal of “understanding the universe.”
Critics, however, point to parallels with Tesla’s Full Self-Driving delays; one reply highlighted Musk’s 2023 promise of FSD readiness. Musk has made this promise for many years, and although the system has been strong and improving, it is still a ways off from the completely autonomous operation that was expected by now.
Tesla Full Self-Driving v14.2.2.5 might be the most confusing release ever
Musk’s comment highlights the intensifying U.S.-centric AI race, with xAI challenging the “three-way” dominance noted by Wharton professor Ethan Mollick, whom Wildeford quoted. As geopolitical tensions rise—evident in the Chinese firms’ lag—Musk’s tease could spur investment and talent wars.
Yet, it also invites scrutiny: Will xAI deliver, or is this another telescope-needed mirage? In an industry where timelines slip but stakes soar, Musk’s words keep the spotlight on xAI’s ambitious path forward.
Elon Musk
Tesla Terafab set for launch: Inside the $20B AI chip factory that will reshape the auto industry
Tesla set to launch “Terafab Project: A vertically integrated chip fabrication effort combining logic processing, memory, and advanced packaging.
Tesla is making one of the boldest bets in its history. On March 14, Elon Musk posted on X that the “Terafab Project launches in 7 days,” pointing to March 21, 2026 as the start date for what he has described as a vertically integrated chip fabrication effort combining logic processing, memory, and advanced packaging.
Tesla first confirmed Terafab on its January 28, 2026 earnings call, where Musk told investors the company needs to build a chip fabrication facility to avoid a supply constraint projected to materialize within three to four years. But the seeds were planted even earlier. At Tesla’s annual general meeting last year, Musk warned that even in the best-case scenario for chip production from their suppliers, it still wouldn’t be enough, and declared that building a “gigantic chip fab” simply had to be done.
While there has been no official announcement on where Tesla plans to break ground on the massive Terafab, all signs point to the North Campus of Giga Texas in Austin.
Months of speculation has surrounded Tesla’s North Campus expansion at Giga Texas, where drone footage captured by observer Joe Tegtmeyer revealed massive construction site preparation just north of the existing factory on a scale that rivals the original Giga Texas footprint itself.
Samsung’s Tesla AI5/AI6 chip factory to start key equipment tests in March: report
The project is projected to produce 100–200 billion AI and memory chips annually, targeting 100,000 wafer starts per month, at an estimated cost of $20 billion. Tesla is targeting 2-nanometre process technology and anticipated to be the most advanced node currently in commercial production. Dubbed the Tesla AI5 chip, the chip will pack 40x–50x more compute performance and 9x more memory than AI4, and will be among the first products Terafab factory is set to produce. This highly optimized, and massively powerful inference chip is designed to make full self-driving (FSD) and Tesla’s Optimus robots faster, safer, and with full autonomy.
This is where Terafab becomes a genuine game-changer. If Tesla successfully builds a 2nm chip fab at scale, it becomes one of only a handful of entities that’s capable of producing AI silicon in-house, with competitive implications that extend far beyond Tesla’s own vehicles, and potentially positioning Tesla as a chip supplier or licensor to other industries.

Credit: @serobinsonjr/X
The next-gen Tesla AI chips will power advancements in Full Self-Driving software, the Cybercab Robotaxi program, and the Optimus humanoid robot line. Musk’s projections for Optimus require chip volumes that no existing external supplier can commit to on Tesla’s timeline.Competitors like Waymo and GM’s Cruise remain dependent on third-party silicon, leaving them exposed to the same supply chain vulnerabilities Tesla is now working to eliminate entirely.
The Terafab launch this week may not mean a factory opens its doors overnight, but it signals Tesla is serious about owning the entire AI stack, from software to silicon.
