Connect with us

News

Driver of Model X crash in Montana pens open letter to Musk, calls Tesla drivers “lab rats” [Updated]

Published

on

Pang, the driver of the Model X that crashed in Montana earlier this month has posted an open letter to Elon Musk and Tesla asking the company to “take responsibility for the mistakes of Tesla products”. He accuses Tesla for allegedly using drivers as “lab rats” for testing of its Autopilot system.

In an email sent to us and also uploaded to the Tesla Motors Club forum, Pang provides a detailed account of what happened the day of the crash. He says he and a friend drove about 600 miles on Interstate 90 on the way to Yellowstone National Park. When he exited the highway to get on Montana route 2, he drove for about a mile, saw conditions were clear, and turned on Autopilot again. Pang describes what happened next as follows:

“After we drove about another mile on state route 2, the car suddenly veered right and crashed into the safety barrier post. It happened so fast, and we did not hear any warning beep. Autopilot did not slow down at all after the crash, but kept going in the original speed setting and continued to crash into more barrier posts in high speed. I managed to step on the break, turn the car left and stopped the car after it crashed 12 barrier posts.

“After we stopped, we heard the car making abnormal loud sound. Afraid that the battery was broken or short circuited, we got out and ran away as fast as we could. After we ran about 50 feet, we found the sound was the engine were still running in high speed. I returned to the car and put it in parking, that is when the loud sound disappeared.”

Advertisement

Pang goes on to explain how his Tesla Model X driving on Autopilot continued to travel on its own even after veering off the road and crashing into a roadside stake.  “I was horrified by the fact that the Tesla autopilot did not slow down the car at all after the intial crash. After we crashed on the first barrier post, autopilot continued to drive the car with the speed of 55 to 60 mph, and crashed another 11 posts. Even after I stopped the car, it was still trying to accelerate and spinning the engine in high speed. What if it is not barrier posts on the right side, but a crowd?”

Photo credit: Steven Xu

After the accident, Tesla reviewed the driving logs from the Model X and reported that the car was operating for more than two miles with no hands on the steering wheel, despite numerous alarms and warnings issued by the car. Pang says he never heard any audible warnings. Comments on TMC range from the incredulous to the acerbic. Most feel Teslas simply don’t operate the way Pang said his car did. Among other discrepancies, the cars are designed to put themselves in Park if the driver’s door is opened with no one in the driver’s seat.

But that hasn’t stopped Pang from voicing his strong opinions on Tesla’s Autopilot system. “It is clear that Tesla is selling a beta product with bugs to consumers, and ask the consumers to be responsible for the liability of the bugging autopilot system. Tesla is using all Tesla drivers as lab rats.”

Advertisement

A car that crashes but continues to accelerate is certainly a scary thought. There is no way to resolve the discrepancy between what Pang says happened and Tesla’s account of what occurred. In an updated email sent to us by friend and english translator for Mandarin speaking Pang, Tesla has reached out to Pang to address the matter.

The original open letter from Pang reads as follows:

A Public Letter to Mr. Musk and Tesla For The Sake Of All Tesla Driver’s Safety

From the survivor of the Montana Tesla autopilot crash

My name is Pang. On July 8, 2016, I drove my Tesla Model X from Seattle heading to Yellowstone Nation Park, with a friend, Mr. Huang, in the passenger seat. When we were on highway I90, I turned on autopilot, and drove for about 600 miles. I switched autopilot off while we exited I90 in Montana to state route 2. After about 1 mile, we saw that road condition was good, and turned on autopilot again. The speed setting was between 55 and 60 mph. After we drove about another mile on state route 2, the car suddenly veered right and crashed into the safety barrier post. It happened so fast, and we did not hear any warning beep. Autopilot did not slow down at all after the crash, but kept going in the original speed setting and continued to crash into more barrier posts in high speed. I managed to step on the break, turn the car left and stopped the car after it crashed 12 barrier posts. After we stopped, we heard the car making abnormal loud sound. Afraid that the battery was broken or short circuited, we got out and ran away as fast as we could. After we ran about 50 feet, we found the sound was the engine were still running in high speed. I returned to the car and put it in parking, that is when the loud sound disappeared. Our cellphone did not have coverage, and asked a lady passing by to call 911 on her cellphone. After the police arrived, we found the right side of the car was totally damaged. The right front wheel, suspension, and head light flied off far, and the right rear wheel was crashed out of shape. We noticed that the barrier posts is about 2 feet from the white line. The other side of the barrier is a 50 feet drop, with a railroad at the bottom, and a river next. If the car rolled down the steep slope, it would be really bad.

Advertisement

Concerning this crash accident, we want to make several things clear:

1. We know that while Tesla autopilot is on but the driver’s hand is not on the steering wheel, the system will issue warning beep sound after a while. If the driver’s hands continue to be off the steering wheel, autopilot will slow down, until the driver takes over both the steering wheel and gas pedal. But we did not hear any warning beep before the crash, and the car did not slow down either. It just veered right in a sudden and crashed into the barrier posts. Apparently the autopilot system malfunctioned and caused the crash. The car was running between 55 and 60 mph, and the barrier posts are just 3 or 4 feet away. It happened in less than 1/10 of a second from the drift to crash. A normal driver is impossible to avoid that in such a short time.

2. I was horrified by the fact that the Tesla autopilot did not slow down the car at all after the intial crash. After we crashed on the first barrier post, autopilot continued to drive the car with the speed of 55 to 60 mph, and crashed another 11 posts. Even after I stopped the car, it was still trying to accelerate and spinning the engine in high speed. What if it is not barrier posts on the right side, but a crowd?

3. Tesla never contacted me after the accident. Tesla just issued conclusion without thorough investigation, but blaming me for the crash. Tesla were trying to cover up the lack of dependability of the autopilot system, but blaming everything on my hands not on the steering wheel. Tesla were not interested in why the car veered right suddenly, nor why the car did not slow down during the crash. It is clear that Tesla is selling a beta product with bugs to consumers, and ask the consumers to be responsible for the liability of the bugging autopilot system. Tesla is using all Tesla drivers as lab rats. We are willing to talk to Tesla concerning the accident anytime, anywhere, in front of the public.

Advertisement

4. CNN’s article later about the accident was quoting out of context of our interview. I did not say that I do not know either Tesla or me should be responsible for the accident. I might consider buying another Tesla only if they can iron out the instability problems of their system.

As a survivor of such a bad accident, a past fan of the Tesla technology, I now realized that life is the most precious fortune in this world. Any advance in technology should be based on the prerequisite of protecting life to the maximum extend. In front of life and death, any technology has no right to ignore life, any pursue and dream on technology should first show the respect to life. For the sake of the safety of all Tesla drivers and passengers, and all other people sharing the road, Mr. Musk should stand up as a man, face up the challenge to thoroughly investigate the cause of the accident, and take responsibility for the mistakes of Tesla product. We are willing to publicly talk to you face to face anytime to give you all the details of what happened. Mr. Musk, you should immediately stop trying to cover up the problems of the Tesla autopilot system and blame the consumers.

Tesla’s Response on TMC

TM Ownership, Saturday at 12:11 PM
Dear Mr. Pang,

We were sorry to hear about your accident, but we were very pleased to learn both you and your friend were ok when we spoke through your translator on the morning of the crash (July 9). On Monday immediately following the crash (July 11), we found a member of the Tesla team fluent in Mandarin and called to follow up. When we were able to make contact with your wife the following day, we expressed our concern and gathered more information regarding the incident. We have since made multiple attempts (one Wednesday, one Thursday, and one Friday) to reach you to discuss the incident, review detailed logs, and address any further concerns and have not received a call back.

Advertisement

As is our standard procedure with all incidents experienced in our vehicles, we have conducted a thorough investigation of the diagnostic log data transmitted by the vehicle. Given your stated preference to air your concerns in a public forum, we are happy to provide a brief analysis here and welcome a return call from you. From this data, we learned that after you engaged Autosteer, your hands were not detected on the steering wheel for over two minutes. This is contrary to the terms of use when first enabling the feature and the visual alert presented you every time Autosteer is activated. As road conditions became increasingly uncertain, the vehicle again alerted you to put your hands on the wheel. No steering torque was then detected until Autosteer was disabled with an abrupt steering action. Immediately following detection of the first impact, adaptive cruise control was also disabled, the vehicle began to slow, and you applied the brake pedal.

Following the crash, and once the vehicle had come to rest, the passenger door was opened but the driver door remained closed and the key remained in the vehicle. Since the vehicle had been left in Drive with Creep Mode enabled, the motor continued to rotate. The diagnostic data shows that the driver door was later opened from the outside and the vehicle was shifted to park. We understand that at night following a collision the rotating motors may have been disconcerting, even though they were only powered by minimal levels of creep torque. We always seek to learn from customer concerns, and we are looking into this behavior to see if it can be improved. We are also continually studying means of better encouraging drivers to adhere to the terms of use for our driver assistance features.

We are still seeking to speak with you. Please contact Tesla service so that we can answer any further questions you may have.

Sincerely,
The Tesla team

Advertisement

"I write about technology and the coming zero emissions revolution."

Advertisement
Comments

News

Tesla Full Self-Driving shows stunning maneuver in Europe to silence skeptics

In a striking demonstration of autonomous driving prowess, Tesla’s Full Self-Driving (FSD) system recently showcased its capabilities on the narrow rural roads of the Netherlands. Captured in two in-car videos, the system encountered scenarios that would challenge even the most experienced human drivers.

Published

on

Credit: Tesla

Tesla Full Self-Driving, fresh on the heels of its approval for operation on European roads for the first time, showed off a stunning maneuver that will certainly silence any skeptics on the continent.

Fresh off its approval in the Netherlands, Full Self-Driving is working toward a significant expansion into more parts of Europe.

In a striking demonstration of autonomous driving prowess, Tesla’s Full Self-Driving (FSD) system recently showcased its capabilities on the narrow rural roads of the Netherlands. Captured in two in-car videos, the system encountered scenarios that would challenge even the most experienced human drivers.

In the first clip, a wide tractor occupied more than half the lane on a tight two-way road. Rather than braking abruptly or forcing a collision risk, FSD smoothly edged the vehicle onto the adjacent bike path—using the extra space with precision—before seamlessly returning to the lane once clear.

Advertisement

The second clip was equally demanding: while overtaking a group of cyclists, an oncoming car approached at speed.

FSD maintained a safe, minimal buffer to the cyclists while timing the pass perfectly, avoiding any swerve or hesitation that could unsettle passengers or other road users.

Advertisement

This maneuver highlights FSD’s advanced spatial reasoning and predictive planning. On roads often under three meters wide, with no room for error, the system calculated available clearance in real time, incorporated shoulder and path geometry, and executed a controlled deviation without compromising safety.

It treated the bike path as a legitimate extension of navigable space, something many drivers might hesitate to do, while respecting Dutch road norms and cyclist priority.

Such feats align closely with a growing library of impressive FSD maneuvers documented on camera worldwide.

In urban Amsterdam, for instance, FSD has navigated the world’s densest cyclist environments, weaving through hundreds of unpredictable bike movements on canal-side streets with tram tracks and pedestrians.

Advertisement

One uncut drive showed it yielding smoothly at crossings, overtaking where needed, and even handling a near-perfect auto-park in a tight residential spot, demonstrating the same low-speed precision seen in the rural clips.

Teslas using FSD have tackled turbo roundabouts in the Netherlands, complex multi-lane circles notorious for geometry challenges, merging confidently while yielding to traffic. Similar clips depict smooth handling of construction zones, emergency vehicle pull-overs, and gated parking barriers, where the car stops precisely, waits for clearance, and proceeds without driver input.

Collectively, these examples illustrate FSD’s evolution toward handling the unpredictable.

The rural Netherlands maneuvers aren’t isolated. Instead, they reflect a pattern of spatial awareness, cyclist deference, and traffic anticipation seen from city streets to highways.

Advertisement

As FSD continues refining through real-world data, videos like this one are certainly building a compelling case for its readiness on Europe’s varied roads.

Continue Reading

News

Tesla utilizes its ‘Rave Cave’ for new awesome safety feature

Part of the massive interior overhaul of both the Model 3 “Highland” and Model Y “Juniper” was the addition of interior accent lighting to help bring out the mood of the vehicle, increase the customization of the interior, and to create a unique listening experience.

Published

on

Credit: Tesla | X

Tesla is utilizing its ‘Rave Cave’ for an awesome new safety feature that will arrive with the upcoming Spring Update for 2026.

Part of the massive interior overhaul of both the Model 3 “Highland” and Model Y “Juniper” was the addition of interior accent lighting to help bring out the mood of the vehicle, increase the customization of the interior, and to create a unique listening experience.

Tesla added a Sync Lights feature that will strobe the accent strips with the beat of the music.

It is one of the most unique and one of the coolest non-functional features of a Tesla, as it does not improve the driving of the vehicle, but makes it a cool and personal addition to the interior.

Advertisement

However, Tesla is going to take it one step further, as the Rave Cave lights will now be used for blind spot recognition. This feature will be added as the Spring 2026 Update starts to roll out.

Advertisement

Tesla writes:

“Accent lights now turn red when an object is in your blind spot and your turn signal is engaged, or when an approaching object is detected while parked.”

This neat new safety feature will now increase the likelihood of a driver, who is operating their Tesla manually, of seeing the blind spot warnings that are currently available on the A pillar and on the center touchscreen.

These new alerts will now warn drivers of cross traffic as they back out of a parking space with little to no visibility of what is coming. It is a great new addition that will only increase the safety of the vehicles, while also utilizing something that is already installed in these specific Model 3 and Model Y units.

Advertisement

The Model 3 and Model Y were the central focus of the Spring 2026 Update, especially considering the fact that the Model S and Model X are basically gone, with only a few hundred units left. Additionally, Tesla included new Immersive Sound and Car Visualization for the Model 3 and Model Y specifically in this new update.

Continue Reading

News

Tesla parked 50+ Cybercabs outside its Texas Factory with some crash tested

Dozens of Tesla Cybercabs have been spotted at Giga Texas crash testing facility ahead of launch.

Published

on

By

Tesla Cybercab fleet spotted at Gigafactory Texas [Credit: Joe Tegtmeyer)
Tesla Cybercab fleet spotted at Gigafactory Texas on April 13, 2026 [Credit: Joe Tegtmeyer)

Drone footage captured by longtime Giga Texas observer Joe Tegtmeyer shows over 50 units of Tesla Cybercab at the Austin factory campus, including several units clustered by Tesla’s on-site crash testing facility.

The outbound lot at Gigafactory Texas sits just outside the factory exit and serves as the primary staging area where finished vehicles are held before being loaded onto transport carriers or dispatched for validation testing. On any given day, the lot holds a mix of Model Y and Cybertruck units alongside the growing Tesla Cybercab fleet, as can be seen in the drone footage captured by Joe Tegtmeyer.

Tesla Cybercab fleet spotted at Gigafactory Texas [Credit: Joe Tegtmeyer)

Tesla Cybercab fleet spotted at Gigafactory Texas on April 13, 2026 [Credit: Joe Tegtmeyer)

Roughly 50 Cybercab units are visible across the campus, parked in tight organized rows. Most of the units visible still carry steering wheels and pedals, temporary additions Tesla included to satisfy current safety regulations while the vehicles accumulate real-world data ahead of full regulatory approval for a steering wheel-free design.

Tesla Cybercab fleet spotted at Gigafactory Texas [Credit: Joe Tegtmeyer)

Tesla Cybercab fleet spotted at Gigafactory Texas [Credit: Joe Tegtmeyer)

Tesla operates dedicated Crash Labs at both its Giga Texas and Fremont facilities that are purpose-built for controlled structural crash tests. Historically, automakers begin intensive crash testing roughly one to two months before volume production kicks off. The Cybertruck followed almost exactly that pattern. The Cybercab appears to be on the same track facility that we first saw back in October 2025.

Tesla Cybercab crash test units spotted at Gigafactory Texas [Credit: Joe Tegtmeyer)

Tesla Cybercab crash test units spotted at Gigafactory Texas [Credit: Joe Tegtmeyer)

The first production Cybercab rolled off the Giga Texas line on February 17, 2026. Volume production is now targeted for April. Musk previously wrote on X that “the early production rate will be agonizingly slow, but eventually end up being insanely fast,” and separately stated Tesla is targeting at least 2 million Cybercab units per year. Commercial robotaxi service in Austin is targeted for late 2026.

 

Advertisement
Continue Reading