News
Tesla Autopilot Abusers need to be held accountable, but how?
Tesla Autopilot Abusers need to be held accountable for their actions. For years, Tesla engineers have worked long and hard to improve Autopilot and Full Self-Driving. Hundreds of thousands of hours of work have been put into these driving assistance programs, whether it would be through software, coding, and programming or through other mediums. However, years of hard work, diligence, and improvement can be wiped away from the public’s perception in a minute with one foolish, irresponsible, and selfish act that can be derived from an owner’s need to show off their car’s semi-autonomous functionalities to others.
The most recent example of this is with Param Sharma, a self-proclaimed “rich as f***” social media influencer who has spent the last few days sparring with Tesla enthusiasts through his selfish and undeniably dangerous act of jumping in the backseat while his car is operating on Autopilot. Sharma has been seen on numerous occasions sitting in the backseat of his car while the vehicle drives itself. It is almost a sure thing that Sharma is using several cheat devices in his Tesla to bypass typical barriers the company has installed to ensure drivers are paying attention. These include a steering wheel sensor, seat sensors, and seatbelt sensors, all of which must be controlled or connected by the driver at the time of Autopilot’s use. We have seen several companies and some owners use DIY hack devices to bypass these safety thresholds. These are hazardous acts for several reasons, the most important being the lack of appreciation for other human lives.
This is a preview from our weekly newsletter. Each week I go ‘Beyond the News’ and handcraft a special edition that includes my thoughts on the biggest stories, why it matters, and how it could impact the future.
While Tesla fans and enthusiasts are undoubtedly confident in the abilities of Autopilot and Full Self-Driving, they will also admit that the use of these suites needs to be used responsibly and as the company describes. Tesla has never indicated that its vehicles can drive themselves, which can be characterized as “Level 5 Autonomy.” The company also indicates that drivers must keep their hands on the steering wheel at all times. There are several safety features that Tesla has installed to ensure that these are recognized by the car’s operator. If these safety precautions are not followed, the driver runs the risk of being put in “Autopilot Jail,” where they will not have the feature available to them for the remainder of their drive.
As previously mentioned, there are cheat devices for all of these safety features, however. This is where Tesla cannot necessarily control what goes on, and law enforcement, in my opinion, is more responsible than the company actually is. It is law enforcement’s job to stop this from happening if an officer sees it occurring. Nobody should be able to climb into the backseat of their vehicle while it is driving. A least not until many years of testing are completed, and many miles of fully autonomous functionalities are proven to be accurate and robust enough to handle real-world traffic.
The reason Tesla should step in, in my opinion, and create a list of repeat offenders who have proven themselves to be irresponsible and not trustworthy enough for Autopilot and FSD, is because if an accident happens while these influencers or everyday drivers are taking advantage of Autopilot’s capabilities, Tesla, along with every other company working to develop Level 5 Autonomous vehicles, takes a huge step backward. Not only will Tesla feel the most criticism from the media, but it will be poured on as the company is taking no real steps to prevent it from happening. Unbelievably, we in the Tesla community know what the vehicles can and what safety precautions have been installed to prevent these incidents from happening. However, mainstream media outlets do not have an explicit and in-depth understanding of Tesla’s capabilities. There is plenty of evidence to suggest that they have no intentions of improving their comprehension of what Tesla does daily.
While talking to someone about this subject on Thursday, they highlighted that this isn’t Tesla’s concern. And while I believe that it really isn’t, I don’t think that’s an acceptable answer to solve all of the abuses going on with the cars. Tesla should take matters into its own hands, and I believe it should because it has done it before. Elon Musk and Tesla decided to expand the FSD Beta testing pool recently, but the company also revoked access to some people who have decided that they would not use the functionality properly. Why is this any different in the case of AP/FSD? Just because someone pays for something doesn’t mean the company cannot revoke access to it. If you pay for access to play video games online and hack or use abusive language, there are major consequences. Your console can get banned, and you would be required to buy a completely new unit if you ever wished to play online video games again.
While unfortunate, Tesla will have to make a stand against those who abuse Autopilot, in my opinion. There needs to be heavier consequences by the company simply because an accident caused by abuse or misuse of the functionalities could set the company back several years and put their work to solve Level 5 Autonomy in a vacuum. There is entirely too much at stake here to even begin to let people off the hook. I believe that Tesla’s actions should follow law enforcement action. When police officers find someone violating the proper use of the system, the normal reckless driving charges should be held up, and there should be increasingly worse consequences for every subsequent offense. Perhaps after the third offense, Tesla could be contacted and could have AP/FSD taken off of the car. There could be a probationary period or a zero-tolerance policy; it would all be up to the company.
I believe that this needs to be taken so seriously, and there need to be consequences because of the blatant disregard for other people and their work. The irresponsible use of AP/FSD by childish drivers means that Tesla’s hard work is being jeopardized by horrible behavior. While many people don’t enjoy driving, it still requires responsibility, and everyone on the road is entrusting you to drive responsibly. It could cost your life or, even worse, someone else’s.
A big thanks to our long-time supporters and new subscribers! Thank you.
I use this newsletter to share my thoughts on what is going on in the Tesla world. If you want to talk to me directly, you can email me or reach me on Twitter. I don’t bite, be sure to reach out!
Elon Musk
Tesla Semi’s official battery capacity leaked by California regulators
A California regulatory filing just confirmed the exact battery size inside each Tesla Semi variant.
A regulatory filing published by the California Air Resources Board in April 2026 has put official numbers on what Tesla Semi owners and fleet buyers have long wanted confirmed: the exact battery capacities of both the Long Range and Standard Range Semi truck variants. CARB is California’s independent air quality regulator, and it certifies zero-emission powertrains before they can be sold or operated in the state. When a manufacturer submits a vehicle for certification, the resulting executive order becomes a public document, making it one of the most reliable sources for confirmed production specs on any EV.
The document lists two certified powertrain configurations. The Long Range Semi carries a usable battery capacity of 822 kWh, while the Standard Range version comes in at 548 kWh. Both use lithium-ion NCMA chemistry and share the same peak and steady-state motor output ratings of 800 kW and 525 kW respectively. Cross-referencing Tesla’s published efficiency figure of approximately 1.7 kWh per mile under full load, the 822 kWh pack supports roughly 480 miles of real-world range, which aligns closely with Tesla’s advertised 500-mile figure for the Long Range trim. The 548 kWh Standard Range pack works out to approximately 320 miles, again consistent with Tesla’s stated 325-mile target.
Here is a direct comparison of the two versions based on the CARB filing and published specs:
| Tesla Semi Spec | Long Range | Standard Range |
| Battery Capacity | 822 kWh | 548 kWh |
| Battery Chemistry | NCMA Li-Ion | NCMA Li-Ion |
| Peak Motor Power | 800 kW | 525 kW |
| Estimated Range | ~500 miles | ~325 miles |
| Efficiency | ~1.7 kWh/mile | ~1.7 kWh/mile |
| Est. Price | ~$290,000 | ~$260,000 |
| GVW Rating | 82,000 lbs | 82,000 lbs |
The timing of this certification is not incidental. On April 29, 2026, Semi Programme Director Dan Priestley confirmed on X that high-volume production is now ramping at Tesla’s dedicated 1.7-million-square-foot facility in Sparks, Nevada. A key advantage of the Nevada location is vertical integration: the 4680 battery cells powering the Semi are manufactured in the same complex, eliminating the supply chain bottleneck that had delayed the program for years.
Tesla’s long-term goal is to reach a production capacity of 50,000 trucks annually at the Nevada factory, which would represent roughly 20 percent of the entire North American Class 8 market. With CARB certification now in hand and the production line running, the regulatory and manufacturing groundwork for that target is in place.
News
Tesla crushes NHTSA’s brand-new ADAS safety tests – first vehicle to ever pass
Tesla became the first company to pass the United States government’s new Advanced Driver Assistance Systems (ADAS) testing with the Model Y, completing each of the new tests with a passing performance.
In a landmark announcement on May 7, the National Highway Traffic Safety Administration (NHTSA) declared the 2026 Tesla Model Y the first vehicle to pass its newly ADAS benchmark under the New Car Assessment Program (NCAP).
Model Y vehicles manufactured on or after November 12, 2025, met rigorous pass/fail criteria for four newly added tests—pedestrian automatic emergency braking, lane keeping assistance, blind spot warning, and blind spot intervention—while also satisfying the program’s original four ADAS requirements: forward collision warning, crash imminent braking, dynamic brake support, and lane departure warning.
The NHTSA has just officially announced that the 2026 @Tesla Model Y is the first vehicle model to pass the agency’s new advanced driver assistance system tests.
2026 Tesla Model Y vehicles, manufactured on or after Nov. 12, 2025, successfully met the new criteria for four… pic.twitter.com/as8x1OsSL5
— Sawyer Merritt (@SawyerMerritt) May 7, 2026
NHTSA administration Jonathan Morrison hailed the achievement as a milestone:
“Today’s announcement marks a significant step forward in our efforts to provide consumers with the most comprehensive safety ratings ever. By successfully passing these new tests, the 2026 Tesla Model Y demonstrates the lifesaving potential of driver assistance technologies and sets a high bar for the industry. We hope to see many more manufacturers develop vehicles that can meet these requirements.”
The updates to NCAP, finalized in late 2024 and effective for 2026 models, reflect growing recognition that ADAS features are no longer optional luxuries but essential tools for preventing crashes.
Pedestrian automatic emergency braking, for instance, targets one of the fastest-rising causes of roadway fatalities, while blind spot intervention and lane keeping assistance address common sources of side-swipes and run-off-road incidents. By incorporating objective, performance-based evaluations rather than mere presence of the technology, NHTSA aims to give buyers clearer data on real-world effectiveness.
This milestone arrives at a pivotal moment when vehicle autonomy is transitioning from science fiction to everyday reality.
Tesla’s Full Self-Driving (FSD) software and the impending rollout of robotaxis underscore a broader industry shift toward higher levels of automation. Yet regulators and consumers remain cautious: safety data must keep pace with technological ambition.
The Model Y’s perfect score on these ADAS benchmarks validates that current driver-assist systems—when engineered rigorously—can dramatically reduce human error, which still accounts for the vast majority of crashes.
For Tesla, the result reinforces its long-standing claim of building the safest vehicles on the road. More importantly, it signals to the entire auto sector that meeting elevated federal standards is achievable and expected.
As autonomy edges closer to Level 3 and beyond, where drivers may disengage more fully, such independent verification becomes critical. It builds public trust, informs purchasing decisions, and accelerates the development of systems that could one day eliminate tens of thousands of annual traffic deaths.
In an era when software-defined vehicles promise transformative mobility, the 2026 Model Y’s NHTSA triumph is more than a manufacturer accolade—it is a regulatory green light that autonomy’s future must be built on proven, testable safety foundations. The bar has been raised. The industry, and the roads we share, will be safer for it.
News
Tesla to fix 219k vehicles in recall with simple software update
Tesla is going to fix the nearly 219,000 vehicles that it recalled due to an issue with the rearview camera with a simple software update, giving owners no need to travel to a service center to resolve the problem.
Tesla is formally recalling 218,868 U.S. vehicles after regulators discovered a software glitch that can delay the rearview camera image by up to 11 seconds when drivers shift into reverse.
The affected models include certain 2024-2025 Model 3 and Model Y, as well as 2023-2025 Model S and Model X vehicles running software version 2026.8.6 and equipped with Hardware 3 computers. The National Highway Traffic Safety Administration (NHTSA) determined the lag violates Federal Motor Vehicle Safety Standard 111 on rear visibility and could increase crash risk.
Yet this is no ordinary recall. Owners do not need to schedule a service-center visit, hand over keys, or wait for parts.
Tesla fans call for recall terminology update, but the NHTSA isn’t convinced it’s needed
Tesla identified the issue on April 10, halted further deployment of the faulty firmware the same day, and began pushing a corrective over-the-air (OTA) software update on April 11.
By the time the NHTSA posted the recall notice on May 6, more than 99.92 percent of the affected fleet had already received the fix. Tesla reports no crashes, injuries, or fatalities linked to the glitch.
The episode underscores a deeper problem with regulatory language. For decades, “recall” meant hauling a vehicle to a dealership for hardware repairs or replacements. That definition no longer fits software-defined cars. When a fix arrives wirelessly in minutes — identical to an iPhone update — the term evokes unnecessary alarm and misleads the public about the actual risk and remedy.
Elon Musk has repeatedly called for exactly this change. After earlier NHTSA actions, he stated plainly: “The terminology is outdated & inaccurate. This is a tiny over-the-air software update.” On another occasion, he added that labeling OTA fixes as recalls is “anachronistic and just flat wrong.”
The terminology is outdated & inaccurate. This is a tiny over-the-air software update. To the best of our knowledge, there have been no injuries.
— Elon Musk (@elonmusk) September 22, 2022
Musk’s point is simple: regulators must evolve their vocabulary to match the technology. Traditional recalls involve physical intervention and downtime; OTA updates do not. Retaining the old label distorts consumer perception, inflates perceived defect rates, and slows the industry’s shift to faster, safer software iteration.
Tesla’s rapid, remote remedy demonstrates the safety advantage of over-the-air capability. Problems that once required weeks of dealer appointments are now resolved in hours, often before most owners notice. As more automakers adopt software-first designs, the entire regulatory framework needs to catch up.
Updating “recall” terminology would align language with reality, reduce public confusion, and recognize that modern vehicles are no longer static hardware — they are continuously improving computers on wheels.
For the 219,000 Tesla owners involved, the process is already complete. The camera works, the car is safe, and no one left their driveway. That is the new standard — and the vocabulary should reflect it.