News
Tesla Autopilot Abusers need to be held accountable, but how?
Tesla Autopilot Abusers need to be held accountable for their actions. For years, Tesla engineers have worked long and hard to improve Autopilot and Full Self-Driving. Hundreds of thousands of hours of work have been put into these driving assistance programs, whether it would be through software, coding, and programming or through other mediums. However, years of hard work, diligence, and improvement can be wiped away from the public’s perception in a minute with one foolish, irresponsible, and selfish act that can be derived from an owner’s need to show off their car’s semi-autonomous functionalities to others.
The most recent example of this is with Param Sharma, a self-proclaimed “rich as f***” social media influencer who has spent the last few days sparring with Tesla enthusiasts through his selfish and undeniably dangerous act of jumping in the backseat while his car is operating on Autopilot. Sharma has been seen on numerous occasions sitting in the backseat of his car while the vehicle drives itself. It is almost a sure thing that Sharma is using several cheat devices in his Tesla to bypass typical barriers the company has installed to ensure drivers are paying attention. These include a steering wheel sensor, seat sensors, and seatbelt sensors, all of which must be controlled or connected by the driver at the time of Autopilot’s use. We have seen several companies and some owners use DIY hack devices to bypass these safety thresholds. These are hazardous acts for several reasons, the most important being the lack of appreciation for other human lives.
This is a preview from our weekly newsletter. Each week I go ‘Beyond the News’ and handcraft a special edition that includes my thoughts on the biggest stories, why it matters, and how it could impact the future.
While Tesla fans and enthusiasts are undoubtedly confident in the abilities of Autopilot and Full Self-Driving, they will also admit that the use of these suites needs to be used responsibly and as the company describes. Tesla has never indicated that its vehicles can drive themselves, which can be characterized as “Level 5 Autonomy.” The company also indicates that drivers must keep their hands on the steering wheel at all times. There are several safety features that Tesla has installed to ensure that these are recognized by the car’s operator. If these safety precautions are not followed, the driver runs the risk of being put in “Autopilot Jail,” where they will not have the feature available to them for the remainder of their drive.
As previously mentioned, there are cheat devices for all of these safety features, however. This is where Tesla cannot necessarily control what goes on, and law enforcement, in my opinion, is more responsible than the company actually is. It is law enforcement’s job to stop this from happening if an officer sees it occurring. Nobody should be able to climb into the backseat of their vehicle while it is driving. A least not until many years of testing are completed, and many miles of fully autonomous functionalities are proven to be accurate and robust enough to handle real-world traffic.
The reason Tesla should step in, in my opinion, and create a list of repeat offenders who have proven themselves to be irresponsible and not trustworthy enough for Autopilot and FSD, is because if an accident happens while these influencers or everyday drivers are taking advantage of Autopilot’s capabilities, Tesla, along with every other company working to develop Level 5 Autonomous vehicles, takes a huge step backward. Not only will Tesla feel the most criticism from the media, but it will be poured on as the company is taking no real steps to prevent it from happening. Unbelievably, we in the Tesla community know what the vehicles can and what safety precautions have been installed to prevent these incidents from happening. However, mainstream media outlets do not have an explicit and in-depth understanding of Tesla’s capabilities. There is plenty of evidence to suggest that they have no intentions of improving their comprehension of what Tesla does daily.
While talking to someone about this subject on Thursday, they highlighted that this isn’t Tesla’s concern. And while I believe that it really isn’t, I don’t think that’s an acceptable answer to solve all of the abuses going on with the cars. Tesla should take matters into its own hands, and I believe it should because it has done it before. Elon Musk and Tesla decided to expand the FSD Beta testing pool recently, but the company also revoked access to some people who have decided that they would not use the functionality properly. Why is this any different in the case of AP/FSD? Just because someone pays for something doesn’t mean the company cannot revoke access to it. If you pay for access to play video games online and hack or use abusive language, there are major consequences. Your console can get banned, and you would be required to buy a completely new unit if you ever wished to play online video games again.
While unfortunate, Tesla will have to make a stand against those who abuse Autopilot, in my opinion. There needs to be heavier consequences by the company simply because an accident caused by abuse or misuse of the functionalities could set the company back several years and put their work to solve Level 5 Autonomy in a vacuum. There is entirely too much at stake here to even begin to let people off the hook. I believe that Tesla’s actions should follow law enforcement action. When police officers find someone violating the proper use of the system, the normal reckless driving charges should be held up, and there should be increasingly worse consequences for every subsequent offense. Perhaps after the third offense, Tesla could be contacted and could have AP/FSD taken off of the car. There could be a probationary period or a zero-tolerance policy; it would all be up to the company.
I believe that this needs to be taken so seriously, and there need to be consequences because of the blatant disregard for other people and their work. The irresponsible use of AP/FSD by childish drivers means that Tesla’s hard work is being jeopardized by horrible behavior. While many people don’t enjoy driving, it still requires responsibility, and everyone on the road is entrusting you to drive responsibly. It could cost your life or, even worse, someone else’s.
A big thanks to our long-time supporters and new subscribers! Thank you.
I use this newsletter to share my thoughts on what is going on in the Tesla world. If you want to talk to me directly, you can email me or reach me on Twitter. I don’t bite, be sure to reach out!
Elon Musk
Elon Musk reveals unfortunate truth of Tesla Full Self-Driving development
In a candid reply to a dramatic video of Tesla’s Full Self-Driving (FSD) system averting disaster, Elon Musk laid bare a harsh reality facing autonomous vehicle technology.
Tesla’s Full Self-Driving suite is one of the most significant technological developments in terms of passenger travel in decades, but it is not all sunshine and rainbows, even with major strides in safety, CEO Elon Musk revealed.
In a candid reply to a dramatic video of Tesla’s Full Self-Driving (FSD) system averting disaster, Elon Musk laid bare a harsh reality facing autonomous vehicle technology.
The clip shows a Model 3 traveling at over 65 mph on a foggy, rain-soaked highway when a pedestrian suddenly steps into traffic.
Full Self-Driving instantly detects the threat and swerves safely, preventing what could have been a fatal collision for both the pedestrian and the driver’s cousin.
Musk’s response was unequivocal:
“Tesla self-driving saves a lot of lives – the statistics are unequivocal. That doesn’t mean it’s perfect, of course.” Even with a projected 10x safety improvement over human drivers, FSD would still prevent roughly 90% of the world’s approximately one million annual auto fatalities. The remaining 10%—roughly 100,000 deaths—would expose Tesla to relentless lawsuits. Meanwhile, the vast majority of lives saved would go unnoticed. “The 90% who are still alive mostly won’t even know that Tesla saved them. Nonetheless, it is the right thing to do.”
This “unfortunate truth,” as Musk implicitly framed it, highlights a fundamental asymmetry in how society perceives safety technology. Human drivers cause the overwhelming majority of crashes through distraction, fatigue, or error.
Tesla self-driving saves a lot of lives – the statistics are unequivocal.
That doesn’t mean it’s perfect, of course.
Even when we improve safety 10X, saving 90% of the million lives lost in auto accidents every year, Tesla will still get sued for the 10% who did die. The 90%… https://t.co/OrNB1mO5eF
— Elon Musk (@elonmusk) April 6, 2026
Yet when FSD errs, the incident becomes headline news and a courtroom target. Prevented tragedies, by contrast, leave no trace.
Survivors simply continue their journeys, unaware of the split-second intervention that kept them alive. The result is a distorted public narrative that amplifies failures while rendering successes invisible.
We have seen this through various headlines throughout the years, including the mainstream media’s obsession with only mentioning the manufacturer’s name in the instance of an accident when it is “Tesla.”
Opinion: Tesla Autopilot NHTSA investigation headlines are out of control
The video’s real-world example underscores FSD’s current capabilities. In near-zero visibility, the system’s cameras and neural network reacted faster than any human could, demonstrating the life-saving potential Musk cites.
Tesla’s latest safety data already shows FSD (Supervised) performing significantly better than the U.S. average, with crashes occurring far less frequently per mile driven.
Still, regulatory scrutiny, liability concerns, and media focus on edge-case failures continue to slow widespread adoption. Musk’s frank admission suggests Tesla is prepared to push forward despite the legal and perceptual headwinds.
As FSD edges closer to unsupervised autonomy, Musk’s post serves as both a progress report and a reality check. The technology is already saving lives today.
The unfortunate truth is that proving it and scaling it responsibly will require society to value statistical lives saved as much as dramatic stories of those lost. In the race toward safer roads, perception may prove as formidable an obstacle as the fog and rain in that viral video.
News
Tesla Full Self-Driving v14.3: First Impressions
Tesla started rolling out Full Self-Driving v14.3 to Early Access Program (EAP) members earlier today, and I had the opportunity to see some of the improvements that were made from v14.2.2.5.
While a lot of things got better, and I truly enjoyed using Full Self-Driving again after being stuck with the widely confusing and frustrating v14.2.2.5, Tesla still has one major problem on its hands, and it has to do with Navigation and Routing. I truly believe those issues will be the biggest challenges Tesla will face with autonomy: the car simply going the correct way, not conflicting with what the navigation says, and taking the simplest and most ideal route to a destination.
Here’s what I noticed as an improvement with my first hour with v14.3. This is not a full review, nor is it reflective of everything I will likely experience with this new version. This is simply what I saw as a noticeable improvement from the past version, v14.2.2.5.
There is also a more streamlined version on X, available at the thread below:
Tesla Full Self-Driving v14.3 testing now: pic.twitter.com/9UuP11Fv9f
— TESLARATI (@Teslarati) April 7, 2026
Yellow Light Behavior is Significantly Better
On v14.2.2.5, I had so many instances of the car slamming the brakes on to stop at a yellow light when it was clearly the safer option to proceed through. There were several times when the car would be about 20 feet from the line, traveling at 15-20 MPH, the light would turn yellow, and it would slam the brakes to stop. I would nudge it through yellow lights constantly because of this by putting my foot on the accelerator.
The instances I’m talking about here would not have been close calls — the car would have likely moved through the intersection completely before the light would turn red.
On multiple occasions this evening, FSD proceeded through yellow lights safely, without hesitation or any brake stabbing. It was refreshing:
🚨 Here’s an EXCELLENT example:
v14.2.2.5 would have slammed the brakes and stopped at this stop sign. I would have tapped the accelerator to proceed.
You can see the light turns yellow and the car makes — in my opinion — the correct decision to proceed. https://t.co/hHMikimkbp pic.twitter.com/Iesta1OYoV
— TESLARATI (@Teslarati) April 7, 2026
This was a huge complaint with v14.2.2.5. Sometimes, it’s a safer option to go through a yellow light, especially when you have traffic behind you. It’s a great way to get rear-ended.
Parking Performance
I had four instances of parking, and FSD v14.3 really did a flawless job. I was very impressed with how solid it was, but also with how efficiently it moved into the spot. When there was traffic around with past versions, I usually chose to park manually just because FSD took its time getting into a spot. I don’t see that being an issue anymore.
I complained about parking a lot and shared several images on X and Facebook of those examples:
Still a few issues with parking on FSD v14.2.2.4 pic.twitter.com/BphvVWDPqe
— TESLARATI (@Teslarati) February 5, 2026
No issues with it this evening. 4/4. Here are two looks:
Highway Performance
FSD v14.3 passed the five cars shown in this image:
The sixth was 200-300 yards ahead of the fifth. In v14.2.2.5, FSD would usually stay in the left lane, especially on Hurry and Mad Max. It did not do that, as it instead chose to get back over in the right lane after passing the final car.
Speed was not much of a concern here, even though it was going 21 MPH over. Although it was fast, I did have a line of cars behind me traveling at the same speed, and FSD had just merged about a half mile prior, so I chose to let it continue.
There were no instances of camping in the left lane for extended periods of time. I do want to do more testing with the Speed Profiles because they were in need of some work with the previous version. I am starting to side with those who want a Max Speed setting, which was removed last year.
Navigation and Routing Still Need Work
I was heading back toward where I came from, so I turned “Avoid Highways” on to take a different way. This confused the Routing system, and instead of turning left, then right, as the Routing said, the car turned right, then indicated for another right, basically going in a big rectangle. The car ignored the second right-hand turn and continued straight. I ended up turning “Avoid Highways” off and letting the car pick the same routing option as what took me here.
I have truly complained so much about Navigation and Routing that I’m starting to feel sort of bad. It is obviously such a massive challenge for some reason, but I am confident it will improve. I recall seeing Tesla hiring someone for this role a few months back, so perhaps there is hope for it to get better.
Smarter Behavior When Approaching Exits/Routing
This probably should be grouped in with Highway Behavior, but I wanted to highlight it on its own.
The highway exit pictured was always frustrating for v14.2.2.5. In the Hurry speed profile, I have seen it try to execute passes on multiple cars with as little as 0.6 miles to spare before taking the exit.
With three cars ahead of it, it chose to reduce speed and just wait until the exit. It was refreshing to see an improvement here, so I hope this behavior persists. Sometimes there’s just no reason to pass when you’re less than a mile from getting off the highway anyway.
Larger Visibility Warnings
Tesla seems to have increased the size of these “Camera Visibility Limited” warnings. Previously, they were just small thumbnails:
🚨 The warnings of “Camera Visibility Limited” appear to be larger with v14.3
Previously, it was a small thumbnail. Haven’t seen it this magnified before. https://t.co/iKJLsZ8P4Q pic.twitter.com/qRWwFyIZNd
— TESLARATI (@Teslarati) April 7, 2026
Stop Sign Behavior
This is probably the biggest improvement of all, because how it behaved at Stop Signs in v14.2.2.5 was so incredibly terrible and disruptive to the flow of a busy intersection.
There are several four-way, all-stop intersections near me. In the past, FSD would stop well behind the Stop Sign or the white-painted line on the road. It would then inch forward, stopping again at this line, essentially making two stops at a single intersection.
If there is visibility, I don’t truly care where FSD stops, as long as it stops once. Stopping twice just isn’t ideal or logical. I can’t imagine many humans would do it, I know I wouldn’t.
I didn’t have that issue this evening:
🚨 Here’s a look with some commentary – Previously, FSD would stop where it did in this video, then again at the white line, before proceeding. https://t.co/xwyVGMy28y pic.twitter.com/MObgUa7DoA
— TESLARATI (@Teslarati) April 7, 2026
This was pretty tight, too, in the sense that both my car and the other one got to the intersection at the same time. FSD may have stopped first, but the other vehicle was probably around the same point that I was when FSD decided to stop. I was happy to see the assertiveness to proceed; it felt like it was ideal to just go through. I was happy it didn’t stop a second time up at the line. I’d be fine if it stopped at the line, as long as that was the only stop it made.
News
Tesla Full Self-Driving v14.3 rolls out: here’s what’s new
We are in EAP and will be on the road with v14.3 in the coming hours, so we’ll have a lot of things to discuss over the next few days, especially coming from v14.2.2.5, which I called the most “confusing” FSD release of all time.
Tesla has officially started rolling out Full Self-Driving v14.3 to Early Access Program (EAP) members, and there are a lot of new improvements.
We are in EAP and will be on the road with v14.3 in the coming hours, so we’ll have a lot of things to discuss over the next few days, especially coming from v14.2.2.5, which I called the most “confusing” FSD release of all time.
🚨 Tesla Full Self-Driving v14.3 is here and it is coming with so many new features
Looks like there will be some MAJOR improvements to the general performance.
Truly seems like it will be significantly different than v14.2 pic.twitter.com/mhdfBLuDup
— TESLARATI (@Teslarati) April 7, 2026
Tesla brought out a lot of improvements, according to the v14.3 release notes, which list a vast number of fixes, new features, and new capabilities.
Here’s what Tesla’s release notes for the v14.3 release state:
- Improved parking location pin prediction, now shown on a map with a P icon.
- Increased decisiveness of parking spot selection and maneuvering.
- Rewrote the Al compiler and runtime from the ground up with MLIR, resulting in 20% faster reaction time and improving model iteration speed.
- Enhanced response to emergency vehicles, school buses, right-of-way violators, and other rare vehicles.
- Mitigated unnecessary lane biasing and minor tailgating behaviors.
- Improved handling of small animals by focusing RL training on harder examples and adding rewards for better proactive safety.
- Improved traffic light handling at complex intersections with compound lights, curved roads, and yellow light stopping – driven by training on hard RL examples sourced from the Tesla fleet.
- Upgraded the Reinforcement Learning (RL) stage of training the FSD neural network, resulting in improvements in a wide variety of driving scenarios.
- Upgraded the neural network vision encoder, improving understanding in rare and low-visibility scenarios, strengthening 3D geometry understanding, and expanding traffic sign understanding.
- Improved handling for rare and unusual objects extending, hanging, or leaning into the vehicle path by sourcing infrequent events from the fleet.
- Improved handling of temporary system degradations by maintaining control and automatically recovering without driver intervention, reducing unnecessary disengagements.
Tesla also listed a handful of future improvements as well:
- Expand reasoning to all behaviors beyond destination handling
- Add pothole avoidance
- Improve driver monitoring system sensitivity with better eye gaze tracking, eye wear handling, and higher accuracy in variable lighting situations
CEO Elon Musk has said that v14.3 could be “where the last big piece of the puzzle finally lands.” We have high expectations for this release because, in a lot of ways, v14.2’s final version was extremely disappointing and seemed to be a regression more than anything.
Nevertheless, Full Self-Driving v14.3 is going to be quite an interesting test, considering this is also the first time Musk has stated it will feel like the car will be “sentient.”
Reasoning will be a bigger piece of the puzzle with this release, although there were some elements of it in v14.2.
Tesla AI Head says future FSD feature has already partially shipped
We plan to travel plenty of miles with it over the next few days, so we’ll keep you posted on what our thoughts are.

