Connect with us
tesla hands free tesla hands free

News

Tesla Autopilot Abusers need to be held accountable, but how?

(Credit: My Tesla Adventure/YouTube)

Published

on

Tesla Autopilot Abusers need to be held accountable for their actions. For years, Tesla engineers have worked long and hard to improve Autopilot and Full Self-Driving. Hundreds of thousands of hours of work have been put into these driving assistance programs, whether it would be through software, coding, and programming or through other mediums. However, years of hard work, diligence, and improvement can be wiped away from the public’s perception in a minute with one foolish, irresponsible, and selfish act that can be derived from an owner’s need to show off their car’s semi-autonomous functionalities to others.

The most recent example of this is with Param Sharma, a self-proclaimed “rich as f***” social media influencer who has spent the last few days sparring with Tesla enthusiasts through his selfish and undeniably dangerous act of jumping in the backseat while his car is operating on Autopilot. Sharma has been seen on numerous occasions sitting in the backseat of his car while the vehicle drives itself. It is almost a sure thing that Sharma is using several cheat devices in his Tesla to bypass typical barriers the company has installed to ensure drivers are paying attention. These include a steering wheel sensor, seat sensors, and seatbelt sensors, all of which must be controlled or connected by the driver at the time of Autopilot’s use. We have seen several companies and some owners use DIY hack devices to bypass these safety thresholds. These are hazardous acts for several reasons, the most important being the lack of appreciation for other human lives.


This is a preview from our weekly newsletter. Each week I go ‘Beyond the News’ and handcraft a special edition that includes my thoughts on the biggest stories, why it matters, and how it could impact the future. 


While Tesla fans and enthusiasts are undoubtedly confident in the abilities of Autopilot and Full Self-Driving, they will also admit that the use of these suites needs to be used responsibly and as the company describes. Tesla has never indicated that its vehicles can drive themselves, which can be characterized as “Level 5 Autonomy.” The company also indicates that drivers must keep their hands on the steering wheel at all times. There are several safety features that Tesla has installed to ensure that these are recognized by the car’s operator. If these safety precautions are not followed, the driver runs the risk of being put in “Autopilot Jail,” where they will not have the feature available to them for the remainder of their drive.

As previously mentioned, there are cheat devices for all of these safety features, however. This is where Tesla cannot necessarily control what goes on, and law enforcement, in my opinion, is more responsible than the company actually is. It is law enforcement’s job to stop this from happening if an officer sees it occurring. Nobody should be able to climb into the backseat of their vehicle while it is driving. A least not until many years of testing are completed, and many miles of fully autonomous functionalities are proven to be accurate and robust enough to handle real-world traffic.

The reason Tesla should step in, in my opinion, and create a list of repeat offenders who have proven themselves to be irresponsible and not trustworthy enough for Autopilot and FSD, is because if an accident happens while these influencers or everyday drivers are taking advantage of Autopilot’s capabilities, Tesla, along with every other company working to develop Level 5 Autonomous vehicles, takes a huge step backward. Not only will Tesla feel the most criticism from the media, but it will be poured on as the company is taking no real steps to prevent it from happening. Unbelievably, we in the Tesla community know what the vehicles can and what safety precautions have been installed to prevent these incidents from happening. However, mainstream media outlets do not have an explicit and in-depth understanding of Tesla’s capabilities. There is plenty of evidence to suggest that they have no intentions of improving their comprehension of what Tesla does daily.

While talking to someone about this subject on Thursday, they highlighted that this isn’t Tesla’s concern. And while I believe that it really isn’t, I don’t think that’s an acceptable answer to solve all of the abuses going on with the cars. Tesla should take matters into its own hands, and I believe it should because it has done it before. Elon Musk and Tesla decided to expand the FSD Beta testing pool recently, but the company also revoked access to some people who have decided that they would not use the functionality properly. Why is this any different in the case of AP/FSD? Just because someone pays for something doesn’t mean the company cannot revoke access to it. If you pay for access to play video games online and hack or use abusive language, there are major consequences. Your console can get banned, and you would be required to buy a completely new unit if you ever wished to play online video games again.

While unfortunate, Tesla will have to make a stand against those who abuse Autopilot, in my opinion. There needs to be heavier consequences by the company simply because an accident caused by abuse or misuse of the functionalities could set the company back several years and put their work to solve Level 5 Autonomy in a vacuum. There is entirely too much at stake here to even begin to let people off the hook. I believe that Tesla’s actions should follow law enforcement action. When police officers find someone violating the proper use of the system, the normal reckless driving charges should be held up, and there should be increasingly worse consequences for every subsequent offense. Perhaps after the third offense, Tesla could be contacted and could have AP/FSD taken off of the car. There could be a probationary period or a zero-tolerance policy; it would all be up to the company.

I believe that this needs to be taken so seriously, and there need to be consequences because of the blatant disregard for other people and their work. The irresponsible use of AP/FSD by childish drivers means that Tesla’s hard work is being jeopardized by horrible behavior. While many people don’t enjoy driving, it still requires responsibility, and everyone on the road is entrusting you to drive responsibly. It could cost your life or, even worse, someone else’s.

A big thanks to our long-time supporters and new subscribers! Thank you.

Advertisement

I use this newsletter to share my thoughts on what is going on in the Tesla world. If you want to talk to me directly, you can email me or reach me on Twitter. I don’t bite, be sure to reach out!

Joey has been a journalist covering electric mobility at TESLARATI since August 2019. In his spare time, Joey is playing golf, watching MMA, or cheering on any of his favorite sports teams, including the Baltimore Ravens and Orioles, Miami Heat, Washington Capitals, and Penn State Nittany Lions. You can get in touch with joey at joey@teslarati.com. He is also on X @KlenderJoey. If you're looking for great Tesla accessories, check out shop.teslarati.com

Advertisement
Comments

News

Tesla Robotaxi ride-hailing without a Safety Monitor proves to be difficult

Published

on

Credit: Grok Imagine

Tesla Robotaxi ride-hailing without a Safety Monitor is proving to be a difficult task, according to some riders who made the journey to Austin to attempt to ride in one of its vehicles that has zero supervision.

Last week, Tesla officially removed Safety Monitors from some — not all — of its Robotaxi vehicles in Austin, Texas, answering skeptics who said the vehicles still needed supervision to operate safely and efficiently.

BREAKING: Tesla launches public Robotaxi rides in Austin with no Safety Monitor

Tesla aimed to remove Safety Monitors before the end of 2025, and it did, but only to company employees. It made the move last week to open the rides to the public, just a couple of weeks late to its original goal, but the accomplishment was impressive, nonetheless.

However, the small number of Robotaxis that are operating without Safety Monitors has proven difficult to hail for a ride. David Moss, who has gained notoriety recently as the person who has traveled over 10,000 miles in his Tesla on Full Self-Driving v14 without any interventions, made it to Austin last week.

He has tried to get a ride in a Safety Monitor-less Robotaxi for the better part of four days, and after 38 attempts, he still has yet to grab one:

Tesla said last week that it was rolling out a controlled test of the Safety Monitor-less Robotaxis. Ashok Elluswamy, who heads the AI program at Tesla, confirmed that the company was “starting with a few unsupervised vehicles mixed in with the broader Robotaxi fleet with Safety Monitors,” and that “the ratio will increase over time.”

This is a good strategy that prioritizes safety and keeps the company’s controlled rollout at the forefront of the Robotaxi rollout.

However, it will be interesting to see how quickly the company can scale these completely monitor-less rides. It has proven to be extremely difficult to get one, but that is understandable considering only a handful of the cars in the entire Austin fleet are operating with no supervision within the vehicle.

Continue Reading

News

Tesla gives its biggest hint that Full Self-Driving in Europe is imminent

Published

on

Credit: BLKMDL3 | X

Tesla has given its biggest hint that Full Self-Driving in Europe is imminent, as a new feature seems to show that the company is preparing for frequent border crossings.

Tesla owner and influencer BLKMDL3, also known as Zack, recently took his Tesla to the border of California and Mexico at Tijuana, and at the international crossing, Full Self-Driving showed an interesting message: “Upcoming country border — FSD (Supervised) will become unavailable.”

Due to regulatory approvals, once a Tesla operating on Full Self-Driving enters a new country, it is required to comply with the laws and regulations that are applicable to that territory. Even if legal, it seems Tesla will shut off FSD temporarily, confirming it is in a location where operation is approved.

This is something that will be extremely important in Europe, as crossing borders there is like crossing states in the U.S.; it’s pretty frequent compared to life in America, Canada, and Mexico.

Tesla has been working to get FSD approved in Europe for several years, and it has been getting close to being able to offer it to owners on the continent. However, it is still working through a lot of the red tape that is necessary for European regulators to approve use of the system on their continent.

This feature seems to be one that would be extremely useful in Europe, considering the fact that crossing borders into other countries is much more frequent than here in the U.S., and would cater to an area where approvals would differ.

Tesla has been testing FSD in Spain, France, England, and other European countries, and plans to continue expanding this effort. European owners have been fighting for a very long time to utilize the functionality, but the red tape has been the biggest bottleneck in the process.

Advertisement

Tesla Europe builds momentum with expanding FSD demos and regional launches

Tesla operates Full Self-Driving in the United States, China, Canada, Mexico, Puerto Rico, Australia, New Zealand, and South Korea.

Continue Reading

Elon Musk

SpaceX Starship V3 gets launch date update from Elon Musk

The first flight of Starship Version 3 and its new Raptor V3 engines could happen as early as March.

Published

on

Credit: SpaceX/X

Elon Musk has announced that SpaceX’s next Starship launch, Flight 12, is expected in about six weeks. This suggests that the first flight of Starship Version 3 and its new Raptor V3 engines could happen as early as March.

In a post on X, Elon Musk stated that the next Starship launch is in six weeks. He accompanied his announcement with a photo that seemed to have been taken when Starship’s upper stage was just about to separate from the Super Heavy Booster. Musk did not state whether SpaceX will attempt to catch the Super Heavy Booster during the upcoming flight.

The upcoming flight will mark the debut of Starship V3. The upgraded design includes the new Raptor V3 engine, which is expected to have nearly twice the thrust of the original Raptor 1, at a fraction of the cost and with significantly reduced weight. The Starship V3 platform is also expected to be optimized for manufacturability. 

The Starship V3 Flight 12 launch timeline comes as SpaceX pursues an aggressive development cadence for the fully reusable launch system. Previous iterations of Starship have racked up a mixed but notable string of test flights, including multiple integrated flight tests in 2025.

Interestingly enough, SpaceX has teased an aggressive timeframe for Starship V3’s first flight. Way back in late November, SpaceX noted on X that it will be aiming to launch Starship V3’s maiden flight in the first quarter of 2026. This was despite setbacks like a structural anomaly on the first V3 booster during ground testing.

“Starship’s twelfth flight test remains targeted for the first quarter of 2026,” the company wrote in its post on X. 

Continue Reading