News
Tesla Autopilot Abusers need to be held accountable, but how?
Tesla Autopilot Abusers need to be held accountable for their actions. For years, Tesla engineers have worked long and hard to improve Autopilot and Full Self-Driving. Hundreds of thousands of hours of work have been put into these driving assistance programs, whether it would be through software, coding, and programming or through other mediums. However, years of hard work, diligence, and improvement can be wiped away from the public’s perception in a minute with one foolish, irresponsible, and selfish act that can be derived from an owner’s need to show off their car’s semi-autonomous functionalities to others.
The most recent example of this is with Param Sharma, a self-proclaimed “rich as f***” social media influencer who has spent the last few days sparring with Tesla enthusiasts through his selfish and undeniably dangerous act of jumping in the backseat while his car is operating on Autopilot. Sharma has been seen on numerous occasions sitting in the backseat of his car while the vehicle drives itself. It is almost a sure thing that Sharma is using several cheat devices in his Tesla to bypass typical barriers the company has installed to ensure drivers are paying attention. These include a steering wheel sensor, seat sensors, and seatbelt sensors, all of which must be controlled or connected by the driver at the time of Autopilot’s use. We have seen several companies and some owners use DIY hack devices to bypass these safety thresholds. These are hazardous acts for several reasons, the most important being the lack of appreciation for other human lives.
This is a preview from our weekly newsletter. Each week I go ‘Beyond the News’ and handcraft a special edition that includes my thoughts on the biggest stories, why it matters, and how it could impact the future.
While Tesla fans and enthusiasts are undoubtedly confident in the abilities of Autopilot and Full Self-Driving, they will also admit that the use of these suites needs to be used responsibly and as the company describes. Tesla has never indicated that its vehicles can drive themselves, which can be characterized as “Level 5 Autonomy.” The company also indicates that drivers must keep their hands on the steering wheel at all times. There are several safety features that Tesla has installed to ensure that these are recognized by the car’s operator. If these safety precautions are not followed, the driver runs the risk of being put in “Autopilot Jail,” where they will not have the feature available to them for the remainder of their drive.
As previously mentioned, there are cheat devices for all of these safety features, however. This is where Tesla cannot necessarily control what goes on, and law enforcement, in my opinion, is more responsible than the company actually is. It is law enforcement’s job to stop this from happening if an officer sees it occurring. Nobody should be able to climb into the backseat of their vehicle while it is driving. A least not until many years of testing are completed, and many miles of fully autonomous functionalities are proven to be accurate and robust enough to handle real-world traffic.
The reason Tesla should step in, in my opinion, and create a list of repeat offenders who have proven themselves to be irresponsible and not trustworthy enough for Autopilot and FSD, is because if an accident happens while these influencers or everyday drivers are taking advantage of Autopilot’s capabilities, Tesla, along with every other company working to develop Level 5 Autonomous vehicles, takes a huge step backward. Not only will Tesla feel the most criticism from the media, but it will be poured on as the company is taking no real steps to prevent it from happening. Unbelievably, we in the Tesla community know what the vehicles can and what safety precautions have been installed to prevent these incidents from happening. However, mainstream media outlets do not have an explicit and in-depth understanding of Tesla’s capabilities. There is plenty of evidence to suggest that they have no intentions of improving their comprehension of what Tesla does daily.
While talking to someone about this subject on Thursday, they highlighted that this isn’t Tesla’s concern. And while I believe that it really isn’t, I don’t think that’s an acceptable answer to solve all of the abuses going on with the cars. Tesla should take matters into its own hands, and I believe it should because it has done it before. Elon Musk and Tesla decided to expand the FSD Beta testing pool recently, but the company also revoked access to some people who have decided that they would not use the functionality properly. Why is this any different in the case of AP/FSD? Just because someone pays for something doesn’t mean the company cannot revoke access to it. If you pay for access to play video games online and hack or use abusive language, there are major consequences. Your console can get banned, and you would be required to buy a completely new unit if you ever wished to play online video games again.
While unfortunate, Tesla will have to make a stand against those who abuse Autopilot, in my opinion. There needs to be heavier consequences by the company simply because an accident caused by abuse or misuse of the functionalities could set the company back several years and put their work to solve Level 5 Autonomy in a vacuum. There is entirely too much at stake here to even begin to let people off the hook. I believe that Tesla’s actions should follow law enforcement action. When police officers find someone violating the proper use of the system, the normal reckless driving charges should be held up, and there should be increasingly worse consequences for every subsequent offense. Perhaps after the third offense, Tesla could be contacted and could have AP/FSD taken off of the car. There could be a probationary period or a zero-tolerance policy; it would all be up to the company.
I believe that this needs to be taken so seriously, and there need to be consequences because of the blatant disregard for other people and their work. The irresponsible use of AP/FSD by childish drivers means that Tesla’s hard work is being jeopardized by horrible behavior. While many people don’t enjoy driving, it still requires responsibility, and everyone on the road is entrusting you to drive responsibly. It could cost your life or, even worse, someone else’s.
A big thanks to our long-time supporters and new subscribers! Thank you.
I use this newsletter to share my thoughts on what is going on in the Tesla world. If you want to talk to me directly, you can email me or reach me on Twitter. I don’t bite, be sure to reach out!
News
Tesla FSD V14.2.1 is earning rave reviews from users in diverse conditions
Tesla’s Full Self-Driving (Supervised) software continues its rapid evolution, with the latest V14.2.1 update drawing widespread praise.
Tesla’s Full Self-Driving (Supervised) software continues its rapid evolution, with the latest V14.2.1 update drawing widespread praise for its smoother performance and smarter decision-making.
Videos and firsthand accounts from Tesla owners highlight V14.2.1 as an update that improves navigation responsiveness, sign recognition, and overall fluidity, among other things. Some drivers have even described it as “more alive than ever,” hinting at the system eventually feeling “sentient,” as Elon Musk has predicted.
FSD V14.2.1 first impressions
Early adopters are buzzing about how V14.2.1 feels less intrusive while staying vigilant. In a post shared on X, Tesla owner @LactoseLunatic described the update as a “huge leap forward,” adding that the system remains “incredibly assertive but still safe.”
Another Tesla driver, Devin Olsenn, who logged ~600 km on V14.2.1, reported no safety disengagements, with the car feeling “more alive than ever.” The Tesla owner noted that his wife now defaults to using FSD V14, as the system is already very smooth and refined.
Adverse weather and regulatory zones are testing grounds where V14.2.1 shines, at least according to testers in snow areas. Tesla watcher Sawyer Merritt shared a video of his first snowy drive on unplowed rural roads in New Hampshire, where FSD did great and erred on the side of caution. As per Merritt, FSD V14.2.1 was “extra cautious” but it performed well overall.
Sign recognition and freeway prowess
Sign recognition also seemed to show improvements with FSD V14.2.1. Longtime FSD tester Chuck Cook highlighted a clip from his upcoming first-impressions video, showcasing improved school zone behavior. “I think it read the signs better,” he observed, though in standard mode, it didn’t fully drop to 15 mph within the short timeframe. This nuance points to V14.2.1’s growing awareness of temporal rules, a step toward fewer false positives in dynamic environments.
FSD V14.2.1 also seems to excel in high-stress highway scenarios. Fellow FSD tester @BLKMDL3 posted a video of FSD V14.2.1 managing a multi-lane freeway closure due to a police chase-related accident. “Perfectly handles all lanes of the freeway merging into one,” the Tesla owner noted in his post on X.
FSD V14.2.1 was released on Thanksgiving, much to the pleasant surprise of Tesla owners. The update’s release notes are almost identical to the system’s previous iteration, save for one line item read, “Camera visibility can lead to increased attention monitoring sensitivity.”
News
Tesla FSD Supervised ride-alongs in Europe begin in Italy, France, and Germany
The program allows the public to hop in as a non-driving observer to witness FSD navigate urban streets firsthand.
Tesla has kicked off passenger ride-alongs for Full Self-Driving (Supervised) in Italy, France and Germany. The program allows the public to hop in as a non-driving observer to witness FSD navigate urban streets firsthand.
The program, detailed on Tesla’s event pages, arrives ahead of a potential early 2026 Dutch regulatory approval that could unlock a potential EU-wide rollout for FSD.
Hands-Off Demos
Tesla’s ride-along invites participants to “ride along in the passenger seat to experience how it handles real-world traffic & the most stressful parts of daily driving, making the roads safer for all,” as per the company’s announcement on X through its official Tesla Europe & Middle East account.
Sign-ups via localized pages offer free slots through December, with Tesla teams piloting vehicles through city streets, roundabouts and highways.
“Be one of the first to experience Full Self-Driving (Supervised) from the passenger seat. Our team will take you along as a passenger and show you how Full Self-Driving (Supervised) works under real-world road conditions,” Tesla wrote. “Discover how it reacts to live traffic and masters the most stressful parts of driving to make the roads safer for you and others. Come join us to learn how we are moving closer to a fully autonomous future.”
Building trust towards an FSD Unsupervised rollout
Tesla’s FSD (Supervised) ride-alongs could be an effective tool to build trust and get regular car buyers and commuters used to the idea of vehicles driving themselves. By seating riders shotgun, Tesla could provide participants with a front row seat to the bleeding edge of consumer-grade driverless systems.
FSD (Supervised) has already been rolled out to several countries, such as the United States, Canada, Australia, New Zealand, and partially in China. So far, FSD (Supervised) has been received positively by drivers, as it really makes driving tasks and long trips significantly easier and more pleasant.
FSD is a key safety feature as well, which became all too evident when a Tesla driving on FSD was hit by what seemed to be a meteorite in Australia. The vehicle moved safely despite the impact, though the same would likely not be true had the car been driven manually.
News
Swedish union rep pissed that Tesla is working around a postal blockade they started
Tesla Sweden is now using dozens of private residences as a way to obtain license plates for its vehicles.
Two years into their postal blockade, Swedish unions are outraged that Tesla is still able to provide its customers’ vehicles with valid plates through various clever workarounds.
Seko chairman Gabriella Lavecchia called it “embarrassing” that the world’s largest EV maker, owned by CEO Elon Musk, refuses to simply roll over and accept the unions’ demands.
Unions shocked Tesla won’t just roll over and surrender
The postal unions’ blockade began in November 2023 when Seko and IF Metall-linked unions stopped all mail to Tesla sites to force a collective agreement. License plates for Tesla vehicles instantly became the perfect pressure point, as noted in a Dagens Arbete report.
Tesla responded by implementing initiatives to work around the blockades. A recent investigation from Arbetet revealed that Tesla Sweden is now using dozens of private residences, including one employee’s parents’ house in Trångsund and a customer-relations staffer’s home in Vårby, as a way to obtain license plates for its vehicles.
Seko chairman Gabriella Lavecchia is not pleased that Tesla Sweden is working around the unions’ efforts yet again. “It is embarrassing that one of the world’s largest car companies, owned by one of the world’s richest people, has sunk this low,” she told the outlet. “Unfortunately, it is completely frivolous that such a large company conducts business in this way.”
Two years on and plates are still being received
The Swedish Transport Agency has confirmed Tesla is still using several different workarounds to overcome the unions’ blockades.
As noted by DA, Tesla Sweden previously used different addresses to receive its license plates. At one point, the electric vehicle maker used addresses for car care shops. Tesla Sweden reportedly used this strategy in Östermalm in Stockholm, as well as in Norrköping and Gothenburg.
Another strategy that Tesla Sweden reportedly implemented involved replacement plates being ordered by private individuals when vehicles change hands from Tesla to car buyers. There have also been cases where the police have reportedly issued temporary plates to Tesla vehicles.
