News
Tesla Autopilot ‘easily tricked’ by Consumer Reports in bizarre test
Consumer Reports claims to have shown that Tesla Autopilot can be “easily tricked” into driving without anyone in the driver’s seat. The test process was extremely bizarre and required certain items that most drivers would never have in their vehicles.
CR released a report on April 22nd entitled, “CR Engineers Show a Tesla Will Drive With No One in the Driver’s Seat.” The test was in response to the recent and very public Tesla Model S crash in Texas, where two men, unfortunately, passed away after their all-electric sedan crashed violently into a tree at a high speed. Investigators are attempting to determine whether the vehicle was “driverless,” a claim made by several mainstream media outlets. CEO Elon Musk chimed in just days after the crash and the very public coverage of it to say that it would be impossible for Autopilot to function on the road where the crash occurred due to the lack of road lines, which are required to initiate the use of Basic Autopilot.
Tesla alleged “driverless” crash in Texas: What is known so far
The CR test required the vehicle, a Tesla Model Y, to be in motion, and engineers then engaged Autopilot and set the speed dial to 0, which brought the car to a stop. Next, Jake Fisher, CR’s Senior Director of Auto Testing, placed a “small, weighted chain on the steering wheel, to simulate the weight of a driver’s hand, and slid over into the front passenger seat without opening any of the vehicle’s doors, because that would disengage Autopilot.” The Autopilot speed was then adjusted so that the vehicle would accelerate from its stationary position. The car managed to drive up and down the half-mile lane of the CR test track, although nobody was in the seat or controlling the vehicle. “It was a bit frightening when we realized how easy it was to defeat the safeguards, which we proved were clearly insufficient,” Fisher said. The engineers encouraged nobody to try the experiment at home, but who will have a custom weighted chain sitting around to experiment with anyway?
“In our evaluation, the system not only failed to make sure the driver was paying attention, but it also couldn’t tell if there was a driver there at all,” Fisher added, but he wasn’t done throwing shade at Tesla. “Tesla is falling behind other automakers like GM and Ford that, on models with advanced driver assist systems, use technology to make sure the driver is looking at the road.” GM’s SuperCruise and Ford’s recently released BlueCruise are what Fisher is referencing, but the comparisons don’t really add up.
Tesla Autopilot has over 23 billion real-world miles of data that is stored in a Neural Network to improve performance. With every mile driven, Tesla’s semi-autonomous driving functionalities become more robust, more precise, and more adaptable to human behavior. Ford and GM have accumulated only a fraction of these statistics. Tesla, meanwhile, recently reported its Q1 2021 Safety Report, where it found that Autopilot is nearly 10 times safer than human driving.
Tesla’s Q1 2021 accident data shows Autopilot is closing in on being 10X safer than humans
The test performed by CR is extremely bizarre because people would not normally have all of these things in their vehicle or even in their possession, to begin with. Tesla maintains that drivers are responsible for remaining attentive during the entirety of their driving experience. The company has never claimed to have released a program capable of Level 5 autonomy where a driver needs to pay no attention to the road or the vehicle’s surroundings. Yet, Tesla’s very-publicized crash raises questions from those who have a historical distaste for the company and its products. Consumer Reports has not been keen on Tesla in the past. They have indicated that GM’s SuperCruise, despite being less effective or safe than Autopilot based on data, holds a commanding lead over Tesla’s semi-autonomous driving program.
It is worth noting that Tesla has several safety thresholds that would prohibit anyone from attempting to let the vehicle drive itself. These include a steering wheel monitoring system, which will bring the car to a complete stop if the driver is not holding it. The system also requires a driver to be in the seat to function, and the company recently revoked FSD software from several drivers who were abusing the program by being inattentive. More safety features, like a facial features recognition camera, will monitor the driver’s eyes and face to ensure they are paying attention to the road.
What are your thoughts on the CR study? Let us know in the comments, or let me know at @KlenderJoey on Twitter. You can email me at joey@teslarati.com as well.
Elon Musk
Tesla’s Elon Musk: 10 billion miles needed for safe Unsupervised FSD
As per the CEO, roughly 10 billion miles of training data are required due to reality’s “super long tail of complexity.”
Tesla CEO Elon Musk has provided an updated estimate for the training data needed to achieve truly safe unsupervised Full Self-Driving (FSD).
As per the CEO, roughly 10 billion miles of training data are required due to reality’s “super long tail of complexity.”
10 billion miles of training data
Musk comment came as a reply to Apple and Rivian alum Paul Beisel, who posted an analysis on X about the gap between tech demonstrations and real-world products. In his post, Beisel highlighted Tesla’s data-driven lead in autonomy, and he also argued that it would not be easy for rivals to become a legitimate competitor to FSD quickly.
“The notion that someone can ‘catch up’ to this problem primarily through simulation and limited on-road exposure strikes me as deeply naive. This is not a demo problem. It is a scale, data, and iteration problem— and Tesla is already far, far down that road while others are just getting started,” Beisel wrote.
Musk responded to Beisel’s post, stating that “Roughly 10 billion miles of training data is needed to achieve safe unsupervised self-driving. Reality has a super long tail of complexity.” This is quite interesting considering that in his Master Plan Part Deux, Elon Musk estimated that worldwide regulatory approval for autonomous driving would require around 6 billion miles.
FSD’s total training miles
As 2025 came to a close, Tesla community members observed that FSD was already nearing 7 billion miles driven, with over 2.5 billion miles being from inner city roads. The 7-billion-mile mark was passed just a few days later. This suggests that Tesla is likely the company today with the most training data for its autonomous driving program.
The difficulties of achieving autonomy were referenced by Elon Musk recently, when he commented on Nvidia’s Alpamayo program. As per Musk, “they will find that it’s easy to get to 99% and then super hard to solve the long tail of the distribution.” These sentiments were echoed by Tesla VP for AI software Ashok Elluswamy, who also noted on X that “the long tail is sooo long, that most people can’t grasp it.”
News
Tesla earns top honors at MotorTrend’s SDV Innovator Awards
MotorTrend’s SDV Awards were presented during CES 2026 in Las Vegas.
Tesla emerged as one of the most recognized automakers at MotorTrend’s 2026 Software-Defined Vehicle (SDV) Innovator Awards.
As could be seen in a press release from the publication, two key Tesla employees were honored for their work on AI, autonomy, and vehicle software. MotorTrend’s SDV Awards were presented during CES 2026 in Las Vegas.
Tesla leaders and engineers recognized
The fourth annual SDV Innovator Awards celebrate pioneers and experts who are pushing the automotive industry deeper into software-driven development. Among the most notable honorees for this year was Ashok Elluswamy, Tesla’s Vice President of AI Software, who received a Pioneer Award for his role in advancing artificial intelligence and autonomy across the company’s vehicle lineup.
Tesla also secured recognition in the Expert category, with Lawson Fulton, a staff Autopilot machine learning engineer, honored for his contributions to Tesla’s driver-assistance and autonomous systems.
Tesla’s software-first strategy
While automakers like General Motors, Ford, and Rivian also received recognition, Tesla’s multiple awards stood out given the company’s outsized role in popularizing software-defined vehicles over the past decade. From frequent OTA updates to its data-driven approach to autonomy, Tesla has consistently treated vehicles as evolving software platforms rather than static products.
This has made Tesla’s vehicles very unique in their respective sectors, as they are arguably the only cars that objectively get better over time. This is especially true for vehicles that are loaded with the company’s Full Self-Driving system, which are getting progressively more intelligent and autonomous over time. The majority of Tesla’s updates to its vehicles are free as well, which is very much appreciated by customers worldwide.
Elon Musk
Judge clears path for Elon Musk’s OpenAI lawsuit to go before a jury
The decision maintains Musk’s claims that OpenAI’s shift toward a for-profit structure violated early assurances made to him as a co-founder.
A U.S. judge has ruled that Elon Musk’s lawsuit accusing OpenAI of abandoning its founding nonprofit mission can proceed to a jury trial.
The decision maintains Musk’s claims that OpenAI’s shift toward a for-profit structure violated early assurances made to him as a co-founder. These claims are directly opposed by OpenAI.
Judge says disputed facts warrant a trial
At a hearing in Oakland, U.S. District Judge Yvonne Gonzalez Rogers stated that there was “plenty of evidence” suggesting that OpenAI leaders had promised that the organization’s original nonprofit structure would be maintained. She ruled that those disputed facts should be evaluated by a jury at a trial in March rather than decided by the court at this stage, as noted in a Reuters report.
Musk helped co-found OpenAI in 2015 but left the organization in 2018. In his lawsuit, he argued that he contributed roughly $38 million, or about 60% of OpenAI’s early funding, based on assurances that the company would remain a nonprofit dedicated to the public benefit. He is seeking unspecified monetary damages tied to what he describes as “ill-gotten gains.”
OpenAI, however, has repeatedly rejected Musk’s allegations. The company has stated that Musk’s claims were baseless and part of a pattern of harassment.
Rivalries and Microsoft ties
The case unfolds against the backdrop of intensifying competition in generative artificial intelligence. Musk now runs xAI, whose Grok chatbot competes directly with OpenAI’s flagship ChatGPT. OpenAI has argued that Musk is a frustrated commercial rival who is simply attempting to slow down a market leader.
The lawsuit also names Microsoft as a defendant, citing its multibillion-dollar partnerships with OpenAI. Microsoft has urged the court to dismiss the claims against it, arguing there is no evidence it aided or abetted any alleged misconduct. Lawyers for OpenAI have also pushed for the case to be thrown out, claiming that Musk failed to show sufficient factual basis for claims such as fraud and breach of contract.
Judge Gonzalez Rogers, however, declined to end the case at this stage, noting that a jury would also need to consider whether Musk filed the lawsuit within the applicable statute of limitations. Still, the dispute between Elon Musk and OpenAI is now headed for a high-profile jury trial in the coming months.