News
Do autonomous cars make us worse drivers?
Autonomous cars are coming. So is the first fatality associated with them. Statistically, that milestone should occur in the next 18 months. What will happen then?
On May 31, 2009, an Airbus 330 on its way from Rio de Janiero to Paris plunged from an altitude of 35,000 feet into the Atlantic, killing all 228 people on board. Just prior to the crash, the airplane was operating in autopilot mode. A reconstruction of the disaster revealed input from several sensors had been compromised by ice that caused them to give false readings. Updated sensors that were less susceptible to ice accumulation were waiting to be installed after the plane arrived in Paris.
Because of the false readings the autopilot system disengaged returning control to the pilots however the senior pilot was sleeping at the time. The two junior pilots were not as highly trained in high altitude flight as they might have been, partly because the use of machines to control aircraft under those conditions was the norm.
Faced with the unexpected, the pilots behaved poorly. At one point they are heard to say on the cockpit recorder, “We completely lost control of the airplane, and we don’t understand anything! We tried everything!” While they tried to rouse the sleeping senior pilot, the nose of the aircraft climbed until a stall was induced. Stall is the point at which the wings become barn doors instead of airfoils. The Airbus 330 dropped from the sky like a rock.
In his excellent story about the crash published on Vanity Fair, William Langewiesche offered this conclusion: “Automation has made it more and more unlikely that ordinary airline pilots will ever have to face a raw crisis in flight—but also more and more unlikely that they will be able to cope with such a crisis if one arises.”
The Tesla community has seen similar instances lately. The driver in Salt Lake City who accidentally activated Summon, causing his car to drive into the back of a truck. The woman on a freeway in California who rear ended a car that suddenly slowed in front of her. The man in Europe who crashed into the back of a van that had stalled in the high speed lane of a highway. He at least had the courage to admit his error. “Yes, I could have reacted sooner, but when the car slows down correctly 1,000 times, you trust it to do it the next time to. My bad.”
After each of these incidents, the tendency has been for many to defend the machine and blame the human. But in a recent article for The Guardian, author Martin Robbins says, “Combine an autopilot with a good driver, and you get an autopilot with, if not a bad driver, at least not such a good one.” He says that statistically, the time when a car operating in autonomous mode causes a fatality is rapidly approaching.
On average, a person is killed in a traffic accident in the United States once every 100 million miles. Elon Musk says Tesla’s Autopilot is half as likely to be involved in a collision as a human driver. That would suggest that somewhere around the 200 million mile mark someone will die as a result of an automobile driven by a machine.
Tesla has already passed the 100 million mile mark for cars driving in Autopilot mode and continues to log 2.6 million miles driven per day. Statistically speaking, the time when a self driving car kills somebody is rapidly approaching. And since most autonomous cars on the road are Teslas, the odds are excellent it will be a Tesla that is involved in that first fatality.
What will happen then? Robbins goes back in history to look for an answer to that question. In 1896, Bridgit Driscoll became the first person in England to be killed by a motor car. The reaction among the public and the press was a fatalistic acceptance that progress will have a price. Within a few years, the speed limit in England was raised from 8 mph — which is was when Ms. Driscoll was killed — to 20 mph. This despite the fact that thousands of road deaths were being recorded on English roads by then.
Regulators around the world are racing to catch up with the explosion of new autonomous driving technology. But Robbins concludes, “By the time they do, it’s likely that the technology will already be an accepted fact of life, its safety taken for granted by consumers, its failures written off as the fault of its error-prone human masters.”
The point is that injuries and fatalities will continue to occur as cars come to rely more and more on machines for routine driving chores. But in that transition period between now and the time when Level 4 autonomy becomes the norm — the day when cars come from the factory with no way for humans to control them directly — we need to accept that complacency and an inflated belief in the power of machines to protect us from harm may actually render us less competent behind the wheel.
We will need to remain vigilant, if for no other reason than telling a jury “It’s not my fault! The machine failed!” is not going to insulate us from the legal requirement to operate our private automobiles in a safe and prudent manner.
News
Tesla starts rolling out FSD V14.2.1 to AI4 vehicles including Cybertruck
FSD V14.2.1 was released just about a week after the initial FSD V14.2 update was rolled out.
It appears that the Tesla AI team burned the midnight oil, allowing them to release FSD V14.2.1 on Thanksgiving. The update has been reported by Tesla owners with AI4 vehicles, as well as Cybertruck owners.
For the Tesla AI team, at least, it appears that work really does not stop.
FSD V14.2.1
Initial posts about FSD V14.2.1 were shared by Tesla owners on social media platform X. As per the Tesla owners, V14.2.1 appears to be a point update that’s designed to polish the features and capacities that have been available in FSD V14. A look at the release notes for FSD V14.2.1, however, shows that an extra line has been added.
“Camera visibility can lead to increased attention monitoring sensitivity.”
Whether this could lead to more drivers being alerted to pay attention to the roads more remains to be seen. This would likely become evident as soon as the first batch of videos from Tesla owners who received V14.21 start sharing their first drive impressions of the update. Despite the update being released on Thanksgiving, it would not be surprising if first impressions videos of FSD V14.2.1 are shared today, just the same.
Rapid FSD releases
What is rather interesting and impressive is the fact that FSD V14.2.1 was released just about a week after the initial FSD V14.2 update was rolled out. This bodes well for Tesla’s FSD users, especially since CEO Elon Musk has stated in the past that the V14.2 series will be for “widespread use.”
FSD V14 has so far received numerous positive reviews from Tesla owners, with numerous drivers noting that the system now drives better than most human drivers because it is cautious, confident, and considerate at the same time. The only question now, really, is if the V14.2 series does make it to the company’s wide FSD fleet, which is still populated by numerous HW3 vehicles.
News
Waymo rider data hints that Tesla’s Cybercab strategy might be the smartest, after all
These observations all but validate Tesla’s controversial two-seat Cybercab strategy, which has caught a lot of criticism since it was unveiled last year.
Toyota Connected Europe designer Karim Dia Toubajie has highlighted a particular trend that became evident in Waymo’s Q3 2025 occupancy stats. As it turned out, 90% of the trips taken by the driverless taxis carried two or fewer passengers.
These observations all but validate Tesla’s controversial two-seat Cybercab strategy, which has caught a lot of criticism since it was unveiled last year.
Toyota designer observes a trend
Karim Dia Toubajie, Lead Product Designer (Sustainable Mobility) at Toyota Connected Europe, analyzed Waymo’s latest California Public Utilities Commission filings and posted the results on LinkedIn this week.
“90% of robotaxi trips have 2 or less passengers, so why are we using 5-seater vehicles?” Toubajie asked. He continued: “90% of trips have 2 or less people, 75% of trips have 1 or less people.” He accompanied his comments with a graphic showing Waymo’s occupancy rates, which showed 71% of trips having one passenger, 15% of trips having two passengers, 6% of trips having three passengers, 5% of trips having zero passengers, and only 3% of trips having four passengers.
The data excludes operational trips like depot runs or charging, though Toubajie pointed out that most of the time, Waymo’s massive self-driving taxis are really just transporting 1 or 2 people, at times even no passengers at all. “This means that most of the time, the vehicle being used significantly outweighs the needs of the trip,” the Toyota designer wrote in his post.
Cybercab suddenly looks perfectly sized
Toubajie gave a nod to Tesla’s approach. “The Tesla Cybercab announced in 2024, is a 2-seater robotaxi with a 50kWh battery but I still believe this is on the larger side of what’s required for most trips,” he wrote.
With Waymo’s own numbers now proving 90% of demand fits two seats or fewer, the wheel-less, lidar-free Cybercab now looks like the smartest play in the room. The Cybercab is designed to be easy to produce, with CEO Elon Musk commenting that its product line would resemble a consumer electronics factory more than an automotive plant. This means that the Cybercab could saturate the roads quickly once it is deployed.
While the Cybercab will likely take the lion’s share of Tesla’s ride-hailing passengers, the Model 3 sedan and Model Y crossover would be perfect for the remaining 9% of riders who require larger vehicles. This should be easy to implement for Tesla, as the Model Y and Model 3 are both mass-market vehicles.
Elon Musk
Elon Musk and James Cameron find middle ground in space and AI despite political differences
Musk responded with some positive words for the director on X.
Avatar director James Cameron has stated that he can still agree with Elon Musk on space exploration and AI safety despite their stark political differences.
In an interview with Puck’s The Town podcast, the liberal director praised Musk’s SpaceX achievements and said higher priorities must unite them, such as space travel and artificial intelligence. Musk responded with some positive words for the director on X.
A longtime mutual respect
Cameron and Musk have bonded over technology for years. As far back as 2011, Cameron told NBC News that “Elon is making very strong strides. I think he’s the likeliest person to step into the shoes of the shuttle program and actually provide human access to low Earth orbit. So… go, Elon.” Cameron was right, as SpaceX would go on to become the dominant force in spaceflight over the years.
Even after Musk’s embrace of conservative politics and his roles as senior advisor and former DOGE head, Cameron refused to cancel his relationship with the CEO. “I can separate a person and their politics from the things that they want to accomplish if they’re aligned with what I think are good goals,” Cameron said. Musk appreciated the director’s comments, stating that “Jim understands physics, which is rare in Hollywood.”
Shared AI warnings
Both men have stated that artificial intelligence could be an existential threat to humanity, though Musk has noted that Tesla’s products such as Optimus could usher in an era of sustainable abundance. Musk recently predicted that money and jobs could become irrelevant with advancing AI, while Cameron warned of a deeper crisis, as noted in a Fox News report.
“Because the overall risk of AI in general… is that we lose purpose as people. We lose jobs. We lose a sense of, ‘Well, what are we here for?’” Cameron said. “We are these flawed biological machines, and a computer can be theoretically more precise, more correct, faster, all of those things. And that’s going to be a threshold existential issue.”
He concluded: “I just think it’s important for us as a human civilization to prioritize. We’ve got to make this Earth our spaceship. That’s really what we need to be thinking.”

