Elon Musk’s cautionary statements about uncontrolled experimentation with artificial intelligence (AI) have caused some to ridicule him as a fear-monger, and have given many in the mainstream press the idea that he is opposed to using AI, which is very far from the truth. In fact, AI is a major component of Tesla’s Autopilot system, and the company applies it in several other areas as well.
It was only recently that Tesla publicly revealed that it is working on its own AI hardware. At the NIPS machine learning conference in December, Elon Musk announced that Tesla is “developing specialized AI hardware that we think will be the best in the world.” The company has offered few details, but it’s widely assumed that the main application will be processing the algorithms for Tesla’s Autopilot software.
As Bernard Marr reports in a recent article in Forbes, there’s little doubt that Tesla is way ahead of its potential rivals in the data-gathering department. Every Model S and X built with the Autopilot hardware suite, which was introduced in September 2014, has the potential to become self-driving, and all Tesla vehicles, Autopilot-enabled or not, continually gather data and send it to the cloud. The company has many more sensors on the roads than any of its Detroit or Silicon Valley rivals, and the number will mushroom when Model 3 production hits its stride.
Tesla is crowd-sourcing data not only from its vehicles, but could one day obtain data on its drivers through internal cameras that detect hand placement on instruments or a person’s state of alertness. The company uses the information not only to improve Autopilot by generating data-dense maps, but also to diagnose driving behavior. Many believe that this sort of data will prove to be a valuable commodity that could be sold to third parties (much as data on web-browsing habits is today). McKinsey and Company has estimated that the market for vehicle-gathered data could be worth $750 billion a year by 2030.
Forbes explains that the AI built into Tesla’s system operates at several levels. Machine learning in the cloud educates the entire fleet, while within each individual vehicle, “edge computing” can make decisions about actions a car needs to take immediately. There’s also a third level of decision-making, in which cars can form networks with other Tesla vehicles nearby in order to share local information. In the future, when there are lots of autonomous cars on the road, these networks could also interface with cars from other makers, and systems such as traffic cameras, road-based sensors, and mobile phones.
At this point, no one knows what new forms of AI technology the mad scientists in Palo Alto are cooking up, but Forbes found some clues on the Facebook page of Tesla’s hardware partner Nvidia: “In contrast to the usual approach to operating self-driving cars, we did not program any explicit object detection, mapping, path planning or control components into this car. Instead, the car learns on its own to create all necessary internal representations necessary to steer, simply by observing human drivers.”
This unsupervised learning model contrasts with the more familiar approach of supervised learning, in which algorithms are trained beforehand about right or wrong decisions. Each approach has its pros and cons, and it’s likely that Tesla’s strategy includes both.
Forbes reports that Tesla’s use of AI is not limited to Autopilot – the company employs machine learning in the design and manufacturing processes, to process customer data, and even to scan the text in online forums for insights into commonly-reported problems. It’s ironic that some in the press choose to portray Elon Musk as an AI Luddite, when in fact Tesla may be one of the most sophisticated users of the technology.
===
Note: Article originally published on evannex.com, by Charles Morris
Source: Forbes
Elon Musk
SpaceX Starship Version 3 booster crumples in early testing
Photos of the incident’s aftermath suggest that Booster 18 will likely be retired.
SpaceX’s new Starship first-stage booster, Booster 18, suffered major damage early Friday during its first round of testing in Starbase, Texas, just one day after rolling out of the factory.
Based on videos of the incident, the lower section of the rocket booster appeared to crumple during a pressurization test. Photos of the incident’s aftermath suggest that Booster 18 will likely be retired.
Booster test failure
SpaceX began structural and propellant-system verification tests on Booster 18 Thursday night at the Massey’s Test Site, only a few miles from Starbase’s production facilities, as noted in an Ars Technica report. At 4:04 a.m. CT on Friday, a livestream from LabPadre Space captured the booster’s lower half experiencing a sudden destructive event around its liquid oxygen tank section. Post-incident images, shared on X by @StarshipGazer, showed notable deformation in the booster’s lower structure.
Neither SpaceX nor Elon Musk had commented as of Friday morning, but the vehicle’s condition suggests it is likely a complete loss. This is quite unfortunate, as Booster 18 is already part of the Starship V3 program, which includes design fixes and upgrades intended to improve reliability. While SpaceX maintains a rather rapid Starship production line in Starbase, Booster 18 was generally expected to validate the improvements implemented in the V3 program.
Tight deadlines
SpaceX needs Starship boosters and upper stages to begin demonstrating rapid reuse, tower catches, and early operational Starlink missions over the next two years. More critically, NASA’s Artemis program depends on an on-orbit refueling test in the second half of 2026, a requirement for the vehicle’s expected crewed lunar landing around 2028.
While SpaceX is known for diagnosing failures quickly and returning to testing at unmatched speed, losing the newest-generation booster at the very start of its campaign highlights the immense challenge involved in scaling Starship into a reliable, high-cadence launch system. SpaceX, however, is known for getting things done quickly, so it would not be a surprise if the company manages to figure out what happened to Booster 18 in the near future.
News
Tesla FSD (Supervised) is about to go on “widespread” release
In a comment last October, Elon Musk stated that FSD V14.2 is “for widespread use.”
Tesla has begun rolling out Full Self-Driving (Supervised) V14.2, and with this, the wide release of the system could very well begin.
The update introduces a new high-resolution vision encoder, expanded emergency-vehicle handling, smarter routing, new parking options, and more refined driving behavior, among other improvements.
FSD V14.2 improvements
FSD (Supervised) V14.2’s release notes highlight a fully upgraded neural-network vision encoder capable of reading higher-resolution features, giving the system improved awareness of emergency vehicles, road obstacles, and even human gestures. Tesla also expanded its emergency-vehicle protocols, adding controlled pull-overs and yielding behavior for police cars, fire trucks, and ambulances, among others.
A deeper integration of navigation and routing into the vision network now allows the system to respond to blocked roads or detours in real time. The update also enhances decision-making in several complex scenarios, including unprotected turns, lane changes, vehicle cut-ins, and interactions with school buses. All in all, these improvements should help FSD (Supervised) V14.2 perform in a very smooth and comfortable manner.
Elon Musk’s predicted wide release
The significance of V14.2 grows when paired with Elon Musk’s comments from October. While responding to FSD tester AI DRIVR, who praised V14.1.2 for fixing “95% of indecisive lane changes and braking” and who noted that it was time for FSD to go on wide release, Musk stated that “14.2 for widespread use.”
FSD V14 has so far received a substantial amount of positive reviews from Tesla owners, many of whom have stated that the system now drives better than some human drivers as it is confident, cautious, and considerate at the same time. With V14.2 now rolling out, it remains to be seen if the update also makes it to the company’s wide FSD fleet, which is still populated by a large number of HW3 vehicles.
News
Tesla FSD V14.2 starts rolling out to initial batch of vehicles
It would likely only be a matter of time before FSD V14.2 videos are posted and shared on social media.
Tesla has begun pushing Full Self-Driving (Supervised) v14.2 to its initial batch of vehicles. The update was initially observed by Tesla owners and veteran FSD users on social media platform X on Friday.
So far, reports of the update have been shared by Model Y owners in California whose vehicles are equipped with the company’s AI4 hardware, though it would not be surprising if more Tesla owners across the country receive the update as well.
Based on the release notes of the update, key improvements in FSD V14.2 include a revamped neural network for better detection of emergency vehicles, obstacles, and human gestures, as well as options to select arrival spots.
It would likely only be a matter of time before FSD V14.2 videos are posted and shared on social media.
Following are the release notes of FSD (Supervised) V14.2, as shared on X by longtime FSD tester Whole Mars Catalog.


Release Notes
2025.38.9.5
Currently Installed
FSD (Supervised) v14.2
Full Self-Driving (Supervised) v14.2 includes:
- Upgraded the neural network vision encoder, leveraging higher resolution features to further improve scenarios like handling emergency vehicles, obstacles on the road, and human gestures.
- Added Arrival Options for you to select where FSD should park: in a Parking Lot, on the Street, in a Driveway, in a Parking Garage, or at the Curbside.
- Added handling to pull over or yield for emergency vehicles (e.g. police cars, fire trucks, ambulances.
- Added navigation and routing into the vision-based neural network for real-time handling of blocked roads and detours.
- Added additional Speed Profile to further customize driving style preference.
- Improved handling for static and dynamic gates.
- Improved offsetting for road debris (e.g. tires, tree branches, boxes).
- Improve handling of several scenarios including: unprotected turns, lane changes, vehicle cut-ins, and school busses.
- Improved FSD’s ability to manage system faults and improve scenarios like handling emergency vehicles, obstacles on the road, and human gestures.
- Added Arrival Options for you to select where FSD should park: in a Parking Lot, on the Street, in a Driveway, in a Parking Garage, or at the Curbside.
- Added handling to pull over or yield for emergency vehicles (e.g. police cars, fire trucks, ambulances).
- Added navigation and routing into the vision-based neural network for real-time handling of blocked roads and detours.
- Added additional Speed Profile to further customize driving style preference.
- Improved handling for static and dynamic gates.
- Improved offsetting for road debris (e.g. tires, tree branches, boxes).
- Improve handling of several scenarios, including unprotected turns, lane changes, vehicle cut-ins, and school buses.
- Improved FSD’s ability to manage system faults and recover smoothly from degraded operation for enhanced reliability.
- Added alerting for residue build-up on interior windshield that may impact front camera visibility. If affected, visit Service for cleaning!
Upcoming Improvements:
- Overall smoothness and sentience
- Parking spot selection and parking quality
