News
Mars’ invisible auroras come to life in new NASA animation
The sky on Mars has a wind of ultraviolet light that pulses as it moves across the red planet. This phenomenon is similar to Earth’s aurora borealis events, lasting from sunset until midnight, and NASA has created a new animation using data obtained from an instrument on the MAVEN spacecraft to demonstrate what it might look like to future astronauts in orbit. That is, it’s what they would see if it wasn’t invisible to the naked human eye.
“The ultraviolet glow comes mostly from an altitude of about 70 kilometers (approximately 40 miles), with the brightest spot about a thousand kilometers (approximately 600 miles) across, and is as bright in the ultraviolet as Earth’s northern lights,” explained Zac Milby, a researcher at the University of Colorado’s Laboratory for Atmospheric and Space Physics (LASP).

Milby is one of several other LASP scientists that published a recent study on the ‘nightglow’ data in the journal JGR Space Physics. The team analyzed two Martian years’ worth of images taken by the Imaging Ultraviolet Spectrograph (IUVS) on the Mars Atmosphere and Volatile Evolution (MAVEN) mission spacecraft to find the reported wind and UV emission behavior. “The MAVEN spacecraft orbiting Mars has obtained a new type of imaging data which reveals the effects of global‐scale winds and waves in the upper atmosphere,” the study’s Plain Language Summary details.
With a mission to Mars in the near-term travel plans for both NASA and SpaceX, learning as much as possible about the planet’s atmosphere is essential. Preparations for both on-site scientific study and astronaut safety will need to incorporate the Martian ground conditions, so studies like this one provide very important and relevant data. The recent findings also demonstrate the importance of cooperative international efforts in exploring deep space, namely because MAVEN wasn’t the first spacecraft to demonstrate the existence of the ‘nightglow’ phenomenon. Rather, the European Space Agency’s Mars Express orbiter mission revealed it in 2003, leading the path for follow up and further study.

It may also be interesting to note that although Mars-based astronauts may not be able to see the UV aurora-type nightglow with their normal eyesight, instruments could likely come into play to overcome such limitations. We already have commercially available UV cameras on Earth, for instance. There’s even a UV camera smartphone add-on that one could see being used on Mars to watch the glow show in a similar fashion to augmented reality starfinders on the market.
Of course, much more science-oriented UV cameras would likely accompany any travelers to the planet, but it’s fun to imagine what may be available to more tourist-types or even Martian colonists some day.
You can watch a video of NASA’s animation of the UV nightglow wind on Mars below:
Elon Musk
Tesla AI Head says future FSD feature has already partially shipped
Tesla’s Head of AI, Ashok Elluswamy, says that something that was expected with version 14.3 of the company’s Full Self-Driving platform has already partially shipped with the current build of version 14.2.
Tesla and CEO Elon Musk have teased on several occasions that reasoning will be a big piece of future Full Self-Driving builds, helping bring forth the “sentient” narrative that the company has pushed for these more advanced FSD versions.
Back in October on the Q3 Earnings Call, Musk said:
“With reasoning, it’s literally going to think about which parking spot to pick. It’ll drop you off at the entrance of the store, then go find a parking spot. It’s going to spot empty spots much better than a human. It’s going to use reasoning to solve things.”
Musk said in the same month:
“By v14.3, your car will feel like it is sentient.”
Amazingly, Tesla Full Self-Driving v14.2.2.2, which is the most recent iteration released, is very close to this sentient feeling. However, there are more things that need to be improved, and logic appears to be in the future plans to help with decision-making in general, alongside other refinements and features.
On Thursday evening, Elluswamy revealed that some of the reasoning features have already been rolled out, confirming that it has been added to navigation route changes during construction, as well as with parking options.
He added that “more and more reasoning will ship in Q1.”
🚨 Tesla’s Ashok Elluswamy reveals Nav decisions when encountering construction and parking options contain “some elements of reasoning”
More uses of reasoning will be shipped later this quarter, a big tidbit of info as we wait v14.3 https://t.co/jty8llgsKM
— TESLARATI (@Teslarati) January 9, 2026
Interestingly, parking improvements were hinted at being added in the initial rollout of v14.2 several months ago. These had not rolled out to vehicles quite yet, as they were listed under the future improvements portion of the release notes, but it appears things have already started to make their way to cars in a limited fashion.
Tesla Full Self-Driving v14.2 – Full Review, the Good and the Bad
As reasoning is more involved in more of the Full Self-Driving suite, it is likely we will see cars make better decisions in terms of routing and navigation, which is a big complaint of many owners (including me).
Additionally, the operation as a whole should be smoother and more comfortable to owners, which is hard to believe considering how good it is already. Nevertheless, there are absolutely improvements that need to be made before Tesla can introduce completely unsupervised FSD.
Elon Musk
Tesla’s Elon Musk: 10 billion miles needed for safe Unsupervised FSD
As per the CEO, roughly 10 billion miles of training data are required due to reality’s “super long tail of complexity.”
Tesla CEO Elon Musk has provided an updated estimate for the training data needed to achieve truly safe unsupervised Full Self-Driving (FSD).
As per the CEO, roughly 10 billion miles of training data are required due to reality’s “super long tail of complexity.”
10 billion miles of training data
Musk comment came as a reply to Apple and Rivian alum Paul Beisel, who posted an analysis on X about the gap between tech demonstrations and real-world products. In his post, Beisel highlighted Tesla’s data-driven lead in autonomy, and he also argued that it would not be easy for rivals to become a legitimate competitor to FSD quickly.
“The notion that someone can ‘catch up’ to this problem primarily through simulation and limited on-road exposure strikes me as deeply naive. This is not a demo problem. It is a scale, data, and iteration problem— and Tesla is already far, far down that road while others are just getting started,” Beisel wrote.
Musk responded to Beisel’s post, stating that “Roughly 10 billion miles of training data is needed to achieve safe unsupervised self-driving. Reality has a super long tail of complexity.” This is quite interesting considering that in his Master Plan Part Deux, Elon Musk estimated that worldwide regulatory approval for autonomous driving would require around 6 billion miles.
FSD’s total training miles
As 2025 came to a close, Tesla community members observed that FSD was already nearing 7 billion miles driven, with over 2.5 billion miles being from inner city roads. The 7-billion-mile mark was passed just a few days later. This suggests that Tesla is likely the company today with the most training data for its autonomous driving program.
The difficulties of achieving autonomy were referenced by Elon Musk recently, when he commented on Nvidia’s Alpamayo program. As per Musk, “they will find that it’s easy to get to 99% and then super hard to solve the long tail of the distribution.” These sentiments were echoed by Tesla VP for AI software Ashok Elluswamy, who also noted on X that “the long tail is sooo long, that most people can’t grasp it.”
News
Tesla earns top honors at MotorTrend’s SDV Innovator Awards
MotorTrend’s SDV Awards were presented during CES 2026 in Las Vegas.
Tesla emerged as one of the most recognized automakers at MotorTrend’s 2026 Software-Defined Vehicle (SDV) Innovator Awards.
As could be seen in a press release from the publication, two key Tesla employees were honored for their work on AI, autonomy, and vehicle software. MotorTrend’s SDV Awards were presented during CES 2026 in Las Vegas.
Tesla leaders and engineers recognized
The fourth annual SDV Innovator Awards celebrate pioneers and experts who are pushing the automotive industry deeper into software-driven development. Among the most notable honorees for this year was Ashok Elluswamy, Tesla’s Vice President of AI Software, who received a Pioneer Award for his role in advancing artificial intelligence and autonomy across the company’s vehicle lineup.
Tesla also secured recognition in the Expert category, with Lawson Fulton, a staff Autopilot machine learning engineer, honored for his contributions to Tesla’s driver-assistance and autonomous systems.
Tesla’s software-first strategy
While automakers like General Motors, Ford, and Rivian also received recognition, Tesla’s multiple awards stood out given the company’s outsized role in popularizing software-defined vehicles over the past decade. From frequent OTA updates to its data-driven approach to autonomy, Tesla has consistently treated vehicles as evolving software platforms rather than static products.
This has made Tesla’s vehicles very unique in their respective sectors, as they are arguably the only cars that objectively get better over time. This is especially true for vehicles that are loaded with the company’s Full Self-Driving system, which are getting progressively more intelligent and autonomous over time. The majority of Tesla’s updates to its vehicles are free as well, which is very much appreciated by customers worldwide.