News
First sounds of wind on Mars captured by NASA’s Insight Lander
This fresh in from the Elysium Planitia of Mars: the sound of wind from an alien world. On its 10th day as a new resident of the red planet, NASA’s InSight lander’s pressure sensor transmitted air vibration data from its trembling solar panels, representing a steady breeze about 99 million miles away. The combination of photos sent back from the craft with the sound of Martian wind gives Earth residents a unique moment to feel like they’ve joined the craft themselves. “It’s fun to imagine that I’m there,” mused Don Banfield during a JPL media teleconference discussing the recording. Banfield is InSight’s Auxiliary Payload Sensor Subsystem (APSS) Science Lead.
InSight, short for “Interior Exploration using Seismic Investigations, Geodesy and Heat Transport”, launched aboard an Atlas V rocket on May 5, 2018 and successfully landed on the Martian surface on November 26, 2018. The craft is a seismic investigator sent to study the red planet’s core, eventually drilling 10-16 feet down into its crust to gather geographical data. The craft’s landing event was live streamed online for viewers around the world, greeting Earthlings with a photo of its new home’s surface shortly after. It sent back more photos of the surrounding area prior to the wind recording.
The thin CO2 atmosphere on Mars doesn’t translate high sounds well, so the recorded vibrations from InSight’s pressure sensor are low on the audio spectrum, under 50 Hz, thus difficult to hear. However, after the frequency was increased by a factor of 100 (raised two octaves), it became possible to hear what sounds like a steady wind blowing across the regolith. Dust devils tracked in the area moving across the Martian surface had motion consistent with the wind recordings, thus confirming what was being heard by InSight’s scientists.

The way InSight picks up and translates sound is similar to how a human ear works: Air pressure vibrates the eardrum, then that vibration pattern is sent through the inner ear bones to the cochlea which has tiny hairs translating the vibrations into electrical signals sent to the brain. InSight’s solar panels are like its eardrums, the spacecraft structure itself like its inner ear, its instruments like its cochlear, and its electronic box translating and transmitting signals is like brain. The “sounds” we hear from Mars are translated data from wind-caused vibrations.
Ironically enough, wind noise is actually not a particularly desired outcome from InSight’s instruments. According to the scientists participating in NASA’s teleconference discussing the event, the inlet for the pressure sensor was specifically designed to minimize any chatter from air movement. Also, the placement of InSight’s seismographic gear will be based on the best area to reduce input from the lander’s interaction with the vibrations it’s recording, i.e., the lander’s movement from seismic events. It should be noted, though, the Martian wind gracing our human ears for the first time is only a taste of what’s to come from InSight’s instruments.
Once the wind and thermal shield (the white dome in the photos) has been lifted from the lander in a few weeks, all of InSight’s instruments will be exposed to the Martian environment for data collection. For now, the lander’s Earth-based team is first focusing on understanding the area the craft is in to pick the best place to set its instruments. After the main mission begins, however, a full study of Mars’ atmosphere will be underway and we could hear, among other natural events, the sounds of exploding meteors.

While wind may be a unique sound to hear on an alien world, it’s not the first time a NASA craft has entertained our ears and imaginations. Electromagnetic vibrations have been recorded all across our solar system, perhaps the most famous of which originated from the Voyager 1 spacecraft launched in 1977. The data collected from the craft’s radio-capturing instruments has been converted into audio files – you can even find a full album’s worth of the sounds on a variety of streaming sites. Some of the recordings are meditation-worthy, others a touch unnerving. We humans have additionally added some recordings of our own to space via Voyager’s famous “golden record”, the sounds of which are also available for listening online.
If you’re craving a full Martian soundtrack, you’ll be happy to know that NASA’s Mars 2020 rover is planned to provide just that. It will have two microphones on board, one of which will record the actual landing of the rover. Combined with telemetry data and surface photographs, Mars is on its way to its own documentary with inputs completely provided “on-location”. Stay tuned!
Listen to the Martian wind yourself below:
Elon Musk
What is Digital Optimus? The new Tesla and xAI project explained
At its core, Digital Optimus operates through a dual-process architecture inspired by human cognition.
Tesla and xAI announced their groundbreaking joint project, Digital Optimus, also nicknamed “Macrohard” in a humorous jab at Microsoft, earlier this week.
This software-based AI agent is designed to automate complex office workflows by observing and replicating human interactions with computers. As the first major outcome of Tesla’s $2 billion investment in xAI, it represents a powerful fusion of hardware efficiency and advanced reasoning.
At its core, Digital Optimus operates through a dual-process architecture inspired by human cognition.
Macrohard or Digital Optimus is a joint xAI-Tesla project, coming as part of Tesla’s investment agreement with xAI.
Grok is the master conductor/navigator with deep understanding of the world to direct digital Optimus, which is processing and actioning the past 5 secs of…
— Elon Musk (@elonmusk) March 11, 2026
Tesla’s specialized AI acts as “System 1”—the fast, instinctive executor—processing the past five seconds of real-time computer screen video along with keyboard and mouse actions to perform immediate tasks.
xAI’s Grok model serves as “System 2,” the strategic “master conductor” or navigator, providing high-level reasoning, world understanding, and directional oversight, much like an advanced turn-by-turn navigation system.
When combined, the two can create a powerful AI-based assistant that can complete everything from accounting work to HR tasks.
Will Tesla join the fold? Predicting a triple merger with SpaceX and xAI
The system runs primarily on Tesla’s low-cost AI4 inference chip, minimizing expensive Nvidia resources from xAI for competitive, real-time performance.
Elon Musk described it as “the only real-time smart AI system” capable, in principle, of emulating the functions of entire companies, handling everything from accounting and HR to repetitive digital operations.
Timelines point to swift deployment. Announced just days ago, Musk expects Digital Optimus to be ready for user experience within about six months, targeting rollout around September 2026.
It will integrate into all AI4-equipped Tesla vehicles, enabling parked cars to handle office work during downtime. Millions of dedicated units are also planned for deployment at Supercharger stations, tapping into roughly 7 gigawatts of available power.
Oh and it works in all AI4-equipped cars, so your car can do office work for you when not driving.
We’re also deploying millions of dedicated Digital Optimus units in the field at Superchargers where we have ~7 gigawatts of available power.
— Elon Musk (@elonmusk) March 12, 2026
Digital Optimus directly supports Tesla’s broader autonomy strategy. It leverages the same end-to-end neural networks, computer vision, and real-time decision-making tech that power Full Self-Driving (FSD) software and the physical Optimus humanoid robot.
By repurposing idle vehicle compute and extending AI4 hardware beyond driving, the project scales Tesla’s autonomy ecosystem from roads to digital workspaces.
As a virtual counterpart to physical Optimus, it divides labor: software agents manage screen-based tasks while humanoid robots tackle physical ones, accelerating Tesla’s vision of general-purpose AI for productivity, Robotaxi fleets, and beyond.
In essence, Digital Optimus bridges Tesla’s vehicle and robotics autonomy with enterprise-scale AI, promising massive efficiency gains. No other company currently matches its real-time capabilities on such accessible hardware.
It really could be one of the most crucial developments Tesla and xAI begin to integrate, as it could revolutionize how people work and travel.
News
Tesla adds awesome new driving feature to Model Y
Tesla is rolling out a new “Comfort Braking” feature with Software Update 2026.8. The feature is exclusive to the new Model Y, and is currently unavailable for any other vehicle in the Tesla lineup.
Tesla is adding an awesome new driving feature to Model Y vehicles, effective on Juniper-updated models considered model year 2026 or newer.
Tesla is rolling out a new “Comfort Braking” feature with Software Update 2026.8. The feature is exclusive to the new Model Y, and is currently unavailable for any other vehicle in the Tesla lineup.
Tesla writes in the release notes for the feature:
“Your Tesla now provides a smoother feel as you come to a complete stop during routine braking.”
🚨 Tesla has added a new “Comfort Braking” update with 2026.8
“Your Tesla provides a smoother feel as you come to a complete stop during routine braking.” https://t.co/afqCpBSVeA pic.twitter.com/C6MRmzfzls
— TESLARATI (@Teslarati) March 13, 2026
Interestingly, we’re not too sure what catalyzed Tesla to try to improve braking smoothness, because it hasn’t seemed overly abrupt or rough from my perspective. Although the brake pedal in my Model Y is rarely used due to Regenerative Braking, it seems Tesla wanted to try to make the ride comfort even smoother for owners.
There is always room for improvement, though, and it seems that there is a way to make braking smoother for passengers while the vehicle is coming to a stop.
This is far from the first time Tesla has attempted to improve its ride comfort through Over-the-Air updates, as it has rolled out updates to improve regenerative braking performance, handling while using Full Self-Driving, improvements to Steer-by-Wire to Cybertruck, and even recent releases that have combatted Active Road Noise.
Tesla holds a unique ability to change the functionality of its vehicles through software updates, which have come in handy for many things, including remedying certain recalls and shipping new features to the Full Self-Driving suite.
Tesla seems to have the most seamless OTA processes, as many automakers have the ability to ship improvements through a simple software update.
We’re really excited to test the update, so when we get an opportunity to try out Comfort Braking when it makes it to our Model Y.
News
Tesla finally brings a Robotaxi update that Android users will love
The breakdown of the software version shows that Tesla is actively developing an Android-compatible version of the Robotaxi app, and the company is developing Live Activities for Android.
Tesla is finally bringing an update of its Robotaxi platform that Android users will love — mostly because it seems like they will finally be able to use the ride-hailing platform that the company has had active since last June.
Based on a decompile of software version 26.2.0 of the Robotaxi app, Tesla looks to be ready to roll out access to Android users.
According to the breakdown, performed by Tesla App Updates, the company is preparing to roll out an Android version of the app as it is developing several features for that operating system.
🚨 It looks like Tesla is preparing to launch the Robotaxi app for Android users at last!
A decompile of v26.2.0 of the Robotaxi app shows some progress on the Android side for Robotaxi 🤖 🚗 https://t.co/mThmoYuVLy
— TESLARATI (@Teslarati) March 13, 2026
The breakdown of the software version shows that Tesla is actively developing an Android-compatible version of the Robotaxi app, and the company is developing Live Activities for Android:
“Strings like notification_channel_robotaxid_trip_name and android_native_alicorn_eta_text show exactly how Tesla plans to replicate the iOS Live Activities experience. Instead of standard push alerts, Android users are getting a persistent, dynamically updating notification channel.”
This is a big step forward for several reasons. From a face-value perspective, Tesla is finally ready to offer Robotaxi to Android users.
The company has routinely prioritized Apple releases because there is a higher concentration of iPhone users in its ownership base. Additionally, the development process for Apple is simply less laborious.
Tesla is working to increase Android capabilities in its vehicles
Secondly, the Robotaxi rollout has been a typical example of “slowly then all at once.”
Tesla initially released Robotaxi access to a handful of media members and influencers. Eventually, it was expanded to more users, so that anyone using an iOS device could download the app and hail a semi-autonomous ride in Austin or the Bay Area.
Opening up the user base to Android users may show that Tesla is preparing to allow even more users to utilize its Robotaxi platform, and although it seems to be a few months away from only offering fully autonomous rides to anyone with app access, the expansion of the user base to an entirely different user base definitely seems like its a step in the right direction.