Connect with us

News

First sounds of wind on Mars captured by NASA’s Insight Lander

Published

on

This fresh in from the Elysium Planitia of Mars: the sound of wind from an alien world. On its 10th day as a new resident of the red planet, NASA’s InSight lander’s pressure sensor transmitted air vibration data from its trembling solar panels, representing a steady breeze about 99 million miles away. The combination of photos sent back from the craft with the sound of Martian wind gives Earth residents a unique moment to feel like they’ve joined the craft themselves. “It’s fun to imagine that I’m there,” mused Don Banfield during a JPL media teleconference discussing the recording. Banfield is InSight’s Auxiliary Payload Sensor Subsystem (APSS) Science Lead.

InSight, short for “Interior Exploration using Seismic Investigations, Geodesy and Heat Transport”, launched aboard an Atlas V rocket on May 5, 2018 and successfully landed on the Martian surface on November 26, 2018. The craft is a seismic investigator sent to study the red planet’s core, eventually drilling 10-16 feet down into its crust to gather geographical data. The craft’s landing event was live streamed online for viewers around the world, greeting Earthlings with a photo of its new home’s surface shortly after. It sent back more photos of the surrounding area prior to the wind recording.

The thin CO2 atmosphere on Mars doesn’t translate high sounds well, so the recorded vibrations from InSight’s pressure sensor are low on the audio spectrum, under 50 Hz, thus difficult to hear. However, after the frequency was increased by a factor of 100 (raised two octaves), it became possible to hear what sounds like a steady wind blowing across the regolith. Dust devils tracked in the area moving across the Martian surface had motion consistent with the wind recordings, thus confirming what was being heard by InSight’s scientists.

A recent photo sent back from InSight as it settles into its Martian habitat. The white dome pictured is covering the craft’s instruments. | Credit: NASA/JPL

The way InSight picks up and translates sound is similar to how a human ear works: Air pressure vibrates the eardrum, then that vibration pattern is sent through the inner ear bones to the cochlea which has tiny hairs translating the vibrations into electrical signals sent to the brain. InSight’s solar panels are like its eardrums, the spacecraft structure itself like its inner ear, its instruments like its cochlear, and its electronic box translating and transmitting signals is like brain. The “sounds” we hear from Mars are translated data from wind-caused vibrations.

Ironically enough, wind noise is actually not a particularly desired outcome from InSight’s instruments. According to the scientists participating in NASA’s teleconference discussing the event, the inlet for the pressure sensor was specifically designed to minimize any chatter from air movement. Also, the placement of InSight’s seismographic gear will be based on the best area to reduce input from the lander’s interaction with the vibrations it’s recording, i.e., the lander’s movement from seismic events. It should be noted, though, the Martian wind gracing our human ears for the first time is only a taste of what’s to come from InSight’s instruments.

Once the wind and thermal shield (the white dome in the photos) has been lifted from the lander in a few weeks, all of InSight’s instruments will be exposed to the Martian environment for data collection. For now, the lander’s Earth-based team is first focusing on understanding the area the craft is in to pick the best place to set its instruments. After the main mission begins, however, a full study of Mars’ atmosphere will be underway and we could hear, among other natural events, the sounds of exploding meteors.

Advertisement
-->
An artist’s depiction of InSight drilling on Mars. | Credit: NASA/JPL-Caltech

While wind may be a unique sound to hear on an alien world, it’s not the first time a NASA craft has entertained our ears and imaginations. Electromagnetic vibrations have been recorded all across our solar system, perhaps the most famous of which originated from the Voyager 1 spacecraft launched in 1977. The data collected from the craft’s radio-capturing instruments has been converted into audio files – you can even find a full album’s worth of the sounds on a variety of streaming sites. Some of the recordings are meditation-worthy, others a touch unnerving. We humans have additionally added some recordings of our own to space via Voyager’s famous “golden record”, the sounds of which are also available for listening online.

If you’re craving a full Martian soundtrack, you’ll be happy to know that NASA’s Mars 2020 rover is planned to provide just that. It will have two microphones on board, one of which will record the actual landing of the rover. Combined with telemetry data and surface photographs, Mars is on its way to its own documentary with inputs completely provided “on-location”. Stay tuned!

Listen to the Martian wind yourself below:

Advertisement
-->

Accidental computer geek, fascinated by most history and the multiplanetary future on its way. Quite keen on the democratization of space. | It's pronounced day-sha, but I answer to almost any variation thereof.

Advertisement
Comments

Elon Musk

Starlink passes 9 million active customers just weeks after hitting 8 million

The milestone highlights the accelerating growth of Starlink, which has now been adding over 20,000 new users per day.

Published

on

Credit: Starlink/X

SpaceX’s Starlink satellite internet service has continued its rapid global expansion, surpassing 9 million active customers just weeks after crossing the 8 million mark. 

The milestone highlights the accelerating growth of Starlink, which has now been adding over 20,000 new users per day.

9 million customers

In a post on X, SpaceX stated that Starlink now serves over 9 million active users across 155 countries, territories, and markets. The company reached 8 million customers in early November, meaning it added roughly 1 million subscribers in under seven weeks, or about 21,275 new users on average per day. 

“Starlink is connecting more than 9M active customers with high-speed internet across 155 countries, territories, and many other markets,” Starlink wrote in a post on its official X account. SpaceX President Gwynne Shotwell also celebrated the milestone on X. “A huge thank you to all of our customers and congrats to the Starlink team for such an incredible product,” she wrote. 

That growth rate reflects both rising demand for broadband in underserved regions and Starlink’s expanding satellite constellation, which now includes more than 9,000 low-Earth-orbit satellites designed to deliver high-speed, low-latency internet worldwide.

Advertisement
-->

Starlink’s momentum

Starlink’s momentum has been building up. SpaceX reported 4.6 million Starlink customers in December 2024, followed by 7 million by August 2025, and 8 million customers in November. Independent data also suggests Starlink usage is rising sharply, with Cloudflare reporting that global web traffic from Starlink users more than doubled in 2025, as noted in an Insider report.

Starlink’s momentum is increasingly tied to SpaceX’s broader financial outlook. Elon Musk has said the satellite network is “by far” the company’s largest revenue driver, and reports suggest SpaceX may be positioning itself for an initial public offering as soon as next year, with valuations estimated as high as $1.5 trillion. Musk has also suggested in the past that Starlink could have its own IPO in the future. 

Continue Reading

News

NVIDIA Director of Robotics: Tesla FSD v14 is the first AI to pass the “Physical Turing Test”

After testing FSD v14, Fan stated that his experience with FSD felt magical at first, but it soon started to feel like a routine.

Published

on

Credit: Grok Imagine

NVIDIA Director of Robotics Jim Fan has praised Tesla’s Full Self-Driving (Supervised) v14 as the first AI to pass what he described as a “Physical Turing Test.”

After testing FSD v14, Fan stated that his experience with FSD felt magical at first, but it soon started to feel like a routine. And just like smartphones today, removing it now would “actively hurt.”

Jim Fan’s hands-on FSD v14 impressions

Fan, a leading researcher in embodied AI who is currently solving Physical AI at NVIDIA and spearheading the company’s Project GR00T initiative, noted that he actually was late to the Tesla game. He was, however, one of the first to try out FSD v14

“I was very late to own a Tesla but among the earliest to try out FSD v14. It’s perhaps the first time I experience an AI that passes the Physical Turing Test: after a long day at work, you press a button, lay back, and couldn’t tell if a neural net or a human drove you home,” Fan wrote in a post on X. 

Fan added: “Despite knowing exactly how robot learning works, I still find it magical watching the steering wheel turn by itself. First it feels surreal, next it becomes routine. Then, like the smartphone, taking it away actively hurts. This is how humanity gets rewired and glued to god-like technologies.”

Advertisement
-->

The Physical Turing Test

The original Turing Test was conceived by Alan Turing in 1950, and it was aimed at determining if a machine could exhibit behavior that is equivalent to or indistinguishable from a human. By focusing on text-based conversations, the original Turing Test set a high bar for natural language processing and machine learning. 

This test has been passed by today’s large language models. However, the capability to converse in a humanlike manner is a completely different challenge from performing real-world problem-solving or physical interactions. Thus, Fan introduced the Physical Turing Test, which challenges AI systems to demonstrate intelligence through physical actions.

Based on Fan’s comments, Tesla has demonstrated these intelligent physical actions with FSD v14. Elon Musk agreed with the NVIDIA executive, stating in a post on X that with FSD v14, “you can sense the sentience maturing.” Musk also praised Tesla AI, calling it the best “real-world AI” today.

Continue Reading

News

Tesla AI team burns the Christmas midnight oil by releasing FSD v14.2.2.1

The update was released just a day after FSD v14.2.2 started rolling out to customers. 

Published

on

Credit: Grok

Tesla is burning the midnight oil this Christmas, with the Tesla AI team quietly rolling out Full Self-Driving (Supervised) v14.2.2.1 just a day after FSD v14.2.2 started rolling out to customers. 

Tesla owner shares insights on FSD v14.2.2.1

Longtime Tesla owner and FSD tester @BLKMDL3 shared some insights following several drives with FSD v14.2.2.1 in rainy Los Angeles conditions with standing water and faded lane lines. He reported zero steering hesitation or stutter, confident lane changes, and maneuvers executed with precision that evoked the performance of Tesla’s driverless Robotaxis in Austin.

Parking performance impressed, with most spots nailed perfectly, including tight, sharp turns, in single attempts without shaky steering. One minor offset happened only due to another vehicle that was parked over the line, which FSD accommodated by a few extra inches. In rain that typically erases road markings, FSD visualized lanes and turn lines better than humans, positioning itself flawlessly when entering new streets as well.

“Took it up a dark, wet, and twisty canyon road up and down the hill tonight and it went very well as to be expected. Stayed centered in the lane, kept speed well and gives a confidence inspiring steering feel where it handles these curvy roads better than the majority of human drivers,” the Tesla owner wrote in a post on X.

Tesla’s FSD v14.2.2 update

Just a day before FSD v14.2.2.1’s release, Tesla rolled out FSD v14.2.2, which was focused on smoother real-world performance, better obstacle awareness, and precise end-of-trip routing. According to the update’s release notes, FSD v14.2.2 upgrades the vision encoder neural network with higher resolution features, enhancing detection of emergency vehicles, road obstacles, and human gestures.

Advertisement
-->

New Arrival Options also allowed users to select preferred drop-off styles, such as Parking Lot, Street, Driveway, Parking Garage, or Curbside, with the navigation pin automatically adjusting to the ideal spot. Other refinements include pulling over for emergency vehicles, real-time vision-based detours for blocked roads, improved gate and debris handling, and Speed Profiles for customized driving styles.

Continue Reading