Connect with us

News

Mars travelers can use ‘Star Trek’ Tricorder-like features using smartphone biotech: study

Published

on

Plans to take humans to the Moon and Mars come with numerous challenges, and the health of space travelers is no exception. One of the ways any ill-effects can be prevented or mitigated is by detecting relevant changes in the body and the body’s surroundings, something that biosensor technology is specifically designed to address on Earth. However, the small size and weight requirements for tech used in the limited habitats of astronauts has impeded its development to date.

A recent study of existing smartphone-based biosensors by scientists from Queen’s University Belfast (QUB) in the UK identified several candidates under current use or development that could be also used in a space or Martian environment. When combined, the technology could provide functionality reminiscent of the “Tricorder” devices used for medical assessments in the Star Trek television and movie franchises, providing on-site information about the health of human space travelers and biological risks present in their habitats.

Biosensors focus on studying biomarkers, i.e., the body’s response to environmental conditions. For example, changes in blood composition, elevations of certain molecules in urine, heart rate increases or decreases, and so forth, are all considered biomarkers. Health and fitness apps tracking general health biomarkers have become common in the marketplace with brands like FitBit leading the charge for overall wellness sensing by tracking sleep patterns, heart rate, and activity levels using wearable biosensors. Astronauts and other future space travelers could likely use this kind of tech for basic health monitoring, but there are other challenges that need to be addressed in a compact way.

The projected human health needs during spaceflight have been detailed by NASA on its Human Research Program website, more specifically so in its web-based Human Research Roadmap (HRR) where the agency has its scientific data published for public review. Several hazards of human spaceflight are identified, such as environmental and mental health concerns, and the QUB scientists used that information to organize their study. Their research produced a 20-page document reviewing the specific inner workings of the relevant devices found in their searches, complete with tables summarizing each device’s methods and suitability for use in space missions. Here are some of the highlights.

Advertisement
A chart showing the classification of scientific articles about relevant smartphone-based biosensors used in the Queen’s University Belfast study. | Credit: Biosensors/Queen’s University Belfast

Risks in the Spacecraft Environment

During spaceflight, the environment is a closed system that has a two-fold effect: One, the immune system has been shown to decrease its functionality in long-duration missions, specifically by lowering white blood cell counts, and two, the weightless and non-competitive environment make it easier for microbes to transfer between humans and their growth rates increase. In one space shuttle era study, the number of microbial cells in the vehicle able to reproduce increased by 300% within 12 days of being in orbit. Also, certain herpes viruses, such as those responsible for chickenpox and mononucleosis, have been reactivated under microgravity, although the astronauts typically didn’t show symptoms despite the presence of active viral shedding (the virus had surfaced and was able to spread).

Frequent monitoring of the spacecraft environment and the crew’s biomarkers is the best way to mitigate these challenges, and NASA is addressing these issues to an extent with traditional instruments and equipment to collect data, although often times the data cannot be processed until the experiments are returned to Earth. An attempt has also been made to rapidly quantify microorganisms aboard the International Space Station (ISS) via a handheld device called the Lab-on-a-Chip Application Development-Portable Test System (LOCAD-PTS). However, this device cannot distinguish between microorganism species yet, meaning it can’t tell the difference between pathogens and harmless species. The QUB study found several existing smartphone-based technologies generally developed for use in remote medical care facilities that could achieve better identification results.

NASA astronaut Karen Nyberg uses a fundoscope to image her eye while in orbit to study Visual Impairment Intracranial Pressure (VIIP) Syndrome. Smaller 3D printed retinal imaging adaptors for smartphones are being developed to perform the testing done by large devices similar to the instrument used here. | Credit: NASA

One of the devices described was a spectrometer (used to identify substances based on the light frequency emitted) which used the smartphone’s flashlight and camera to generate data that was at least as accurate as traditional instruments. Another was able to identify concentrations of an artificial growth hormone injected into cows called recominant bovine somatrotropin (rBST) in test samples, and other systems were able to accurately detect cyphilis and HIV as well as the zika, chikungunya, and dengue viruses. All of the devices used smartphone attachments, some of them with 3D-printed parts. Of course, the types of pathogens detected are not likely to be common in a closed space habitat, but the technology driving them could be modified to meet specific detection needs.

The Stress of Spaceflight

A group of people crammed together in a small space for long periods of time will be impacted by the situation despite any amount of careful selection or training due to the isolation and confinement. Declines in mood, cognition, morale, or interpersonal interaction can impact team functioning or transition into a sleep disorder. On Earth, these stress responses may seem common, or perhaps an expected part of being human, but missions in deep space and on Mars will be demanding and need fully alert, well-communicating teams to succeed. NASA already uses devices to monitor these risks while also addressing the stress factor by managing habitat lighting, crew movement and sleep amounts, and recommending astronauts keep journals to vent as needed. However, an all-encompassing tool may be needed for longer-duration space travels.

As recognized by the QUB study, several “mindfulness” and self-help apps already exist in the market and could be utilized to address the stress factor in future astronauts when combined with general health monitors. For example, the popular FitBit app and similar products collect data on sleep patterns, activity levels, and heart rates which could potentially be linked to other mental health apps that could recommend self-help programs using algorithms. The more recent “BeWell” app monitors physical activity, sleep patterns, and social interactions to analyze stress levels and recommend self-help treatments. Other apps use voice patterns and general phone communication data to assess stress levels such as “StressSense” and “MoodSense”.

A Tricorder-like setup is imagined by scientists at Queens University Belfast, utilizing the functionalities of existing smartphone-based biosensors. | Credit: Biosensors/Queens University Belfast

Advances in smartphone technology such as high resolution cameras, microphones, fast processing speed, wireless connectivity, and the ability to attach external devices provide tools that can be used for an expanding number of “portable lab” type functionalities. Unfortunately, though, despite the possibilities that these biosensors could mean for human spaceflight needs, there are notable limitations that would need to be overcome in some of the devices. In particular, any device utilizing antibodies or enzymes in its testing would risk the stability of its instruments thanks to radiation from galactic cosmic rays and solar particle events. Biosensor electronics might also be damaged by these things as well. Development of new types of shielding may be necessary to ensure their functionality outside of Earth and Earth orbit or, alternatively, synthetic biology could also be a source of testing elements genetically engineered to withstand the space and Martian environments.

The interest in smartphone-based solutions for space travelers has been garnering more attention over the years as tech-centric societies have moved in the “app” direction overall. NASA itself has hosted a “Space Apps Challenge” for the last 8 years, drawing thousands of participants to submit programs that interpret and visualize data for greater understanding of designated space and science topics. Some of the challenges could be directly relevant to the biosensor field. For example, in the 2018 event, contestants are asked to develop a sensor to be used by humans on Mars to observe and measure variables in their environments; in 2017, contestants created visualizations of potential radiation exposure during polar or near-polar flight.

Advertisement

While the QUB study implied that the combination of existing biosensor technology could be equivalent to a Tricorder, the direct development of such a device has been the subject of its own specific challenge. In 2012, the Qualcomm Tricorder XPRIZE competition was launched, asking competitors to develop a user-friendly device that could accurately diagnose 13 health conditions and capture 5 real-time health vital signs. The winner of the prize awarded in 2017 was Pennsylvania-based family team called Final Frontier Medical Devices, now Basil Leaf Technologies, for their DxtER device. According to their website, the sensors inside DxtER can be used independently, one of which is in a Phase 1 Clinical Trial. The second place winner of the competition used a smartphone app to connect its health testing modules and generate a diagnosis from the data acquired from the user.

The march continues to develop the technology humans will need to safely explore regions beyond Earth orbit. Space is hard, but it was hard before we went there the first time, and it was hard before we put humans on the moon. There may be plenty of challenges to overcome, but as the Queen’s University Belfast study demonstrates, we may already be solving them. It’s just a matter of realizing it and expanding on it.

Accidental computer geek, fascinated by most history and the multiplanetary future on its way. Quite keen on the democratization of space. | It's pronounced day-sha, but I answer to almost any variation thereof.

Advertisement
Comments

Elon Musk

Elon Musk launches TERAFAB: The $25B Tesla-SpaceXAI chip factory that will rewire the AI industry

Tesla, SpaceX, and xAI unveiled TERAFAB, a $25B chip factory targeting one terawatt of AI compute annually.

Published

on

By

Tesla TERAFAB Factory in Austin, Texas

Elon Musk took the stage over the weekend at the defunct Seaholm Power Plant in Austin, Texas, to officially unveil TERAFAB, a $20-25 billion joint venture between Tesla, SpaceX, and xAI that he described as “the most epic chip building exercise in history by far.” The announcement marks the most ambitious infrastructure bet Musk has made since Gigafactory 1 in Sparks, Nevada, and it fuses three of his companies into a single, vertically integrated AI hardware machine for the first time.

TERAFAB is designed to consolidate every stage of semiconductor production under one roof, including chip design, lithography, fabrication, memory production, advanced packaging, and testing.  At full capacity, the facility would scale to roughly 70% of the global output from the current world’s largest semiconductor foundry from Taiwan Semiconductor Manufacturing Company (TSMC).

Elon Musk’s stated goal is one terawatt of computing power annually, split between Tesla’s AI5 inference chips for vehicles and Optimus robots, and D3 chips built specifically for SpaceXAI’s orbital satellite constellation.

Tesla Terafab set for launch: Inside the $20B AI chip factory that will reshape the auto industry

Advertisement

The logic behind the merger of these three entities is rooted in a supply chain crisis Musk has been signaling for over a year. At Tesla’s Q4 2025 earnings call, he warned investors that external chip capacity from TSMC, Samsung, and Micron would hit a ceiling within three to four years. “We’re very grateful to our existing supply chain, to Samsung, TSMC, Micron and others,” Musk acknowledged at the Terafab event, “but there’s a maximum rate at which they’re comfortable expanding.” Building in-house was, in his framing, not a strategic option, but a necessity.

The space angle is where the announcement becomes genuinely unprecedented. Musk said 80% of Terafab’s compute output would be directed toward space-based orbital AI satellites, arguing that solar irradiance in space is roughly 5x greater than at Earth’s surface, and that heat rejection in vacuum makes thermal scaling viable. This directly feeds the SpaceXAI vision, which is betting that within two to three years, running AI workloads in orbit will be cheaper than doing so on the ground. The satellites, powered by constant solar energy, would effectively turn low Earth orbit into the world’s largest data center.

Will Tesla join the fold? Predicting a triple merger with SpaceX and xAI

Historically, this announcement threads together every major Musk initiative of the past two years: the xAI-SpaceX merger, Tesla’s $2.9 billion solar equipment talks with Chinese suppliers, the 100 GW domestic solar manufacturing push, the Optimus humanoid robot program, and Starship’s development. TERAFAB is the capstone that ties them into a single coherent architecture — chips made on Earth, launched by SpaceX, powered by Tesla solar, run by xAI, and ultimately extended to the Moon.

Advertisement

“I want us to live long enough to see the mass driver on the moon, because that’s going to be incredibly epic,”Musk said during the presentation.

Advertisement
Continue Reading

News

Rolls-Royce makes shocking move on its EV future

When Rolls-Royce unveiled its first all-electric model, the Spectre, in 2022, former CEO Torsten Müller-Ötvös declared the brand would cease production of internal combustion engine vehicles by the end of the decade.

Published

on

Rolls Royce Wheels
Credit: BMW Group

Rolls-Royce made a shocking move on its EV future after planning to go all-electric by the end of the decade. Now, the company is tempering its expectations for electric vehicles, and its CEO is aiming to lean on its legacy of high-powered combustion engines to lead it into the future.

In a significant reversal, Rolls-Royce Motor Cars has scrapped its ambitious plan to become an all-electric manufacturer by 2030. The luxury British marque announced the decision amid sustained customer demand for traditional combustion engines and shifting regulatory landscapes.

When Rolls-Royce unveiled its first all-electric model, the Spectre, in 2022, former CEO Torsten Müller-Ötvös declared the brand would cease production of internal combustion engine vehicles by the end of the decade.

The move aligned with the industry’s broader push toward electrification, promising silent, effortless power befitting the “Rolls-Royce of cars.”

Advertisement

However, new CEO Chris Brownridge, who assumed the role in late 2023, has reversed course. “We can respond to our client demand … we build what is ordered,” Brownridge stated.

The company will continue offering its iconic V12 engines, which remain a cornerstone of its heritage and appeal to discerning buyers who appreciate the distinctive sound and character. He noted the original pledge was “right at the time,” but “the legislation has changed.”

While not abandoning electric vehicles entirely, the Spectre remains in production, with an electric Cullinan option forthcoming; the decision marks the end of a strict all-EV timeline. Relaxed emissions regulations and slowing EV demand, evidenced by a 47 percent drop in Spectre sales to 1,002 units in 2025, forced the reconsideration.

It was a sign that perhaps Rolls-Royce owners were not inclined to believe that the company’s all-EV future was the right move.

Advertisement

Rolls Royce customers want more EVs, says company CEO

Rolls-Royce joins a growing roster of automakers reevaluating aggressive electrification targets.

Fellow luxury brand Bentley has pushed its full electrification from 2030 to 2035, while continuing to offer hybrids and ICE models. Mercedes-Benz walked back its 2030 all-EV goal, now aiming for about 50% electrified sales while keeping combustion engines into the 2030s. Porsche has abandoned its 80% EV sales target by 2030, delaying models and extending hybrids.

Mainstream giants are following suit. Honda canceled its U.S. EV plans, including the 0-Series and Acura RSX, facing a $15.7 billion hit as it doubles down on hybrids. Ford and General Motors have incurred tens of billions in writedowns, canceling models and pivoting to hybrids amid an industry total exceeding $70 billion in charges.

Advertisement

This trend reflects a pragmatic shift driven by infrastructure gaps, consumer preferences, and policy changes. In the ultra-luxury segment, where emotional connection reigns, automakers are prioritizing flexibility over rigid deadlines, ensuring brands like Rolls-Royce evolve without alienating their core clientele.

Continue Reading

News

Elon Musk teases expectations for Tesla’s AI6 self-driving chip

This optimistic timeline for tape-out—the stage where chip design is finalized before manufacturing—signals Tesla’s push to rapidly advance its silicon capabilities.

Published

on

Credit: Grok

Tesla CEO Elon Musk is outlining expectations for the AI6 self-driving chip, which is still two generations away. Despite this, it is already in the plans of the company and its serial entrepreneur CEO, who has high expectations for it.

Musk provided fresh details on the company’s aggressive AI hardware roadmap, spotlighting the upcoming AI6 chip designed to supercharge Tesla’s self-driving tech, humanoid robots, and data center operations.

In a post on X dated March 19, Musk stated, “With some luck and acceleration using AI, we might be able to tape out AI6 in December.”

This optimistic timeline for tape-out—the stage where chip design is finalized before manufacturing—signals Tesla’s push to rapidly advance its silicon capabilities.

The announcement builds on progress with the predecessor AI5. Earlier in January, Musk announced that the AI5 design was “in good shape” and “almost done,” describing it as an “existential” project for the company that demanded his personal attention on weekends.

He characterized AI5 as roughly equivalent to Nvidia’s Hopper class performance in a single system-on-chip (SoC) and Blackwell-level as a dual configuration, but at significantly lower cost and power usage.

Advertisement

Elon Musk is setting high expectations for Tesla AI5 and AI6 chips

Musk highlighted that AI5 “will punch far above its weight” thanks to Tesla’s co-designed AI software and hardware stack, making maximal use of every circuit. While capable of data center training tasks, it is primarily optimized for edge computing in Optimus robots and Robotaxi vehicles.

For AI6, Musk envisions substantial gains. “In the same half reticle and same process node, we think a single AI6 chip has the potential to match a dual SoC AI5,” he explained.

The company is targeting ambitious nine-month development cycles for future chips, allowing rapid iteration to AI7, AI8, and beyond. AI5/AI6 engineering remains Musk’s top time allocation at Tesla, with the CEO calling AI5 “good” and AI6 “great.”

Advertisement

Samsung is expected to manufacture the AI6 chips, following deals worth billions, while AI5 will leverage TSMC and Samsung production. These chips will form the backbone of Tesla’s Full Self-Driving system, enabling safer and more capable autonomy, alongside powering dexterous movements in Optimus bots and efficient inference in expanding data centers.

Tesla to discuss expansion of Samsung AI6 production plans: report

Musk has also restarted work on the Dojo 3 supercomputer project now that AI5 is progressing. Long-term plans include in-house manufacturing via the Terafab facility.

By accelerating chip development with AI tools, Tesla aims to reduce dependence on third-party GPUs and deliver high-performance, energy-efficient solutions tailored to its ecosystem. Success with AI6 could mark a major milestone in Tesla’s journey toward full autonomy and robotics leadership, though timelines remain subject to manufacturing realities.

Advertisement
Continue Reading