Connect with us

News

Mars travelers can use ‘Star Trek’ Tricorder-like features using smartphone biotech: study

Published

on

Plans to take humans to the Moon and Mars come with numerous challenges, and the health of space travelers is no exception. One of the ways any ill-effects can be prevented or mitigated is by detecting relevant changes in the body and the body’s surroundings, something that biosensor technology is specifically designed to address on Earth. However, the small size and weight requirements for tech used in the limited habitats of astronauts has impeded its development to date.

A recent study of existing smartphone-based biosensors by scientists from Queen’s University Belfast (QUB) in the UK identified several candidates under current use or development that could be also used in a space or Martian environment. When combined, the technology could provide functionality reminiscent of the “Tricorder” devices used for medical assessments in the Star Trek television and movie franchises, providing on-site information about the health of human space travelers and biological risks present in their habitats.

Biosensors focus on studying biomarkers, i.e., the body’s response to environmental conditions. For example, changes in blood composition, elevations of certain molecules in urine, heart rate increases or decreases, and so forth, are all considered biomarkers. Health and fitness apps tracking general health biomarkers have become common in the marketplace with brands like FitBit leading the charge for overall wellness sensing by tracking sleep patterns, heart rate, and activity levels using wearable biosensors. Astronauts and other future space travelers could likely use this kind of tech for basic health monitoring, but there are other challenges that need to be addressed in a compact way.

The projected human health needs during spaceflight have been detailed by NASA on its Human Research Program website, more specifically so in its web-based Human Research Roadmap (HRR) where the agency has its scientific data published for public review. Several hazards of human spaceflight are identified, such as environmental and mental health concerns, and the QUB scientists used that information to organize their study. Their research produced a 20-page document reviewing the specific inner workings of the relevant devices found in their searches, complete with tables summarizing each device’s methods and suitability for use in space missions. Here are some of the highlights.

A chart showing the classification of scientific articles about relevant smartphone-based biosensors used in the Queen’s University Belfast study. | Credit: Biosensors/Queen’s University Belfast

Risks in the Spacecraft Environment

During spaceflight, the environment is a closed system that has a two-fold effect: One, the immune system has been shown to decrease its functionality in long-duration missions, specifically by lowering white blood cell counts, and two, the weightless and non-competitive environment make it easier for microbes to transfer between humans and their growth rates increase. In one space shuttle era study, the number of microbial cells in the vehicle able to reproduce increased by 300% within 12 days of being in orbit. Also, certain herpes viruses, such as those responsible for chickenpox and mononucleosis, have been reactivated under microgravity, although the astronauts typically didn’t show symptoms despite the presence of active viral shedding (the virus had surfaced and was able to spread).

Frequent monitoring of the spacecraft environment and the crew’s biomarkers is the best way to mitigate these challenges, and NASA is addressing these issues to an extent with traditional instruments and equipment to collect data, although often times the data cannot be processed until the experiments are returned to Earth. An attempt has also been made to rapidly quantify microorganisms aboard the International Space Station (ISS) via a handheld device called the Lab-on-a-Chip Application Development-Portable Test System (LOCAD-PTS). However, this device cannot distinguish between microorganism species yet, meaning it can’t tell the difference between pathogens and harmless species. The QUB study found several existing smartphone-based technologies generally developed for use in remote medical care facilities that could achieve better identification results.

Advertisement
NASA astronaut Karen Nyberg uses a fundoscope to image her eye while in orbit to study Visual Impairment Intracranial Pressure (VIIP) Syndrome. Smaller 3D printed retinal imaging adaptors for smartphones are being developed to perform the testing done by large devices similar to the instrument used here. | Credit: NASA

One of the devices described was a spectrometer (used to identify substances based on the light frequency emitted) which used the smartphone’s flashlight and camera to generate data that was at least as accurate as traditional instruments. Another was able to identify concentrations of an artificial growth hormone injected into cows called recominant bovine somatrotropin (rBST) in test samples, and other systems were able to accurately detect cyphilis and HIV as well as the zika, chikungunya, and dengue viruses. All of the devices used smartphone attachments, some of them with 3D-printed parts. Of course, the types of pathogens detected are not likely to be common in a closed space habitat, but the technology driving them could be modified to meet specific detection needs.

The Stress of Spaceflight

A group of people crammed together in a small space for long periods of time will be impacted by the situation despite any amount of careful selection or training due to the isolation and confinement. Declines in mood, cognition, morale, or interpersonal interaction can impact team functioning or transition into a sleep disorder. On Earth, these stress responses may seem common, or perhaps an expected part of being human, but missions in deep space and on Mars will be demanding and need fully alert, well-communicating teams to succeed. NASA already uses devices to monitor these risks while also addressing the stress factor by managing habitat lighting, crew movement and sleep amounts, and recommending astronauts keep journals to vent as needed. However, an all-encompassing tool may be needed for longer-duration space travels.

As recognized by the QUB study, several “mindfulness” and self-help apps already exist in the market and could be utilized to address the stress factor in future astronauts when combined with general health monitors. For example, the popular FitBit app and similar products collect data on sleep patterns, activity levels, and heart rates which could potentially be linked to other mental health apps that could recommend self-help programs using algorithms. The more recent “BeWell” app monitors physical activity, sleep patterns, and social interactions to analyze stress levels and recommend self-help treatments. Other apps use voice patterns and general phone communication data to assess stress levels such as “StressSense” and “MoodSense”.

A Tricorder-like setup is imagined by scientists at Queens University Belfast, utilizing the functionalities of existing smartphone-based biosensors. | Credit: Biosensors/Queens University Belfast

Advances in smartphone technology such as high resolution cameras, microphones, fast processing speed, wireless connectivity, and the ability to attach external devices provide tools that can be used for an expanding number of “portable lab” type functionalities. Unfortunately, though, despite the possibilities that these biosensors could mean for human spaceflight needs, there are notable limitations that would need to be overcome in some of the devices. In particular, any device utilizing antibodies or enzymes in its testing would risk the stability of its instruments thanks to radiation from galactic cosmic rays and solar particle events. Biosensor electronics might also be damaged by these things as well. Development of new types of shielding may be necessary to ensure their functionality outside of Earth and Earth orbit or, alternatively, synthetic biology could also be a source of testing elements genetically engineered to withstand the space and Martian environments.

The interest in smartphone-based solutions for space travelers has been garnering more attention over the years as tech-centric societies have moved in the “app” direction overall. NASA itself has hosted a “Space Apps Challenge” for the last 8 years, drawing thousands of participants to submit programs that interpret and visualize data for greater understanding of designated space and science topics. Some of the challenges could be directly relevant to the biosensor field. For example, in the 2018 event, contestants are asked to develop a sensor to be used by humans on Mars to observe and measure variables in their environments; in 2017, contestants created visualizations of potential radiation exposure during polar or near-polar flight.

While the QUB study implied that the combination of existing biosensor technology could be equivalent to a Tricorder, the direct development of such a device has been the subject of its own specific challenge. In 2012, the Qualcomm Tricorder XPRIZE competition was launched, asking competitors to develop a user-friendly device that could accurately diagnose 13 health conditions and capture 5 real-time health vital signs. The winner of the prize awarded in 2017 was Pennsylvania-based family team called Final Frontier Medical Devices, now Basil Leaf Technologies, for their DxtER device. According to their website, the sensors inside DxtER can be used independently, one of which is in a Phase 1 Clinical Trial. The second place winner of the competition used a smartphone app to connect its health testing modules and generate a diagnosis from the data acquired from the user.

The march continues to develop the technology humans will need to safely explore regions beyond Earth orbit. Space is hard, but it was hard before we went there the first time, and it was hard before we put humans on the moon. There may be plenty of challenges to overcome, but as the Queen’s University Belfast study demonstrates, we may already be solving them. It’s just a matter of realizing it and expanding on it.

Advertisement

Accidental computer geek, fascinated by most history and the multiplanetary future on its way. Quite keen on the democratization of space. | It's pronounced day-sha, but I answer to almost any variation thereof.

Advertisement
Comments

News

Nvidia CEO Jensen Huang regrets not investing more in Elon Musk’s xAI

The CEO stated that Nvidia is already an investor in xAI, but he wished he had given the artificial intelligence startup more money.

Published

on

Credit: Elon Musk/X

Nvidia CEO Jensen Huang revealed that one of his investment regrets is not putting more money into Elon Musk’s artificial intelligence startup, xAI. 

Speaking in a CNBC interview, Huang said Nvidia is already an investor in xAI but wished he had given the artificial intelligence startup more money. This was due to Musk’s record of building transformative companies such as Tesla and SpaceX.

A new wave of transformative AI firms

Huang said he’s very excited about xAI’s latest financing round. He described Musk’s company as part of a powerful new generation of AI developers, alongside OpenAI and Anthropic. that are reshaping the computing landscape.

“I’m super excited about the financing opportunity they’re doing. The only regret I have about xAI, we’re an investor already, is that I didn’t give him more money. You know almost everything that Elon’s pat of, you really want to be part of as well,” the Nvidia CEO stated.

The CEO also clarified Nvidia’s investment in xAI, revealing that Elon Musk had offered the investment opportunity to the chipmaker. “He (Musk) gave us the opportunity to invest in xAI. I’m just delighted by that,” Huang stated.

Advertisement

AI investment boom

Huang contrasted today’s AI-driven economy with the early days of the internet. “Back then, all the internet companies combined were maybe $30 or $40 billion in size,” he said. “If you look at the hyperscalers now, that’s about $2.5 trillion of business already operating today.”

He also stated that the ongoing shift from CPU-based computing to GPU-powered generative AI represents a “multi-trillion-dollar buildout” that Nvidia is looking to support. Huang added that every Nvidia engineer now works with AI coding assistants such as Cursor, which he called his “favorite enterprise AI service,” and it has led to a major productivity boost across the company.

Watch Nvidia CEO Jensen Huang’s CNBC interview in the video below.

Continue Reading

Investor's Corner

Stifel raises Tesla price target by 9.8% over FSD, Robotaxi advancements

Stifel also maintained a “Buy” rating for the electric vehicle maker.

Published

on

Credit: Tesla China

Investment firm Stifel has raised its price target for Tesla (NASDAQ:TSLA) shares to $483 from $440 over increased confidence in the company’s self-driving and Robotaxi programs. The new price target suggests an 11.5% upside from Tesla’s closing price on Tuesday.

Stifel also maintained a “Buy” rating despite acknowledging that Tesla’s timeline for fully unsupervised driving may be ambitious.

Building confidence

In a note to clients, Stifel stated that it believes “Tesla is making progress with modest advancements in its Robotaxi network and FSD,” as noted in a report from Investing.com. The firm expects unsupervised FSD to become available for personal use in the U.S. by the end of 2025, with a wider ride-hailing rollout potentially covering half of the U.S. population by year-end.

Stifel also noted that Tesla’s Robotaxi fleet could expand from “tiny to gigantic” within a short time frame, possibly making a material financial impact to the company by late 2026. The firm views Tesla’s vision-based approach to autonomy as central to this long-term growth, suggesting that continued advancements could unlock new revenue streams across both consumer and mobility sectors.

Tesla’s FSD goals still ambitious

While Stifel’s tone remains optimistic, the firm’s analysts acknowledged that Tesla’s aggressive autonomy timeline may face execution challenges. The note described the 2025 unsupervised FSD target as “a stretch,” though still achievable in the medium term.

Advertisement

“We believe Tesla is making progress with modest advancements in its Robotaxi network and FSD. The company has high expectations for its camera-based approach including; 1) Unsupervised FSD to be available for personal use in the United States by year-end 2025, which appears to be a stretch but seems more likely in the medium term; 2) that it will ‘probably have ride hailing in probably half of the populations of the U.S. by the end of the year’,” the firm noted.

Continue Reading

News

Tesla Cybertruck gets Full Self-Driving v14 release date, sort of

Published

on

Tesla Cybertruck owners are wondering when they will get access to the company’s Full Self-Driving version 14.1 that rolled out to other owners today for the first time.

Cybertruck owners typically receive Full Self-Driving updates slightly later than other drivers, as the process for the all-electric pickup is different. It is a larger vehicle that requires some additional attention from Tesla before FSD versions are rolled out, so they will be slightly delayed. CEO Elon Musk said the all-wheel steering technically requires a bit more attention before rollout as well.

After some owners got access to the v14.1 Full Self-Driving suite this morning, Cybertruck owners sought out a potential timeframe for when they would be able to experience things for themselves.

Tesla owners show off improvements with new Full Self-Driving v14 rollout

They were able to get an answer from Ashok Elluswamy, Tesla’s Head of AI, who said:

“We got you. Coming soon.”

The release of FSD v14.1 for Cybertruck will not be tempered, either. Elluswamy then confirmed that Tesla would be rolling out the full-featured FSD v14 for the pickup, meaning it would be able to reverse and park itself, among other features.

Elluswamy said it would be capable of these features, which were void in other FSD releases for Cybertruck in the past.

Tesla’s rollout of FSD v14.1 brings several extremely notable changes and improvements to the suite, including more refined operation in parking garages, a new ability to choose parking preferences upon arriving at your destination, a new driving mode called “Sloth,” which is even more reserved than “Chill,” and general operational improvements.

Those who were lucky enough to receive the suite have already started showing off the improvements, and they definitely seem to be a step up from what v13’s more recent versions were capable of.

CEO Elon Musk called v14 “sentient” a few weeks back, and it seems that it is moving toward that. However, he did state that additional releases with more capabilities would be available in the coming weeks, but many owners are still waiting for this first version.

Continue Reading

Trending