Connect with us

News

Mars travelers can use ‘Star Trek’ Tricorder-like features using smartphone biotech: study

Published

on

Plans to take humans to the Moon and Mars come with numerous challenges, and the health of space travelers is no exception. One of the ways any ill-effects can be prevented or mitigated is by detecting relevant changes in the body and the body’s surroundings, something that biosensor technology is specifically designed to address on Earth. However, the small size and weight requirements for tech used in the limited habitats of astronauts has impeded its development to date.

A recent study of existing smartphone-based biosensors by scientists from Queen’s University Belfast (QUB) in the UK identified several candidates under current use or development that could be also used in a space or Martian environment. When combined, the technology could provide functionality reminiscent of the “Tricorder” devices used for medical assessments in the Star Trek television and movie franchises, providing on-site information about the health of human space travelers and biological risks present in their habitats.

Biosensors focus on studying biomarkers, i.e., the body’s response to environmental conditions. For example, changes in blood composition, elevations of certain molecules in urine, heart rate increases or decreases, and so forth, are all considered biomarkers. Health and fitness apps tracking general health biomarkers have become common in the marketplace with brands like FitBit leading the charge for overall wellness sensing by tracking sleep patterns, heart rate, and activity levels using wearable biosensors. Astronauts and other future space travelers could likely use this kind of tech for basic health monitoring, but there are other challenges that need to be addressed in a compact way.

The projected human health needs during spaceflight have been detailed by NASA on its Human Research Program website, more specifically so in its web-based Human Research Roadmap (HRR) where the agency has its scientific data published for public review. Several hazards of human spaceflight are identified, such as environmental and mental health concerns, and the QUB scientists used that information to organize their study. Their research produced a 20-page document reviewing the specific inner workings of the relevant devices found in their searches, complete with tables summarizing each device’s methods and suitability for use in space missions. Here are some of the highlights.

A chart showing the classification of scientific articles about relevant smartphone-based biosensors used in the Queen’s University Belfast study. | Credit: Biosensors/Queen’s University Belfast

Risks in the Spacecraft Environment

During spaceflight, the environment is a closed system that has a two-fold effect: One, the immune system has been shown to decrease its functionality in long-duration missions, specifically by lowering white blood cell counts, and two, the weightless and non-competitive environment make it easier for microbes to transfer between humans and their growth rates increase. In one space shuttle era study, the number of microbial cells in the vehicle able to reproduce increased by 300% within 12 days of being in orbit. Also, certain herpes viruses, such as those responsible for chickenpox and mononucleosis, have been reactivated under microgravity, although the astronauts typically didn’t show symptoms despite the presence of active viral shedding (the virus had surfaced and was able to spread).

Frequent monitoring of the spacecraft environment and the crew’s biomarkers is the best way to mitigate these challenges, and NASA is addressing these issues to an extent with traditional instruments and equipment to collect data, although often times the data cannot be processed until the experiments are returned to Earth. An attempt has also been made to rapidly quantify microorganisms aboard the International Space Station (ISS) via a handheld device called the Lab-on-a-Chip Application Development-Portable Test System (LOCAD-PTS). However, this device cannot distinguish between microorganism species yet, meaning it can’t tell the difference between pathogens and harmless species. The QUB study found several existing smartphone-based technologies generally developed for use in remote medical care facilities that could achieve better identification results.

Advertisement
NASA astronaut Karen Nyberg uses a fundoscope to image her eye while in orbit to study Visual Impairment Intracranial Pressure (VIIP) Syndrome. Smaller 3D printed retinal imaging adaptors for smartphones are being developed to perform the testing done by large devices similar to the instrument used here. | Credit: NASA

One of the devices described was a spectrometer (used to identify substances based on the light frequency emitted) which used the smartphone’s flashlight and camera to generate data that was at least as accurate as traditional instruments. Another was able to identify concentrations of an artificial growth hormone injected into cows called recominant bovine somatrotropin (rBST) in test samples, and other systems were able to accurately detect cyphilis and HIV as well as the zika, chikungunya, and dengue viruses. All of the devices used smartphone attachments, some of them with 3D-printed parts. Of course, the types of pathogens detected are not likely to be common in a closed space habitat, but the technology driving them could be modified to meet specific detection needs.

The Stress of Spaceflight

A group of people crammed together in a small space for long periods of time will be impacted by the situation despite any amount of careful selection or training due to the isolation and confinement. Declines in mood, cognition, morale, or interpersonal interaction can impact team functioning or transition into a sleep disorder. On Earth, these stress responses may seem common, or perhaps an expected part of being human, but missions in deep space and on Mars will be demanding and need fully alert, well-communicating teams to succeed. NASA already uses devices to monitor these risks while also addressing the stress factor by managing habitat lighting, crew movement and sleep amounts, and recommending astronauts keep journals to vent as needed. However, an all-encompassing tool may be needed for longer-duration space travels.

As recognized by the QUB study, several “mindfulness” and self-help apps already exist in the market and could be utilized to address the stress factor in future astronauts when combined with general health monitors. For example, the popular FitBit app and similar products collect data on sleep patterns, activity levels, and heart rates which could potentially be linked to other mental health apps that could recommend self-help programs using algorithms. The more recent “BeWell” app monitors physical activity, sleep patterns, and social interactions to analyze stress levels and recommend self-help treatments. Other apps use voice patterns and general phone communication data to assess stress levels such as “StressSense” and “MoodSense”.

A Tricorder-like setup is imagined by scientists at Queens University Belfast, utilizing the functionalities of existing smartphone-based biosensors. | Credit: Biosensors/Queens University Belfast

Advances in smartphone technology such as high resolution cameras, microphones, fast processing speed, wireless connectivity, and the ability to attach external devices provide tools that can be used for an expanding number of “portable lab” type functionalities. Unfortunately, though, despite the possibilities that these biosensors could mean for human spaceflight needs, there are notable limitations that would need to be overcome in some of the devices. In particular, any device utilizing antibodies or enzymes in its testing would risk the stability of its instruments thanks to radiation from galactic cosmic rays and solar particle events. Biosensor electronics might also be damaged by these things as well. Development of new types of shielding may be necessary to ensure their functionality outside of Earth and Earth orbit or, alternatively, synthetic biology could also be a source of testing elements genetically engineered to withstand the space and Martian environments.

The interest in smartphone-based solutions for space travelers has been garnering more attention over the years as tech-centric societies have moved in the “app” direction overall. NASA itself has hosted a “Space Apps Challenge” for the last 8 years, drawing thousands of participants to submit programs that interpret and visualize data for greater understanding of designated space and science topics. Some of the challenges could be directly relevant to the biosensor field. For example, in the 2018 event, contestants are asked to develop a sensor to be used by humans on Mars to observe and measure variables in their environments; in 2017, contestants created visualizations of potential radiation exposure during polar or near-polar flight.

While the QUB study implied that the combination of existing biosensor technology could be equivalent to a Tricorder, the direct development of such a device has been the subject of its own specific challenge. In 2012, the Qualcomm Tricorder XPRIZE competition was launched, asking competitors to develop a user-friendly device that could accurately diagnose 13 health conditions and capture 5 real-time health vital signs. The winner of the prize awarded in 2017 was Pennsylvania-based family team called Final Frontier Medical Devices, now Basil Leaf Technologies, for their DxtER device. According to their website, the sensors inside DxtER can be used independently, one of which is in a Phase 1 Clinical Trial. The second place winner of the competition used a smartphone app to connect its health testing modules and generate a diagnosis from the data acquired from the user.

The march continues to develop the technology humans will need to safely explore regions beyond Earth orbit. Space is hard, but it was hard before we went there the first time, and it was hard before we put humans on the moon. There may be plenty of challenges to overcome, but as the Queen’s University Belfast study demonstrates, we may already be solving them. It’s just a matter of realizing it and expanding on it.

Advertisement

Accidental computer geek, fascinated by most history and the multiplanetary future on its way. Quite keen on the democratization of space. | It's pronounced day-sha, but I answer to almost any variation thereof.

Advertisement
Comments

News

Elon Musk confirms Tesla FSD V14.2 will see widespread rollout

Musk shared the news in a post on social media platform X.

Published

on

Credit: Whole Mars Catalog/X

Elon Musk has confirmed that Tesla will be implementing a wide rollout of Full Self-Driving (FSD) V14 with the system’s V14.2 update. Musk shared the news in a post on social media platform X. 

FSD V14.1.2 earns strong praise from testers

Musk’s comment came as a response to Tesla owner and longtime FSD tester AI DRIVR, who noted that it might be time to release Full Self-Driving to the fleet because V14.1.2 has already become very refined.

“95% of the indecisive lane changes and braking have been fixed in FSD 14.1.2. I haven’t touched my steering wheel in two days. I think it’s time, Tesla AI,” the longtime FSD tester wrote

AI DRIVR’s comment received quite a bit of support from fellow Tesla drivers, some of whom noted that the improvements that were implemented in V14.1.2 are substantial. Others also agreed that it’s time for FSD to see a wide release.

In his reply to the FSD tester, CEO Elon Musk noted that FSD V14’s wide release would happen with V14.2. “14.2 for widespread use,” Musk wrote in his reply

Advertisement

Mad Max mode makes headlines

One of the key features that was introduced with FSD’s current iteration is Mad Max mode, which allows for higher speeds and more frequent lane changes than the previous “Hurry” mode. Videos and social media posts from FSD testers have shown the system deftly handling complex traffic, merging seamlessly, and maintaining an assertive but safe driving behavior with Mad Max mode engaged.

Tesla AI head Ashok Elluswamy recently noted in a post on X that Mad Max mode was built to handle congested daytime traffic, making it extremely useful for drivers who tend to find themselves in heavy roads during their daily commutes. With Musk now hinting that FSD V14.2 will go on wide release, it might only be a matter of time before the larger Tesla fleet gets to experience the notable improvements of FSD’s V14 update.

Continue Reading

News

Multiple Tesla Cybercab units spotted at Giga Texas crash test facility

The vehicles were covered, but one could easily recognize the Cybercab’s sleek lines and compact size.

Published

on

Credit: @JoeTegtmeyer/X

It appears that Tesla is ramping up its activities surrounding the development and likely initial production of the Cybercab at Giga Texas. This was, at least, hinted at in a recent drone flyover of the massive electric vehicle production facility in Austin. 

Cybercab sightings fuel speculations

As observed by longtime Giga Texas drone operator Joe Tegtmeyer, Tesla had several covered Cybercab units outside the facility’s crash testing facility at the time of his recent flyover. The vehicles were covered, but one could easily recognize the Cybercab’s sleek lines and compact size. Tegtmeyer also observed during his flyover that production of the Model Y Standard seems to be hitting its pace.

The drone operator noted that the seven covered Cybercabs might be older prototypes being decommissioned or new units awaiting crash tests. Either scenario points to a ramp-up in Cybercab activity at Giga Texas, however. “In either case, this is another datapoint indicating production is getting closer to happening,” Tegtmeyer wrote on X, highlighting that the autonomous two-seaters were quite exciting to see.

Cybercab production targets

This latest sighting follows reports of renewed Cybercab appearances at both the Fremont Factory and Giga Texas. A test unit was recently spotted driving on Giga Texas’ South River Road. Another Cybercab, seen at Tesla’s Fremont Factory, appeared to be manually driven, suggesting that the vehicle’s current prototypes may still be produced with temporary steering controls.

The Tesla Cybercab is designed to be the company’s highest-volume vehicle, with CEO Elon Musk estimating that the autonomous two-seater should see an annual production rate of about 2 million units per year. To accomplish this, Tesla will be building the Cybercab using its “Unboxed” process, which should help the vehicle’s production line achieve outputs that are more akin to consumer electronics production lines.

Advertisement
Continue Reading

Elon Musk

Teslas in the Boring Co. Vegas Loop are about to get a big change

Elon Musk has a big update for Teslas that operate within the Boring Company’s Vegas Loop.

Published

on

the boring company's vegas loop entrance
(Credit: Sam Morris, LVCVA/Las Vegas News Bureau)

Tesla vehicles operating in the Boring Company’s Vegas Loop are about to get a big change, CEO Elon Musk said.

In Las Vegas, the Boring Company operates the Vegas Loop, an underground tunnel system that uses Teslas to drop people off at various hotspots on the strip. It’s been active for a few years now and is expanding to other resorts, hotels, and destinations.

Currently, there are stops at three resorts: Westgate, the Encore, and Resorts World. However, there will eventually be “over 100 stations and span over 68 miles of tunnel,” the Vegas Loop website says.

The Loop utilizes Tesla Model 3 and Model Y vehicles to send passengers to their desired destinations. They are currently driven using the Full Self-Driving suite, but they also have safety drivers in each vehicle to ensure safety.

Tesla Cybertruck rides are crucial for Vegas Loop expansion to airport

Tesla and the Boring Company have been working to remove drivers from the vehicles used in the Loop, but now, it appears there is a set timeline to have them out, according to CEO Elon Musk:

Musk says the Boring Co. will no longer rely on safety drivers within the Teslas for operation. Instead, Tesla will look to remove the safety drivers from the cars within the next month or two, a similar timeline for what Musk believes the Robotaxi platform will look like in Austin.

In Texas, as Robotaxi continues to operate as it has since June, there are still safety monitors within the car who sit in the passenger’s seat. They are there to ensure a safe experience for riders.

When the route takes the vehicle on the highway, safety monitors move into the driver’s seat.

However, Tesla wants to be able to remove safety monitors from its vehicles in Austin by the end of the year, Musk has said recently.

In early September, Musk said that the safety monitors are “just there for the first few months to be extra safe.” He then added that there “should be no safety driver by end of year.”

Continue Reading

Trending