

News
Mars travelers can use ‘Star Trek’ Tricorder-like features using smartphone biotech: study
Plans to take humans to the Moon and Mars come with numerous challenges, and the health of space travelers is no exception. One of the ways any ill-effects can be prevented or mitigated is by detecting relevant changes in the body and the body’s surroundings, something that biosensor technology is specifically designed to address on Earth. However, the small size and weight requirements for tech used in the limited habitats of astronauts has impeded its development to date.
A recent study of existing smartphone-based biosensors by scientists from Queen’s University Belfast (QUB) in the UK identified several candidates under current use or development that could be also used in a space or Martian environment. When combined, the technology could provide functionality reminiscent of the “Tricorder” devices used for medical assessments in the Star Trek television and movie franchises, providing on-site information about the health of human space travelers and biological risks present in their habitats.
Biosensors focus on studying biomarkers, i.e., the body’s response to environmental conditions. For example, changes in blood composition, elevations of certain molecules in urine, heart rate increases or decreases, and so forth, are all considered biomarkers. Health and fitness apps tracking general health biomarkers have become common in the marketplace with brands like FitBit leading the charge for overall wellness sensing by tracking sleep patterns, heart rate, and activity levels using wearable biosensors. Astronauts and other future space travelers could likely use this kind of tech for basic health monitoring, but there are other challenges that need to be addressed in a compact way.
The projected human health needs during spaceflight have been detailed by NASA on its Human Research Program website, more specifically so in its web-based Human Research Roadmap (HRR) where the agency has its scientific data published for public review. Several hazards of human spaceflight are identified, such as environmental and mental health concerns, and the QUB scientists used that information to organize their study. Their research produced a 20-page document reviewing the specific inner workings of the relevant devices found in their searches, complete with tables summarizing each device’s methods and suitability for use in space missions. Here are some of the highlights.
Risks in the Spacecraft Environment
During spaceflight, the environment is a closed system that has a two-fold effect: One, the immune system has been shown to decrease its functionality in long-duration missions, specifically by lowering white blood cell counts, and two, the weightless and non-competitive environment make it easier for microbes to transfer between humans and their growth rates increase. In one space shuttle era study, the number of microbial cells in the vehicle able to reproduce increased by 300% within 12 days of being in orbit. Also, certain herpes viruses, such as those responsible for chickenpox and mononucleosis, have been reactivated under microgravity, although the astronauts typically didn’t show symptoms despite the presence of active viral shedding (the virus had surfaced and was able to spread).
Frequent monitoring of the spacecraft environment and the crew’s biomarkers is the best way to mitigate these challenges, and NASA is addressing these issues to an extent with traditional instruments and equipment to collect data, although often times the data cannot be processed until the experiments are returned to Earth. An attempt has also been made to rapidly quantify microorganisms aboard the International Space Station (ISS) via a handheld device called the Lab-on-a-Chip Application Development-Portable Test System (LOCAD-PTS). However, this device cannot distinguish between microorganism species yet, meaning it can’t tell the difference between pathogens and harmless species. The QUB study found several existing smartphone-based technologies generally developed for use in remote medical care facilities that could achieve better identification results.

One of the devices described was a spectrometer (used to identify substances based on the light frequency emitted) which used the smartphone’s flashlight and camera to generate data that was at least as accurate as traditional instruments. Another was able to identify concentrations of an artificial growth hormone injected into cows called recominant bovine somatrotropin (rBST) in test samples, and other systems were able to accurately detect cyphilis and HIV as well as the zika, chikungunya, and dengue viruses. All of the devices used smartphone attachments, some of them with 3D-printed parts. Of course, the types of pathogens detected are not likely to be common in a closed space habitat, but the technology driving them could be modified to meet specific detection needs.
The Stress of Spaceflight
A group of people crammed together in a small space for long periods of time will be impacted by the situation despite any amount of careful selection or training due to the isolation and confinement. Declines in mood, cognition, morale, or interpersonal interaction can impact team functioning or transition into a sleep disorder. On Earth, these stress responses may seem common, or perhaps an expected part of being human, but missions in deep space and on Mars will be demanding and need fully alert, well-communicating teams to succeed. NASA already uses devices to monitor these risks while also addressing the stress factor by managing habitat lighting, crew movement and sleep amounts, and recommending astronauts keep journals to vent as needed. However, an all-encompassing tool may be needed for longer-duration space travels.
As recognized by the QUB study, several “mindfulness” and self-help apps already exist in the market and could be utilized to address the stress factor in future astronauts when combined with general health monitors. For example, the popular FitBit app and similar products collect data on sleep patterns, activity levels, and heart rates which could potentially be linked to other mental health apps that could recommend self-help programs using algorithms. The more recent “BeWell” app monitors physical activity, sleep patterns, and social interactions to analyze stress levels and recommend self-help treatments. Other apps use voice patterns and general phone communication data to assess stress levels such as “StressSense” and “MoodSense”.
Advances in smartphone technology such as high resolution cameras, microphones, fast processing speed, wireless connectivity, and the ability to attach external devices provide tools that can be used for an expanding number of “portable lab” type functionalities. Unfortunately, though, despite the possibilities that these biosensors could mean for human spaceflight needs, there are notable limitations that would need to be overcome in some of the devices. In particular, any device utilizing antibodies or enzymes in its testing would risk the stability of its instruments thanks to radiation from galactic cosmic rays and solar particle events. Biosensor electronics might also be damaged by these things as well. Development of new types of shielding may be necessary to ensure their functionality outside of Earth and Earth orbit or, alternatively, synthetic biology could also be a source of testing elements genetically engineered to withstand the space and Martian environments.
The interest in smartphone-based solutions for space travelers has been garnering more attention over the years as tech-centric societies have moved in the “app” direction overall. NASA itself has hosted a “Space Apps Challenge” for the last 8 years, drawing thousands of participants to submit programs that interpret and visualize data for greater understanding of designated space and science topics. Some of the challenges could be directly relevant to the biosensor field. For example, in the 2018 event, contestants are asked to develop a sensor to be used by humans on Mars to observe and measure variables in their environments; in 2017, contestants created visualizations of potential radiation exposure during polar or near-polar flight.
While the QUB study implied that the combination of existing biosensor technology could be equivalent to a Tricorder, the direct development of such a device has been the subject of its own specific challenge. In 2012, the Qualcomm Tricorder XPRIZE competition was launched, asking competitors to develop a user-friendly device that could accurately diagnose 13 health conditions and capture 5 real-time health vital signs. The winner of the prize awarded in 2017 was Pennsylvania-based family team called Final Frontier Medical Devices, now Basil Leaf Technologies, for their DxtER device. According to their website, the sensors inside DxtER can be used independently, one of which is in a Phase 1 Clinical Trial. The second place winner of the competition used a smartphone app to connect its health testing modules and generate a diagnosis from the data acquired from the user.
The march continues to develop the technology humans will need to safely explore regions beyond Earth orbit. Space is hard, but it was hard before we went there the first time, and it was hard before we put humans on the moon. There may be plenty of challenges to overcome, but as the Queen’s University Belfast study demonstrates, we may already be solving them. It’s just a matter of realizing it and expanding on it.
News
Tesla Full Self-Driving v14.1 first impressions: Robotaxi-like features arrive
Tesla Full Self-Driving v14.1 is here, and we got to experience it for ourselves.

Tesla rolled out its Full Self-Driving v14.1 yesterday, its first public launch of its most robust and accurate FSD iteration yet. Luckily, I was able to get my hands on it through the Early Access Program.
The major changes in FSD v14.1 were revealed in the release notes, which outline several notable improvements in areas such as driving styles, parking, and overall navigation. Here’s what Tesla outlined fully in its release notes:
- Added Arrival Options for you to select where FSD should park: in a Parking Lot, on the Street, in a Driveway, in a Parking Garage, or at the Curbside.
- Added handling to pull over or yield for emergency vehicles (e.g. police cars, fire trucks, ambulances).
- Added navigation and routing into the vision-based neural network for real-time handling of blocked roads and detours.
- Added additional Speed Profile to further customize driving style preference.
- Improved handling for static and dynamic gates.
- Improved offsetting for road debris (e.g. tires, tree branches, boxes).
- Improve handling of several scenarios including: unprotected turns, lane changes, vehicle cut-ins, and school busses.
- Improved FSD’s ability to manage system faults and recover smoothly from degraded operation for enhanced reliability.
- Added alerting for residue build-up on interior windshield that may impact front camera visibility. If affected, visit Service for cleaning!
I wanted to try it for myself. My big must-dos were my complaints with v13.2.9, which included parking when arriving at a destination, Navigation when leaving a destination, and definitely a general improvement in the car traveling at an acceptable rate of speed, even when using the “Hurry” driving style.
Here’s what I noticed with the new Full Self-Driving v14.1:
Speed Profiles are More Realistic
I am driving on “Hurry” about 95% of the time when utilizing Full Self-Driving. In past versions, most notably v13.2.9, my Tesla would slowly reach the speed limit, and it would tend to hang out at about 1-2 MPH either above or below it.
My first observation with v14.1 was the vehicle’s tendency to get right up to speed and, since I was still on Hurry, drive slightly above the speed limit. It never got out of line; it traveled at speeds I would typically drive at manually.
I think this is a big improvement on its own, because I felt that I was pressing the accelerator too frequently in past FSD versions. Oftentimes, it just wasn’t going fast enough to justify the “Hurry” label; it felt more conservative and more like a student driver than anything.
Check it out:
🚨 Tesla Full Self-Driving v14.1 travels at more realistic speeds on local roads.
With 13.2.9, even on Hurry, it would hover the speed limit a little too much, often times traveling 1-2 MPH below or over.
It now travels at more realistic speeds. The removal of Max Speed and… pic.twitter.com/DPC0oBl3SC
— TESLARATI (@Teslarati) October 8, 2025
This was among my favorite improvements, and it was the first thing I noticed as the car navigated me to the Supercharger, where my next positive is.
Navigating into parking lots, self-parking at Supercharger
One of the changes noted in the Release Notes was the addition of Arrival Options, which allows the car to select the appropriate parking situation. Since I was going to charge, the car had already chosen “Charger” as the parking option.
Pulling into a gas station or convenience store, especially during work days, can be stressful, as they are usually congested and full of foot and vehicle traffic. In past FSD versions, I have noticed the car being slightly “jumpy” and even hesitant to proceed through the lot.
Driving through parking lots was a noticeable improvement. It seems as if the car is much more confident in making its way through, while still being aware and cautious enough to safely navigate to the Supercharger.
It then backed straight into a Supercharger stall, which was recently repaired and is once again active. I was actually upset it chose this specific stall because it had been inactive for a while. However, Tesla got this stall back up and running, the car chose it, and backed into the spot flawlessly:
🚨 Check out Tesla Full Self-Driving v14 choosing and backing into a Supercharger
After selecting this Supercharger at the beginning of my trip, my Tesla had already selected “Charger” as the arrival option pic.twitter.com/jqLNwQ9x0o
— TESLARATI (@Teslarati) October 8, 2025
This was super cool to experience, and I think it is a testament to how hard the Tesla AI team has worked. CEO Elon Musk recently stated that FSD would enable automatic parking at Superchargers, which was really awesome to experience firsthand.
I decided to leave the Supercharger and go to an auto parts store to pick up some interior cleaner and some microfiber towels. I love keeping my Tesla clean!
I also thought it would be a great opportunity to see how it would react to another parking lot, how it would navigate it, and let it choose a parking spot. It did it all flawlessly:
🚨 Here’s Tesla Full Self-Driving v14.1 navigating to a store, pulling in, choosing a parking spot, and backing right in
From v13.2.9, this is a drastic improvement. Typically, manual parking was required in past versions when arriving at retail locations. pic.twitter.com/kgFMu6dxnW
— TESLARATI (@Teslarati) October 8, 2025
I had zero complaints about everything here. All of it was done really well.
Making a choice after being caught in the middle of an intersection
I arrived at a tight intersection in Dallastown, PA, and what my car did next has catalyzed quite a conversation on X.
It proceeded out into the middle of the intersection as the light was green. It had to yield to oncoming traffic, and while waiting, the light turned yellow, then red.
Most people, including myself, would have turned right and proceeded through the intersection since the car was already past the line. However, FSD chose to back up and wait for the next light cycle, which I felt was also a more than acceptable option:
🚨 Super cool thing Tesla FSD v14.1 did: it proceeded thru this intersection to turn left, but the light had gone to red before the turn could be completed.
It put itself in reverse and backed up to the “Stop Here on Red” sign/line. Didn’t proceed at a red or impede others. pic.twitter.com/AKb1AI32fK
— TESLARATI (@Teslarati) October 8, 2025
There are some conflicting perspectives on what it chose to do here. Some said they would have proceeded and would want FSD to also proceed. I can agree with that perspective, but I also think it is not the worst thing in the world to back up. In Pennsylvania, I couldn’t find the exact law that says what is right or wrong. Instead, I did see that a left turn on red is only feasible when you’re going from a One-Way street to another One-Way.
I’m not totally sure what is “correct” here, but I think either option is fine. I have personally done both, and I’ve seen other drivers do both. I was more than fine with the car doing this, and I was honestly impressed that it did.
Navigated a busy grocery store lot, found suitable parking
This is not the busiest my local grocery store gets, but it was still congested enough for me to be impressed.
FSD decided to do one loop in the parking lot before it found a spot that it felt was good enough for me. I was perfectly fine with where it chose to park, and I thought it did a really great job. I was impressed with how stress-free I felt, as I have noted in the past that parking lots are definitely an area where Tesla needs to improve.
I was happy with its performance:
🚨 Here, @Tesla Full Self-Driving v14.1 searched for a parking space at the grocery store.
It did one loop, navigating safely through pedestrians and carts before it decided this spot was good enough for me.
This truly will take the stress out of parking at busy stores pic.twitter.com/73U3Bl7Odm
— TESLARATI (@Teslarati) October 8, 2025
Strange right turn signal as if it saw an emergency vehicle
This was the first bug I noticed with FSD v14.1. While traveling on a local road, it put the right turn signal on and approached the curb as if it was pulling over for an emergency vehicle or as if it was going to park on the street.
It then realized its mistake and proceeded:
Now for a couple bugs 🐞
Tesla FSD v14.1 put its right turn signal on as if it was going to pull over. It did move closer to curb, but then realized this wasn’t the correct maneuver for our route.
It proceeded without much issue pic.twitter.com/yoUoyzWMDM
— TESLARATI (@Teslarati) October 8, 2025
I’m not super sure what caused this, but I was a tad bit confused. There were no police cars, ambulances, or anyone with flashing lights to my rear. There was a dump truck on the other side of the road, and I almost felt like the way it navigated “around” that was probably what triggered it.
Navigation is still making strange decisions
I’ve written about navigation and my discontent with some of its decisions. It seems v14.1 didn’t resolve much of anything with navigation, and it did a couple of things wrong.
The first was that it tried to take the illogical and pointless path out of the Supercharger. I wrote about this a few days ago, as FSD tried to take my car the wrong way.
It did it again, but I overrode the decision, and it was all okay:
Bug 🐞 no. 3: I have this issue at this Supercharger and I talk about it frequently.
Navigation takes illogical and strange exit from Supercharger. I override, turning left instead of right, Nav adjusts and picks correct routing. Hoping this is resolved soon. pic.twitter.com/4VsCGHZbYW
— TESLARATI (@Teslarati) October 8, 2025
This is a minor issue, but it is still pretty frustrating. Hopefully, the navigation will learn after performing this adjustment after enough times.
The next navigation issue was more frustrating than the Supercharger one, especially considering it completely ignored the route. The navigation had the vehicle very clearly heading straight, but out of nowhere, the right turn signal went on. I overrode it, but the car still turned right, ignoring the navigation completely:
Bug 🐞 no. 2: Navigation clearly shows the route continuing straight through the traffic light. I noticed the right turn signal coming on, so I overrode it.
The car turned right anyway. I took over and drove manually until I was able to get to a stop so I could re-activate FSD pic.twitter.com/nxt4UlRqkK
— TESLARATI (@Teslarati) October 8, 2025
I ended up taking over here and driving until I could get to a stop sign.
Final Thoughts
I am really impressed with all of the changes Tesla made with FSD v14.1, and while there were a handful of bugs, things were tremendously better than v13.2.9.
News
Nvidia CEO Jensen Huang regrets not investing more in Elon Musk’s xAI
The CEO stated that Nvidia is already an investor in xAI, but he wished he had given the artificial intelligence startup more money.

Nvidia CEO Jensen Huang revealed that one of his investment regrets is not putting more money into Elon Musk’s artificial intelligence startup, xAI.
Speaking in a CNBC interview, Huang said Nvidia is already an investor in xAI but wished he had given the artificial intelligence startup more money. This was due to Musk’s record of building transformative companies such as Tesla and SpaceX.
A new wave of transformative AI firms
Huang said he’s very excited about xAI’s latest financing round. He described Musk’s company as part of a powerful new generation of AI developers, alongside OpenAI and Anthropic. that are reshaping the computing landscape.
“I’m super excited about the financing opportunity they’re doing. The only regret I have about xAI, we’re an investor already, is that I didn’t give him more money. You know almost everything that Elon’s pat of, you really want to be part of as well,” the Nvidia CEO stated.
The CEO also clarified Nvidia’s investment in xAI, revealing that Elon Musk had offered the investment opportunity to the chipmaker. “He (Musk) gave us the opportunity to invest in xAI. I’m just delighted by that,” Huang stated.
AI investment boom
Huang contrasted today’s AI-driven economy with the early days of the internet. “Back then, all the internet companies combined were maybe $30 or $40 billion in size,” he said. “If you look at the hyperscalers now, that’s about $2.5 trillion of business already operating today.”
He also stated that the ongoing shift from CPU-based computing to GPU-powered generative AI represents a “multi-trillion-dollar buildout” that Nvidia is looking to support. Huang added that every Nvidia engineer now works with AI coding assistants such as Cursor, which he called his “favorite enterprise AI service,” and it has led to a major productivity boost across the company.
Watch Nvidia CEO Jensen Huang’s CNBC interview in the video below.
Investor's Corner
Stifel raises Tesla price target by 9.8% over FSD, Robotaxi advancements
Stifel also maintained a “Buy” rating for the electric vehicle maker.

Investment firm Stifel has raised its price target for Tesla (NASDAQ:TSLA) shares to $483 from $440 over increased confidence in the company’s self-driving and Robotaxi programs. The new price target suggests an 11.5% upside from Tesla’s closing price on Tuesday.
Stifel also maintained a “Buy” rating despite acknowledging that Tesla’s timeline for fully unsupervised driving may be ambitious.
Building confidence
In a note to clients, Stifel stated that it believes “Tesla is making progress with modest advancements in its Robotaxi network and FSD,” as noted in a report from Investing.com. The firm expects unsupervised FSD to become available for personal use in the U.S. by the end of 2025, with a wider ride-hailing rollout potentially covering half of the U.S. population by year-end.
Stifel also noted that Tesla’s Robotaxi fleet could expand from “tiny to gigantic” within a short time frame, possibly making a material financial impact to the company by late 2026. The firm views Tesla’s vision-based approach to autonomy as central to this long-term growth, suggesting that continued advancements could unlock new revenue streams across both consumer and mobility sectors.
Tesla’s FSD goals still ambitious
While Stifel’s tone remains optimistic, the firm’s analysts acknowledged that Tesla’s aggressive autonomy timeline may face execution challenges. The note described the 2025 unsupervised FSD target as “a stretch,” though still achievable in the medium term.
“We believe Tesla is making progress with modest advancements in its Robotaxi network and FSD. The company has high expectations for its camera-based approach including; 1) Unsupervised FSD to be available for personal use in the United States by year-end 2025, which appears to be a stretch but seems more likely in the medium term; 2) that it will ‘probably have ride hailing in probably half of the populations of the U.S. by the end of the year’,” the firm noted.
-
Elon Musk2 weeks ago
Tesla FSD V14 set for early wide release next week: Elon Musk
-
News2 weeks ago
Elon Musk gives update on Tesla Optimus progress
-
News2 weeks ago
Tesla has a new first with its Supercharger network
-
News2 weeks ago
Tesla job postings seem to show next surprise market entry
-
Investor's Corner2 weeks ago
Tesla gets new Street-high price target with high hopes for autonomy domination
-
Lifestyle2 weeks ago
500-mile test proves why Tesla Model Y still humiliates rivals in Europe
-
News1 week ago
Tesla Giga Berlin’s water consumption has achieved the unthinkable
-
Lifestyle1 week ago
Tesla Model S Plaid battles China’s 1500 hp monster Nurburgring monster, with surprising results