News
Engineers develop bio-machine nose that can “sniff” and classify odors
Engineers from Brown University in Rhode Island have invented a small, low-cost sensor device which is able to classify odors using input from a mimicked “sniffing” action. It’s called TruffleBot, and it’s here to raise the bar on electronic “noses”. It also works with Raspberry Pi, an inexpensive mini-computer popular with electronics hobbyists, students, and others in the “maker” crowd.
Generally, an electronic nose is a device comprising several chemical sensors whose results are fed through a pattern-recognition system to identify odors. In traditional devices, the chemical responses alone are used for classification. The engineers behind this invention, however, decided to incorporate non-chemical data to account for the mechanics of the smell process used in nature for a better result. Their experiment proved successful with an approximate 95-98% rate of accuracy in identification compared to about 80-90% accuracy with the chemical sensors alone.
According to the inventors’ published paper, the guiding knowledge that made TruffleBot so useful in odor detection was this: Different smells have different impacts on the air around them, and measuring the variations enables more accurate identification. Did you know that beer odor decreases air pressure and increases temperature? The changes are slight, but TruffleBot can sense them.
This is where the “sniffing” comes in. The device uses air pumped through four obstructed pathways before sending it through chemical and non-chemical sensors. Odors impact the air surrounding them, and the movement of the air through obstacles (“sniffing”) enables the odors’ impact to be more accurately measured.
A chart detailing how TruffleBot processes odors. | Credit: Brown University
So, where exactly would one need an electronic nose? Everywhere. Devices with the chemical sensing ability are being used in agriculture, military, and commercial applications to identify all sorts environmental data. Essentially, electronic noses are useful in any industrial application that has odor involved.
Nasal Marketing
Did you know that it’s possible to trademark a smell in the United States? It’s not easy to accomplish given the somewhat difficult requirements to meet, but a few such things exist. The fact that Play-Doh, a product whose smell is probably one of its most distinct features, was granted a trademark for the scent only this year is testament to the difficulty of obtaining such a mark. However, the fact that some companies have found enough incentive to make sure only their company can give your nose a particular chemical experience tells a lot about that sense’s importance from a marketing perspective.
On one hand, utilizing smell in marketing might seem a little manipulative. After all, creating an air freshener that reminds someone of a beloved, deceased relative on purpose might not seem like a particularly ethical way to target their money. On the other hand (or bigger picture), however, the motivation for marketers to use scent as a tool involves a sort of “chicken or the egg” question.
To summarize part of an article in the journal Sensors on the role scent plays in society and commerce, the aroma of products has a direct impact on their appeal to customers and thus, the success of the product. In fact, a change in a product’s formula that impacts its smell can, and often has had, devastating sales results. In other words, it’s not enough for a company to create a good product; it has to be a good smelling product.
Hacking the Human Nose
It’s probably no surprise that the commercial industry has categorized consumer preferences when it comes to smells. As the first sense fully developed after birth, our noses link us to things like memories, emotions, and chemical communication (think pheromones). Is it any wonder, then, why businesses might be interested in the functionality of the organ that is doing the receiving?
Turns out, there’s an enormous amount of science behind “hacking” a nose. Identifying smells is more than just categorizing chemical mixtures as “floral” or “masculine”. The multitude of chemical combinations available generates such a vast amount of data that scientists have implemented computer neural networks to analyze and classify it. Also, the actual mechanics of smelling something impacts the way the smell is received and processed in the brain. Computers and scientific instruments come in handy there as well. To really get to the core of human response to an aroma, lots of non-human tools are needed, and this is essentially where the TruffleBot fits in the greater realm of “olfactory” science.
I think this is a Sumerian variant for “fruity”. | Credit: AstroJane’s bathroom collection.
More Than Just Your Money
Perhaps one of the most innovative uses found for electronic noses is in disease research. One of the limitations of human smell is its overall weakness. A dog’s sense of smell is around 40 times better than a human’s, and a bear’s is a whopping 2,100 times superior to ours. That said, when researchers learned that certain diseases give off certain odors, the human nose wasn’t exactly the first choice to utilize in sensing them.
An electronic nose makes good use of the simple fact that organic matter releases chemicals into the air. For example, when a plant has been impacted by a fungus, the changes brought on in the plant’s structure release what’s called “volatile organic compounds” (VOCs). These VOCs can be detected by the sensors in an electronic nose and then provide information on the type of disease present without destroying the plants being tested.
Humans have some amazing things to gain from electronic noses, too. Using sensors to process odors from VOCs, things like digestive diseases, kidney diseases, and diabetes, among many others, are all receiving scientific attention for non-invasive diagnosis by these types of devices. With improvements brought on by inventions like TruffleBot, especially combined with its low-cost and resulting accessibility, a future involving remote diagnoses for any number of illnesses and diseases seems more possible every day.
Elon Musk
Tesla Full Self-Driving v14.2.1 texting and driving: we tested it
We decided to test it, and our main objective was to try to determine a more definitive label for when it would allow you to grab your phone and look at it without any nudge from the in-car driver monitoring system.
On Thursday, Tesla CEO Elon Musk said that Full Self-Driving v14.2.1 would enable texting and driving “depending on [the] context of surrounding traffic.”
Tesla CEO Elon Musk announces major update with texting and driving on FSD
We decided to test it, and our main objective was to try to determine a more definitive label for when it would allow you to grab your phone and look at it without any nudge from the in-car driver monitoring system.
I’d also like to add that, while Tesla had said back in early November that it hoped to allow this capability within one to two months, I still would not recommend you do it. Even if Tesla or Musk says it will allow you to do so, you should take into account the fact that many laws do not allow you to look at your phone. Be sure to refer to your local regulations surrounding texting and driving, and stay attentive to the road and its surroundings.
The Process
Based on Musk’s post on X, which said the ability to text and drive would be totally dependent on the “context of surrounding traffic,” I decided to try and find three levels of congestion: low, medium, and high.
I also tried as best as I could to always glance up at the road, a natural reaction, but I spent most of my time, during the spans of when it was in my hand, looking at my phone screen. I limited my time looking at the phone screen to a few seconds, five to seven at most. On local roads, I didn’t go over five seconds; once I got to the highway, I ensured the vehicle had no other cars directly in front of me.
Also, at any time I saw a pedestrian, I put my phone down and was fully attentive to the road. I also made sure there were no law enforcement officers around; I am still very aware of the law, which is why I would never do this myself if I were not testing it.
I also limited the testing to no more than one minute per attempt.
I am fully aware that this test might ruffle some feathers. I’m not one to text and drive, and I tried to keep this test as abbreviated as possible while still getting some insight on how often it would require me to look at the road once again.
The Results
Low Congestion Area
I picked a local road close to where I live at a time when I knew there would be very little traffic. I grabbed my phone and looked at it for no more than five seconds before I would glance up at the road to ensure everything was okay:
In full: the Low Congestion Area pic.twitter.com/6DqlBnekPn
— TESLARATI (@Teslarati) December 4, 2025
Looking up at the road was still regular in frequency; I would glance up at the road after hitting that five-second threshold. Then I would look back down.
I had no nudges during this portion of the test. Traffic was far from even a light volume, and other vehicles around were very infrequently seen.
Medium Congestion Area
This area had significantly more traffic and included a stop at a traffic light. I still kept the consecutive time of looking at my phone to about five seconds.
I would quickly glance at the road to ensure everything was okay, then look back down at my phone, spending enough time looking at a post on Instagram, X, or Facebook to determine what it was about, before then peeking at the road again.
There was once again no alert to look at the road, and I started to question whether I was even looking at my phone long enough to get an alert:
In full: the Medium Congestion Area pic.twitter.com/gnhIfBVe6Q
— TESLARATI (@Teslarati) December 4, 2025
Based on past versions of Full Self-Driving, especially dating back to v13, even looking out the window for too long would get me a nudge, and it was about the same amount of time, sometimes more, sometimes less, I would look out of a window to look at a house or a view.
High Congestion Area
I decided to use the highway as a High Congestion Area, and it finally gave me an alert to look at the road.
As strange as it is, I felt more comfortable looking down at my phone for a longer amount of time on the highway, especially considering there is a lower chance of a sudden stop or a dangerous maneuver by another car, especially as I was traveling just 5 MPH over in the left lane.
This is where I finally got an alert from the driver monitoring system, and I immediately put my phone down and returned to looking at the road:
In full: the High Congestion Area pic.twitter.com/K9rIn4ROvm
— TESLARATI (@Teslarati) December 4, 2025
Once I was able to trigger an alert, I considered the testing over with. I think in the future I’d like to try this again with someone else in the car to keep their eyes on the road, but I’m more than aware that we can’t always have company while driving.
My True Thoughts
Although this is apparently enabled based on what was said, I still do not feel totally comfortable with it. I would not ever consider shooting a text or responding to messages because Full Self-Driving is enabled, and there are two reasons for that.
The first is the fact that if an accident were to happen, it would be my fault. Although it would be my fault, people would take it as Tesla’s fault, just based on what media headlines usually are with accidents involving these cars.
Secondly, I am still well aware that it’s against the law to use your phone while driving. In Pennsylvania, we have the Paul Miller Law, which prohibits people from even holding their phones, even at stop lights.
I’d feel much more comfortable using my phone if liability were taken off of me in case of an accident. I trust FSD, but I am still erring on the side of caution, especially considering Tesla’s website still indicates vehicle operators have to remain attentive while using either FSD or Autopilot.
Check out our full test below:
Elon Musk
Tesla CEO Elon Musk announces major update with texting and driving on FSD
“Depending on context of surrounding traffic, yes,” Musk said in regards to FSD v14.2.1 allowing texting and driving.
Tesla CEO Elon Musk has announced a major update with texting and driving capabilities on Full Self-Driving v14.2.1, the company’s latest version of the FSD suite.
Tesla Full Self-Driving, even in its most mature and capable versions, is still a Level 2 autonomous driving suite, meaning it requires attention from the vehicle operator.
You cannot sleep, and you should not take attention away from driving; ultimately, you are still solely responsible for what happens with the car.
The vehicles utilize a cabin-facing camera to enable attention monitoring, and if you take your eyes off the road for too long, you will be admonished and advised to pay attention. After five strikes, FSD and Autopilot will be disabled.
However, Musk announced at the Annual Shareholder Meeting in early November that the company would look at the statistics, but it aimed to allow people to text and drive “within the next month or two.”
He said:
“I am confident that, within the next month or two, we’re gonna look at the safety statistics, but we will allow you to text and drive.”
“I am confident that, within the next month or two, we’re gonna look at the safety statistics, but we will allow you to text and drive.”
Does anyone think v14.3 will enable this? pic.twitter.com/N2yn0SK70M
— TESLARATI (@Teslarati) November 23, 2025
Today, Musk confirmed that the current version of Full Self-Driving, which is FSD v14.2.1, does allow for texting and driving “depending on context of surrounding traffic.”
Depending on context of surrounding traffic, yes
— Elon Musk (@elonmusk) December 4, 2025
There are some legitimate questions with this capability, especially as laws in all 50 U.S. states specifically prohibit texting and driving. It will be interesting to see the legality of it, because if a police officer sees you texting, they won’t know that you’re on Full Self-Driving, and you’ll likely be pulled over.
Some states prohibit drivers from even holding a phone when the car is in motion.
It is certainly a move toward unsupervised Full Self-Driving operation, but it is worth noting that Musk’s words state it will only allow the vehicle operator to do it depending on the context of surrounding traffic.
He did not outline any specific conditions that FSD would allow a driver to text and drive.
News
Tesla Semi just got a huge vote of confidence from 300-truck fleet
The confidential meeting marks a major step for the mid-sized carrier in evaluating the electric truck for its regional routes.
The Tesla Semi is moving closer to broader fleet adoption, with Keller Logistics Group wrapping up a key pre-production planning session with the electric vehicle maker’s team this week.
The confidential meeting marks a major step for the mid-sized carrier in evaluating the electric truck for its regional routes.
Keller’s pre-production Tesla Semi sessions
Keller Logistics Group, a family-owned carrier with over 300 tractors and 1,000 trailers operating in the Midwest and Southeast, completed the session to assess the Tesla Semi’s fit for its operations. The company’s routes typically span 500-600 miles per day, positioning it as an ideal tester for the Semi’s day cab configuration in standard logistics scenarios.
Details remain under mutual NDA, but the meeting reportedly focused on matching the truck to yard, shuttle and regional applications while scrutinizing economics like infrastructure, maintenance and incentives.
What Keller’s executives are saying
CEO Bryan Keller described the approach as methodical. “For us, staying ahead isn’t a headline, it’s a habit. From electrification and yard automation to digital visibility and warehouse technology, our teams are continually pressure-testing what’s next. The Tesla Semi discussion is one more way we evaluate new tools against our standards for safety, uptime, and customer ROI. We don’t chase trends, we pressure-test what works,” Keller said.
Benjamin Pierce, Chief Strategy Officer, echoed these sentiments. “Electrification and next-generation powertrains are part of a much broader transformation. Whether it’s proprietary yard systems like YardLink™, solar and renewable logistics solutions, or real-time vehicle intelligence, Keller’s approach stays the same, test it, prove it, and deploy it only when it strengthens service and total cost for our customers,” Pierce said.