Connect with us
How Starlink & T-Mobile's partnership will impact 5G for the better for AI cameras How Starlink & T-Mobile's partnership will impact 5G for the better for AI cameras

News

How Starlink & T-Mobile’s partnership will impact 5G for the better for AI cameras

Credit: Smarter AI

Published

on

Starlink and T-Mobile’s partnership will be revolutionary for cellular service and Smarter AI CEO Chris Piche had some thoughts on how the new partnership will impact 5G capability for the automotive industry. 

Chris, who has created services including AT&T TV, BBM Video, Poly Video, and STUN/TURN/ICE shared his thoughts on the effect of 5G on vehicles and telecommunications in an interview with Teslarati.

AI Cameras, Tesla, Starlink & autonomous vehicles

Before founding Smarter AI, the Top 40 under 40 entrepreneur’s company created a technology that BlackBerry licensed to enable voice and video calling. This gave Chris a front-row seat to witness the speed at which technology can transform markets. 

Smarter AI is a software platform for artificial intelligence cameras. 

“Smarter AI is to cameras as Android and iOS are to phones,” he told me. The company’s first vertical market is focusing on transportation. Vehicle camera systems such as dash cams or other camera systems for larger vehicles are in this market. 

Advertisement
-->

“The connection here with Tesla, Starlink, and T-Mobile is all around autonomous transportation. Today’s autonomous transportation whether it’s in Tesla or another kind of vehicle all relies on line of sight situational awareness. In Tesla’s case, they rely on some cases exclusively and other cases primarily on cameras and computer vision to try to understand what’s happening around the car.”

“Many of their competitors use LiDAR and don’t rely on cameras. But in both cases, it’s all based on line of sight. What they can actually see in a straight line.”

Seeing beyond the line of sight

Chris told me that one of the new technologies that Smarter AI and other companies are developing is called vehicle to vehicle (V2V) or vehicle to everything else (V2X).

“These technologies enable cars to see beyond line of sight. Imagine you’re coming to an intersection and are planning to take a turn.”

Instead of waiting to see what’s ahead of you on the street, you’re turning on to, the technology will tell you exactly what is ahead. There could be a stopped car, a pedestrian about to jaywalk, or some type of temporary obstruction that you are unaware of. 

Advertisement
-->

“Imagine if there was a camera system located at the intersection. Imagine that as your vehicle is approaching that intersection, your vehicle could communicate with the camera and the camera could tell your vehicle that there’s some sort of obstacle.”

An autonomous vehicle would use this information to determine whether or not it can make that turn. This technology, Chris told me, relies on high-capacity and high-availability communications networks such as 5G. 

 

Starlink & T-Mobile’s partnership could help with the challenges of implementing V2V and V2X

“One of the challenges with implementing technologies like V2V or V2X on top of 5G is that 5G deployments tend to be pretty good and getting better in large urban areas.” 

5G is pretty spotty in Baton Rouge and personally, 4G LTE works faster than 5G does for me although there’s a tower across the street from me. Chris, who is in Las Vegas, said that the coverage is pretty good for his friend with AT&T. He doesn’t have AT&T and his coverage is pretty spotty like mine is. 

Advertisement
-->

“But this agreement with Starlink and T-Mobile has the promise or the potential to either eliminate or significantly reduce the spottiness in the 5G coverage and that will enable technologies that are designed on top of 5G such as V2V and V2X to work either more reliably in urban areas where 5G is already available but is a little bit spotty,” he said.

“It would also enable these technologies to work in other areas where there is no 5G. We think this is a really significant announcement in terms of the promise of autonomous transportation and bringing it much closer to being a reality.”

 

How V2V and V2X could improve Tesla’s Autopilot

Chris told me he’s been using Tesla’s Autopilot for around five years. 

“It’s so good. It’s to the point that for the things it can see, it’s a way better driver than I am,” he said adding that when he drives for over a couple of minutes, he engages Autopilot. However, there are a couple of things that it lacks. 

Advertisement
-->

“It can’t see that far ahead and it lacks context. Sometimes, if there’s a car making a turn in front of my car, the Autopilot won’t understand the context that maybe this other car is momentarily in front of mine. And if I was driving, I’d keep driving. I wouldn’t take my foot off the accelerator or slam on the brakes unless I could see that something was going wrong with the turn that the other car was making.”

One way to improve Autopilot is through V2V or V2X, Chris explained. 

“In V2V, my car would talk to the car that’s making the turn in front of me and they would orchestrate the speed and direction of both of the cars so that the car in front of me could make its turn and my car could continue driving without slamming on the brakes.”

“With V2X, that would enable my car to talk to the cameras, traffic lights, and intersections to gain situational awareness about either other cars that aren’t equipped with the same technology or about other objects such as bicycles, pedestrians, or other obstacles on the street.”

Note: Johnna is a Tesla shareholder and supports its mission. 

Advertisement
-->

Your feedback is important. If you have any comments, or concerns, or see a typo, you can email me at johnna@teslarati.com. You can also reach me on Twitter at @JohnnaCrider1.

Teslarati is now on TikTok. Follow us for interactive news & more.

 

 

 

Advertisement
-->

 

 

 

Johnna Crider is a Baton Rouge writer covering Tesla, Elon Musk, EVs, and clean energy & supports Tesla's mission. Johnna also interviewed Elon Musk and you can listen here

Advertisement
Comments

Elon Musk

Starlink passes 9 million active customers just weeks after hitting 8 million

The milestone highlights the accelerating growth of Starlink, which has now been adding over 20,000 new users per day.

Published

on

Credit: Starlink/X

SpaceX’s Starlink satellite internet service has continued its rapid global expansion, surpassing 9 million active customers just weeks after crossing the 8 million mark. 

The milestone highlights the accelerating growth of Starlink, which has now been adding over 20,000 new users per day.

9 million customers

In a post on X, SpaceX stated that Starlink now serves over 9 million active users across 155 countries, territories, and markets. The company reached 8 million customers in early November, meaning it added roughly 1 million subscribers in under seven weeks, or about 21,275 new users on average per day. 

“Starlink is connecting more than 9M active customers with high-speed internet across 155 countries, territories, and many other markets,” Starlink wrote in a post on its official X account. SpaceX President Gwynne Shotwell also celebrated the milestone on X. “A huge thank you to all of our customers and congrats to the Starlink team for such an incredible product,” she wrote. 

That growth rate reflects both rising demand for broadband in underserved regions and Starlink’s expanding satellite constellation, which now includes more than 9,000 low-Earth-orbit satellites designed to deliver high-speed, low-latency internet worldwide.

Advertisement
-->

Starlink’s momentum

Starlink’s momentum has been building up. SpaceX reported 4.6 million Starlink customers in December 2024, followed by 7 million by August 2025, and 8 million customers in November. Independent data also suggests Starlink usage is rising sharply, with Cloudflare reporting that global web traffic from Starlink users more than doubled in 2025, as noted in an Insider report.

Starlink’s momentum is increasingly tied to SpaceX’s broader financial outlook. Elon Musk has said the satellite network is “by far” the company’s largest revenue driver, and reports suggest SpaceX may be positioning itself for an initial public offering as soon as next year, with valuations estimated as high as $1.5 trillion. Musk has also suggested in the past that Starlink could have its own IPO in the future. 

Continue Reading

News

NVIDIA Director of Robotics: Tesla FSD v14 is the first AI to pass the “Physical Turing Test”

After testing FSD v14, Fan stated that his experience with FSD felt magical at first, but it soon started to feel like a routine.

Published

on

Credit: Grok Imagine

NVIDIA Director of Robotics Jim Fan has praised Tesla’s Full Self-Driving (Supervised) v14 as the first AI to pass what he described as a “Physical Turing Test.”

After testing FSD v14, Fan stated that his experience with FSD felt magical at first, but it soon started to feel like a routine. And just like smartphones today, removing it now would “actively hurt.”

Jim Fan’s hands-on FSD v14 impressions

Fan, a leading researcher in embodied AI who is currently solving Physical AI at NVIDIA and spearheading the company’s Project GR00T initiative, noted that he actually was late to the Tesla game. He was, however, one of the first to try out FSD v14

“I was very late to own a Tesla but among the earliest to try out FSD v14. It’s perhaps the first time I experience an AI that passes the Physical Turing Test: after a long day at work, you press a button, lay back, and couldn’t tell if a neural net or a human drove you home,” Fan wrote in a post on X. 

Fan added: “Despite knowing exactly how robot learning works, I still find it magical watching the steering wheel turn by itself. First it feels surreal, next it becomes routine. Then, like the smartphone, taking it away actively hurts. This is how humanity gets rewired and glued to god-like technologies.”

Advertisement
-->

The Physical Turing Test

The original Turing Test was conceived by Alan Turing in 1950, and it was aimed at determining if a machine could exhibit behavior that is equivalent to or indistinguishable from a human. By focusing on text-based conversations, the original Turing Test set a high bar for natural language processing and machine learning. 

This test has been passed by today’s large language models. However, the capability to converse in a humanlike manner is a completely different challenge from performing real-world problem-solving or physical interactions. Thus, Fan introduced the Physical Turing Test, which challenges AI systems to demonstrate intelligence through physical actions.

Based on Fan’s comments, Tesla has demonstrated these intelligent physical actions with FSD v14. Elon Musk agreed with the NVIDIA executive, stating in a post on X that with FSD v14, “you can sense the sentience maturing.” Musk also praised Tesla AI, calling it the best “real-world AI” today.

Continue Reading

News

Tesla AI team burns the Christmas midnight oil by releasing FSD v14.2.2.1

The update was released just a day after FSD v14.2.2 started rolling out to customers. 

Published

on

Credit: Grok

Tesla is burning the midnight oil this Christmas, with the Tesla AI team quietly rolling out Full Self-Driving (Supervised) v14.2.2.1 just a day after FSD v14.2.2 started rolling out to customers. 

Tesla owner shares insights on FSD v14.2.2.1

Longtime Tesla owner and FSD tester @BLKMDL3 shared some insights following several drives with FSD v14.2.2.1 in rainy Los Angeles conditions with standing water and faded lane lines. He reported zero steering hesitation or stutter, confident lane changes, and maneuvers executed with precision that evoked the performance of Tesla’s driverless Robotaxis in Austin.

Parking performance impressed, with most spots nailed perfectly, including tight, sharp turns, in single attempts without shaky steering. One minor offset happened only due to another vehicle that was parked over the line, which FSD accommodated by a few extra inches. In rain that typically erases road markings, FSD visualized lanes and turn lines better than humans, positioning itself flawlessly when entering new streets as well.

“Took it up a dark, wet, and twisty canyon road up and down the hill tonight and it went very well as to be expected. Stayed centered in the lane, kept speed well and gives a confidence inspiring steering feel where it handles these curvy roads better than the majority of human drivers,” the Tesla owner wrote in a post on X.

Tesla’s FSD v14.2.2 update

Just a day before FSD v14.2.2.1’s release, Tesla rolled out FSD v14.2.2, which was focused on smoother real-world performance, better obstacle awareness, and precise end-of-trip routing. According to the update’s release notes, FSD v14.2.2 upgrades the vision encoder neural network with higher resolution features, enhancing detection of emergency vehicles, road obstacles, and human gestures.

Advertisement
-->

New Arrival Options also allowed users to select preferred drop-off styles, such as Parking Lot, Street, Driveway, Parking Garage, or Curbside, with the navigation pin automatically adjusting to the ideal spot. Other refinements include pulling over for emergency vehicles, real-time vision-based detours for blocked roads, improved gate and debris handling, and Speed Profiles for customized driving styles.

Continue Reading