Connect with us

News

Intel CEO believes autonomous driving data is the new oil

Published

on

The LA Auto Show may be remembered more for its technologies than the actual cars it showcased. That’s because automakers and technology companies are no longer isolated; instead, they’re part of a new and fascinating picture in which, when it comes to the future of automobiles, “data is the new oil.”

Intel CEO Brian Krzanich, delivering a keynote address at the Automobility LA conference (as part of the LA Auto Show) on Nov. 15, described the confluence of automobiles, data dependence, and connectivity as being equally valuable as an integrated whole as automobiles currently are on oil. Krzanich stated,

“We are in a time when technology is valued not just for the devices it produces, but for the experiences it makes possible. Data has the potential to radically change the way we think about the driving experience: as consumers, as automakers, as technologists, and as citizens of our communities,”

Intel’s interest in self-driving vehicles has grown over the last year after acquiring machine vision company, Itseez, Inc. this past May. With Itseez in its portfolio, Intel is developing algorithms and implementations of computer vision around automobiles, among other applications. Additionally, a partnership with BMW and system-on-a-chip maker and ex-Tesla partner Mobileye may produce an open platform for designing autonomous vehicles.

“It’s not enough just to capture the data,” Krzanich argued. “We have to turn the data into an actionable set of insights to get the full value out of it. To do that requires an end-to-end computing solution from the car through the network and to the cloud — and strong connectivity.”

Krzanich’ keynote speech marks the first time that Intel, the semiconductor conglomerate, has ever had a prominent role at an automobile show. It follows an editorial that he wrote earlier in the year in which he outlined five key points to accelerate Intel’s transformation from a PC company to a company that powers the cloud and billions of smart, connected computing devices. According to Krzanich:

Advertisement
-->
  • The cloud is the most important trend shaping the future of the smart, connected world. Virtualization and software are increasingly defining infrastructure in the cloud and data center.
  • The many “things” that make up the PC Client business and the Internet of Things are made much more valuable by their connection to the cloud. The Internet of Things encompasses all smart devices – every device, sensor, console and any other client device – that are connected to the cloud. Everything that a “thing” does can be captured as a piece of data, measured real-time, and is accessible from anywhere. The biggest opportunity in the Internet of Things is its ubiquity.
  • Memory and programmable solutions such as FPGAs, which are integrated circuits that can be programmed in the field after manufacture, will deliver entirely new classes of products for the data center and the Internet of Things. Breakthrough innovations and products to the cloud and data center infrastructure are revolutionizing the performance and architecture of the data center, with growth for years to come.
  • 5G will become the key technology for access to the cloud, providing computing power to a device and connecting it to the cloud makes it more valuable. The example of the autonomous vehicle, with its need for connectivity to the cloud alongside the cloud’s need for machine learning capabilities, requires the most up-to-date algorithms and data sets to allow the vehicle to operate safely. In this way, connectivity is fundamental to every one of the cloud-to-thing segments we will drive.
  • Moore’s Law, in which Intel co-founder Gordon Moore in 1965 noticed that the number of transistors per square inch on integrated circuits had doubled every year since their invention, will continue. This concept has fueled the recent technology revolution.

Krzanich elaborated at the Automobility LA conference that autonomous cars may soon utilize sensors from LIDAR, sonar, and radar, as well as GPS and cameras. A single autonomous vehicle could generate approximately 4 terabytes (4,000 GB) of data daily. “Every autonomous car will generate the data equivalent of almost 3,000 people. Extrapolate this further and think about how many cars are on the road. Let’s estimate just 1 million autonomous cars worldwide — that means automated driving will be representative of the data of 3 billion people,” Krzanich said.

The keynote speech augmented an Intel press statement that its Capital division will invest $250 million over the next two years into developing technologies around autonomous vehicles, which are “areas where technology can directly mitigate risks while improving safety, mobility, and efficiency at a reduced cost; and companies that harness the value of the data to improve reliability of automated driving systems.”

Source: Brian Krzanich Editorial

Carolyn Fortuna is a writer and researcher with a Ph.D. in education from the University of Rhode Island. She brings a social justice perspective to environmental issues. Please follow me on Twitter and Facebook and Google+

Advertisement
Comments

Elon Musk

SpaceX Starship Version 3 booster crumples in early testing

Photos of the incident’s aftermath suggest that Booster 18 will likely be retired.

Published

on

Credit: SpaceX/X

SpaceX’s new Starship first-stage booster, Booster 18, suffered major damage early Friday during its first round of testing in Starbase, Texas, just one day after rolling out of the factory. 

Based on videos of the incident, the lower section of the rocket booster appeared to crumple during a pressurization test. Photos of the incident’s aftermath suggest that Booster 18 will likely be retired. 

Booster test failure

SpaceX began structural and propellant-system verification tests on Booster 18 Thursday night at the Massey’s Test Site, only a few miles from Starbase’s production facilities, as noted in an Ars Technica report. At 4:04 a.m. CT on Friday, a livestream from LabPadre Space captured the booster’s lower half experiencing a sudden destructive event around its liquid oxygen tank section. Post-incident images, shared on X by @StarshipGazer, showed notable deformation in the booster’s lower structure.

Neither SpaceX nor Elon Musk had commented as of Friday morning, but the vehicle’s condition suggests it is likely a complete loss. This is quite unfortunate, as Booster 18 is already part of the Starship V3 program, which includes design fixes and upgrades intended to improve reliability. While SpaceX maintains a rather rapid Starship production line in Starbase, Booster 18 was generally expected to validate the improvements implemented in the V3 program.

Tight deadlines

SpaceX needs Starship boosters and upper stages to begin demonstrating rapid reuse, tower catches, and early operational Starlink missions over the next two years. More critically, NASA’s Artemis program depends on an on-orbit refueling test in the second half of 2026, a requirement for the vehicle’s expected crewed lunar landing around 2028.

Advertisement
-->

While SpaceX is known for diagnosing failures quickly and returning to testing at unmatched speed, losing the newest-generation booster at the very start of its campaign highlights the immense challenge involved in scaling Starship into a reliable, high-cadence launch system. SpaceX, however, is known for getting things done quickly, so it would not be a surprise if the company manages to figure out what happened to Booster 18 in the near future.

Continue Reading

News

Tesla FSD (Supervised) is about to go on “widespread” release

In a comment last October, Elon Musk stated that FSD V14.2 is “for widespread use.”

Published

on

Tesla has begun rolling out Full Self-Driving (Supervised) V14.2, and with this, the wide release of the system could very well begin. 

The update introduces a new high-resolution vision encoder, expanded emergency-vehicle handling, smarter routing, new parking options, and more refined driving behavior, among other improvements.

FSD V14.2 improvements

FSD (Supervised) V14.2’s release notes highlight a fully upgraded neural-network vision encoder capable of reading higher-resolution features, giving the system improved awareness of emergency vehicles, road obstacles, and even human gestures. Tesla also expanded its emergency-vehicle protocols, adding controlled pull-overs and yielding behavior for police cars, fire trucks, and ambulances, among others.

A deeper integration of navigation and routing into the vision network now allows the system to respond to blocked roads or detours in real time. The update also enhances decision-making in several complex scenarios, including unprotected turns, lane changes, vehicle cut-ins, and interactions with school buses. All in all, these improvements should help FSD (Supervised) V14.2 perform in a very smooth and comfortable manner.

Elon Musk’s predicted wide release

The significance of V14.2 grows when paired with Elon Musk’s comments from October. While responding to FSD tester AI DRIVR, who praised V14.1.2 for fixing “95% of indecisive lane changes and braking” and who noted that it was time for FSD to go on wide release, Musk stated that “14.2 for widespread use.”

FSD V14 has so far received a substantial amount of positive reviews from Tesla owners, many of whom have stated that the system now drives better than some human drivers as it is confident, cautious, and considerate at the same time. With V14.2 now rolling out, it remains to be seen if the update also makes it to the company’s wide FSD fleet, which is still populated by a large number of HW3 vehicles. 

Advertisement
-->
Continue Reading

News

Tesla FSD V14.2 starts rolling out to initial batch of vehicles

It would likely only be a matter of time before FSD V14.2 videos are posted and shared on social media.

Published

on

Credit: Grok Imagine

Tesla has begun pushing Full Self-Driving (Supervised) v14.2 to its initial batch of vehicles. The update was initially observed by Tesla owners and veteran FSD users on social media platform X on Friday.

So far, reports of the update have been shared by Model Y owners in California whose vehicles are equipped with the company’s AI4 hardware, though it would not be surprising if more Tesla owners across the country receive the update as well. 

Based on the release notes of the update, key improvements in FSD V14.2 include a revamped neural network for better detection of emergency vehicles, obstacles, and human gestures, as well as options to select arrival spots. 

It would likely only be a matter of time before FSD V14.2 videos are posted and shared on social media.

Following are the release notes of FSD (Supervised) V14.2, as shared on X by longtime FSD tester Whole Mars Catalog.

Advertisement
-->

Release Notes

2025.38.9.5

Currently Installed

FSD (Supervised) v14.2

Full Self-Driving (Supervised) v14.2 includes:

  • Upgraded the neural network vision encoder, leveraging higher resolution features to further improve scenarios like handling emergency vehicles, obstacles on the road, and human gestures.
  • Added Arrival Options for you to select where FSD should park: in a Parking Lot, on the Street, in a Driveway, in a Parking Garage, or at the Curbside.
  • Added handling to pull over or yield for emergency vehicles (e.g. police cars, fire trucks, ambulances.
  • Added navigation and routing into the vision-based neural network for real-time handling of blocked roads and detours.
  • Added additional Speed Profile to further customize driving style preference.
  • Improved handling for static and dynamic gates.
  • Improved offsetting for road debris (e.g. tires, tree branches, boxes).
  • Improve handling of several scenarios including: unprotected turns, lane changes, vehicle cut-ins, and school busses.
  • Improved FSD’s ability to manage system faults and improve scenarios like handling emergency vehicles, obstacles on the road, and human gestures.
  • Added Arrival Options for you to select where FSD should park: in a Parking Lot, on the Street, in a Driveway, in a Parking Garage, or at the Curbside.
  • Added handling to pull over or yield for emergency vehicles (e.g. police cars, fire trucks, ambulances).
  • Added navigation and routing into the vision-based neural network for real-time handling of blocked roads and detours.
  • Added additional Speed Profile to further customize driving style preference.
  • Improved handling for static and dynamic gates.
  • Improved offsetting for road debris (e.g. tires, tree branches, boxes).
  • Improve handling of several scenarios, including unprotected turns, lane changes, vehicle cut-ins, and school buses.
  • Improved FSD’s ability to manage system faults and recover smoothly from degraded operation for enhanced reliability.
  • Added alerting for residue build-up on interior windshield that may impact front camera visibility. If affected, visit Service for cleaning!

Upcoming Improvements:

  • Overall smoothness and sentience
  • Parking spot selection and parking quality
Continue Reading