Connect with us

News

Apple Car project continues evolving with larger test fleet & fresh, new hire

Published

on

Apple’s long-rumored “Project Titan,” also known as the Apple Car initiative, has gone through several changes over the years. While the Cupertino-based tech giant initially appeared to be focused on manufacturing its own vehicle, the company has since opted to focus on developing self-driving technologies instead. Apple CEO Tim Cook, for one, stated back in June 2017 that Apple was “very focused on autonomous systems.” Since then, the iPhone-maker has gone all-in on the self-driving race. Today, the company commands the largest fleet of autonomous vehicles on California’s roads, even surpassing the numbers of veterans in the field such as Waymo.

The growth of Apple’s self-driving fleet in California has been no less than astounding. According to a MacRumors report, information obtained from the California Department of Motor Vehicles has revealed that Apple started with a fleet of 27 autonomous vehicles in January. By March, there were 45 self-driving vehicles operated by the tech giant. By mid-May, the company had 55 vehicles and 83 drivers in its fleet. Just two weeks after that, Apple’s fleet of self-driving cars has grown to 62 vehicles and 87 drivers. In comparison, Waymo has 51 autonomous vehicles testing on CA roads. 

Apple’s self-driving cars are characterized by their rather hefty roofs, which include an array of cameras and advanced LiDAR equipment. The vehicles are running Apple’s in-development autonomous driving software. Just like some of Google’s fleet, Apple has selected Lexus to be its car manufacturer of choice, with the company using Lexus RX450h SUVs as its test vehicles. Each of Apple’s self-driving cars is deployed with a safety driver, as the company’s permit currently does not allow fully driverless operations yet.

Apart from growing its fleet, Apple is also growing its talent pool for its self-driving initiatives. Just recently, the company hired senior self-driving car engineer Jaime Waydo, who has previous experience as an engineer from NASA’s Jet Propulsion Laboratory. What is particularly notable from Waydo’s work experience, however, was that she worked for Waymo before joining Apple’s self-driving car project. The former NASA engineer oversaw systems engineering at Waymo, while also aiding the self-driving car Google subsidiary in making pivotal decisions about the driverless operations of its test fleet in Arizona.

Apple’s self-driving car project is among the company’s largest, most ambitious initiatives to date, with CEO Tim Cook dubbing it as the “mother of all AI projects.” In a way, Tim Cook’s statement rings true, considering that Apple has made its name and established its reputation in consumer technology, not in automotive engineering. While the company does have experience with artificial intelligence and machine learning thanks to products like the iPhone and voice-activated assistants like Siri, a self-driving car system is an entirely different challenge. It is, after all, one that Google is still trying to master despite being in the industry since 2009, and one that Tesla is still seeking to learn despite having more than 150,000 vehicles on its fleet gathering data every day.

Advertisement

Simon is an experienced automotive reporter with a passion for electric cars and clean energy. Fascinated by the world envisioned by Elon Musk, he hopes to make it to Mars (at least as a tourist) someday. For stories or tips--or even to just say a simple hello--send a message to his email, simon@teslarati.com or his handle on X, @ResidentSponge.

Advertisement
Comments

News

Tesla Model 3 gets perfect 5-star Euro NCAP safety rating

Tesla prides itself on producing some of the safest vehicles on the road today.

Published

on

Credit: Tesla Singapore/X

Tesla prides itself on producing some of the safest vehicles on the road today. Based on recent findings from the Euro NCAP, the 2025 Model 3 sedan continues this tradition, with the vehicle earning a 5-star overall safety rating from the agency.

Standout Safety Features

As could be seen on the Euro NCAP’s official website, the 2025 Model 3 achieved an overall score of 90% for Adult Occupants, 93% for Child Occupants, 89% for Vulnerable Road Users, and 87% for Safety Assist. This rating, as per the Euro NCAP, applies to the Model 3 Rear Wheel Drive, Long Range Rear Wheel Drive, Long Range All Wheel Drive, and Performance All Wheel Drive.

The Euro NCAP highlighted a number of the Model 3’s safety features, such as its Active Hood, which automatically lifts during collisions to mitigate injury risks to vulnerable road users, and Automatic Emergency Braking System, which now detects motorcycles through an upgraded algorithm. The Euro NCAP also mentioned the Model 3’s feature that prevents initial door opening if someone is approaching the vehicle’s blind spot.

Standout Safety Features

In a post on its official Tesla Europe & Middle East account, Tesla noted that the company is also introducing new features that make the Model 3 even safer than it is today. These include functions like head-on collision avoidance and crossing traffic AEB, as well as Child Left Alone Detection, among other safety features.

“We also introduced new features to improve Safety Assist functionality even further – like head-on collision avoidance & crossing traffic AEB – to detect & respond to potential hazards faster, helping avoid accidents in the first place. 

Advertisement

“Lastly, we released Child Left Alone Detection – if an unattended child is detected, the vehicle will turn on HVAC & alert caregivers via phone app & the vehicle itself (flashing lights/audible alert). Because we’re using novel in-cabin radar sensing, your Tesla is able to distinguish between adult vs child – reduced annoyance to adults, yet critical safety feature for kids,” Tesla wrote in its post on X.

Below is the Euro NCAP’s safety report on the 2025 Tesla Model 3 sedan.

Euroncap 2025 Tesla Model 3 Datasheet by Simon Alvarez on Scribd

Continue Reading

Elon Musk

USDOT Secretary visits Tesla Giga Texas, hints at national autonomous vehicle standards

The Transportation Secretary also toured the factory’s production lines and spoke with CEO Elon Musk.

Published

on

Credit: Elon Musk/X

United States Department of Transportation (USDOT) Secretary Sean Duffy recently visited Tesla’s Gigafactory Texas complex, where he toured the factory’s production lines and spoke with CEO Elon Musk. In a video posted following his Giga Texas visit, Duffy noted that he believes there should be a national standard for autonomous vehicles in the United States.

Duffy’s Giga Texas Visit

As could be seen in videos of his Giga Texas visit, the Transportation Secretary seemed to appreciate the work Tesla has been doing to put the United States in the forefront of innovation. “Tesla is one of the many companies helping our country reach new heights. USDOT will be right there all the way to make sure Americans stay safe,” Duffy wrote in a post on X. 

He also praised Tesla for its autonomous vehicle program, highlighting that “We need American companies to keep innovating so we can outcompete the rest of the world.”

National Standard

While speaking with Tesla CEO Elon Musk, the Transportation Secretary stated that other autonomous ride-hailing companies have been lobbying for a national standard for self-driving cars. Musk shared the sentiment, stating that “It’d be wonderful for the United States to have a national set of rules for autonomous driving as opposed to 50 independent sets of rules on a state-by-state rules basis.”

Duffy agreed with the CEO’s point, stating that, “You can’t have 50 different rules for 50 different states. You need one standard.” He also noted that the Transportation Department has asked autonomous vehicle companies to submit data. By doing so, the USDOT could develop a standard for the entire United States, allowing self-driving cars to operate in a manner that is natural and safe.

Advertisement
Continue Reading

News

Tesla posts Optimus’ most impressive video demonstration yet

The humanoid robot was able to complete all the tasks through a single neural network.

Published

on

Credit: Tesla Optimus/X

When Elon Musk spoke with CNBC’s David Faber in an interview at Giga Texas, he reiterated the idea that Optimus will be one of Tesla’s biggest products. Seemingly to highlight the CEO’s point, the official Tesla Optimus account on social media platform X shared what could very well be the most impressive demonstration of the humanoid robot’s capabilities to date.

Optimus’ Newest Demonstration

In its recent video demonstration, the Tesla Optimus team featured the humanoid robot performing a variety of tasks. These include household chores such as throwing the trash, using a broom and a vacuum cleaner, tearing a paper towel, stirring a pot of food, opening a cabinet, and closing a curtain, among others. The video also featured Optimus picking up a Model X fore link and placing it on a dolly.

What was most notable in the Tesla Optimus team’s demonstration was the fact that the humanoid robot was able to complete all the tasks through a single neural network. The robot’s actions were also learned directly from Optimus being fed data from first-person videos of humans performing similar tasks. This system should pave the way for Optimus to learn and refine new skills quickly and reliably.

Tesla VP for Optimus Shares Insight

In a follow-up post on X, Tesla Vice President of Optimus (Tesla Bot) Milan Kovac stated that one of the team’s goals is to have Optimus learn straight from internet videos of humans performing tasks, including footage captured in third person or by random cameras.

“We recently had a significant breakthrough along that journey, and can now transfer a big chunk of the learning directly from human videos to the bots (1st person views for now). This allows us to bootstrap new tasks much faster compared to teleoperated bot data alone (heavier operationally).

Advertisement

“Many new skills are emerging through this process, are called for via natural language (voice/text), and are run by a single neural network on the bot (multi-tasking). Next: expand to 3rd person video transfer (aka random internet), and push reliability via self-play (RL) in the real-, and/or synthetic- (sim / world models) world,” Kovac wrote in his post on X.

Continue Reading

Trending