Tesla’s Full Self-Driving regulation process to be further examined by California DMV

Credit: teslaphotographer/Instagram

The California Department of Motor Vehicles is further examining the regulation process of Tesla’s Full Self-Driving Beta program to determine whether it could be held under the DMV’s autonomous vehicle regulations. The regulations apply to other automakers that are attempting to develop autonomous vehicles, including Waymo, Cruise, and Zoox.

According to a report from the Los Angeles Times, the California DMV is reviewing whether Tesla’s Full Self-Driving Beta program should be included in the grouping of companies that test their products on public roads. Tesla was excluded from the grouping previously because the functionality requires a human driver. Tesla maintains that its cars are not fully autonomous and that humans must remain attentive at all times.

However, on January 5th, the California DMV told Tesla that it would “revisit” the potential inclusion of the automaker into the autonomous vehicle testing regulations program. It would require Tesla to report all system failures and accidents to the California DMV, which other companies in the program are already required to do. The potential inclusion seems to indicate that there are several reasons for Tesla’s potential inclusion to the regulations, including NHTSA investigations.

“Recent software updates, videos showing dangerous use of that technology, open investigations by the National Highway Traffic Safety Administration, and the opinions of other experts in this space prompted the reevaluation,” the DMV said in the letter to Lena Gonzalez, a Senator from Long Beach, California.

The Beta program is limited, and Tesla only allows drivers with a Safety Score of 98 or higher to receive the exclusive updates. Even still, some of the users with scores this high haven’t been included in the Beta program as Tesla is still being extremely cautious regarding its rollout. Concerns about some of the FSD Beta program’s capabilities have prompted immediate updates from Tesla. For example, October’s 10.3 FSD release provided dysfunctional and inaccurate Forward Collision Assist warnings when no threat was there in reality. Tesla issued an update to fix the bug less than 24 hours later.

Tesla does report safety statistics but has not since Q2 2021. Statistics for that timeframe included one crash every 4.41 million miles driven with Autopilot technology, which includes Autosteer and active safety features. Drivers who did not use Autopilot recorded a crash once every 1.2 million miles driven. NHTSA data says there is one accident in the United States for every 484,000 miles driven.

I’d love to hear from you! If you have any comments, concerns, or questions, please email me at joey@teslarati.com. You can also reach me on Twitter @KlenderJoey, or if you have news tips, you can email us at tips@teslarati.com.

Joey Klender: Transportation Writer | Penn State Alum | Future World Series of Poker Bracelet Holder 🚀 🛰 ☀️ 🚘 🧠 🕳
Disqus Comments Loading...