Connect with us

News

OPINION: Tesla’s ‘Safety Score’ Beta needs broader terms for factoring your score

(Credit: Angel Wong/YouTube)

Published

on

Tesla’s “Safety Score” Beta is one of the most impressive ideas to improve driving safety, in my opinion. An article from Model 3 owner and Tesla enthusiast Nick Howard explained that Tesla is essentially gamifying the act of driving, encouraging owners to drive in a manner that would allow their scores to be higher. If you know anything about the Tesla community, you know that it is filled with die-hard fans who are satirically battling it out for the elusive 100 scores. While Tesla has outlined the ways that driving behaviors could affect the score for better or for worse, I believe that other instances may need to be outlined so owners are perfectly clear on how their score could be affected based on their hobbies or driving style. While I disagree with Consumer Reports’ assumption that the Safety Score is a bad idea (which, in reality, makes no real sense to me), I do believe that some owners are confused on what makes their score higher or lower, especially as many owners are attempting to enter the elusive Full Self-Driving Beta program.

If you’ve taken a peek at Tesla’s Support page that outlines the numerous factors that can affect a driver’s Safety Score, it seems pretty straightforward. There are cut and dry behaviors that tend to be recognized universally as “aggressive,” including tailgating, hard braking, and aggressive turning. Additionally, Forward Collision Warnings per 1,000 miles and forced Autopilot disengagements are also included in the behaviors that could affect your score, but these are exclusive to Tesla, of course, due to their use of Forward Collision Warnings and Autopilot disengagements.

Tesla introduces Safety Score (Beta) system that incentivizes safe driving

It’s very self-explanatory: Drive safely and receive a higher score. But are there not instances where things could get a tad confusing for some drivers, especially those with scores just below the perfect 100 threshold?

One example that I saw over the weekend was from Richard Marrero, a Tesla owner who was curious about taking his vehicle to the local racetrack. While Tesla owners are occasionally hitting the accelerator when a stoplight turns green, it may be understandable for Safety Scores to be affected. However, what if the nature of the driving occurs on a closed circuit? Marrero may drive like a saint on the road but might want to push his vehicle to the limit at a local dragstrip or raceway. After all, why have a high-performance car with face-melting acceleration if you can’t test it from time to time?

Advertisement
-->

There are other examples that could affect a Safety Score that are technically out of the driver’s control. In some instances, it may be an action taken by the driver that is technically safer than other options, yet it could reduce the Safety Score. Tesla Joy, a Model 3 owner, encountered this predicament on October 1, according to a Tweet. Her Safety Score was reduced due to hard braking at a “quick changing yellow light.” I believe nearly everyone who has a driver’s license can attest that some stoplights are slightly more accelerated than others. Quick changing yellow lights are one of the most polarizing events in a daily drive. Some will tell you just to run through it, others will argue that the safer thing to do is just slow down and stop. Whichever way you choose to handle this scenario, you are likely to encounter someone who shares a point of view on how to handle the premature yellow light in a different manner.

Advertisement
-->

However, I don’t necessarily believe that there is a “wrong” way to handle it. While the right way to do it, according to my knowledge as a driver of over 11 years, would require you to slow down and come to a stop, especially since the yellow light is a key indicator of “slow down.” Tesla Joy did it as most Learner’s Permit booklets would describe, yet she was still docked points.

There are undoubtedly more examples of how Tesla could do a better job of explaining what actions are not favorable for the Safety Score system, and I would love to hear your thoughts or examples on things that have occurred that affected your score. Tesla did a wonderful job of outlining the most face-value actions that Safety Scores will be affected by, but there are other questions that need to be confronted so drivers are clear on what other things could hurt their scores. After all, the wider the FSD Beta testing group is, the more data Tesla will obtain through its Neural Network.

Don’t hesitate to contact us with tips! Email us at tips@teslarati.com, or you can email me directly at joey@teslarati.com.

Joey has been a journalist covering electric mobility at TESLARATI since August 2019. In his spare time, Joey is playing golf, watching MMA, or cheering on any of his favorite sports teams, including the Baltimore Ravens and Orioles, Miami Heat, Washington Capitals, and Penn State Nittany Lions. You can get in touch with joey at joey@teslarati.com. He is also on X @KlenderJoey. If you're looking for great Tesla accessories, check out shop.teslarati.com

Advertisement
Comments

News

Tesla quietly flexes FSD’s reliability amid Waymo blackout in San Francisco

“Tesla Robotaxis were unaffected by the SF power outage,” Musk wrote in his post.

Published

on

Tesla highlighted its Full Self-Driving (Supervised) system’s robustness this week by sharing dashcam footage of a vehicle in FSD navigating pitch-black San Francisco streets during the city’s widespread power outage. 

While Waymo’s robotaxis stalled and caused traffic jams, Tesla’s vision-only approach kept operating seamlessly without remote intervention. Elon Musk amplified the clip, highlighting the contrast between the two systems.

Tesla FSD handles total darkness

The @Tesla_AI account posted a video from a Model Y operating on FSD during San Francisco’s blackout. As could be seen in the video, streetlights, traffic signals, and surrounding illumination were completely out, but the vehicle drove confidently and cautiously, just like a proficient human driver.

Musk reposted the clip, adding context to reports of Waymo vehicles struggling in the same conditions. “Tesla Robotaxis were unaffected by the SF power outage,” Musk wrote in his post. 

Musk and the Tesla AI team’s posts highlight the idea that FSD operates a lot like any experienced human driver. Since the system does not rely on a variety of sensors and a complicated symphony of factors, vehicles could technically navigate challenging circumstances as they emerge. This definitely seemed to be the case in San Francisco.  

Advertisement
-->

Waymo’s blackout struggles

Waymo faced scrutiny after multiple self-driving Jaguar I-PACE taxis stopped functioning during the blackout, blocking lanes, causing traffic jams, and requiring manual retrieval. Videos shared during the power outage showed fleets of Waymo vehicles just stopping in the middle of the road, seemingly confused about what to do when the lights go out. 

In a comment, Waymo stated that its vehicles treat nonfunctional signals as four-way stops, but “the sheer scale of the outage led to instances where vehicles remained stationary longer than usual to confirm the state of the affected intersections. This contributed to traffic friction during the height of the congestion.”

A company spokesperson also shared some thoughts about the incidents. “Yesterday’s power outage was a widespread event that caused gridlock across San Francisco, with non-functioning traffic signals and transit disruptions. While the failure of the utility infrastructure was significant, we are committed to ensuring our technology adjusts to traffic flow during such events,” the Waymo spokesperson stated, adding that it is “focused on rapidly integrating the lessons learned from this event, and are committed to earning and maintaining the trust of the communities we serve every day.”

Continue Reading

News

Waymo scrutinized after self-driving taxis cause traffic jams during SF blackout

It’s not farfetched to speculate that it would have been a doomsday scenario for Tesla had FSD behaved this way.

Published

on

Credit: @AnnTrades/X

A power outage across San Francisco over the weekend forced numerous Waymo self-driving taxis to stop at darkened intersections and cause traffic blockages in multiple locations across the city. The disruption left riders stranded, frustrated drivers blocked, and city officials stepping in as the Alphabet-owned company temporarily suspended service amid the widespread gridlock.

Needless to say, it would likely have been a doomsday scenario for Tesla had FSD behaved in a similar way, especially if fleets of its robotaxis blocked traffic for numerous drivers. 

Power outage halts Waymo fleet

The outage knocked out electricity for tens of thousands of customers, leaving traffic signals dark across large parts of the city, as noted in a report from the New York Times. Waymo vehicles began stopping at intersections and remained stationary for extended periods, seemingly unable to operate. Tow truck operators worked through the night removing immobilized vehicles, while videos circulated online showing Waymos with hazard lights flashing as traffic backed up around them.

Waymo later confirmed that it had paused its Bay Area ride-hailing service after the San Francisco mayor’s office contacted the company about the congestion its vehicles were contributing to. Service began coming back online shortly after 3:30 p.m. local time, though some users still reported being unable to request rides. Waymo maintained that no injuries or accidents were reported during the outage.

Autonomous cars during emergencies

The incident surprised industry observers since autonomous vehicles are designed to function during signal outages and temporary connectivity losses. Waymo stated that its vehicles treat nonfunctional signals as four-way stops, but “the sheer scale of the outage led to instances where vehicles remained stationary longer than usual to confirm the state of the affected intersections. This contributed to traffic friction during the height of the congestion.” Experts suggested the problem may have been linked to the vehicles’ reliance on remote assistance teams, which help resolve complex situations the cars cannot handle independently.

Advertisement
-->

“Yesterday’s power outage was a widespread event that caused gridlock across San Francisco, with non-functioning traffic signals and transit disruptions. While the failure of the utility infrastructure was significant, we are committed to ensuring our technology adjusts to traffic flow during such events,” the Waymo spokesperson stated, adding that it is “focused on rapidly integrating the lessons learned from this event, and are committed to earning and maintaining the trust of the communities we serve every day.”

Continue Reading

News

Tesla aims to combat common Full Self-Driving problem with new patent

Tesla writes in the patent that its autonomous and semi-autonomous vehicles are heavily reliant on camera systems to navigate and interact with their environment.

Published

on

Credit: @samsheffer | x

Tesla is aiming to combat a common Full Self-Driving problem with a new patent.

One issue with Tesla’s vision-based approach is that sunlight glare can become a troublesome element of everyday travel. Full Self-Driving is certainly an amazing technology, but there are still things Tesla is aiming to figure out with its development.

Unfortunately, it is extremely difficult to get around this issue, and even humans need ways to combat it when they’re driving, as we commonly use sunglasses or sun visors to give us better visibility.

Cameras obviously do not have these ways to fight sunglare, but a new patent Tesla recently had published aims to fight this through a “glare shield.”

Tesla writes in the patent that its autonomous and semi-autonomous vehicles are heavily reliant on camera systems to navigate and interact with their environment.

The ability to see surroundings is crucial for accurate performance, and glare is one element of interference that has yet to be confronted.

Tesla described the patent, which will utilize “a textured surface composed of an array of micro-cones, or cone-shaped formations, which serve to scatter incident light in various directions, thereby reducing glare and improving camera vision.”

The patent was first spotted by Not a Tesla App.

The design of the micro-cones is the first element of the puzzle to fight the excess glare. The patent says they are “optimized in size, angle, and orientation to minimize Total Hemispherical Reflectance (THR) and reflection penalty, enhancing the camera’s ability to accurately interpret visual data.”

Additionally, there is an electromechanical system for dynamic orientation adjustment, which will allow the micro-cones to move based on the angle of external light sources.

This is not the only thing Tesla is mulling to resolve issues with sunlight glare, as it has also worked on two other ways to combat the problem. One thing the company has discussed is a direct photon count.

CEO Elon Musk said during the Q2 Earnings Call:

“We use an approach which is direct photon count. When you see a processed image, so the image that goes from the sort of photon counter — the silicon photon counter — that then goes through a digital signal processor or image signal processor, that’s normally what happens. And then the image that you see looks all washed out, because if you point the camera at the sun, the post-processing of the photon counting washes things out.”

Future Hardware iterations, like Hardware 5 and Hardware 6, could also integrate better solutions for the sunglare issue, such as neutral density filters or heated lenses, aiming to solve glare more effectively.

Continue Reading