The recently launched Tesla Autopilot probe by the NHTSA is said to be a greater threat to the automaker’s reputation than its financial situation, according to a new note from Morgan Stanley. Headed by analyst Adam Jonas, the note outlines the potential risks that Tesla could face with the probe, which intends to investigate 11 instances of Autopilot-equipped cars crashing into emergency vehicles, according to the documents.
The note outlines four potential risk factors that Tesla could face in a long and drawn-out battle to clear its name of any wrongdoing. In the initial years of autonomous driving development, nearly any instance of a vehicle being involved in an accident has increased skepticism over the potential of future self-driving cars. While Tesla Autopilot only operates on Level 2 autonomy, with Level 5 being a fully operational self-driving machine, the company admittedly states that drivers should still remain alert while the vehicle is in operation, never taking their eyes off the road.
However, this is not the instance for every driver. While Autopilot vehicles from Tesla were involved in accidents on a significantly less-frequent occasion than the national average based on NHTSA statistics, Tesla vehicles involved in accidents seem to catch more media attention than any other instance on the road. After all, we don’t hear about every Chevy Malibu or Ford F-150 crash that occurs, but the false narrative that Teslas drive themselves still floats around in the form of catchy headlines or misleading articles.
The chance for reputational risks is one of the most notable points of the Morgan Stanley note and is the point that the analysts expand on the most. “Vehicle safety actions and recalls (both voluntary and involuntary) are a fact of life in the auto industry, despite cars achieving greater capability and quality over time. While we are not making any changes to our Tesla model and price target at this time, the NHTSA serves as a reminder to investors about the importance of vehicle safety as we turn over greater portions of driving to software in a network,” the note said.
Of course, semi-autonomous vehicles, and autonomy in general for automotive, is a young and relatively new feature in the world of cars. There are bound to be mistakes and incidents just as there were with early vehicles. Accidents happen, but the early adopters of motor vehicles did not give up on the task of making them better and safer, and that’s precisely what will happen as more companies take a crack at the potential autonomous driving sector.
“The regulatory, legal, and moral/ethical nuances are difficult, if not impossible, to model. As human driving transitions to computer driving, accident frequency is expected to decline by 90% or more (some experts insist accident frequency must ultimately fall by greater than 99.9%). At the same time, accident ‘fault’ transitions from someone to something,” the note also states. “Just our view, but there is no moral equivalency between a ‘human-caused’ traffic fatality and a ‘system-caused’ traffic fatality. Over time, we believe the industry should be in position to provide vehicle data for 3rd party validation to prove the significant societal/health and safety benefits of autonomy.
As noted yesterday in an interview with former Ford CEO Mark Fields, the NHTSA study into Tesla could take up to 18 months. Morgan Stanley reiterates this point in its note, especially with Autopilot’s “high profile nature.” Unfortunately, Tesla’s flashy name and mainstream personality as an automaker, especially a revolutionary one, has put them at center stage for this kind of attention. Those with a reasonable platform may not understand all of the functionalities or safety precautions of Autopilot’s nature. Still, unfortunately, many of the accidents are being described as the software’s fault, although many of the instances are actually driver errors.
At the time of writing, TSLA stock was trading at $689.79, up over 3.6%.
Disclosure: Joey Klender is a TSLA Shareholder.