In a series of tweets over the weekend, Tesla Motor (NASDAQ:TSLA) Chairman Elon Musk has tried hard to convey to his loyal followers the goodness of Tesla’s Autopilot and the thinking, or more appropriately lack thereof, behind the SolarCity (NASDAQ:SCTY) acquisition.
As Mr. Musk was busy tweeting, there was yet another accident where a customer is attributing fault to the Autopilot feature. While the credibility of the customer’s claim is yet to be confirmed, debate continues to rage on the internet on the safety of the Autopilot.
As is common whenever such a topic is discussed, Tesla fans routinely find the hapless drivers at fault for not knowing the limitations of the Autopilot. Here is a telling quote from the comment section of the linked article:
“Model S owner chiming in. I think we’re seeing people trust it too much. If I let autopilot have its way I’d be dead a dozen+ different times. You can’t be stupid about it – you have to judge road quality, lighting conditions, etc before turning it on. You also need to pay attention because sometimes it’ll lose the line and just veer suddenly.
The rash of accidents we’re seeing is almost certainly due to people putting too much trust in it. Notice, also, it’s almost always new vehicles and not older ones? New drivers unfamiliar with the limits of the technology trusting it too much.
This driver, on an undivided country road without cell phone reception? He/she had zero business using autopilot in such a circumstance. Inattention and too much trust caused this one, plain and simple.”
Note that this comes from a person who would have been dead a dozen+ times by using this “safe” system if not for his interventions. It is telling that Tesla has such a loyal fan base for this apparent death trap . Unfortunately for Tesla, this type of fanatic customer base will not be the norm as the car reaches mass market. As such, evolution ensures that this type of thinking does not prosper as the human race is known to pick proper survival strategies over time.
The real question for Musk here is this: Is Autopilot saving drivers or are fanatic customers saving Autopilot?
In the context of this question, note that the perspective of the user quoted above is not unique. There is an abundance of stories on Tesla Motor Club forums and YouTube, on how drivers are trying to be safe in spite of the autopilot. Overwhelming number of stories on Tesla Motor Club forums shows that customers are saving Autopilot and not the other way around.
Against this background comes Mr. Musk’s vacuous tweeting. We believe these tweets show that Mr. Musk at risk of losing whatever little credibility he has left on the subject.
Several things are clear from these tweets:
– Mr. Musk is also trying to redefine what “beta” means which we believe is ill advised.
– Mr. Musk is tacitly accepting that the current dataset is not statistically significant – an argument we have made in the past.
– Mr. Musk is not even sure if this beta will be “sufficient” after the system hits 1 billion miles in about 6 months.
It is mind boggling that the Chief Executive of a multibillion dollar auto company is claiming his company’s products are safe with a data set that he admits that is not statistically significant. This, while admitting that the sufficiency of testing will be unknown even after the next six months.
In the interest of not misleading the customers and investors who believed the Company’s claims about safety using statistically insignificant data, can Tesla put out a press release that more accurately depicts the technology’s status?
We will not hold our breath waiting for Tesla to do the right thing here and will instead focus on what is wrong with this new set of tweets from Mr. Musk.
Firstly, we need to clarify the definition of Autopilot. Autopilot is a set of technologies that come together to provide customer a sense that the car can drive by itself under certain situations. The key pieces of Tesla Autopilot technology are: Automatic emergency braking, Adaptive cruise control, Autosteer, Autopark, and Summon.
Secondly, we want to address what it means to be in “beta”. In our recent article about Tesla autopilot situation we said that “Forget beta, this system is likely not even ready for alpha.” Below, we discuss about the nature of alpha versus beta and share our view on where Tesla is likely in its product evolution. While there is wide latitude when it comes to what companies call an alpha test or a beta test, the following is a reasonable approximation of the differences:
||Tesla Specific Remark
|What is the main goal
||Identify and resolve key technical challenges
||Identify and remove smaller annoyances
||“Public beta” is used by many companies to accelerate bug finding
|When they happen
||When the product is functional but not ready for release
||When the product is almost ready for launch and has minor bugs.
||Autopilot may be pre-alpha as it appears that the hardware may not be capable of the intended functionality without LIDAR
|Who does it
||Typically experts and in-house testers
||Experts and knowledgeable / professional customers
||Many Tesla customers are hard core but competence to conduct beta testing is questionable
|How long will it take
||Several months or years for complex products
||Usually weeks or months before final product is released
||Tesla has been running the test for years and it is doubtful they will reach production with the deployed hardware
|What does it cost the customer
||Alpha systems are almost never sold to customers as they are not considered customer ready
||Beta systems are typically deployed free with customer having the obligation of providing feedback
||Tesla charges for its product which by definition is not ready. This considerably increases its liability as courts may find it increases Tesla’s responsibility
The most telling aspect of the above discussion is that Tesla Autopilot is not in beta by the traditional definition. Note that Tesla has been claiming “beta” status for almost two years now and even by Mr. Musk’s own tweets the end point is not in sight. In reality, Mr. Musk’s tweets only go to confirm that the system may not even be in alpha as there is considerable doubt about the sufficiency of the system.
To call this current Autopilot system beta and foist it on unsuspecting customers is irresponsible and reckless at best. From an investor view point, the shortcomings of the current system are being somewhat hidden by the cult nature of vocal customer base.
Investors should also take the commentary of Tesla drivers with a grain of salt. When it comes to driving most people think that their skills are above average. This illusory superiority, we can be certain, extends to Tesla beta testers. A look at the comments section of Tesla related Seeking Alpha articles and comments on the Tesla Motor Forum’s site indicates that most Tesla drivers seem to think they understand the limitations of the Tesla Autopilot system. The irony is that this thinking extends to situations even after a driver reports a new and never before seen problem on their $100K Tesla. The reality is that given the immaturity of the system, Tesla’s documentation is likely well short of what it should be. Many of the systems limitations and problematic corner cases are likely not documented. Consequently, vast majority of the drivers who think they understand the system likely have some significant gaps in knowledge that they are not aware of.
Nothing demonstrates this better than the recent PR duel between Tesla and Mobileeye where Mobileeye claimed that its system is not designed to prevent the recent Florida accident whereas Tesla felt its IP should have detected it but did not for various reasons. The bottom line here is that Tesla itself does not know the limitations of its half-baked system. We are not talking about small bugs in the system but big ones where literally a truck can drive through.
Based on information taken from internet forums, we strongly believe that, by the time statistically significant data is available it will be abundantly clear that the Autopilot is unsafe.
On current PR vector, by glibly going on a PR offensive in the face of a half-baked technology, Tesla is not only putting lives at risk but also doing considerable damage to its brand.
If Tesla really has the data to stand behind its claims, Elon Musk must stop blogging and put out one or more detailed whitepapers showing the math and statistics of why he deems the Autopilot to be safer than human driving.
The key when it comes to presenting statistically significant data is find elements where such data is likely to be available. We cannot wait for fatalities to provide statistically significant data because fatalities are rare in modern cars – especially for cars like Model S and Model X with 5 star crash ratings.
In the current context, instead of waiting for fatalities, it would be much better for Tesla to present comparative accident or incident data. For example, a good way to show this would be: comparative incident and accident data between Teslas with autopilot and Teslas without autopilot (has to be shown with proper controls)
Anecdotal data we have seen on this subject from internet forums indicates that the accident / incident reports for Teslas with Autopilots is off-the charts. There are hundreds of reported incidents on about 25K cars. Our very rough estimate, based on this empirical evidence, is that Teslas using Autopilot have a MUCH WORSE safety record than without. We believe Autopilot is not just marginally bad but just plain bad when it comes to safety.
We are willing to change our view on the subject if Tesla can provide comparative data that can demonstrate otherwise. We challenge Mr. Musk to present controlled comparative statistical data to customers and investors.