Another Tesla Crash, What It Teaches Us

Tesla crashed on a test drive while AutoPilot engaged. Nobody got hurt. But the minor incident gives us a plenty to think about.

Earlier this week, I came across a report about a Tesla’s AutoPilot crash. It appeared on Tesla Motors Club’s site, posted by a Tesla fan planning to purchase a car.

The user’s post on the web site’s forum read:

I was on the last day of my 7-day deposit period. I was really excited about the car. So I took my friend to a local Tesla store and we went for a drive. AP [AutoPilot] was engaged. As we went up a hill, the car was NOT slowing down approaching a red light at 50 mph. The salesperson suggested that my friend not brake, letting the system do the work. It didn’t. The car in front of us had come to a complete stop. The salesperson then said, “brake!” Full braking didn’t stop the car in time and we rear-ended the car in front of us HARD. All airbags deployed. The car was totaled. I have heard from a number of AP owners that there are limitations to the system (of course) but, wow! The purpose of this post isn’t to assign blame, but I mention this for the obvious reason that AP isn’t autonomous and it makes sense to have new drivers use this system in very restricted circumstances before activating it in a busy urban area.

Thankfully, nobody got hurt. This post got no traction in the media. No reporter appears to be following it up (except for this publication). This could have been easily filed under the rubric, “minor accidents,” the sort of news we all ignore.

Read More

Germany Says ‘Nein’ to Tesla Calling Its Tech ‘Autopilot’

TESLA MOTORS’ CARS kinda drive themselves, but its Autopilot technology has caused some headaches for the company. Consumer Reports called on Tesla to recall the feature, calling it “too much autonomy, too soon.” A Florida man died in a crash while using Autopilot.

And now, German transport minister Alexander Dobrindt asked Tesla to ditch the term “Autopilot,” arguing it can lead consumers to think the car is far more capable than it is.

Before going further, a note: Model S and Model X vehicles with Autopilot can stay in their lane and maintain a safe speed. The technology is meant for highways, where there are fewer obstacles to deal with, and requires drivers to keep their hands on the wheel and remain alert. Autopilot is designed to assist drivers, not replace them.

The electric automaker said nein to Dobrindt. In a statement, the company said it duly warns drivers of the system’s limits (whether drivers pay attention is another matter). And it defended using the word autopilot: “This is how the term has been used for decades in aerospace: to denote a support system that operates under the direct supervision of a human pilot.”

Read More

German Government Report Critical of Tesla Autopilot

Der Spiegel says the Transport Ministry called the feature a “considerable traffic hazard.”

The Autopilot function on Tesla’s Model S car represents a “considerable traffic hazard,” according to an internal report for Germany’s Transport Ministry seen by magazineDer Spiegel.

Experts in the Federal Highway Research Institute carried out tests on the electric car and criticized it on a number of points, the magazine reported on Friday.

For example, drivers are not alerted by the Autopilot system when the vehicle gets into a situation that the computer cannot solve, Spiegel cited the report as saying.

In addition, the car’s sensors do not detect far back enough during an overtaking maneuver, while the emergency brake also performs inadequately, according to the report.

Spiegel said Transport Minister Alexander Dobrindt was aware of the report but did not want to take the model out of service.

The ministry told Reuters a final evaluation had not yet been taken and further tests were being conducted.

Read More

Autopilot tech supplier fears Tesla is pushing safety envelope too far

The chief technology officer of a technology supplier that enables Tesla’s semi-autonomous Autopilot driving technology believes the carmaker is pushing the safety envelope too far.

“It is not designed to cover all possible crash situations in a safe manner,” Amnon Shashua, CTO and executive chairman at Israel-based Mobileye NV, told Reuters Wednesday.

Shashua’s comments came the same day a second fatal accident was revealed through a lawsuit filed against Tesla by the father of a man allegedly driving a Model S with the Autopilot engaged.

Read More

China Crash Raises Fresh Questions About Tesla’s Disclosures

Tesla is already the subject of a Securities and Exchange Commission investigation into whether it breached securities laws by failing to disclose the Florida crash prior to selling $2 billion worth of shares. The January crash would seemingly raise similar concerns.

With Tesla, still a minnow in the global auto market, soon facing new competition from deep-pocketed rivals including General Motors’ Chevy Bolt, its brand reputation has never been more important. The longer questions linger about the safety of its vehicles, and the company’s willingness to discuss potential issues, the more that reputation is put at risk.

Read More

Today’s news indicates the NHTSA probe into autopilot performance has widened


  • Today’s news indicates the NHTSA probe into autopilot performance has widened.
  • This widening has a chance to catch Tesla’s “big lie” regarding autopilot safety.
  • This article explains the big lie and why it might be caught. This might have demand implications.

“So far, so good”

We’ve all heard about the Tesla (NASDAQ:TSLA) fatal autopilot crash, on which I wrote an article titled “A Detailed View Into The Tesla Autopilot Fatality,” which examined it in great depth.

We also all know that NHTSA launched a probe into it. However, today we got additional information on NHTSA’s investigation which can still be of relevance. I will explain why.

Tesla’s Defense Of Autopilot

Tesla’s defense of its autopilot feature rests on it repeating a big lie often enough. As the saying goes:

If you tell a lie big enough and keep repeating it, people will eventually come to believe it.

So what is this lie I speak about? Let me quote straight from Tesla’s blog(emphasis is mine):

We learned yesterday evening that NHTSA is opening a preliminary evaluation into the performance of Autopilot during a recent fatal crash that occurred in a Model S. This is the first known fatality in just over 130 million miles where Autopilot was activated. Among all vehicles in the US, there is a fatality every 94 million miles. Worldwide, there is a fatality approximately every 60 million miles. It is important to emphasize that the NHTSA action is simply a preliminary evaluation to determine whether the system worked according to expectations.

Why Is This A Lie?

It’s a lie because:

  • When the death occurred autopilot had been used for less than 100 million miles, not 130 million miles.
  • Autopilot is lane keeping and/or TACC. The issue is only with lane keeping and TACC put together, which certainly had racked up only a fraction of the less than 100 million miles on autopilot.
  • Autopilot lane keeping is only used in safer-than-average roads and conditions, thus it makes no sense to compare it to accidents occurring on all roads and under all conditions.
  • Autopilot-capable cars are much newer than the average car, thus much safer than the average car, autopilot or not.
  • The Model S is a large 4-door sedan, thus much safer than the average car, autopilot or not.
  • Tesla’s driver demographics are safer than average on account of age and income/intelligence, autopilot or not. (the exception here is gender, though)
  • Tesla is sold in states and countries which themselves are safer than average (have less fatalities per mile than average), autopilot or not.
  • The average fatality rates include motorcyclists and pedestrians, which account for ~1/3rd of fatalities in the US and more elsewhere in the world, which makes them directly non-comparable to the Tesla on autopilot. Indeed, the average stripped of this effect would all by itself render the Model S on autopilot less safe than average.

You take all of these effects together, and it’s obvious that a Tesla on autopilot is much less safe than the average comparable car driven by a comparable driver in comparable roads. Quite possibly it’s more than an order of magnitude more dangerous, though quantifying it is a job for a PhD thesis.

However, There’s More To It

Up until now, I haven’t said why the new NHTSA information, carried in a letter delivered by NHTSA to Tesla, is relevant to this matter.

It’s relevant, again, because of a couple of different reasons:

Because it speaks of a June 14 meeting with Tesla. Yet, Tesla only saw it fit to release this information to the market on June 30. This is of minor importance.

It’s also relevant because NHTSA is asking not just information on this particular fatal autopilot accident, but also on other accidents where autopilot might have been involved. This is of major importance.

This widening of the investigation can make a lot of difference regarding Tesla’s claims of autopilot safety (the big lie). It is so, because Tesla, as always, has seemingly taken liberties with what constitutes an autopilot-related accident.

Take for instance the accident where Arianna Simpson crashed into the back of another car while on autopilot. Simpson realized, too late, that the car was not going to slow down before impacting the car ahead. So she braked (too late, of course, the accident was already unavoidable at that point). So what does Tesla say regarding this (according to

In contrast, Tesla says that the vehicle logs show that its adaptive cruise control system is not to blame. Data points to Simpson hitting the brake pedal and deactivating autopilot and traffic aware cruise control, returning the car to manual control instantly.

It isn’t a great stretch to thus believe that internally, Tesla won’t be classifying this as an “autopilot-related” accident. In a metaphor, we could say that if Tesla’s autopilot was throwing you off a tall building, according to this criteria it would still be perfectly safe and not causing any “autopilot-related” accidents, just as long as it disconnected moments before you hit the ground.

Obviously, accident data collected and classified according to such criteria would be mind-bogglingly misleading. It is here, that NHTSA’s widening of its autopilot probe can gain a lot of relevance. It can gain relevance because, if Tesla doesn’t pull the wool over NHTSA’s eyes, NHTSA can get data on the actual autopilot safety (to which it would still have to apply numerous statistical adjustments). How could Tesla still pull the wool over NHTSA’s eyes? Well, by providing it data only for accidents where autopilot was still engaged right on impact. Let’s hope NHTSA is more intelligent than that.


NHTSA’s probe into autopilot performance has widened. Initially, it was seen as just covering the autopilot fatality, whereas now NHTSA is trying to get a better grip on autopilot performance in general.

This widened NHTSA probe might well expose Tesla’s recurring untruth about the safety of its cars while on autopilot. This isn’t really a Tesla issue in the sense that other automakers having similar features will have similar problems, but Tesla is the only one actively trying to mislead the market regarding the feature’s safety. Furthermore, Tesla makes autopilot more of a selling point than other automakers do. Indeed, this whole episode is probably already having an actual impact on demand for Tesla cars, demand which judging from Tesla’s Model S sales during Q2 2016, Tesla can hardly spare.

As an illustration on how Tesla is misleading regarding the safety of autopilot, I can’t resist reproducing the following slide:

Source: Eric’s feed on Twitter, based on IIHS data

This slide, too, has multiple reasons to be misleading (including not being comprehensive), but it still illustrates the point quite well.

Continue reading “Today’s news indicates the NHTSA probe into autopilot performance has widened”

Secret Masterplan and a Reluctant Wall Street

Part 2, that is.

Tesla Motors CEO Elon Musk on Sunday tweeted his intention to soon publish part two of his “top secret Tesla masterplan” following an embattled several weeks for the Silicon Valley heavyweight.

The tweet, which read “Working on Top Secret Tesla Masterplan, Part 2. Hoping to publish later this week” comes amid inquiries into two crashes of Tesla cars, as well as ongoing questions regarding Musk’s plans to combine his electronic vehicle company, Tesla TSLA 0.39% , with his solar panel maker company SolarCity SCTY -3.02% 

On July 1, a driver of a Tesla Model X in Pennsylvania crashed into a turnpike guard rail. The National Highway Traffic Safety Administration (NHTSA) announced last week it is investigating the crash “to determine whether automated functions were in use at the time of the crash.”

Tesla has said, “Based on the information we have now, we have no reason to believe that Autopilot had anything to do with this accident.”

The crash came on the heels of NHTSA’s disclosure it was investigating a May 7 crash in Florida that killed the driver of a Tesla Model S that was operating in Autopilot mode.

The Tesla Autopilot system allows the car to keep itself in a lane, maintain speed and operate for a limited time without a driver doing the steering.

Last month, Tesla proposed to buy SolarCity over which Musk serves as chairman and principal shareholder. Musk expects the deal will help Tesla get into the market for sustainable energy for homes and businesses.

Jim Chanos of Kynikos Associates has called the proposed acquisition “shameful example of corporate governance at its worst.”

Continue reading “Secret Masterplan and a Reluctant Wall Street”