Tesla’s Cheap Approach To Autopilot Might Not Lead Anywhere

Summary

  • More reports of Tesla autopilot users getting into accidents are emerging.
  • While many have already commented on Tesla’s handling of these incidents, I comment on the technical side of the autopilot technology Tesla is using.
  • Aside from product management issues, Tesla’s approach might not lead to fully autonomous driving in principle.

A self-driving car has killed someone. A Tesla car (NASDAQ: TSLA). No matter whether it was a product in beta, not a fully autonomous car, or if there was negligence on the driver’s end - this is the headline. News of more serious accidents keep emerging. Many contributors have already commented on how Tesla has handled the incident, in particular given the timing with regard to the company’s recent equity offering. For now, I am not interested in these issues. I am interested in Tesla’s technical product management regarding the autopilot. I will begin by discussing the narrative on the autopilot, both from a marketing and from a technical perspective.

(click to enlarge)

The marketing perspective

To understand how Tesla got into this mess, the first question investors should ask themselves is why Tesla ever decided it needed an autopilot. After all, Tesla has had enough production problems on its plate, from delayed launches, reliability problems with the model X to the Gigafactory.

The simple answer: branding. Tesla views itself as a technology company that solves humanity’s problems. It does not sell high-end luxurious electric vehicles, it sells a 21st century vision of technology bringing salvation. Its customers and shareholders buy participation in this vision first, products second. It’s about being on the right side of history.

Green energy is one aspect of that. Artificial intelligence is another big part of the narrative and a feature like the autopilot helps branding Tesla being at the forefront of both. Don’t just take it from me though. A quote from Jon McNeill (highlight mine):

(The autopilot feature) is one of the core stories of what’s going on here at Tesla.

That’s right, the autopilot is a story first, a product second. From the autopilot release:

The release of Tesla Version 7.0 software is the next step for Tesla Autopilot. We will continue to develop new capabilities and deliver them through over-the-air software updates, keeping our customers at the forefront of driving technology in the years ahead.

Now turning to the technical perspective, we have to keep in mind that this is the purpose of the autopilot. Tesla desperately wanted to be first to market here. How could it achieve that with substantially fewer resources, manpower and infrastructure than traditional car makers?

The technical narrative

Most readers here are familiar with different levels of autonomous driving. Systems like Tesla’s autopilot are classified according to which tasks they take away from a human driver and whether the human driver is still in control. There are multiple approaches to achieving different levels, and they depend on the end goal. For Tesla, that end goal is to deliver on the vision of cars that do not require a human driver at all (stated by Elon Musk in recent conference calls). Google (NASDAQ:GOOG) (NASDAQ:GOOGL) has the same end goal and has been working on the autonomous car projects for much longer than Tesla.

The two companies’ approach is fundamentally different and this is where we get back to the narrative. Both Google’s and Tesla’s cars use a wide array of sensors and radars to detect obstacles and map their environment. A high-level overview of Tesla’s autopilot functionality is given here.

The main difference is that Tesla is relying on relatively cheap off-the-shelve technology from their partner Mobileye (NYSE:MBLY) and other OEMs, combined with in-house software. Google on the other hand has spent years collecting data from fewer cars with considerably more expensive sensor technology, notably relying on LIDAR technology. LIDAR (or light sensing radar) surveys the environment by scanning it with laser light and measuring distances from reflections. This is the device always seen on top of Google’s self driving cars:

(click to enlarge)

LIDAR is expensive - articles from last year cite $80,000 for a single sensor, and over $150,000 in sensor cost for every Google car.

LIDAR is however believed to be the best sensoring solution if cost is not an issue. In the infamous accident, a white truck would likely been detected. Tesla instead relies on a passive, optical approach - Mobileye’s approach. A single forward camera relying on deep learning to discriminate objects in the environment.

Last year, Elon Musk said the following on the issue:

I don’t think you need LIDAR. I think you can do this all with passive optical and then with maybe one forward RADAR,” Musk said during at a press conference in October. “I think that completely solves it without the use of LIDAR. I’m not a big fan of LIDAR, I don’t think it makes sense in this context.

Elon Musk’s statement here frankly does not make sense from a technical perspective - LIDAR is superior technology, period. It would have just been economically infeasible to get LIDAR on Tesla’s cars, and further it would not have allowed to remotely install autopilot that easily on existing cars.

Hence, Tesla came up with a narrative that sounds consistent, logical and like it could be technically sound. That narrative is that while Tesla uses cheaper hardware, Tesla’s superior software, continuously learning from live data from tens of thousands of Tesla cars, will achieve the same outcome.

The New Yorker wrote a nice piece on this narrative. It frankly sounds like it is copied right out off marketing material:

Autopilot also gave Tesla access to tens of thousands of “expert trainers,” as Musk called them. When these de-facto test drivers overrode the system, Tesla’s sensors and learning algorithms took special note. The company has used its growing data set to continually improve the autonomous-driving experience for Tesla’s entire fleet. By late 2015, Tesla was gathering about a million miles’ worth of driving data every day.

A similarly superficial piece regurgitating this narrative has been blogged by Peter Diamandis, world-renowned AI researcher.

The underdog has shown up Google, of all technology companies, in much less time and with much fewer resources, and cheaper hardware on the cars to boot with. Like most things that sound too good to be true, this also is. Here is why:

Computer vision is just not there

Readers might have noticed that many other car companies have assisted driving, e.g. Mercedes (Intelligent Drive) and BMW (Driver assistance plus). Why do they not call it autopilot, but use rather more modest feature descriptions? Likely because they realize that there is a massive disconnect between these features and truly autonomous driving.

Now, I am not an expert in computer vision, but I do research in another subfield of deep learning. What I can tell you is this: the model accuracy in deep learning is simply not what you want in life-or-death situations. It’s just not there. It is maybe 99% even in very clinical settings. Maybe even 99.9%. This means that even in situations where you think (and Tesla said) it would work, it will not, e.g. because of unlucky combinations of weather, shadow/occlusion, reflection, color.

Releasing a deep learning-based system for chat bots, virtual assistants or a very narrow set of assisted driving tasks is probably alright. Calling it an autopilot is irresponsible. The whole narrative of ‘the systems continuously get better from all the data’ is simply misleading because deep-learning based computer vision can only get so good right now. It creates a dangerous expectation from customers who do not understand the nature of the probabilistic models used to make life-or-death decisions. This is entirely separate from the question whether assisted-driving makes sense - it does. The point is that these models are fundamentally different from control-theory based approaches used in other technology, i.e. predictable analytical models that can be verified to give real-time guarantees.

More data does not fix the fundamental gap between the kind of accuracy in critical systems and the state of the art in computer vision. I am not alone with this opinion. From one of the most renowned deep learning experts in the world, back in May:

It’s irresponsible to ship driving system that works 1,000 times and lulls false sense of safety, then… BAM!

The German car makers have not called their systems autopilot because they know this has the potential to mislead customers. Google has not released its car because it understands the problem with humans in the loop.

Google has substantially more expertise, resource and manpower in deep learning, software and computer vision than Tesla. This alone should have given journalists and analysts a sliver of doubt about the narrative of Tesla leapfrogging Google through intelligent use of software.

Funnily enough, a Tesla test vehicle with LIDAR has been spotted in Palo Alto recently:

(click to enlarge)

In my opinion, Tesla may have to admit that its current autopilot technology will never reach reliable, fully autonomous driving. Tesla might have to turn to LIDAR as well. This would completely destroy the narrative of having autopilot in anything but high-end models. The Model 3 would not be able to ever be autonomous at the promised price point.

Summary

Tesla might be forced to completely revert on its autopilot narrative. By trying to push a marketing story of being at the forefront of innovation, the company has endangered customers. Its off-the shelve technology might be good enough for a particular set of situations. This set of operations will certainly expand in the future, but might never reach safe full autonomy, unless Tesla changes its hardware stack.

Tesla has not stumped Google. Ultimately, I expect Google to come out on top with fully autonomous driving. Good things take time and Google is taking the time to get this right.

Author: BayesianLearner

Source:

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s