The Science of Automated Cars and an Impatient Business

Deadly Tesla Crash Exposes Confusion over Automated Driving

Amid a federal investigation, ignorance of the technology’s limitations comes into focus

How much do we really know about what so-called self-driving vehicles can and cannot do? The fatal traffic accident involving a Tesla Motors car that crashed while using its Autopilot feature offers a stark reminder that such drivers are in uncharted territory—and of the steep cost of that uncertainty.

The sensor systems that enable Tesla’s hands-free driving are the result of decades of advances in computer vision and machine learning. Yet the failure of Autopilot—built into 70,000 Tesla vehicles worldwide since October 2014—to help avoid the May 7 collision that killed the car’s sole occupant demonstrates how far the technology has to go before fully autonomous vehicles can truly arrive.

The crash occurred on a Florida highway when an 18-wheel tractor-trailer made a left turn in front of a 2015 Tesla Model S that was in Autopilot mode and the car failed to apply the brakes, the National Highway Traffic Safety Administration (NHTSA)—which is investigating—said in a preliminary report. “Neither Autopilot nor the driver noticed the white side of the tractor-trailer against a brightly lit sky, so the brake was not applied,” according to a statement Tesla issued last week when news of the crash was widely reported. Tesla says Autopilot is disabled by default in the cars, and that before they engage the feature drivers are cautioned that the technology is still in the testing phase. Drivers are also warned that Autopilot “is an assist feature that requires you to keep your hands on the steering wheel at all times,” the company says.

In addition to investigating exactly what happened in Florida, Tesla is looking into a Pennsylvania crash that took place July 1—the day after the NHTSA announced its probe—involving a 2016 Tesla Model X that may have been using Autopilot at the time of the accident, according to the Detroit Free Press. Tesla says there is no evidence Autopilot was in use during the mishap, although the Pennsylvania State Police contend the driver said the car was using the self-driving feature.

FAULTY VISION

Tesla’s description of the Florida accident suggests the car’s computer vision system was likely the crux of the problem, says Ragunathan Rajkumar, a professor of electrical and computer engineering in Carnegie Mellon University’s CyLab and veteran of the university’s efforts to develop autonomous vehicles—including the Boss SUV that won the Defense Advanced Research Projects Agency (DARPA) 2007 Urban Challenge. Computer vision allows machines to detect, interpret and classify objects recorded by a camera, but the technology is known to be imperfect “by a very good margin,” Rajkumar says.

The paradox of computer vision systems is that in order to classify an object quickly, they generally use low-resolution cameras that do not gather large amounts of data—typically two megapixels, much less than the average smartphone camera. “The only way you can get high reliability is for [a self-driving technology] to combine data from two or more sensors,” Rajkumar says. Automobiles with self-driving features typically include cameras and radar as well as light detection and ranging (LiDAR).

Tesla vehicles rely on artificial vision technology provided by Mobileye, Inc. The company’s cameras act as sensors to help warn drivers when they are in danger of rear-ending another vehicle, and in some instances can trigger an emergency braking system. The companynoted in a statement last week that the Tesla incident “involved a laterally crossing vehicle,” a situation to which the company’s current automatic emergency braking systems are not designed to respond. Features that could detect this type of lateral turn across a vehicle’s path will not be available from Mobileye until 2018, according to the company. Mobileye co-founder Amnon Shashua acknowledged the accident last week during a press conference called to introduce a partnership with automaker BMW and chipmaker Intel that promised to deliver an autonomous vehicle known as the iNEXT by 2021. “It’s not enough to tell the driver you need to be alert,” Shashua said. “You need to tell the driver why you need to be alert.” He provided no details on how that should be done, however.

BUYER BEWARE

Consumers must be made aware of any self-driving technology’s capabilities and limitations, says Steven Shladover, program manager for mobility at California Partners for Advanced Transportation Technology (PATH), a University of California, Berkeley, intelligent transportation systems research and development program. “My first reaction [to the crash] was that it was inevitable because the technology is limited in its capabilities and in many cases the users are really not aware of what those limitations are,” says Shladover, who wrote about the challenges of building so-called self-driving vehicles in the June 2016 Scientific American. “By calling something an ‘autopilot’ or using terms like ‘self-driving,’ they sort of encourage people to think that the systems are more capable than they really are, and that is a serious problem.”

Vehicles need better software, maps, sensors and communication as well as programming to deal with ethical issues before than can truly be considered “self-driving,” according to Shladover. These improvementswill come but there will always be scenarios that they are not ready to handle properly, he says. “Nobody has a software engineering methodology today that can ensure systems perform safely in complex applications, particularly in systems with a really low tolerance for faults, such as driving,” he adds.

Vehicles with increasingly advanced self-driving features are emerging as a significant force in the automobile industry for several reasons: Some major carmakers see the technology as a way to differentiate their brands and tout new safety features in their higher-end models. Also, there is demand for systems that monitor driver alertness at the wheel as well as software that issues warnings when a vehicle strays from its lane and takes over braking systems when a vehicle is cut off; the market will only increase for such features as they become more affordable. Further, motor vehicle deaths were up by 7.7 percent in 2015, and 94 percent of crashes can be tied back to human choice or error, according an NHTSA report issued July 1.

The quest to roll out new autonomous driving features will unfold rapidly over the next five years, according to a number of companies working on the technology. In addition to the BMW iNEXT, GM’shands-free, semiautonomous cruise control is expected in 2017. The next Mercedes E-Class will come with several autonomous features, including active lane-change assist that uses a radar- and camera-based system. Much of the progress beyond those core self-driving capabilities will depend on federal government guidance. In March the U.S. Department of Transportation’s National Transportation Systems Center reviewed federal motor vehicle safety standards and concluded that increasing levels of automation for parking, lane changing, collision avoidance and other maneuvers is acceptable—provided that the vehicle also has a driver’s seat, steering wheel, brake pedal and other features commonly found in today’s automobiles.

Google had started down a similar road toward offering self-driving features about six years ago—but it abruptly switched direction in 2013 to focus on fully autonomous vehicles, for reasons similar to the circumstances surrounding the Tesla accident. “Developing a car that can shoulder the entire burden of driving is crucial to safety,” Chris Urmson, director of Google parent corporation Alphabet, Inc.’s self-driving car project, told Congress at a hearing in March. “We saw in our own testing that the human drivers can’t always be trusted to dip in and out of the task of driving when the car is encouraging them to sit back and relax.”

Just as Google’s experiments caused the company to rethink its efforts to automate driving, Tesla’s accident, although not necessarily a setback, “will justifiably lead to more caution,” Rajkumar says.

Author: Larry Greenemeler

Source:

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s