Autopilot tech supplier fears Tesla is pushing safety envelope too far

The chief technology officer of a technology supplier that enables Tesla’s semi-autonomous Autopilot driving technology believes the carmaker is pushing the safety envelope too far.

“It is not designed to cover all possible crash situations in a safe manner,” Amnon Shashua, CTO and executive chairman at Israel-based Mobileye NV, told Reuters Wednesday.

Shashua’s comments came the same day a second fatal accident was revealed through a lawsuit filed against Tesla by the father of a man allegedly driving a Model S with the Autopilot engaged.

Read More

The Science of Automated Cars and an Impatient Business

Deadly Tesla Crash Exposes Confusion over Automated Driving

Amid a federal investigation, ignorance of the technology’s limitations comes into focus

How much do we really know about what so-called self-driving vehicles can and cannot do? The fatal traffic accident involving a Tesla Motors car that crashed while using its Autopilot feature offers a stark reminder that such drivers are in uncharted territory—and of the steep cost of that uncertainty.

The sensor systems that enable Tesla’s hands-free driving are the result of decades of advances in computer vision and machine learning. Yet the failure of Autopilot—built into 70,000 Tesla vehicles worldwide since October 2014—to help avoid the May 7 collision that killed the car’s sole occupant demonstrates how far the technology has to go before fully autonomous vehicles can truly arrive.

The crash occurred on a Florida highway when an 18-wheel tractor-trailer made a left turn in front of a 2015 Tesla Model S that was in Autopilot mode and the car failed to apply the brakes, the National Highway Traffic Safety Administration (NHTSA)—which is investigating—said in a preliminary report. “Neither Autopilot nor the driver noticed the white side of the tractor-trailer against a brightly lit sky, so the brake was not applied,” according to a statement Tesla issued last week when news of the crash was widely reported. Tesla says Autopilot is disabled by default in the cars, and that before they engage the feature drivers are cautioned that the technology is still in the testing phase. Drivers are also warned that Autopilot “is an assist feature that requires you to keep your hands on the steering wheel at all times,” the company says.

In addition to investigating exactly what happened in Florida, Tesla is looking into a Pennsylvania crash that took place July 1—the day after the NHTSA announced its probe—involving a 2016 Tesla Model X that may have been using Autopilot at the time of the accident, according to the Detroit Free Press. Tesla says there is no evidence Autopilot was in use during the mishap, although the Pennsylvania State Police contend the driver said the car was using the self-driving feature.

FAULTY VISION

Tesla’s description of the Florida accident suggests the car’s computer vision system was likely the crux of the problem, says Ragunathan Rajkumar, a professor of electrical and computer engineering in Carnegie Mellon University’s CyLab and veteran of the university’s efforts to develop autonomous vehicles—including the Boss SUV that won the Defense Advanced Research Projects Agency (DARPA) 2007 Urban Challenge. Computer vision allows machines to detect, interpret and classify objects recorded by a camera, but the technology is known to be imperfect “by a very good margin,” Rajkumar says.

The paradox of computer vision systems is that in order to classify an object quickly, they generally use low-resolution cameras that do not gather large amounts of data—typically two megapixels, much less than the average smartphone camera. “The only way you can get high reliability is for [a self-driving technology] to combine data from two or more sensors,” Rajkumar says. Automobiles with self-driving features typically include cameras and radar as well as light detection and ranging (LiDAR).

Tesla vehicles rely on artificial vision technology provided by Mobileye, Inc. The company’s cameras act as sensors to help warn drivers when they are in danger of rear-ending another vehicle, and in some instances can trigger an emergency braking system. The companynoted in a statement last week that the Tesla incident “involved a laterally crossing vehicle,” a situation to which the company’s current automatic emergency braking systems are not designed to respond. Features that could detect this type of lateral turn across a vehicle’s path will not be available from Mobileye until 2018, according to the company. Mobileye co-founder Amnon Shashua acknowledged the accident last week during a press conference called to introduce a partnership with automaker BMW and chipmaker Intel that promised to deliver an autonomous vehicle known as the iNEXT by 2021. “It’s not enough to tell the driver you need to be alert,” Shashua said. “You need to tell the driver why you need to be alert.” He provided no details on how that should be done, however.

BUYER BEWARE

Consumers must be made aware of any self-driving technology’s capabilities and limitations, says Steven Shladover, program manager for mobility at California Partners for Advanced Transportation Technology (PATH), a University of California, Berkeley, intelligent transportation systems research and development program. “My first reaction [to the crash] was that it was inevitable because the technology is limited in its capabilities and in many cases the users are really not aware of what those limitations are,” says Shladover, who wrote about the challenges of building so-called self-driving vehicles in the June 2016 Scientific American. “By calling something an ‘autopilot’ or using terms like ‘self-driving,’ they sort of encourage people to think that the systems are more capable than they really are, and that is a serious problem.”

Vehicles need better software, maps, sensors and communication as well as programming to deal with ethical issues before than can truly be considered “self-driving,” according to Shladover. These improvementswill come but there will always be scenarios that they are not ready to handle properly, he says. “Nobody has a software engineering methodology today that can ensure systems perform safely in complex applications, particularly in systems with a really low tolerance for faults, such as driving,” he adds.

Vehicles with increasingly advanced self-driving features are emerging as a significant force in the automobile industry for several reasons: Some major carmakers see the technology as a way to differentiate their brands and tout new safety features in their higher-end models. Also, there is demand for systems that monitor driver alertness at the wheel as well as software that issues warnings when a vehicle strays from its lane and takes over braking systems when a vehicle is cut off; the market will only increase for such features as they become more affordable. Further, motor vehicle deaths were up by 7.7 percent in 2015, and 94 percent of crashes can be tied back to human choice or error, according an NHTSA report issued July 1.

The quest to roll out new autonomous driving features will unfold rapidly over the next five years, according to a number of companies working on the technology. In addition to the BMW iNEXT, GM’shands-free, semiautonomous cruise control is expected in 2017. The next Mercedes E-Class will come with several autonomous features, including active lane-change assist that uses a radar- and camera-based system. Much of the progress beyond those core self-driving capabilities will depend on federal government guidance. In March the U.S. Department of Transportation’s National Transportation Systems Center reviewed federal motor vehicle safety standards and concluded that increasing levels of automation for parking, lane changing, collision avoidance and other maneuvers is acceptable—provided that the vehicle also has a driver’s seat, steering wheel, brake pedal and other features commonly found in today’s automobiles.

Google had started down a similar road toward offering self-driving features about six years ago—but it abruptly switched direction in 2013 to focus on fully autonomous vehicles, for reasons similar to the circumstances surrounding the Tesla accident. “Developing a car that can shoulder the entire burden of driving is crucial to safety,” Chris Urmson, director of Google parent corporation Alphabet, Inc.’s self-driving car project, told Congress at a hearing in March. “We saw in our own testing that the human drivers can’t always be trusted to dip in and out of the task of driving when the car is encouraging them to sit back and relax.”

Just as Google’s experiments caused the company to rethink its efforts to automate driving, Tesla’s accident, although not necessarily a setback, “will justifiably lead to more caution,” Rajkumar says.

Continue reading “The Science of Automated Cars and an Impatient Business”

SpaceX Attempted Monopoly

The most bitterly contested issue in Senate floor debate of the National Defense Authorization Act this year was whether the Department of Defense would be allowed to continue using Russian rocket engines to lift national-security satellites into orbit.  The engines, called RD-180s, provide first-stage thrust for Atlas launch vehicles built by United Launch Alliance, a joint venture of Boeing and Lockheed Martin.

United Launch Alliance, or ULA, had a monopoly on Pentagon launches until last year, when the Air Force certified Elon Musk’s SpaceX as a competing provider of launch services to the military.  By that time, though, the whole issue of how national-security payloads reach orbit had gotten caught up in the reaction to Moscow’s annexation of Crimea. The annexation and continuous military provocations that followed convinced many legislators that Russian engines needed to be jettisoned from the military space program as soon as possible.

That set the stage for an aggressive lobbying campaign by SpaceX to assure itself of a sizable market share in military launches by securing passage of a ban on using the Russian engines after 2019.  That’s when the Pentagon’s current multiyear contract for launch services will be completed.  If Russian engines were banned, then ULA would be unable to use Atlas for military launches.  Its other launch vehicle, Delta, costs about 35% more than Atlas and has no hope of beating SpaceX’s Falcon 9 launch vehicle in a price-based competition.

(Disclosure: Boeing and Lockheed Martin both contribute to my think tank, and Lockheed is also a consulting client – which is why I know some of the details that follow.)

If SpaceX’s lobbying campaign had succeeded, Musk’s company would have ended up with a monopoly on pretty much any military payload it was capable of lifting into orbit.  However, the campaign failed — on June 13 the Senate approved an amendment to the authorization bill that would allow use of up to 18 more RD-180 engines through the end of 2022.  Since the House had already voted a bill that would allow further use of the Russian engines, it appears the great engine debate is over, and ULA won.

The question is why it won. Reporting by Politico points to the heavy lobbying muscle that ULA owners Boeing and Lockheed Martin deployed in support of their joint venture — which certainly is a big part of the explanation.  But that doesn’t give credit to Senators for the deliberative process by which they weighed the arguments of both sides in a very emotional debate.  The real reason ULA won was that its backers told a more convincing story than the SpaceX side did, and legislators responded to the merits of their case.

Because the space launch industry is concentrated in a handful of states, most of the Senators were not voting on the basis of constituent interests.  It being an election year, though, they were all mindful of how voters might view sending money to Russia for rocket engines.  Vladimir Putin’s government is despised on both sides of the aisle, and SpaceX backers hammered away at the theme that every engine purchase was lining the pockets of Putin’s corrupt inner circle.

It was a potent theme.  In fact, the whole idea of relying on Russian engines to get missile warning and intelligence gathering satellites into orbit sounded a little crazy, given the threatening moves Moscow was making.  But SpaceX wasn’t capable of lifting these heavy satellites into high orbits, and Atlas was the cheapest option available that could get the job done.  In fact, if Delta was grounded for any reason, it was the only option.

The main goal of the Boeing-Lockheed-ULA lobbying campaign was to get this message out to Senators, many of whom had not followed the engine debate closely.  That approach was a bit like the strategy that Boeing followed in making the case for reauthorization of the Export-Import Bank.  Although there was strong opposition to the bank, it was almost entirely based on emotion rather than analysis.  Boeing and other Ex-Im backers spent years educating legislators to the facts, and in the end won a resounding bipartisan victory.

The biggest challenge in implementing such a fact-based strategy is fielding enough talent to reach legislators and their staffs.  In previous years, ULA had sustained a relatively sparse presence on Capitol Hill, failing to grasp the threat that SpaceX posed to its business.  With co-owners Boeing and Lockheed Martin leaving ULA to fend for itself most of the time, SpaceX was able to win many converts to its side.  ULA managed to dodge the bullet of an outright ban on the engines last year through a last-minute change to must-pass spending legislation, but it was too close for comfort.

With Senate Armed Services Committee chairman John McCain leading the charge for SpaceX and furious at last year’s 11th hour reversal of a ban, ULA’s prospects for prevailing in the 2017 budget cycle looked uncertain.  So CEO Tory Bruno assembled a much bigger lobbying team, and that was heavily supplemented by Boeing and Lockheed.  Reports that SpaceX and ULA spent similar amounts on lobbying in the first quarter don’t count all the additional muscle that Boeing and Lockheed deployed.

Protecting their space-launch franchise wasn’t the only reason Boeing and Lockheed got engaged in a big way.  The rhetoric employed by SpaceX backers was so strident and emotional that the companies could not let it go unanswered.  Senator McCain’s remarks in floor debate described a greedy “military-industrial-congressional complex” that was subverting the nation’s interests to keep using Russian engines, in the process rewarding Putin’s “corrupt cronies” and helping fund Russian aggression in Syria.

This overheated language gave the companies additional incentive to get their story out on Capitol Hill.  They were aided in that effort by Senator Bill Nelson of Florida, a senior member of the Armed Services Committee who brokered a compromise that McCain was willing to sign on to because it included a specific date after which RD-180s could no longer be used for military launches.  That date gave ULA sufficient time to field a successor to Atlas powered by American engines, meaning SpaceX would not be able to fashion a launch monopoly.

Nelson pointed out in floor debate on the amendment that he and Senator Cory Gardner of Colorado sponsored that Russian engines represented less than one-third of one percent of the value of U.S. imports from Russia, hardly enough to enrich anybody in Moscow.  He also noted ULA’s record of over a hundred successful launches with no failures since its inception, and explained why it would take until 2022 for a new launch vehicle to be available so that military access to space and competition for launch services could be guaranteed.

Senator Nelson’s arguments were so sensible that in the end his amendment passed on a voice vote.  In other words, it wasn’t close.  There’s no denying that lobbying by Boeing, Lockheed Martin and ULA contributed to the defeat of the engine ban that SpaceX so fervently sought.  But the arguments lobbyists laid out to anyone who would listen were so compelling that even some members of the Tea Party signed on.  SpaceX lost because its position potentially endangered U.S. access to space and undermined competition.

Continue reading “SpaceX Attempted Monopoly”

U.S. Safety Agency Investigates Another Tesla Crash Involving Autopilot

The nation’s top auto safety regulator said on Wednesday that it had begun an investigation of a second crash involving a Tesla Motors car equipped with Autopilot technology, a system designed to let vehicles drive themselves for brief periods.

In the nonfatal crash, a Tesla sport utility vehicle rolled over last Friday on the Pennsylvania Turnpike after hitting barriers on both sides of the highway. Safety officials continue to investigate a fatal Florida accident in May. The driver of the Pennsylvania vehicle told the Pennsylvania State Police that he was operating it in Autopilot mode.

The accidents have put new scrutiny on Tesla’s Autopilot system and raised questions about whether the technology, which the company describes as only an experimental “beta” test, lulls drivers into a false sense of security.

Although Tesla drivers have posted YouTube videos of themselves operating the vehicles completely hands-free — even climbing into the back seat — the company has cautioned that Autopilot is meant only as an “auto-assist” feature that requires drivers to keep their hands on or near the steering wheel at all times.

In the Florida crash, the first known fatality involving an autonomous driving system, the driver was killed when his Tesla Model S sedan struck a tractor-trailer that was crossing the roadway.

An account given on Wednesday by a witness to the Florida accident seemed to indicate that the Autopilot system continued operating the car at highway speed, even after the vehicle’s top was sheared off by the impact and the Tesla went under the trailer and continued down the road.

“The car came from underneath the trailer,” said the witness, Terence Mulligan, who was named in the Florida Highway Patrol’s accident report. Mr. Mulligan, who was driving behind the tractor-trailer at the time, said: “The top was gone. It went right by me.”

Mr. Mulligan, in a telephone interview, said he turned and followed the Tesla, which did not slow down until it had left the road, crashed through two fences and hit a utility pole. His account jibed with the accident report by the Florida Highway Patrol, which said the car was traveling at 65 miles per hour when it hit the tractor-trailer.

Tesla has declined to comment on the details of the Florida crash, which is still under investigation by state and federal officials.

In a statement on Wednesday about the Pennsylvania crash, Tesla said it had “no reason to believe that Autopilot had anything to do with this accident” based on the information it had collected so far.

The Pennsylvania crash involved a Model X S.U.V. heading east on the Pennsylvania Turnpike about 100 miles east of Pittsburgh. The car scraped a guardrail on the right side of the road, crossed the roadway and hit the concrete median. It then rolled over onto its roof and came to a stop in the middle of the road.

Tesla vehicles have the ability to send data back to the company about their condition and operation. In a statement, the company said it received an automated alert from the Model X in Pennsylvania on July 1 showing that its airbags had deployed. But the company said more detailed information about the car’s operation was not received, a situation that could happen if the car’s antenna was damaged in the crash.

Details of the Pennsylvania crash were first reported by The Detroit Free Press. The Pennsylvania State Police declined to release additional details because an investigation is in progress.

The Pennsylvania driver, Albert Scaglione, said by phone on Wednesday that he had just been released from the hospital and declined to comment on the accident. “My attorneys will be releasing a statement shortly,” he said.

A passenger in the car, Tim Yanke, was reportedly not seriously injured.

The National Highway Traffic Safety Administration said on Wednesday that it was collecting information from the Pennsylvania State Police, Tesla and the driver to find out whether automated functions were in use at the time of the crash.

The federal safety agency has also sent a crash investigation team to Florida to determine if the Tesla Autopilot system was at fault in the accident on May 7, which killed Joshua Brown, a 40-year-old from Canton, Ohio.

In the Florida crash, charges are pending against Frank Baressi, the driver of the tractor-trailer that was hit by Mr. Brown’s Tesla. But no final determination on charges will be made until the inquiry is complete, Sgt. Kim Montes, a spokeswoman for the Florida Highway Patrol, said on Wednesday.

“We know the truck made a left turn, and the person going straight has the right of way,” she said, referring to Mr. Brown’s vehicle.

Mr. Baressi, reached by phone, declined to comment.

In an interview with The Associated Press last week, Mr. Baressi said he had heard a Harry Potter movie playing from Mr. Brown’s vehicle, but also acknowledged, “He went so fast through my trailer, I didn’t see him.”

Sergeant Montes said, “We don’t know if that’s accurate,” adding, “We may never know, obviously, given the damage of the vehicle. In a very violent crash, there’s not going to be a lot left inside a car that could be playing.”

A DVD player and a laptop computer were recovered from Mr. Brown’s vehicle after the crash.

Questions have been raised about why neither Tesla nor the federal safety agency notified the public sooner about the May 7 accident, if only to caution other drivers about using Tesla’s Autopilot feature.

When the federal investigation of Mr. Brown’s accident was disclosed last week, Tesla released a statement saying it had informed the agency of the crash “immediately after it occurred.”

But in a statement on Tuesday, Tesla said it did not tell the federal agency about the accident until nine days later.

The Florida Highway Patrol contacted Tesla, seeking help in downloading data from the car’s so-called black-box recorder, seven to 10 days after the crash.

The company said in a statement that it was obligated to notify the National Highway Traffic Safety Administration on a quarterly basis when it became aware of a fatal accident involving a Tesla vehicle.

“As part of its regular ongoing communication and not as part of any formal process, Tesla told N.H.T.S.A. about the accident while it was still in the process of conducting its investigation,” Tesla said. “This happened on May 16.”

Continue reading “U.S. Safety Agency Investigates Another Tesla Crash Involving Autopilot”