SEC slaps Tesla’s wrist over accounting practices

The Securities and Exchange Commission has been writing Tesla some strongly worded letters about its accounting practices.

The regulator criticized Tesla for using “individually tailored” measurements in its August earnings release, according to a series of letters between Tesla and the SEC reported by The Wall Street Journal.

The regulator has been cracking down on the use of non-GAAP information in earnings releases, the report said.

The SEC letters also criticized Tesla for failing to make a “substantive case” for using non-GAAP figures. An expert consulted by the Journal noted the strong language of the letters.

Tesla announced in early October that it would stop reporting non-GAAP figures. The company had previously defended their use by saying non-GAAP figures better reflected their finances.

Tesla did not return a request for comment.

Read More

How Tesla and Elon Musk Exaggeraged Safety Claims About Autopilot and Cars

The autonomous program isn’t meant for most types of driving, and the automaker compares its new luxury vehicles to older, cheaper cars.

After years of hype about its autonomous driving system, the facts are coming out about how safe Tesla and Autopilot really are.

Elon Musk’s company admits three of its vehicles have crashed while Autopilot was engaged, including one fatal accident in which Joshua Brown’s vehicle ran directly into a semi truck. In the wake of Brown’s death, Musk claimed Autopilot would save thousands of lives if it was deployed universally today. That bold claim is based on insufficient data and flawed comparisons between Tesla vehicles and all other cars that stack the deck in favor of the company.

 

Musk’s fame as a self-described “applied physicist” and serial entrepreneur have generated a seemingly inexhaustible public faith in his intelligence and leadership, but by promoting such a flawed statistical comparison of his firm’s controversial system he calls that confidence into question.

Autonomous drive technology was never mentioned in Musk’s ambitious 2006 “Top Secret Master Plan” for Tesla, and appears to have been tacked on to its world-changing mission in response to Google’s massively-hyped (but still not yet commercially deployed) self-driving car program. The reveal of Google’s radical, zero-human-control autonomous concept car in 2014 suddenly made Tesla’s cars-of-the-future look decidedly passé. And in contrast to Google, whose communication about autonomous drive focused entirely on the technology’s long-term safety benefits, Musk’s fixation on beating the competition to market looks more like a rush to protect Tesla’s high-tech image than the pure pursuit of safer roads. When Morgan Stanley in 2015 boosted Tesla’s stock price target by 90percent based on the projection that Musk would lead the auto industry into an“autonomous utopia,” it showed that even a perceived advantage in autonomous drive could help Tesla raise the huge amounts of capital it needs to continue growing.

 

Musk set about creating this perception in 2013, when he said that Autopilot would be capable of handling “90 percent of miles driven” by 2016. By mid-2014, Musk was promising investors that the system could handle all freeway driving, “from onramp to exit,” within 12 months of deployment. Tesla still has yet to make good on those bold claims, and competitors argue that Tesla’s relatively simple sensor hardware will never be capable of safely performing at such a high level of autonomy.

 

In the wake of Brown’s fatal crash, Tesla’s sensor supplier Mobileye clarified that its current technology is not designed to prevent a crash with laterally moving traffic like the turning semi truck Brown’s Model S struck. This week, Tesla revealed another Autopilot accident that saw a Model X swerve into wooden stakes going 55 mph in a canyon road.

Experts have understood Autopilot’s hardware limitations for some time, but Tesla owners and investors clearly believed that Autopilot was either an autonomous drive system, or something very close to it. Brown clearly believed that Autopilot was “autonomous” and described it as such in the description of a video that Musk shared on Twitter. So great was his apparent faith in Autopilot’s autonomous capabilities that he was reportedly watching a DVD at the time of his fatal crash. The extent of Autopilot’s true abilities, which wax and wane with each Over The Air software and firmware update Tesla pushes to the car, is hotly debated on Tesla forums where even Musk’s most devout acolytes waver between extolling its miraculous powers and blaming drivers for their inattentiveness depending on the circumstances.

This ambiguity and overconfidence in semi-autonomous systems is why Google refuses to develop anything less than fully autonomous systems which requires no driver input, a level of performance the search giant insists requires the extensive testing and expensive LIDAR sensors that Musk has often dismissed. It’s also why major automakers are developing driver alertness monitoring systems that they say will keep drivers in semi-autonomous vehicles from relying too heavily on their vehicle’s limited capabilities.

 

Rather than waiting for LIDAR costs to come down or building in a complex driver alertness monitoring system, Tesla has chosen to blame its faithful beta testers for any problems that pop up in testing. One Tesla owner describes this Catch-22, after being told that a crash was her fault because she turned off Autopilot by hitting the brakes: “So if you don’t brake, it’s your fault because you weren’t paying attention,” she told The Wall Street Journal. “And if you do brake, it’s your fault because you were driving.”

Confusion about Autopilot’s actual abilities has persisted even after the first fatal crash was reported, with Musk dismissing questions about the wreck by claiming that road fatalities would be cut in half “if the Tesla Autopilot was universally available.” The shaky statistical basis for these claims is just the latest in a long line of confusing and contradictory statements about Autopilot’s abilities.

 

Musk first claimed that Autopilot is twice as safe as a human driver before any reported crashes involving Autopilot, when he asserted that the average distance driven before an airbag deployment was twice as long for vehicles with Autopilot. This position ignores the fact that airbag deployments are not the same as fatalities or injuries. In fact, about 3,400 Americans (about 10 percent of total annual road deaths) die each year in frontal crashes where airbags are not deployed. Moreover, Musk was drawing on just 47 million miles driven in Tesla vehicles, or less than 0.00001 percent of the more than 3 trillion miles driven by Americans in 2014.

 

The fact that Autopilot is only supposed to be activated on divided freeways, without cross-traffic, cyclists, or pedestrians skews the statistics so far in its favor that any comparison with broader traffic safety statistics “has no meaning” according to Princeton automotive engineering professor Alan Kornhauser. And since Tesla has blamed drivers for wrecks in which they turned off Autopilot features by attempting to steer or brake just before a crash may also be limiting the number of incidents Tesla reports as involving Autopilot.

 

In Tesla’s response to the recent fatality, the company emphasized that Autopilot is responsible for fewer fatalities (one per 130 million miles driven) than the overall U.S. fleet average (one per 94 million miles driven). The accuracy of the latter figure has been called into question by Sam Abuelsamid, who points out that vehicle occupant deaths in the U.S. occur only once every 135.8 million miles, making the average U.S. fleet slightly safer on average than Tesla’s one death per 130 million miles.

 

In addition to being potentially inaccurate, the company’s statistics are also fundamentally lacking in comparability because they fail to account for significant differences in vehicle age and vehicle cost—two attributes that significantly affect vehicle safety.

 

Tesla compares its fleet of (on average) 2-year-old cars to the U.S. fleet and its average age of 11.5 years, almost as old as the automaker itself. Data from the Insurance Institute for Highway Safety show that even when vehicles from 2004 were new, they were much less safe than modern vehicles are. At the time of manufacture, these vehicles were responsible for 79 fatalities per million registered-vehicle years—by comparison, the 2011 model-year vehicles cut the fatality rate by nearly two-thirds, to just 28 per million vehicle years. This yawning gap in safety is even wider now, since the 2004 vehicles are now over a decade old, and have likely racked up over 100,000 miles. Tesla holds up this aging U.S. fleet (including motorcycles, which are 26 times as likely to be involved in a fatal accident as passenger vehicles) as a reasonable safety comparison for its fleet.

Tesla’s comparison also overlooks the dramatic cost difference between its luxury cars and an “average” vehicle on the road in the U.S. today.

The lowest-priced Model S starts at $66,000, about the same as an entry-level Audi A7, and climbs to around $135,000 with options, a similar price point as a fully-optioned Lexus LS 600h hybrid. The Model X starts around $83,000 and tops out around $145,000, a price range bracketed by the Porsche Cayenne SUV on the low end and a Porsche 911 Turbo on the high end. Vehicles in this price range are engineered to be some of the safest vehicles on the road, and many come with a panoply of advanced driver assistance safety functions, like Automatic Emergency Braking and Adaptive Cruise Control, making them the most direct competition for Tesla’s vehicles. Comparing a vehicle in this rarified segment to the overall market, where the average new vehicle costs less than $35,000, is beyond disingenuous.

 

Given Tesla’s sophistication and resources—and its strong incentive to make a robust case for the safety of its vehicles—it is surprising that they weren’t able to put together a more compelling comparison.

If federal safety regulators find that Musk’s public statements about Autopilot’s abilities similarly misleading, his bold “public beta test” could set back more than just his image and Tesla’s; it could raise suspicions that compromise the development of autonomous drive technology more broadly.

Continue reading “How Tesla and Elon Musk Exaggeraged Safety Claims About Autopilot and Cars”

The Case for the SEC Investigation on Tesla

In an SEC filing just three days after the fatal accident.

Apparently, a crash related to Tesla’s autopilot feature was material, before it wasn’t.

On Tuesday, Fortune reported that Elon Musk and Tesla Motors TSLA 0.45% may have withheld a material fact from shareholders when it failed to disclose that a driver had died using the semi-self driving “autopilot” feature in one of the company’s vehicles. The fatal accident, the first known case related to the autopilot feature, occurred 11 days before Musk and Tesla sold $2 billion shares in an offering on May 18. Yet the company made no mention of the crash in its offering documents. The news of the accident didn’t come out until last week, when it was reported by federal highway authorities—six weeks after the offering.

Musk told Fortune via email that the deadly crash wasn’t “material” information that Tesla investors needed to know. After the article appeared on Tuesday, Musk called the article “BS” in a tweet and said that the fact that Tesla’s shares rose on Friday following the accident’s disclosure showed that the accident wasn’t material.

But back in early May, Tesla said exactly the opposite of what its founder is saying now in an SEC filing. The company warned investors that a fatal crash related to its autopilot feature, even a single incident, would be a material event to “our brand, business, prospects, and operating results.” The disclosure said that the company may face product liability claims due to “failures of new technologies that we are pioneering, including autopilot in our vehicles,” adding that “product liability claims could harm our business, prospects, operating results and financial condition.”

What’s more, Tesla noted that it does not carry insurance against such events. Were the company to be sued because of a death related to, say, its autopilot feature, it could result in a “substantial monetary award” that would have to be paid from “company funds, not by insurance.”

The company made the disclosure on May 10, just three days after the fatal autopilot accident, but likely after the company knew about the crash. On Tuesday, following Fortune’s article, Tesla disclosed that it learned about the accident “shortly” after it occurred, and told authorities at The National Highway Traffic Safety Administration (NHTSA) about it nine days after it happened, which was two days before the company’s $2 billion stock offering.

The offering consisted of Tesla selling approximately $1.7 billion of new stock while Musk exercised stock options for over 5.5 million shares that were set to expire on Dec. 3, 2016 and then sold nearly 2.8 million of those shares. Musk’s sale generated nearly $600 million in proceeds for the Tesla CEO, which, according to the company, he used for tax purposes.

Besides the stock offering, the lack of disclosure could be a problem for a proposed merger deal with SolarCity SCTY 0.37% . The solar energy company, of which Musk is also a controlling shareholder, said in June it was evaluating a $2.8 billion acquisition offer from Tesla. It is not clear whether Musk and Tesla disclosed the crash to SolarCity’s board. There was no mention of the accident in a conference call with investors following the announcement of the deal. And no prospectus has been filed on the deal, which shareholders have yet to approve through a vote.

Companies are legally required to disclose any facts they know that may affect the price of their stock. Tesla has marketed the autopilot feature vigorously as safe and important to its customers.

John Coffee, a law professor at Columbia University and an expert in securities law, says he believes the company should have disclosed news of the death earlier, and the fact that the stock didn’t fall following the news of the crash doesn’t prove the event wasn’t material and shouldn’t have been disclosed. He says since Tesla’s shares are under the influence of a large controlling shareholder, namely Musk, you shouldn’t read too much into its stock movements. “I think it is material as that death has changed both the public’s and the insurance industry’s perception of self-driving cars,” wrote Coffee in an email to Fortune.

Fortune has reached out to Musk and Tesla PR for comment and will update this story if either responds.

Continue reading “The Case for the SEC Investigation on Tesla”

Report: SEC is investigating Tesla for possible violation of securities law

The US Securities and Exchange Commission is investigating Tesla for a possible securities-law violation, according to the Wall Street Journal.

The inquiry — at its early stages — is in connection to Tesla’s failure to disclose a fatal crash involving one of the company’s Autopilot-equipped cars to investors, the newspaper said, citing a person familiar with the matter.

“Tesla has not received any communication from the SEC regarding this issue,” a company representative told Business Insider on Monday.

The company has been criticized for not telling investors of the death of a Tesla Model S driver.

Tesla knew of the death, which occurred on May 7, before it raised $2 billion in a stock sale on May 18.

News of the fatal crash became public only at the end of June after the National Highway Traffic Safety Administration’s Office of Defects Investigation indicated that it is looking into the performance of Tesla’s automated-driving system that was in use at the time of the crash.

Since last week, Tesla and CEO Elon Musk have been defending the lack of disclosure, saying that it is not “material nonpublic information.”

Under SEC rules, companies are required to disclose to shareholders any material nonpublic information it passes along to other entities. Regulation Fair Disclosure, however, is mostly targeted at information made available to some shareholders, investors, or individuals with an ability to profit off of this information before the broader markets know.

The SEC declined to comment.

In the wake of the Tesla crash, the National Transportation Safety Boardannounced last week that it is probing the incident to see if there are any systemic issues with the automated driving systems on the road today.

Shares of Tesla fell less than 2% in after-hours trading after rising 3.7% during the regular trading session. Continue reading “Report: SEC is investigating Tesla for possible violation of securities law”