‘A very big deal’: Federal safety regulator takes aim at Tesla Autopilot
After four years of laissez-faire treatment under the Trump administration, the nation’s top auto safety regulator is making it clear to Elon Musk and Tesla that there’s a new sheriff in town.
After four years of laissez-faire treatment under the Trump administration, the nation’s top auto safety regulator is making it clear to Elon Musk and Tesla that there’s a new sheriff in town.
In June, the National Highway Traffic and Safety Administration ordered automakers to cough up data on every crash that involves automated driving systems, such as Tesla’s Autopilot. Last month it launched an investigation into a dozen crashes in which Teslas on Autopilot plowed into parked emergency vehicles.
Then, on Tuesday, NHTSA’s Office of Defects Investigation sent an 11-page letter instructing Tesla to provide the agency with an enormous volume of detailed data on each Tesla vehicle sold or leased in the United States from 2014 to 2021. “This could be a very big deal,” said Bryant Walker Smith, a professor at the University of South Carolina, one of the legal field’s foremost experts in automated motor vehicle law.
Back in 2016, when automated driving systems first drew broad public attention, the agency published enforcement guidelines making clear that it could enforce safety regulations governing software systems, not just traditional components such as carburetors, air bags or ignition switches.
Subsequently, however, the Trump administration took a lax approach to NHTSA enforcement. As many as 30 investigations into Tesla were launched, delving into Autopilot and other safety concerns, but the vast majority were either disbanded or are still in process.
The agency’s new activism is bad news for Tesla, whose electric car revenues have been boosted in part by the popularity of its Autopilot driver assist system, and by the $10,000 it receives from buyers of its Full Self-Driving system (which in fact is not a full self-driving system).
If the traffic and safety administration finds Autopilot or FSD defective in a way that jeopardizes public safety, the features could be recalled, a prospect that could force changes to the systems and potentially lead to a ban while safety concerns are addressed, legal experts say.
Even a finding that Tesla has promoted what NHTSA calls “predictable abuse” could cause problems for Autopilot. Tesla legal language says human drivers must pay attention at all times with Autopilot engaged, but Tesla marketing, including videos of Musk driving Teslas without using his hands, has seemingly contradicted the warnings. A growing library of YouTube videos shows Tesla drivers misusing the system, some of them crawling into the back seat while the car “drives itself.”
Although NHTSA has put Tesla on a tight deadline to submit its data, which are due Oct. 22, Smith said any recall or other enforcement wouldn’t be immediate, and by the time NHTSA acted, new software iterations or a change in Tesla marketing could make the matter moot.
“This is also a very long and potentially hidden process that will depend on how Tesla responds,” Smith said. “I could see this potentially taking a very long time.”
If the past is any indication, Musk may prove less than cooperative. He once hung up on the head of the National Transportation Safety Board. As the Securities and Exchange Commission sought information from Tesla in 2020, Musk sent out a tweet obliquely inviting the commission to perform oral sex on him.
Tesla could attempt to forestall the inquiry by declaring much of the data in question to be proprietary business information, Smith said. The company could obscure the data using formats that make it hard to pull out useful information. The agency itself may not have sufficient expertise to analyze the data dump, he said, and there could be legal challenges.
But that hardly means the attempt is fruitless, Smith added. “This is NHTSA trying to unlock the doors to a whole lot of information, and it will be fascinating to see what jumps out.”
Autopilot crashes have led to injuries and deaths, Smith said, and the NHTSA is clearly attempting to better understand where technology is heading across the industry, not just investigating past crashes. “It’s an effort to get out ahead of new crashes and understand what’s happening in those cars and in those companies.”
The letter to Tesla seeks data that the company has collected on crashes, consumer complaints, lawsuits, injury claims, property damage claims, accident reports, descriptions of how Tesla technology is intended to work, performance metrics, warranty claims, vehicle safety testing processes and dozens of other subjects.
Tesla critics have accused the company of conducting “stealth recalls” that attempt to solve safety problems while keeping them out of the public eye. One aim of NHTSA’s data request seems to be to uncover evidence of issues Tesla successfully hushed up or had reason to know about, Smith said. A company that can monitor its vehicles could identify incidents and quickly deal with them in a way that they do not become broadly public, he said.
Tesla did not respond to a request for comment. Musk has not addressed NHTSA’s brightening spotlight on the company. On Wednesday, Musk tweeted out the assertion that “Safety is always paramount at Tesla.”