Sarah Holder is a staff writer at CityLab covering local policy, housing, labor, and technology.
The Department of Transportation’s new set of voluntary safety guidelines might get more AVs on the road, and fast. But what happens if automakers don’t follow them?
Human error is to blame for 94 percent of all fatal motor vehicle crashes, and the rates of death have been spiking: About 40,000 people were killed on U.S. highways in 2016. That’s one reason the U.S. Department of Transportation and the National Highway Traffic Safety Administration hope to get more self-driving vehicles on the road, and fast.
But autonomous vehicles (AVs) could be a threat to both operators and others, too, if they turn out to be glitch-ridden machines. (That’s assuming they don’t achieve robot overlord status and actively seek to kill us.) Just this week, the National Transportation Safety Board concluded that insufficient system controls contributed to the fatal accident of a man using Tesla’s semi-autonomous Autopilot feature in 2015.
The 2016 Federal Automated Vehicles Policy, released under the Obama administration, was the first federal guidance to shape a centralized AV safety doctrine, with the intention of avoiding a chaotic patchwork of requirements from state to state. As my colleague Laura Bliss wrote last September, that policy guidance amounted to “a federal endorsement of full autonomy” that was engineered to speed the industry’s rapid development.
The Trump administration took its first swing at the same issue this week, with Transportation Secretary Elaine Chao announcing the release of “A Vision for Safety.” Unlike its hefty 114-page predecessor, this version is a mere 30-page set of “suggestions.” It’s designed to make “[d]epartment regulatory process more nimble,” encourage “new entrants and ideas,” and “remove barriers to innovation” by thinning out some of what this administration considered the more restrictive parameters outlined in 2016.
The bulk of the new document outlines the structure of “Voluntary Safety Self-Assessments,” which encourage companies to submit documents to the DOT that address 12 aspects of vehicle safety and accountability. Manufacturers are prompted to describe vehicle designs and the technologies that make them tick (all of which should adhere to federal, state, and local laws), and lay out plans for consumer education and training. Cybersecurity should be incorruptible, and data recording should be comprehensive. “The purpose of this Voluntary Guidance is to help designers of AVs analyze, identify, and resolve safety considerations prior to deployment using their own, industry, and other best practices,” the Vision states.
All this information, once compiled, should be detailed, accurate, and released for public consumption. The emphasis, however, is on the “should.” While the guidance encourages car makers to publicly disclose their Voluntary Safety Self-Assessments, it does not require them to, nor does it introduce enforcement mechanisms.
The 2016 FAVP wasn’t mandatory either, but manufacturers and automakers often interpreted that relatively lenient document more strictly, according to Greg Rogers, political analyst at the Eno Center for Transportation. It, too, had a list of safety areas (15 to the Vision’s 12), and recommended that states require safety assessments to be submitted before introducing AV testing. “It created back-door regulations by having states act as [regulatory] proxies,” he says.
2017’s Vision for Safety is much more hands-off: The word “voluntary” is repeated more than 50 times in the new document, and other language encourages states not to interfere with DOT’s regulations.
Inviting private businesses to regulate themselves is in keeping with the Trump administration’s ongoing interest in disassembling environmental and safety legislation. Auto industry groups have stated that they are largely pleased with the new guidance. But some experts caution that this document is essentially inviting the car industry to cut corners. “Do [automakers] truly have the incentive to ensure public safety?” asks Greg Rodriguez, a lawyer who specializes in implementing new transportation technologies. “There’s already a lot of pressure for them to get [self-driving] vehicles on the road, because they have investors who want to see a return.”
The fear is that, without clear consequences for failing to meet certain baseline safety thresholds, car companies might push AVs into the market and onto the highway prematurely. “What the new version does is it says to manufacturers, you can develop [automated vehicles] and you don’t have to submit the letter,” says Rogers, referring to the submission of safety guidance write-ups. “But in order to help the public be more comfortable, how about you consider publishing information on these 12 different aspects?” It puts the onus for making vehicles safe—and communicating that safety—back on the manufacturer.
It’s no secret that automakers can find ways to skirt even the most clearly defined federal rules: Witness Volkswagen’s recent Dieselgate scandal, in which company deployed an elaborate technological cover-up to defeat emissions testing. And that happened under the relatively hawkish eye of federal and state oversight.
There’s also the question of the stated purpose of this voluntary guidance—to “support the automotive industry and other key stakeholders”—and the very voluntariness of it. Officially, the self-professed role of the NHTSA is quite different: It’s to “keep people safe on America’s roadways.” Jacob Mason, a transportation researcher at the Institute for Transportation and Development Policy, worries that the federal government is aligning itself with companies whose interests are with the bottom line. “The purpose of the guidance should be to ensure safety,” he says. “It seems like the public interest isn’t even in there. Isn’t the role of government to serve the public interest?”
Automotive safety advocates are also concerned about other bills surrounding state-specific AV regulation in Congress. In early September, the House passed the SELF DRIVE Act with (rare) sweeping bipartisan support, giving NHTSA jurisdiction over states in regulating AV design, performance, and safety requirements.
Now the Senate is developing its own version of the federal legislation. Senate commerce committee chairman Gary Peters told Politico that it will end up looking quite different—which might make for a tough reconciliation process. Called the AV START Act, the Senate’s draft of the Senate bill has been received with alarm by the National Association of Transportation Officials and Transportation for America, who fear state and local governments will be entirely shut out of the decision-making process. “The bill’s requirement of a safety report is just an exercise,” they wrote in a statement. “The bill strips states and local governments of the authority to manage the vehicles on their roadways and leaves them without the tools to deal with problems already arising during the testing and deployment of automated vehicles.”
Such preemptive strikes from Congress seem to run counter to the position the feds have laid out for themselves. While the DOT is passing “suggestions” in favor of regulation—wary of putting the brakes on a technology that’s still in the developing phases—states have had to legislate for automated vehicles piecemeal. In California, self-driving cars need to publicly report their crashes and reveal how often humans have to take over control of the wheel; in Arizona, all you need is a standard vehicle registration. Michigan’s AVs can only take to Michigan roads if high-tech manufacturers collaborate with the traditional automotive industry (because, Detroit).
To adapt to different regulations, companies need to spend time and money making vehicles that adhere to each state’s preferences—and as technology quickly evolves, the rules will change, too. To avoid regulatory whiplash, the industry has put pressure on Congress to preempt the ability of states to implement safety rules.
Avoiding patchwork regulation is a worthy goal. As laws exist now, a person traveling in an AV into Nevada from California would need to hop out and put on a special red license plate after crossing state boundaries. In Europe, navigating the dizzying switch between regulations at each border has already proven difficult for the global AV industry, says Hod Lipson, Professor of Engineering at Columbia University, and coauthor of Driverless: Intelligent Cars and the Road Ahead. “If the [U.S.] government can handle this uniformly, it would be an incredible boost to the entire field,” he says.
But any regulation enforced from the top down also needs to complement existing transportation systems, says Rodriguez. “It seems like [the Vision is] more guidance for the private side, and not involving or giving enough credit for the role that local governments play,” he says. And in turn, the Senate and House bills seem to angle toward stripping state and local governments of regulatory power even before they’ve exerted it.
Preemption is no mere rhetorical concern—its implications for local communities are serious. What happens if one brand of AVs consistently runs red lights? Or if another can’t read stop signs obscured by graffiti, or has trouble reading the green lines of one city’s bike lane from the white ones of its bus stops? State and local authorities might be unable to intervene, according to the same statement by the National Association of Transportation Officials and Transportation for America. And the federal guidance introduced on Tuesday suggests few—if any—safeguards in place at the top of the governmental food chain, at least for now.
For AV advocates, the hope embedded in the new DOT guidelines is that simply putting more autonomous vehicles on the road faster will save more lives by reducing human error. NHTSA predicts cars will be running on full highway autopilot by 2025; Lipson thinks that it shouldn’t take that long.
“I’d like to see DOT say something crisp and simple, like: If a manufacturer can provide data that their car can drive twice as safely [than a human-operated one], they can have fully autonomous vehicle on the road,” Lipson says. “If you have the data, we’ll give you the license—period.” In other words, DOT could operate more like the FDA, he says: If you have a new drug that saves more lives than an existing one, it should go on the market.
But others urge more caution: A wave of AV-caused crashes could very swiftly erode public support for this emerging technology. If speedy deployment comes at the cost of deliberate safety requirements, then the U.S. will have only gone backwards, says Rodriguez. “Making safety voluntary is not in the interests of the long-term success of the technology,” he says. “Innovation and public safety should be on equal footing.”