Feds preview rules of the road for self-driving cars

Tenor of guidance signals government has embraced autonomous driving

By Joan Lowy and Justin Pritchard
Associated Press

WASHINGTON (AP) — Obama administration officials are previewing long-awaited guidance that attempts to bring self-driving cars to the nation’s roadways safely — without creating so many roadblocks that the technology can’t make it to market quickly.

Traditional automakers and tech companies have been testing self-driving prototypes on public roads for several years, with a human in the driver’s seat just in case. The results suggest that what once seemed like a technology perpetually over the horizon appears to be fast approaching, especially with car companies announcing a string of investments and acquisitions in recent months.

Federal officials have been struggling with how to capitalize on the technology’s promised safety benefits — the cars can react faster than people, but don’t drink or get distracted — while making sure they are ready for widespread use. The new guidance represents their current thinking, which they hope will bring some order to what has been a chaotic rollout so far.

Self-driving cars have the potential to save thousands of lives lost on the nation’s roads each year and to change the lives of the elderly and the disabled, President Barack Obama said in an op-ed published Monday by the Pittsburgh Post-Gazette.

“Safer, more accessible driving. Less congested, less polluted roads. That’s what harnessing technology for good can look like,” Obama wrote. But he added: “We have to get it right. Americans deserve to know
they’ll be safe today even as we develop and deploy the technologies of tomorrow.”

One self-driving technology expert said the overall tenor of the guidance signaled that the federal government truly has embraced autonomous driving. “In terms of just attitude, this is huge,” said Bryant Walker Smith, a law professor at the University of South Carolina who closely tracks the technology. He also cautioned that many details remain unclear.

The government did make clear that the National Highway Traffic Safety Administration will seek recalls if semi-autonomous systems don’t make drivers pay attention.

The agency, which is part of the Transportation Department, released guidelines showing how NHTSA can use its recall authority to regulate new technology. “It emphasizes that semi-autonomous driving systems that fail to adequately account for the possibility that a distracted or inattentive driver-occupant might fail to retake control of the vehicle in a safety-critical situation may be defined as an unreasonable risk to safety and subject to recall,” the department said in a statement.

NHTSA says the guidelines aren’t aimed at electric car maker Tesla Motors. But the bulletin would address events like a fatal crash in Florida that occurred while a Tesla Model S was operating on the company’s semi-autonomous Autopilot system. The system can brake when it spots obstacles and keep cars in their lanes. But it failed to spot a crossing tractor-trailer and neither the system nor the driver braked. Autopilot allows drivers to take their hands off the steering wheel for short periods.

Tesla has since announced modifications so Autopilot relies more on radar and less on cameras, which it said were blinded by sunlight in the Florida crash. The company has maintained that Autopilot is a driver assist system and said it warns drivers they must be ready to take over at any time.

Under the overall guidelines, the federal transportation regulators, rather than states, should be in charge of regulating self-driving cars since the vehicles are essentially controlled by software, not people, administration officials said.

States have historically set the rules for licensing drivers, but when the driver becomes a computer “we intend to occupy the field here,” Transportation Secretary Anthony Foxx said. States, he said, should stick to registering the cars and dealing with questions of liability when they crash.

Automakers should also be allowed to self-certify the safety of autonomous vehicles by following a 15-point checklist for safe design, development, testing and deployment, said officials who briefed reporters. Though companies are not required to follow the guidance — it is voluntary and does not carry the force of formal regulation — Foxx said he expects compliance.

“It’s in their vested interest to go through the rigors that we’re laying out here” to gain the confidence of both regulators and the public, Foxx said.

In somewhat contradictory fashion, officials also said the National Highway Traffic Safety Administration is examining whether it should have “pre-market approval” authority, in which the government inspects and approves new technologies like autonomous vehicles. That would be a departure from the agency’s historic self-certification system and might require action from Congress.

Officials spoke to reporters ahead of a news conference scheduled for Tuesday at which they plan to provide greater detail of their guidance to automakers and states, as well as new powers and resources that the NHTSA may require.

NHTSA has been striving to make the guidelines a concise framework, rather than a lengthy set of detailed standards and regulations. The agency’s administrator, Mark Rosekind, has said he wants the guidelines to be flexible to keep pace with innovation.

Some consumer advocates have objected to voluntary guidelines instead of safety rules that are legally enforceable. But the rulemaking process is often laborious and can take years to complete.

Automakers sought the NHTSA guidance in part because they fear a patchwork of state laws will slow or complicate deployment of self-driving cars. Some state lawmakers see the advent of autonomous cars as a way to attract technology companies and spur economic growth, and are proposing laws friendly to the technology.

The Michigan legislature, for example, is considering bills that would allow the testing of self-driving cars without brakes or pedals on state roads. New York, on the other hand, has a longstanding law that requires drivers keep one hand on the wheel at all times, which undermines the rationale for self-driving technology.