A self-driving delivery service this week received the first-ever federal safety exemption for driverless vehicles – the latest sign of the Trump administration's hands-off approach to autonomous cars and trucks.

Nuro, founded by a pair of ex-Google engineers, began delivering groceries with a handful of autonomous vehicles in Houston and Scottsdale, Arizona in 2018. The exemption granted Thursday by the U.S. Department of Transportation, though, allows the company to introduce its second-generation vehicle, which will lack the steering wheel, transparent windshield, mirrors, and gas and brake pedals otherwise required in every car or truck – and generally preferred by drivers – on U.S. roads.

The exemption is at once a significant mile-marker for the autonomous vehicle industry and was widely anticipated – and long overdue. Already, across 36 states and the District of Columbia, more than 80 companies – Nuro, as well as Tesla, Uber, Waymo, and others – are testing over 1,400 self-driving vehicles on America's public roads, from driverless golf carts tootling around town to autonomous ride-share taxis to long-haul tractor-trailers bearing freight from Arizona to California.

All are doing so without explicit approval from federal regulators – chiefly because there is still no federal regulation that addresses or has even yet contemplated the development of driverless vehicles. Instead, the rules on the books still have drivers in mind, requiring the rudimentary navigation and safety features that motorists for years have needed to drive safely – or at all – from here to there.

"A company that wants to deploy an automated vehicle, provided they comply with federal safety standards about mirrors and brake pedals and steering wheels, can do it today, There is no regulatory process to speak of," said Bryant Walker Smith, a law professor at the University of South Carolina and expert in transport technologies. "So I wouldn't characterize that as a slow or a fast regulatory process. It's a nonexistent process. If a company develops a vehicle that checks off these arbitrary requirements because they assume that drivers are human, they can deploy."

Hence, to a certain extent, Nuro's exemption is a formality – the company and many others have already been testing vehicles without even a driver behind the wheel. Now the company can do so without the sorts of driver-centric safety features that were already superfluous in an AV.

Companies that hope to rapidly deploy their driverless vehicles or services at commercial-scale, however, will ultimately need the kind of federal exemption that Nuro received this week – one that's taken far longer to receive than developers anticipated, though perhaps not as long as some consumer-safety advocates would have hoped.

The Trump administration, in 2016, invited AV developers to apply, promising a remarkably short six-month review process. Nuro submitted its application in October 2018; General Motors, which is developing the Cruise Origin autonomous vehicle, and which is seeking to deploy an autonomous version of Chevrolet's battery-powered Bolt sedan, applied months earlier in January.

"Then they waited, and they waited, and they waited. And this is now NHTSA finally acting," Walker Smith said.

Nuro's application was simpler than GM's: Notably, the delivery vehicle will drive no faster than 25 mph and remain on local roads.

However, each new announcement with AVs has also renewed safety concerns about the sector – and the extent to which regulators are, or are not, ensuring the vehicles won't slam into pedestrians, cyclists, and other vehicles, let alone are sufficiently protected against being hijacked through cyber attacks.

In lieu of explicit laws and regulations, oversight at the state and federal level has instead mostly relied on self-reporting by AV developers – an approach that's come under intense scrutiny since deadly crashes caused by autonomous technology in another transportation sector, air travel, exposed major shortcomings in how federal regulators had depended on Boeing to self-report design flaws in its new 737 Max.

Self-driving technology has been involved in at least five fatal collisions in the U.S. since 2016: four involving a Tesla being driven with Autopilot engaged and one involving the ride-hailing service Uber's self-driving test vehicles.

Investigations into the crashes yielded vastly different findings and underscore the challenges in deploying the technology: Authorities looking into the Tesla ($TSLA) collisions cited driver inattention or faulty highway safety equipment – one driver in Florida, for example, was apparently watching a movie just before slamming into a tractor-trailer, and numerous videos have captured drivers sleeping behind the wheel of their semi-autonomous cars as they speed down the highway.

However, the probes have also discovered that warning systems designed to alert such inattentive drivers often did not activate as Tesla and other automakers had promised, including in the deadly Florida collision. Similarly, the probe of the 2018 Uber ($UBER) crash in Arizona, where the vehicle fatally struck a person crossing the street, revealed that the Uber vehicles were not programmed to recognize that people often cross the street outside a crosswalk. (As Wired put it: "Uber's Self-Driving Car Didn't Know Pedestrians Could Jaywalk.") Uber shut down the self-driving pilot program but tentatively restarted it last year.

"The Center for Auto Safety is very supportive of advancing automotive innovation through technology, particularly when it comes to increasing safety for everyone on the road. This is why we were disappointed when Nuro's petition failed to sufficiently demonstrate it had addressed a variety of safety concerns that could arise as a result of the company's requested exemptions," Jason Levine, executive director of the Center for Automotive Safety, said in a statement. "We are flabbergasted that of all of the dozens of safety rules, pending petitions, and defect investigations on the National Highway Traffic Safety Administration's plate, they chose to prioritize reviewing whether 5,000 vehicles which were never intended to carry humans requires side view mirrors."

He added: "Based on the current Administration's record we have no faith in their willingness to effectively recall the vehicle should it prove to be a danger to the public."

The rapid acceleration of AV development, which has far outstripped the more languid pace of traditional car and truck innovation, has posed a challenge to creating regulations that can keep up. And industry experts emphasize that regulators have found an alternative: namely using their agencies' soft power – the threat of investigations and forced recalls – to ensure compliance.

"By the time that you have any kind of safety standard promulgated, which normally takes a couple of years at least, technology has outrun the safety standard," said Nicholas Wittner, a professor at the Michigan State University College of Law specializing in automotive technology and engineering. "I don't remember an instance where you have the technology developing this quickly."

But there remain troubling shortcomings, which may highlight wider problems: Many insiders, for example, remain flabbergasted that state and federal regulators have allowed Tesla to call its enhanced cruise feature "Autopilot," a name that, combined with the automakers' marketing efforts, misled consumers and regulators about the feature's capabilities. German regulators, for example, in 2016 refused to allow Tesla to use the term there.

There also remains deep philosophical issues about how best to gauge a driverless vehicle's safety: Should it merely be safer than the typical human driver – responsible for nearly 37,000 motor-vehicle crashes in 2018 – or instead should it be close to perfect? And should an AV developer be penalized more harshly when one if its vehicles fails to prevent a crash that a human driver would have easily avoided – a semi truck with a white trailer that, to the computer's lenses, blended into the sky, or a street-sweeper that, to a computer, makes a mid-block U-turn.

"You don't get to make the same crash twice," Walker Smith said. "If you miss a bridge because it's been painted for St. Patrick's Day, or a deer jumps for the bridge, maybe you couldn't have foreseen that, but now you don't get to make that same mistake."

The incidents and disagreements have sparked sharply different outlooks on the AV industry: Some figures, like Transportation Secretary Elaine Chao or Cruise Origin CEO Dan Ammann expecting rapid deployment of driverless vehicles in the near term, and others, especially legacy automakers, predicting a much longer rollout.

"Too often companies are still using regulation as an excuse: 'Why don't you have the vehicles that you promised or at least hinted would be available by now?,' they say, 'Oh, regulation,'" Walker Smith said. "It's not regulation, it's fundamentally technological – it's the technology that remains challenging."

Share:
More In Business
Women Hold the Key to our Climate Future
Zainab Salbi, founder of Women for Women International and co-founder of Daughters for Earth, shares why she is putting women in positions of power to fight the climate crisis.
Load More