A Google Car Can Qualify As A Legal Driver

A robotic car still can't vote or order a drink, though. Not yet

2 min read

A Google Car Can Qualify As A Legal Driver
Illustration: iStockphoto; Car: Google

The U.S. highway safety agency, the National Highway Traffic Safety Administration, has determined that a computer system can qualify as the legal driver of a car, Reuters reports.

The opinion is expressed in a letter, dated 4 February, from NHTSA Chief Counsel Paul A. Hemmersbaugh to Chris Urmson, head of Google’s self-driving car project. The letter, which appears on NHTSA’s Web site, comes in response to Urmson’s request three months earlier that the government allow for the possibility of a car that truly drives itself.

“As a foundational starting point for the interpretations below, NHTSA will interpret ‘driver’ in the context of Google’s described motor vehicle design as referring to the SDS [self-driving system], and not to any of the vehicle occupants,” Hemmersbaugh writes. “We agree with Google its [self-driving vehicle] will not have a ‘driver’ in the traditional sense that vehicles have had drivers during the last more than 100 years.”

Accepting an AI as a legal driver eases the government’s rule-writing process and takes a clear step toward Google’s stated goal of bypassing the human role in driving altogether. Among other things, the ruling means that Google—and any other company—may design the various parts of an automatic driving system to deal directly with the artificial pilot without first clearing things with the primate who may be sitting in the front seat.

Google has long advocated going to full robotic mode in one fell swoop, jettisoning the steering wheel, the accelerator, and the brake pedal, so that no other human-operated control remains besides an on-off switch. But the major auto manufacturers have preferred adding what they term “advanced driver assistance systems,” such as lane-keeping and self-parking programs, with the goal of helping the human driver rather than supplanting him.

Google calls such half-measures dangerous because they may lull people into a false sense of security. Say you’re on the highway, daydreaming about lunch, when your car abruptly informs you that it will hand back control to you in, say, five seconds. You may well be flummoxed. Particularly if you’re in the back seat, rummaging about for a misplaced pencil or whatever.

What the letter makes clear is that a Google car, whatever the legal standing of its robotic system, must still conform to the letter of the law—and that means having a steering wheel, a brake pedal, and so forth. It notes that a lot of government rules will have to be rewritten before before anyone can drive—or rather, be driven by—a robotic car. But by saying so, it raises the possibility that such revisions are under consideration.

The Conversation (0)