How do you train a driver not to drive? That’s a question officials in California are wrestling with. The U.S. state furthest along the road to self-driving vehicles is drawing up regulations for the operation of autonomous vehicles by the general public—and it may require motorists to undergo additional instruction or evaluation before they can be chauffeured by robots.
Self-driving cars promise a future where you can watch television, sip cocktails, or snooze all the way home. But what happens when something goes wrong? Today’s drivers have not been taught how to cope with runaway acceleration, unexpected braking, or a car that wants to steer into a wall.
“Driver training or driver readiness is a component that we are actively discussing,” says Bernard Soriano, deputy director for California’s Department of Motor Vehicles. “Some of the elements that the manufacturers have in their test-driver training programs could be something that we could consider.”
These include classroom lessons on the abilities and limitations of autonomous technologies, computer simulations of failures, and real-world driving sessions. However, carmakers’ training programs can vary considerably. (See our investigation of robocar test-driver certification here.) Google requires that its test drivers complete weeks of in-depth lessons and rigorous exams, while Audi’s entire program lasts just a couple of hours.
One problem is that regulators do not know whether self-driving technologies will arrive in production vehicles as optional features in luxury cars or as the master control of fully autonomous robo-taxis. Ryan Calo, who teaches a robotics law and policy class at the University of Washington, believes the distinction is crucial. “For an autonomous vehicle without a steering wheel, I’m not sure you need any more training than you’d get for a dishwasher,” he says. “But for a vehicle primarily meant to be driven by a human driver and that has an autonomous mode, I could imagine some additional degree of certification.”
Today’s experimental autonomous cars occasionally need to hand control back to their human operators, either because of a bug in the system or for something as innocuous as the car leaving a well-mapped area. These “disengagements” may require the driver to take action quickly. California takes disengagements so seriously that it requires manufacturers testing self-driving cars to log each one. “Today, drivers are not trained or tested for that change in control,” says Patrick Lin, director of the ethics and emerging sciences group at California Polytechnic State University. “Humans aren’t hardwired to sit and monitor a system for long periods of time and then quickly react properly when an emergency happens.”
Drivers might also need help setting up an autonomous vehicle, training in how to deactivate systems in situations that no self-driving car could anticipate, such as an approaching dust storm, or dealing with errors made by the system as it is driving.
However, not everyone believes that self-driving technology presents drivers with any special challenges. In a document called The Pathway to Driverless Cars,” [pdf] which was released 11 February, the British government said a normal driver’s license would be sufficient to operate cars with an autonomous mode in the United Kingdom, even for test drivers of experimental vehicles. It also anticipates that fully automated vehicles would require no driver’s license at all. But it acknowledges that those views might change once self-driving cars take to the roads: “Emergent properties of the way automated systems interact…may potentially [require] changes to driver training, testing, and licensing.”
The Swedish authorities have a similarly flexible attitude. A report from the Swedish Transport Agency [pdf] last year said, “As things stand at present, it is too early to determine what authorization requirements would be appropriate” for fully autonomous cars.
Soriano doesn’t have the luxury of such a wait-and-see attitude. California has already missed a 1 January deadline to establish regulations for the public use of self-driving cars. When it comes to the issue of driver training and certification, he admits, “We haven’t made a decision on it yet.”
Contributing Editor Mark Harris is an investigative science and technology reporter based in Seattle, with a particular interest in robotics, transportation, green technologies, and medical devices. In 2012, he wrote an in-depth article for IEEE Spectrum on failures in AED defibrillators that won the Grand Neal Award from American Business Media. In 2014, he was Knight Science Journalism Fellow at MIT, and in 2015 he won the AAS Kavli Science Journalism Gold Award.