California has strict rules about who can pilot the dozens of experimental autonomous vehicles cruising its public roads. Prospective test drivers have to pass a defensive driving course, have near-spotless records, and have at least a decade without a drunk-driving conviction. Crucially, they must also complete a special training program for autonomous vehicles, some of which can be as buggy as any Silicon Valley prototype.
But an investigation by IEEE Spectrum has uncovered that these autonomous training programs vary considerably in content, intensity, and duration. Drivers hoping to operate one of Google’s autonomous Lexus SUVs will spend at least five weeks on classroom lessons, in-car observations, hands-on sessions, and evaluations. Those itching to get behind the wheel of a computer-controlled Audi A7, however, could complete the carmaker’s training program in less than 2 hours.
|Company||Extra Time for Robocar Training|
|Tesla||Undisclose (likely half a day)|
|Bosch||Undisclose (likely one day)|
|Mercedes-Benz||Undisclose (likely one day)|
|Volkswagen/Audi||About 2 hours|
This is because manufacturers are allowed to design and conduct their own autonomous training programs. California law [pdf] requires the courses to feature behind-the-wheel lessons and information about automated technologies, including their limitations. What the regulations do not mention are specific procedures to teach or goals to meet, nor how long any such training must last.
Documents obtained by IEEE Spectrum under Public Records Act legislation show that the seven companies currently holding experimental self-driving car-testing permits for California have interpreted the law very differently.
Google, which pays its autonomous safety drivers US $20 an hour, has a comprehensive autonomous training program in place. The company’s five-week course trains test drivers in both software operation (from the passenger seat) and driving, with separate modules for highways and urban streets. “Freeway and surface-street driving are very different, and thus require different skills,” says a Google document outlining the program.
Students are given video tutorials and instruction from an experienced driver before gradually being allowed to observe and then perform supervised drives. There are reflex tests with fake system failures on simulated cars and hours-long written and practical examinations to pass. Even once they’re certified, Google safety drivers are subject to random checks and must refresh their defensive driving skills every two years.
However, the company initially also sought permission for untrained dignitaries to sit in the driver’s seat. Last year, Ron Medford, the company’s driverless-car safety director, complained to the DMV:
We request that the rule provide…flexibility for manufacturers to demonstrate their autonomous technology to policymakers, regulators, and other key stakeholders who have not completed a full driver-training program and received a testing permit.
The department disagreed.
Nissan crams much of the same training into a single day. Drivers are taught how to deal with errors in autonomous systems causing abrupt braking or acceleration, deactivation of the brakes, or faulty steering actuation. Delphi’s course takes about the same amount of time, with a morning of orientation and instruction in power-up, shutdown, and emergency procedures. A 2-hour ride-along to observe the car is followed by two hours of supervised driving, after which the driver is certified.
Bosch, Mercedes-Benz, and Tesla do not specify how long their autonomous training sessions last, but none sounds very challenging. Bosch and Mercedes conduct their programs on closed tracks, and Bosch’s is limited to transitioning between manual and automated modes, using the emergency stop button, and the rather vague “controlling the autonomous vehicle in case of a potential malfunction.” Tesla’s training seems equally brief, with a quick overview of normal operation a 30-minute test. The toughest section tests whether drivers can overcome errors deliberately introduced into the steering system. To be fair, the Model S “autopilot” system is currently limited to highway use only.
The autonomous training program run by Audi, part of Volkswagen Group, looks to be the briefest and easiest of all. It starts with a 15- to 30-minute classroom session that Volkswagen promises “will include information relevant to the safe operation of any highly automated vehicle.” The training then rushes through vehicle-specific instruction, in-car familiarization, observation, and practice operation. It concludes with a 15- to 30-minute evaluation that is “much like a driver test.” A speedy candidate could complete the whole thing in a little over an hour and a half.
“Our interest is in ensuring not only that manufacturers have a robust training program but also that test drivers are well versed in the technology and are able to gain safe control of that vehicle,” says Bernard Soriano, deputy director for California’s Department of Motor Vehicles. “We had to go back and forth with some manufacturers in terms of getting more information about their training program.” Ultimately, however, none of the programs were rejected, even Audi’s whistle-stop course.
That concerns some experts. David Mindell, an MIT professor specializing in the history of automation, says, “Today’s ‘autonomous’ cars still require a great deal of human judgment and skill to operate safely, and that’s unlikely to change for some time.” Patrick Lin, director of the ethics and emerging sciences group at California Polytechnic State University, adds, “Allowing manufacturers to have variable training times may be useful in determining the proper amount of training ordinary drivers should have. But if government or a consortium of carmakers were to establish minimum standards of safety and training, that may give us more confidence than letting each manufacturer decide what’s best.”
Given the disparity between the carmakers’ approaches today, don’t expect that consortium to emerge anytime soon.
This post was updated and corrected on 25 February 2015 to clarify Google’s training and objections to the California DMV.
Contributing Editor Mark Harris is an investigative science and technology reporter based in Seattle, with a particular interest in robotics, transportation, green technologies, and medical devices. In 2012, he wrote an in-depth article for IEEE Spectrum on failures in AED defibrillators that won the Grand Neal Award from American Business Media. In 2014, he was Knight Science Journalism Fellow at MIT, and in 2015 he won the AAS Kavli Science Journalism Gold Award.