Volvo president Håkan Samuelsson caused a stir earlier this week when he said that Volvo would accept full liability whenever its cars are in autonomous mode. Samuelsson went further, urging lawmakers to solve what he called “controversial outstanding issues” over legal liability in the event that a self-driving car is involved in a crash.
“If we made a mistake in designing the brakes or writing the software, it is not reasonable to put the liability on the customer,” says Erik Coelingh, senior technical leader for safety and driver support technologies at Volvo. “We say to the customer, you can spend time on something else, we take responsibility.”
This, says Samuelsson, makes Volvo, “one of the first car makers in the world to make such a promise.” Google and Mercedes Benz have recently made similar assurances. But does that mean if your future self-driving Tesla or Volkswagen gets into a crash instead, you’re going to be on the hook for all the damages?
Not at all, says John Villasenor, professor of electrical engineering and public policy at the University of California, Los Angeles, and the author of a paper titled, “Products Liability and Driverless Cars.” According to Villasenor, “Existing liability frameworks are well positioned to address the questions that will arise with autonomous cars.” He told IEEE Spectrum that, “If an autonomous car causes an accident, then the manufacturer was already going to be squarely in the liability chain.”
The University of Washington’s Technology Law and Policy Clinic agrees. In a submission earlier this year to the Uniform Law Commission, a body that aims to standardize laws between U.S. states, the group said, “Product liability theories are highly developed, given the advance of technology in and out of cars for well over a century, and are capable of covering autonomous vehicles.”
IntelliSafe Auto Pilot interfacePhoto: Volvo
As cars have become increasingly automated, with antilock brakes, electronic stability control, crash prevention radars and lane-keeping assistance, legal precedents have naturally developed in step. U.S. law now provides multiple routes for anyone seeking redress for any defective product, whether a simple toaster or a fully autonomous SUV.
For a start, manufacturers must exercise a reasonable degree of care when designing their products. It makes sense that any company selling a self-driving car that, for instance, was not tested in bad weather, might be sued for negligence if one crashed during a snowstorm. But that is, perhaps, a poor example. “Snow is difficult because it limits visibility,” says Volvo’s Coelingh. “And it is low friction and so limits braking ability. Snow’s not impossible but it’s really difficult.”
Volvo intends to be one of the first carmakers to get self-driving vehicles into the hands of real customers, with a fleet of a hundred autonomous XC90 SUVs planned for the roads of Gothenburg, Sweden, by 2017. Initially, none will be allowed to drive in snowy conditions.
Even if not pronounced negligent, manufacturers can still be found ‘strictly liable’ for any problems discovered in their final products, or can be sued for design or manufacturing defects. They can also be held liable if they fail to warn consumers of the risks of using (or misusing) products or services.
To reduce the chance of any mishaps, Volvo intends to give its first customers special training for their self-driving cars. The company will seek out a diverse range of drivers representative of its customer base, including older motorists and those suspicious of new technology. “One of the really interesting things is to see if people who are skeptical in the beginning will change their minds once they have used it for a while,” says Coelingh.
Judging by Volvo’s latest video featuring its Drive Me autonomous XC90s, motorists will be encouraged to watch television or do some work while the car is in charge. The video does not mention that the car might occasionally need to hand control back to the driver, as most experimental vehicles do today. For its pilot program in Gothenburg, says Coelingh, Volvo can even remotely disable the autonomous technology. “We might want to make the technology unavailable if something really critical occurs,” he says. But if a production self-driving car actually required more human oversight than a manufacturer claimed, and this led to an accident, the driver might have a legal case for misrepresentation.
“There are grey areas, involving disputes regarding whether an accident was caused by a failure of autonomous technology, an error by the human driver, or some combination,” says Villasenor, “But these are not areas that [Volvo’s] pronouncement will resolve. But it’s still good to see manufacturers stepping up and recognizing their liability obligations.”
The takeaway? While carmakers’ promises to accept liability are probably unnecessary, they’re not a signal to steer your old wreck into an autonomous Volvo in the hope of a fat payout. “We do not take responsibility for all potential crashes with a self-driving car,” warns Coelingh. “If a customer misuses the technology or if there is another road user that causes an accident, it’s not we or our customer who are to blame, it’s the third party.”
Mark Harris is an investigative science and technology reporter based in Seattle, with a particular interest in robotics, transportation, green technologies, and medical devices. He’s on Twitter at @meharris and email at mark(at)meharris(dot)com. Email or DM for Signal number for sensitive/encrypted messaging.