Hey there, human — the robots need you! Vote for IEEE’s Robots Guide in the Webby Awards.

Close bar

Apportioning Blame When Robocars Have Accidents

The death of a pedestrian in Tempe, Ariz., sparks a blame game

3 min read

This March 19, 2018 still image taken from video provided by ABC-15, shows investigators at the scene of a fatal accident involving a self driving Uber car on the street in Tempe, Ariz.
Image: ABC-15.com/AP

When a self-driving car is involved in an accident, the rush to judgment begins before the hubcaps stop rolling.

Was it the fault of the car’s programmers? The fleet operator? The person in the car, or in the car in front of it, or on the bicycle next to it? How about the road regulators—federal, state, and local? 

The blame game played out on Monday, when news was published of an accident the night before that had claimed the life of a 49-year-old woman who was walking her bike across a street in Tempe, Ariz.

At first it was unclear who, if anyone, was at fault. Then, later in the day, Tempe police chief Sylvia Moir told the San Francisco Chronicle that the car’s own video record of the accident had led her to conclude, preliminarily, that the automated system and its professional minder probably did nothing wrong.

“It’s very clear it would have been difficult to avoid this collision in any kind of mode (autonomous or human-driven) based on how she came from the shadows right into the roadway,” Moir said.

Then the wheel turned again, with Reuters reporting that police have backed away from Moir’s statement on the grounds that determining fault is properly the job of the county prosecutor. The final say, though, goes to the National Transportation Safety Board (NTSB), which has opened an investigation. It may take quite some time before we have any results.

“I don’t think anyone outside the NTSB is qualified to look at the facts and make conclusions,” says Alex Roy, an editor at large at thedrive.com and an award-winning rally race driver. “I do think it’s outrageous that self-driving cars aren’t being treated the same way as commercial aircraft, with transparency of data.”

Roy declined to cast blame on anyone involved in the accident. But take a cursory look at social media, and you’ll find plenty of observers who were quick to assert that self-driving cars had been given too free a rein on the open road. The charge has been most commonly leveled against Arizona, which has been among the leaders in robocar testing.

You could say that self-driving technology costs real lives while saving statistical lives.

The state began by allowing Waymo to offer a driverless ride-hailing service in a suburb near Phoenix, first requiring it to put a safety driver behind the wheel, as Uber had done before the weekend’s fatality led it to suspend these trials. Then, last November, Arizona allowed Waymo to take the safety driver out of the front seat and put him in the back. Finally, this year, Waymo took the driver out of the car altogether. 

On the other side, there’s the argument that self-driving cars promise to save tens of thousands of lives, and that we need real-world experimentation to bring such cars to market. Consider the case of the first robocar fatality. In May 2016, the driver of a Tesla Model S running on Autopilot smashed into a truck and died. The NTSB’s final report apportioned some of the blame to Tesla’s automation for having allowed the driver to stop paying attention to the road. But, separately, the agency also found that Autopilot has had the net effect of reducing crashes. 

You could say that self-driving technology costs real lives while saving statistical lives. It’s easy to say which makes for a better news story.

So far, no self-driving car company has defended itself in a criminal liability action.

“Yes, we are still in a testing phase, but by allowing these tests, we are going to approach a way of determining the safest method,” argues Jennifer Huddleston Skees, a legal researcher for the Mercatus Center at George Mason University. She adds that the courts had the legal tools to make the transition to cars from horse-drawn carriages, and those tools will help it make the transition from conventional cars to self-driving ones. 

So far, no self-driving car company has defended itself in a criminal liability action. A civil case, however, is being brought in California by Oscar Nilsson, a motorcyclist from San Francisco, who in December hit a Chevrolet Bolt while it was in self-driving mode. Both vehicles were moving at slow speed, but Nilsson says he suffered neck and back injuries.

No argument from statistical lives saved, however, can work until we have enough statistics to work with. And, though Waymo has logged 4 million driverless miles and Uber has maybe half that much, we are far from the hundreds of millions needed to make solid comparisons of fatality rates.

“Luckily, software systems can be reengineered; even if self-driving cars aren’t currently safer than a human driver, there’s good reason to expect that one day they will be,” writes Megan McArdle in the Washington Post. “But we’ve done that future no service by talking as if it had definitely already arrived. If we tell people that self-driving cars are perfectly safe, and that turns out not to be true, the backlash could drive these vehicles off the road before they’re able to deliver on their promise.”

The Conversation (0)