Who’s at Fault in Uber’s Fatal Collision?

With a full investigation underway, we can start to untangle the strands of responsibility—and ask what it means for the future of self-driving cars

4 min read
National Transportation Safety Board (NTSB) investigators examine a self-driving Uber vehicle involved in a fatal accident in Tempe, Arizona, U.S
Photo: National Transportation Safety Board/Handout via Reuters

This is a guest post. The views expressed in this article are solely those of the author and do not represent positions of IEEE Spectrum or the IEEE.

Uber's accident earlier this week—the first fatality involving a pedestrian and an autonomous car—already has fingers pointing to various parties, though it's still too early to tell who's responsible. The plot thickens every day with new rumors and information.

A woman, jaywalking at night. Walking a bicycle. Possibly homeless. On a busy street known for partying. In Arizona, with permissive laws. Involving Uber, with permissive ethics. Which had hired an ex-felon to be the safety driver in the autonomous vehicle (AV).

As we wait for a full investigation, we can start untangling the strands of responsibility, which include the following possibilities.

1. The victim

First, it should be clear that whether Elaine Herzberg, the 49-year old victim, was homeless or not is irrelevant, even if social prejudices against the homeless are exploited in placing blame. Hypothetically, if she had been drinking at one of the many bars nearby—next to Arizona State University, which was once Playboy's #1 party school—that could be relevant. Jaywalking, despite nearby crosswalks, would also be relevant.

But even if the Uber car had the right of way, it can't just run over whatever is in front of it. Imagine if it were a toddler who darted into the road: is there really no obligation to avoid the crash, just because a robot car has the right of way? (No.) Even if it were an unavoidable crash, one can still ask whether the car should have driven more slowly, so to not put itself in such a scenario in the first place.

2. Uber

Tempe's police chief Sylvia Moir said, “Preliminarily, it appears that the Uber would likely not be at fault in this accident." Even if true, this speaks only to traffic laws and not about other liability. All technologies have their limits; if Uber hadn't properly accounted for these limits—such as not worrying that a pedestrian might be hidden behind an object, like a tree or parked truck—the company could share some responsibility.

Even if the victim had stepped out from dark shadows, poor lighting shouldn't be an issue given the lidar (basically, a laser-based radar) used by Uber. Though as Uber admits, some lidars are better than others. If lighting was an actual issue, then darker-skinned people might want to be extra wary, given that the best technologies today have a hard time detecting their faces.

AVs also have a hard time recognizing bicycles. If this turns out to be a factor, then it's a known factor and should have been accounted for. These and other technology limits, especially foreseeable errors, may mean that AV technologies were not ready for public streets, and Uber oversold its capabilities to us. For many, this wouldn't be out of character for Uber, given a perceived disregard for laws and regulations for public safety.

3. Volvo

Uber's vehicle was a Volvo XC90 SUV. Given that it was modified and not running Volvo's original software, Volvo perhaps escapes direct responsibility in this case. But it could be criticized for risking its high reputation to partner with a company of lesser standing, and allowing that company to modify its products. In the court of public opinion, Volvo may be guilty by association.

4. Uber's driver/operator

It was soon discovered that Rafaela Vasquez—the Uber employee inside the vehicle, monitoring its operation—had served a few years in prison for attempted armed robbery. Now, this criminal history by itself doesn't make her responsible for the accident; but if she's found to have been inattentive while she was supposed to be watching for pedestrians, she may become an easy scapegoat, given social prejudices against ex-felons in the workplace. Still, Uber may be responsible for not properly vetting, training, or setting expectations with its employees—for instance, if its employees are lulled into a false sense of confidence about the vehicle's capabilities.

5. Society—or no one

At minimum, government exists to protect its citizens. So, it's always unsettling to see a city or state rush to embrace commercial projects at the apparent expense of public safety, especially as other cities have rejected those projects as unsafe. Arizona is already known for one of the nation's highest rate of pedestrian deaths; if city planning, law enforcement, and other factors are implicated in that trend and continue to be unaddressed, the state may bear responsibility to some degree. The current federal administration, too, might share in that blame, with a similar bias for commercial interests; it had rolled back AV guidelines on safety that the previous administration had put in place.

Finally, at the highest level, society may have some responsibility for creating urban spaces that privilege cars over people, as well as giving rise to citizens who vote in political leaders who act against public interests. Or maybe no one is responsible: accidents are just an inherent risk in an imperfect world, and all of driving is a tradeoff between safety and mobility, anyway.

Enter robot law

As a first of its kind, this case takes us into uncharted territory, though some legal experts and ethicists have been thinking about the issues. Stephen Wu, attorney at Silicon Valley Law Group, has authored papers about liability for a scenario such as this; he agrees that even if the pedestrian was entirely at fault and a human driver couldn't have done better, it's still complicated:

“It may be, however, that while a human driver could not have stopped in time, an automated system could have," suggests Wu. “Was there anything defective in the system such that this system failed to stop and another one would have, even if a human driver would be incapable of stopping? That to me is the interesting question. If so, we could hold Uber responsible, even though a human driver could not have stopped in time."

Right on cue, Wu and the Center for Automotive Research at Stanford (CARS) are organizing a mock trial about a fictional AV accident in a pre-workshop at the We Robot conference next month at Stanford Law School. With this Uber case, what was fiction is now fact.

The Conversation (0)