The Big Problem With Self-Driving Cars Is People
The engineers who built routers for the fledgling ARPANET in 1969 never dreamed that networking technology would upend journalism. Nor did anyone guess that cellular communication would make people ignore one another at the dinner table. Early users of email had no idea of spam. Henry Ford did not foresee the traffic jam.
Technology has unintended consequences. Sometimes they are large and tumultuous. It is often well worth the trouble of trying to figure them out ahead of time.
Right now, the new technology with the biggest buzz is the self-driving car. Are there any likely unintended consequences of the widespread adoption of self-driving cars? You bet there are! I can think of two: Such cars will be pariahs, and their owners will act obnoxiously.
Both difficulties will emerge months or perhaps years after truly self-driving cars have been brought to market. Before then, engineers have a great deal of work ahead of them to make the cars safer, more capable, and more foolproof and to convince regulators to allow them onto the roads. These objectives are going to take longer than many proponents of automated driving realize or are prepared to admit.
I am confident we will eventually get to fully self-driving cars, but my concern is that during trial deployments we will run into many unexpected consequences that will delay mass deployment for many years. As a robotics researcher and entrepreneur, I have made it my business to imagine and visualize how automation will work in certain environments and situations. I’ve been doing that lately with autonomous cars. What is my conclusion? To paraphrase Bette Davis in the film All About Eve: Fasten your seat belts. It’s going to be a bumpy ride.
If I was walking on a moonless night along a country road and heard a car approaching, I’d get off the road, climbing into bushes if necessary, until the car had passed. I’d do that because I wouldn’t know whether the driver had seen me. In such a setting, we willingly give cars the right-of-way.
But in the daytime, in an urban area, I might step in front of a car at a stop sign without a second thought. Alternatively, I might linger on the curb a moment while indicating that I am about to step off the curb. Or, if I’m behind the wheel, I might just blow through the intersection, oblivious to the sign. Two questions arise: If self-driving cars can’t handle such examples of human caprice, how will people feel about sharing space with these new aliens? And how much will the performance of self-driving cars need to be reduced, or otherwise modified, to enable them to share the roads smoothly with cars that are driven entirely or primarily by humans?
Consider first a residential neighborhood, such as my own part of Cambridge, Mass., where modest houses share the streets with triple-decker apartment buildings. The streets are narrow, in many cases one-way, with very few marked pedestrian crossings. People expect to be able to cross a street at any point, but they know that there is give-and-take between drivers and pedestrians, often mediated by subtle cues of eye contact or body language. Cars and people are viewed as equals, quite unlike the situation you’d find on a narrow country road at night.
In this neighborhood, cars and people interact in three ways. First, on the longer main roads the cars mostly travel without interruption, but there are stop signs mediating access to these through roads from the smaller streets that cross them. People walking along these main roads assume that they, too, have the right-of-way, expecting that drivers who have stopped on a side street will let them walk in front if they are about to step off the curb. Moreover, these people usually want the driver to acknowledge their presence before they step in front of the car.
Second, when people want to cross a street between intersections or on a main road without stop signs, they wait for a gap to show between cars. Only then do they step out cautiously and confirm that the car is slowing down before they move into the middle of the road. And third, the sidewalks here are narrow, and when snow has made them hard or impossible to traverse, people often choose to walk along the roads instead, trying to provide room for the cars to pass but nevertheless expecting the cars to be respectful of them.
Now consider the very different conditions in Central Square, also in Cambridge. It has shops, an area for bars and restaurants (with the upper floors occupied by MIT spin-off startups). There are marked pedestrian crossings and—usually—people cross at those designated places. They do so because the drivers are a little less civil here, perhaps because there is a larger proportion of people driving through who are not local residents.
People step out tentatively into the marked crosswalks and visually check whether oncoming drivers are slowing down or indicate in some way that they have seen the pedestrians. It’s easy to see into the cars and get an idea of what the driver is paying attention to, even at night, given the ample lighting. Pedestrians and drivers mostly engage in this kind of brief social interaction, and any lack of interaction is usually an indicator to pedestrians that the driver has not seen them. And, this being the Boston area, when such a driver barrels through the crossing, the pedestrians get angry and yell at the driver.
In yet more hostile areas, such as parts of New York City, pedestrians and drivers often play even more contentious games, such as purposefully avoiding eye contact so as to force the other party to yield. The upshot is that an autonomous car able to drive in one area may be poorly equipped to function in another.
The complexity is not limited to contentious behavior, either. In Central Square, many pedestrians “reward” good behavior by drivers. When the main road is busy, getting on or off it can require a lot of patience on the part of a driver. Pedestrians who have seen signs of such patience will sometimes voluntarily defer to such drivers, waving them through.
These are the sorts of nuances that typically elude artificial intelligence. What if cars trying for full autonomy can’t handle them? The short answer, of course, is that they will not be able to accommodate pedestrians as smoothly as human drivers do.
This is not just a matter of social nicety. Consider the challenges posed by a snowy day: Cars will have to be able to perceive people walking along—or in—the street, and then they’ll have to make a decision. Should they pass these people, as most human drivers would, or should they follow slowly, avoiding the risk of passing on the treacherous roads? The latter tactic would slow traffic for both the occupant of the driverless car and for any human drivers behind it. Obviously, some of those human drivers would become annoyed at being stuck behind driverless cars. Driverless cars would then be a nuisance.
Even in good weather, an intersection could vex a robotic car. Let’s say the car is stopped at a stop sign on a side street and identifies two people standing at the corner. These folks might be about to cross, but then again, they could just be chatting. Or maybe it’s a parent and child waiting for the school bus. A human driver would assess the situation effortlessly. How long should the driverless car wait? And won’t some bored jerks try to spoof such cars by standing at the side of the road and gesticulating as though they’re about to jump off the curb? People don’t try that with human drivers because there would be repercussions. Driverless cars, on the other hand, wouldn’t be allowed to try to retaliate.
How will a driverless car let you know it has seen you and is trying to figure out whether you’re about to cross in front of it? It could just inch forward and then stop if you made a move toward the road. Otherwise, without social interactions, it would be like the case of the dark country road, in which the driverless car has to be granted the right-of-way over pedestrians and cars with human drivers. That won’t endear them to people, who are unlikely to welcome the idea of driverless cars that act as if they own the road. So what’s likely to happen is that driverless cars will be very wimpy drivers, slowing down—and angering—everybody.
Indeed, a report from the British Department for Transport predicts that traffic on highways will slow down somewhat because of timid autonomous systems until some threshold of autonomous density is reached. But I believe that the dynamics of pedestrian interaction will make the problem much more serious than that.
Consider that there will, for years, be a range of self-driving cars sharing the road with pedestrians and human-driven cars. The self-driving cars will themselves range from semiautonomous ones, with level-2 or -3 autonomy, to fully autonomous ones, at levels 4 and 5 [see sidebar, “5 Levels of Autonomy”]. If a semiautonomous car is not playing by the unwritten rules, bystanders will probably blame the person using the car. But they won’t have that choice if the car is fully autonomous. So in that case, they will blame the car.
It’s not hard to see how this could lead to real contempt for cars with level-4 and level-5 autonomy. It will come from pedestrians and human drivers in urban areas. And people will not be shy about expressing that contempt. In private conversations with me, at least one manufacturer is afraid that human drivers will bully self-driving cars operating with level-2 autonomy, so the engineers are taking care that their level-3 test cars look the same as conventional models.
Bullying can go both ways, of course. The flip side of socially clueless autonomous cars is the owners of such cars taking the opportunity to be antisocial themselves.
Up from Central Square toward Harvard Square in Cambridge is a stretch of Massachusetts Avenue that mixes residential and commercial buildings, with metered parking. One day I needed to stop at the UPS store there to ship a heavy package, and as there were no free parking spots I found myself cruising up and down a 100-meter stretch as I waited for a spot to open up. The thought occurred to me that if I’d had a level-4 or -5 self-driving car I could have left it to do that circling while I dropped into the store. Such is the root of antisocial behavior: convenience for me versus inconvenience for everyone else.
People will be tempted to take many other little shortcuts with their autonomous cars. I’m sure the owners will be more creative than I can be, but here are three additional examples:
- People will jump out of their cars at a Starbucks to run in and pick up their orders, leaving them not in legal parking spots but blocking others, knowing that the cars will take care of getting out of the way if some other car needs to get by. That may well work, but only by slowing everything down for other people. And perhaps the owners will be able to set the tolerance on how uncomfortable things have to get before the cars move. I can’t see that ending well.
- Suppose someone is going to an evening event without much parking nearby. And suppose autonomous cars are always prowling neighborhoods waiting for their owners to summon them, so it takes a while for any particular car to get through the traffic to the pickup location. Then the members of a two-car family may send one of their cars earlier in the day to find the closest parking spot that it can, then rely on their second car to drop them at the event and send it home immediately. When the event is over, their first autonomous car is right there waiting for them. The cost is foisted off on the commons, in the form of a parking spot occupied all day. (Oh yeah, and by the way, with double the greenhouse gases emitted.)
- In the various suburban schools that my kids went to there was a pickup ritual: Mothers, mostly, would drive up just before dismissal time and line up in the order of arrival; when school let out, the teachers would bring out the kids, and the parents and teachers would cooperate to get the kids into their car seats. Then off would go the cars, one at a time. When the first few families have fully driverless cars, one can imagine them sending their cars to wait in line first, so that their kids get picked up first and brought home. There will be a contest to see whose robotic car can get to school first. Teachers, too, will be inconvenienced, but people will still try it.
Early on in the transition to driverless cars, the rich will have a whole new way to alienate the rest of society. If you’re doubtful, take a drive south from San Francisco on U.S. Route 101 in the morning and see the Teslas speeding down the left-hand lane.
Here’s another reason why I’m skeptical about autonomous cars: The United States and most other countries haven’t even managed to fully automate their mass-transit systems. So how are we supposed to achieve the far more difficult task of completely automating cars?
True, there are many driverless train systems in the world, but they mostly operate in very restricted environments—in the United States most are found in airports and span just a few kilometers of track, all of it completely segregated from the vehicles and people that are outside the system. Such systems closely correspond to level-4 autonomy for cars, but in extremely restricted geographical environments. Level-5-autonomy trains would move on tracks with level crossings or function as streetcars that share space with automobiles and pedestrians. No one is testing level-5 train autonomy or even proposing to do it.
Note how much simpler navigation is for a train than for a car. Trains have rails that physically restrict where the trains can go. And note further that all train systems are run by teams of specialists. Individual consumers do not buy and operate trains, yet that is precisely what we are expecting will happen in the coming market for self-driving cars.
I believe that self-driving cars will be limited, at first, to applications similar to that of today’s self-driving airport trains. We’ll see autonomous trucks convoying behind a single human-piloted truck in designated lanes. But once that convoy is off the highway, we’ll demand that a human driver be behind the wheel in each truck.
And, just like trains in airports, we’ll see level-4 cars driving themselves in limited, pedestrian-free domains—in garages, for instance, where drivers can drop off cars and let them park themselves with only inches to spare on each side.
Somewhat later we might see level-4 autonomy for ride-hailing services in limited areas of major cities, with the ride beginning and ending within a well-defined area where well-observed “walk” signals keep pedestrians and cars apart. Some parts of San Francisco might work for this. Indeed, Uber has experimented there with such cars, although each still comes with a human minder behind the wheel, who takes over in case the software fails.
We might also see level-4 autonomy on some delivery vehicles in dense urban environments. But they will need to be ultradeferential to pedestrians, as well as avoiding choke points used by other cars during peak commuting periods.
That is where we are today. People are overestimating how quickly level-5 autonomy will come and overestimating how widespread level-4 autonomy will become in the near future. They see only the technical possibilities, not the resistance that will come when autonomous agents invade human spaces, be they too rude or overly deferential.
Certainly this new way of driving will eventually come. It will creep up on us, finally reducing manual driving to a recreation confined to specialized entertainment zones. The day of the robocar is inevitable, but that day will not come soon.
And flying cars? Forget about ’em.
This article appears in the August 2017 print issue as “The Self-Driving Car’s People Problem.”