Communication is a sixth sense that lets us see through other people’s eyes and thus through walls, over the horizon, and around corners. “Look out for that lion,” someone said to our ancient ancestor, and that’s why we’re standing here today.
Birds do it, with their calls; bees do it, with their waggle dance. Now cars are doing it, with wireless systems.
I saw it at first hand yesterday, at MCity, a mock cityscape—complete with Hollywood-style street facades—that the University of Michigan, in Ann Arbor, has set up to test self-driving cars. The place is littered with antennas, power outlets, and cameras, including an eye in the sky—a quadcopter from a local aerial photography company.
I sit in a Kia Soul, along with Gabor Orosz, a professor of mechanical engineering, and a student, as we head into a bend shrouded in shrubbery. Suddenly we come upon a car parked right in the middle of the road, and Gabor hits the brakes with a screech. "If we’d had icy weather, there's no way I could stop the car," he says.
We do it again, but this time the car drives itself and has its communication system turned on. Well before we make the turn the car slows to a stop: It has received a signal from that car in the road. “Instead of getting into trouble, and asking the car to get us out of it, we prevent the trouble in the first place,” Orosz says.
Such communication—from car to car, and from a car to nearby traffic signals—has been mandated by U.S. safety regulators. When the details of the emerging system are approved, possibly this year, the rule could take effect by 2020. General Motors has already equipped its 2017 Cadillac CTS with the system.
The technology is perfect for robocars, but it can help even in today’s manually driven cars, as I’m shown in another test. This time we’re driving in the third of three cars when the first one brakes hard. The second one doesn’t, but that’s okay, because a buzzer goes off in our cabin, alerting our driver to brake. With this technology around, you can kiss multicar pileups goodbye.
Here are a few of MCity’s video clips:
The full system has yet to be rolled out, but wireless sharing of data is now going on in the streets of Ann Arbor, a largish college town. Some 1,500 people have signed up for the research project, enough to constitute from 2 to 5 percent of the traffic during rush hour. Next year, the program expects to add another 1,000 cars to the fleet.
The data they gather is already helping the city’s traffic control system optimize stop lights; soon it should be possible to prioritize a given traffic light for a particular vehicle—say, a city bus that needs a few more seconds to make the light because it’s behind schedule. Another possible application is a curb-mounted station that broadcasts a signal that tells your car what the speed limit is and whether you’re exceeding it. One of them is already in operation in front of a Wendy’s restaurant that’s about 5 minutes down the road from the MCity test track.
Could the curbside station, or others like it, one day email a speeding ticket to your car? Or maybe it could just take the money out of your checking account and tell you about it afterwards.
It takes 20 to 30 minutes to fit a car with the necessary hardware: a GPS sensor and a wireless transceiver. Here in the MCity compound, at least, the GPS system uses a repeater to enhance its accuracy down to centimeter level—good enough to locate a car precisely and to allow other cars to figure out its trajectory and measure its speed. The wireless transceiver uses DSRC (direct, short-range communications), a protocol based on the 5.9 gigahertz band.
To get data from actual self-driving vehicles, the University of Michigan will inaugurate a shuttle service from North Campus to the main campus, a 3-kilometer (2 mile) circuit. The shuttle, made by Navya, a French company, follows a set path at the pace of a trot, around 10 km/hour (6 mph).
I sit on a bench inside the sensor-festooned shuttle as it plies a curving gravel road, and I ask one of our minders whether the shuttle’s sensors had ever recorded anything strange. Indeed they had. A fawn once stepped in front of one, and the thing politely stopped for the baby deer. But nobody retrieved the video feed, so the YouTube viral moment was lost forever.
Fun fact: The Navya is mostly leased to organizations, like MCity, that are more in the way of being partners than customers. But if you’d like to buy one outright, it’ll cost you well over US $200,000, and the maintenance will run you around $40,000 a year.
The point of all this is to generate critically useful experience. We’ll need a whole lot of it to cut the accident rate down to the parts-per-million level that safe driving requires. It’s no surprise that Waymo, the company that’s been testing robocars the longest, has the lowest rate of unplanned driver interventions. Yet even Waymo hasn’t covered the billions—yes, billions—of kilometers you need to traverse in order to encounter all the hard problems that are lurking out there. These are the edge cases, as engineers call them.
Edge cases tend to be rare, as indeed severe accidents are. In the United States, only about one person dies for every 100 million miles driven (the highest rate is South Carolina’s, which has 1.89 deaths per 100 million miles, according to the Insurance Institute for Highway Safety). A fleet of a few thousand self-driving cars won’t log that many miles anytime soon.
That’s why engineers at the University of Michigan have concluded that in order to get enough experience, they must go beyond the real world to the virtual one. They are pouring data from their tests here into a system that constructs a huge number of variations, then beams that modeled world out to the car on the MCity track. That way, it can confront a lot of hair-raising problems without endangering anyone. Call it augmented reality for cars.
At one point during one of our drives, our car approached an intersection when another car zoomed through, running a red light. Of course, our driver knew it was going to happen—the lawyers would have insisted on it—and he braked in good time.
Later, in a small control room with four big screens, I see the event playing out again—first, as it actually happened, then again as a virtual variation of the event.
“It’s difficult for us to test in a dangerous environment, like running a red light,” says Henry Liu, a professor of civil and environmental engineering at the university. “Augmented reality addresses this barrier by creating a simulated world, running in parallel with the real MCity.”
He gestures toward one of the screens, and I see why our car had stopped in front of a mocked-up train crossing. A virtual train—the Hogwarts Express?—had been passing through, but only the car had been able to see it.
Philip E. Ross became a senior editor at IEEE Spectrum in June 2006. His interests include transportation, energy storage, artificial intelligence, natural-language processing, and the economic aspects of technology. He has reported on solar towers in Spain, cloud seeding in Nevada, telescopes atop a mountain in the Canaries, and robotic cars in California and Germany. He blogs mainly for Cars That Think, which won a 2015 Neal Award. Earlier in his career he worked for Red Herring, Forbes, Scientific American, and The New York Times. He has a master's degree in international affairs from Columbia University and another, in journalism, from the University of Michigan.