Superaccurate GPS may soon solve three robocar bugbears—blurred lane markings, bad weather, and over-the-horizon blind spots. These are things lidar and radar often can’t see, see through, or see around.
A group led by Todd Humphreys, an aerospace engineer at the University of Texas at Austin, has just tested a software-based system that can run on processors in today’s cars, using data from scattered ground stations, to locate a car to within 10 centimeters (4 inches). That’s good enough to keep you smack in the middle of your lane all the time, even in a blizzard.
“When there’s a standard deviation of 10 cm, the probability of slipping into next lane is low enough, meaning 1 part in a million,” he said. Today’s unaided GPS gives meter-plus accuracy, which gives you maybe 1 part in 10, if that, he adds.
That’s not a great percentage, particularly if you’re driving a semi. Lane-keeping discipline is non-negotiable for a robocar.
The team, which was backed by Samsung, began with the idea of giving smartphones super-GPS positioning power. But though that idea worked, it was limited by inadequate antennas, which neither Samsung nor any other vendor is likely to improve unless some killer app should come along to justify the extra cost.
“We pivoted then, to cars,” Humphreys says.
Humphreys works on many aspects of GPS; just last month he wrote for IEEE Spectrum on how to protect the system from malicious attack. He continues to do basic research, but he also serves as the scientific advisor to Radiosense, a firm his students recently founded. Ford has recently contacted them, as has Amazon, which may be interested in using the positioning service in its planned fleet of cargo-carrying drones. Radiosense is already working with its own drones—“dinnerplate-size quadcopters,” Humphreys says.
Augmented GPS has been around since the 1980s, when it finally gave civilians the kind of accuracy that the military had jealously reserved to itself. Now the military uses it too, for instance to land drones on aircraft carriers. It works by using not just satellites’ data signals but also the carrier signals on which the data are encoded. And, to estimate distances to satellites without being misled by the multiple pathways a signal may take, these systems use a range of sitings—say, taken while the satellite moves in the sky. They then use algorithms to locate the receiver on a map.
But until now it worked only if you had elaborate antennas, powerful processing, and quite a lot of time. It could take 1 to 5 minutes for the algorithm to “converge,” as the jargon has it, onto an estimate.
“That’s not good, I think,” Humphreys says. “My vision of the modern driver is one who’s impatient, who wants to snap into 10-cm-or-better accuracy and push the ‘autonomy’ button. Now that does require that the receiver be up and running, but once it’s on, when you exit a tunnel, boom, you’re back in.” And in your own lane.
Another drawback of existing systems is cost. “I spoke with Google—they gave me a ride in Mountain View, Calif., in November—and I asked them at what price point this would be worth it to them,” Humphreys says. “They originally had this Trimble [PDF] thing, $60,000 a car, but they shed it, thinking that that was exorbitant. They want a $10,000 [total] sensor package.”
The Texas student team keeps the materials cost of the receiver system in the car at just US $35 per car, running their software-defined system entirely on a $5 Raspberry Pi processor. Of course, the software could piggyback, almost unnoticed, on the powerful robocar processors that are coming down the pike from companies like Nvidia and NXP.
Just as important as the receivers is the ground network of base stations, which the Texas team has shown must be spaced within 20 kilometers (12 miles) for full accuracy. And, because the students’ solar-powered, cellphone-network-connected base stations cost only about $1,000 to build, it wouldn’t be too hard to pepper an entire region with them.
You’d need more stations per unit of territory where satellite signals get bounced around or obscured, as they are in cities, particularly heavily built-up parts. It’s tough, Humphreys admits, in the canyons of Manhattan. Conveniently, though, it is in just such boxed-in places that the robocar’s cameras, radar, and lidar work the best, thanks to the many easily recognized buildings there that can serve as landmarks.
“Uber’s engineers hate bridges, because there are not a lot of visual features,” Humphreys notes. “They would do well to have 10-cm precise positioning; it can turn any roadway into a virtual railway.”
What’s next, after cars? Humphreys is still looking for the killer app to justify superaccurate GPS in handheld systems.
“We’re looking into outdoor virtual reality,” Humphreys says. “You could put on a visor and go on your favorite running trail, and it would represent it to you in a centimeter-accurate way, but artistically enhanced—maybe you’d always have a blue sky. You could craft the world to your own liking.” While staying on the path, of course.