Someday, U.S. soldiers fighting in the streets of a sprawling megacity will need an airdrop of ammunition, food, or water that can’t be safely delivered by ground convoy or helicopter. But the supplies parachuting from the skies won’t have to rely on GPS signals that suffer from inaccuracy in cluttered city environments or can be disrupted by enemies. The U.S. military has been testing new supply airdrops that can automatically aim for a precise landing based on images of the target area.
Recent tests of the U.S. Army’s Joint Precision Airdrop System (JPADS) have been trying new navigational software—developed by the Draper Laboratory in Cambridge, Mass., and other companies—to achieve GPS-style accuracy with images alone. The software figures out its current location by comparing ground terrain features, such as trees or buildings seen by onboard cameras, with the latest satellite or drone images of the target area in its database. That allows the software to accurately guide the descent of the parafoil-equipped cargo as it glides toward the ground. It’s all part of a broader effort by the U.S. military to test computer-driven versions of old fashioned navigation by sight.
“It’s what we humans have been using since the beginning of time, vision-based navigation,” said Gary Thibault, supervisory mechanical engineer for the Airdrop/Aerial Delivery program in the office of the U.S. Army’s Product Manager Force Sustainment Systems.
Moving away from the modern U.S. military’s reliance on GPS has big advantages. Anyone who has tried using GPS directions on their smartphone while walking or driving in a city knows how GPS accuracy can suffer at times. The current reliance on GPS-guided airdrops could prove challenging for troops who will inevitably find themselves patrolling or fighting within huge cities in the future. Enemy jamming of GPS signals or possibly even direct attacks on the satellites forming the GPS constellation could also deny crucial positional information.[shortcode ieee-pullquote quote=""It's what we humans have been using since the beginning of time, vision-based navigation"" float="right" expand=1]
That’s why the JPADS testing is just one possible use of vision-based navigation for the U.S. military. Similar systems could guide the descent of paratroopers jumping out of aircraft, robotic drones flying surveillance or strike missions, military aircraft piloted by humans, or possibly even vehicles on the ground.
“This camera-based navigation can conceivably be extended to any platform that has a need to know where it is, whether it be an autonomous vehicle, manned aircraft, unmanned aircraft or something else,” said Chris Bessette, JPADS program manager at Draper Laboratory.
The JPADS program has been testing Draper Laboratory’s vision-based software called “Lost Robot.” For the latest JPADS tests, commercial off-the-shelf cameras attached to the airdropped cargo provide images of the ground below. The software compares what the camera sees with the latest satellite images of the ground target area—satellite images that can be loaded into the JPADS database just before the mission takes off. That means the vision-based JPADS can function as “drop and forget” cargo delivery that automatically steers itself to the target without requiring outside signals or information.
Ideally, the Army wants to see if the vision-based version of JPADS can maintain the current GPS-guided system’s accuracy. One of the current GPS-guided systems can deliver 2,000 pounds of airdropped cargo to a target area with an accuracy that is significantly better than its threshold accuracy of 150 meters. A larger version of JPADS has the requirement of delivering 10,000 pounds of cargo to a target within 250 meters.
Experimental versions of JPADS have also tinkered with both much lighter and much heavier cargo. But the heavier the cargo, the more tricky it becomes to rapidly steer the parafoil and keep the airdrop on target. It’s like the difference between “parking a massive Mack Truck versus a bicycle,” said Richard Benney, director of the Aerial Delivery Directorate at the U.S. Army Natick Soldier Research, Development and Engineering Center.
The vision-based approach does have its limits. Very flat and featureless terrain such as a snow-covered field or generally flat ground could pose a challenge, Benney said. Heavy cloud cover can also give the software a harder time by preventing the onboard cameras from getting a good look at the ground below. For now, the JPADS testing with the cooperation of both the Army and Air Force has been going well in the “cactus-ridden desert” of Arizona.
Eventually, the Army may consider infrared cameras or other video camera equipment to help deal with the system’s current sight limitations. More high-end sensors beyond off-the-shelf cameras could also improve the software’s performance by getting more sophisticated image data from the ground. But the JPADS team wants to keep the added costs relatively low for the supply airdrops to remain cost-effective in the future.
“We’re trying to keep it a very cheap, reliable delivery vehicle to get the guys on the ground what they want,” Benney said.
Jeremy Hsu has been working as a science and technology journalist in New York City since 2008. He has written on subjects as diverse as supercomputing and wearable electronics for IEEE Spectrum. When he’s not trying to wrap his head around the latest quantum computing news for Spectrum, he also contributes to a variety of publications such as Scientific American, Discover, Popular Science, and others. He is a graduate of New York University’s Science, Health & Environmental Reporting Program.