The August 2022 issue of IEEE Spectrum is here!

Close bar

JPL's AI-Powered Racing Drone Challenges Pro Human Pilot

It's human vs. machine in this racing drone test

4 min read
NASA JPL is developing AI-powered drones that are fully autonomous
NASA JPL is developing AI-powered drones that are fully autonomous.
Photo: NASA JPL

As drones and their components get smaller, more efficient, and more capable, we’ve seen an increasing amount of research toward getting these things flying by themselves in semistructured environments without relying on external localization. The University of Pennsylvania has done some amazing work in this area, as has DARPA’s Fast Lightweight Autonomy program.

At NASA’s Jet Propulsion Laboratory, they’ve been working on small drone autonomy for the past few years as part of a Google-funded project. The focus is on high-speed dynamic maneuvering, in the context of flying a drone as fast as possible around an indoor race course using only onboard hardware. For the project’s final demo, JPL raced their autonomous drones through an obstacle course against a professional human racing drone pilot.

The AI-powered drone is fully autonomous, meaning that there’s no external localization or off-board computer control. A Qualcomm Snapdragon Flight board is used for real-time flight control. The drone has a 3D map of the course that it constructs itself using its two wide field-of-view cameras: one pointing forward and the other pointing downward, resulting in a 250-degree-plus FOV with a persistent horizon. The two cameras generate a depth map from motion stereo, and in flight, the cameras plus an IMU localize to the map and perform visual-inertial odometry for motion tracking.

The AI-powered drone is fully autonomous: There’s no external localization or off-board computer control … It generates depth maps from motion stereo, and in flight, cameras plus an IMU localize and perform visual-inertial odometry for motion tracking

While the drones are capable of straight line speeds of over 120 km/h, JPL’s warehouse isn’t quite large enough for them to go flat out, sadly. The constrained track proved especially tricky for the professional human drone racing pilot, Ken Loo, who got mentally fatigued by the density of the track. Once Loo learned the course, though, he could complete it in an average of just over 11 seconds, while the autonomous drone took an average of 3 seconds longer. The time difference mostly came from aggression—while the autonomous drone was smoother and more consistent (flying nearly the same time every lap), Loo accelerated and decelerated more quickly, and was able to dynamically improvise maneuvers and shortcuts that the autonomous system couldn’t.

The project’s manager at JPL is Rob Reid, who helped develop that nifty robotic space hedgehog back in 2015. We spoke with Reid to find out why the heck they let a human win this race, and how they’re going to stop that from ever happening again.

IEEE Spectrum: Can you describe the drone autonomy research that JPL has been involved in that led to this demonstration?

Rob Reid: JPL has been researching camera-based navigation techniques for spacecraft and micro aerial vehicles (drones) for decades. Since 2013, it has collaborated with Google on Project Tango, and over the last two years, it has integrated Tango into a drone to demonstrate novel navigation algorithms. The team has explored various trajectory optimization techniques that account for effects such as aerodynamics and camera motion blur.

JPL AI-powered racing dronePhoto: NASA JPL

Why was a drone race an ideal way for you to demonstrate progress in this area?

The goal was to demonstrate high-performance autonomous flight among obstacles—an indoor drone race provides a complex track full of obstacles, along with a compelling reason to fly fast through them!

Were you expecting that the human pilot would win?

I wasn’t surprised by the outcome; we were confident that our drone system was going to be competitive; however, we weren’t sure who was going to learn an optimal trajectory (i.e., racing line) the fastest! With only one afternoon of flying, Ken was able to shave seconds off his lap time much faster than our algorithms could. In the weeks since, we have sped up our optimization approach considerably.

What are the limitations of the hardware that the drones are using to navigate, and how did that affect their performance in the race?

The biggest performance limitation for fast indoor flight comes from the shutter speed of the onboard cameras that are used to track the drone’s motion—flying too fast while too close to the ground, or rolling or pitching too quickly can cause the image to blur and the drone to become lost. We addressed this in two ways: First, by using two wide field-of-view cameras—by pointing one forward and the other downward, the >250-degree field of view allows the drone to always see the horizon. Second, we adjusted trajectories to cap rotation rates and speed-to-height ratio.

What will it take before drones like these are competitive with human expert pilots in structured environments?

For a typical drone race, the hardware is ready to beat human experts: Our drones are “race spec” and can pull a few g’s. We couldn’t fly a nighttime race, or on a track with lots of visual repetition.

Are you continuing this project? If so, what can we look forward to?

The work is ongoing. Unfortunately, I can’t say much of what’s next! But you can look forward to drones with the ability to sense obstacles and update their own trajectories online.

This area of robotics is progressing rapidly; things like event-based cameras could potentially solve the issue of motion blur to some extent and enable even more dynamic autonomous maneuvers. And Reid is definitely right that drone hardware is poised to surpass human performance, although that’s the case with robotics in general—we’re at the point where, with a few exceptions, robotics is much more of a software challenge than a hardware challenge. This doesn’t mean that it’s necessarily any easier to solve, though, and we’re excited to see how JPL’s drones evolve.

[ JPL ]

The Conversation (0)

How Robots Can Help Us Act and Feel Younger

Toyota’s Gill Pratt on enhancing independence in old age

10 min read
An illustration of a woman making a salad with robotic arms around her holding vegetables and other salad ingredients.
Dan Page
Blue

By 2050, the global population aged 65 or more will be nearly double what it is today. The number of people over the age of 80 will triple, approaching half a billion. Supporting an aging population is a worldwide concern, but this demographic shift is especially pronounced in Japan, where more than a third of Japanese will be 65 or older by midcentury.

Toyota Research Institute (TRI), which was established by Toyota Motor Corp. in 2015 to explore autonomous cars, robotics, and “human amplification technologies,” has also been focusing a significant portion of its research on ways to help older people maintain their health, happiness, and independence as long as possible. While an important goal in itself, improving self-sufficiency for the elderly also reduces the amount of support they need from society more broadly. And without technological help, sustaining this population in an effective and dignified manner will grow increasingly difficult—first in Japan, but globally soon after.

Keep Reading ↓Show less