The December 2022 issue of IEEE Spectrum is here!

Close bar

Event Camera Helps Drone Dodge Thrown Objects

Watch this drone not get hit by a soccer ball

3 min read
Watch this drone not get hit by a soccer ball
This drone uses an event camera, which responds to changes in a scene on a per-pixel basis in microseconds, to nimbly avoid obstacles.
Image: University of Zurich via YouTube

Davide Scaramuzza’s Robotics and Perception Group at the University of Zurich pioneered the use of event cameras on drones. We first wrote about event cameras back in 2014: These are sensors that are not good at interpreting a scene visually like a regular camera, but they’re extremely sensitive to motion, responding to changes in a scene on a per-pixel basis in microseconds. A regular camera that detects motion by comparing one frame with another takes milliseconds to do the same thing, which might not seem like much, but for a fast-moving drone it could easily be the difference between crashing into something and avoiding it successfully.

In a paper recently accepted to IEEE Robotics and Automation Letters, Davide Falanga and Suseong Kim from Scaramuzza’s group take a look at exactly how much of a difference it can make to use an event camera on drones moving at high speeds. And to validate their research, they hurl soccer balls at a drone as hard as they can, and see if it can dodge them.

Impressive, right? As far as the drone is concerned, this is a clever way of mimicking obstacle encounters in high-speed flight, since it’s relative velocity that’s important. Also, the researchers say that in each case, motion capture data confirms that “the ball would have hit the vehicle if the avoidance maneuver was not executed.”

The time it takes a robot (of any kind) to avoid an obstacle is constrained primarily by perception latency, which includes perceiving the environment, processing those data, and then generating control commands. Depending on what sensor you’re using, what algorithm you’re using, and what computer you’re using, typical perception latency is anywhere from tens of milliseconds to hundreds of milliseconds. The sensor itself is usually the biggest contributor to this latency, which is what makes event cameras so appealing—they can spit out data with a theoretical latency measured in nanoseconds.

Ball dodging droneThe quadrotor used by the University of Zurich researchers featured a (1) Insightness SEEM1 sensor; (2) Intel Upboard computer, running the detection algorithm; and a (3) Lumenier F4 AIO flight controller, which received commands from a ground station.Image: University of Zurich

The question that the University of Zurich researchers want to answer is how much the perception latency actually affects the maximum speed at which a drone can move while still being able to successfully dodge obstacles. Comparing the kind of traditional vision sensors that you’d find on a research-grade quadrotor (both mono and stereo cameras) with an event camera, it turns out that the difference is actually not all that significant, as long as you’re dealing with a quadrotor that’s not moving too quickly. As the speed of the quadrotor increases, though, event cameras can start to make a difference—a quadrotor with a thrust to weight ratio of 20, for example, could achieve maximum safe obstacle avoidance speeds that are about 12 percent higher than if it was using a traditional camera. Quadrotors this powerful don’t exist yet (maximum thrust to weight ratios are closer to 10), but we’re getting there.

It’s perhaps a little surprising that event cameras don’t offer a more significant benefit in latency quadrotors at lower speeds, but it’s important to remember that event cameras are pretty cool for other reasons as well: They don’t suffer from motion blur, and they’re much more resilient to lightning conditions, able to work just fine in the dark as well as when you’re dealing with high dynamic range, like looking into the sun. As the speed and agility of drones increases, and especially if we want to start using them in unstructured environments for practical purposes, it sure seems like event cameras will be the way to go.

How Fast is Too Fast? The Role of Perception Latency in High-Speed Sense and Avoid,” by Davide Falanga, Suseong Kim, and Davide Scaramuzza from the University of Zurich, has been accepted to IEEE Robotics and Automation Letters.

[ RPG ]

The Conversation (0)

The Bionic-Hand Arms Race

The prosthetics industry is too focused on high-tech limbs that are complicated, costly, and often impractical

12 min read
Horizontal
A photograph of a young woman with brown eyes and neck length hair dyed rose gold sits at a white table. In one hand she holds a carbon fiber robotic arm and hand. Her other arm ends near her elbow. Her short sleeve shirt has a pattern on it of illustrated hands.

The author, Britt Young, holding her Ottobock bebionic bionic arm.

Gabriela Hasbun. Makeup: Maria Nguyen for MAC cosmetics; Hair: Joan Laqui for Living Proof
DarkGray

In Jules Verne’s 1865 novel From the Earth to the Moon, members of the fictitious Baltimore Gun Club, all disabled Civil War veterans, restlessly search for a new enemy to conquer. They had spent the war innovating new, deadlier weaponry. By the war’s end, with “not quite one arm between four persons, and exactly two legs between six,” these self-taught amputee-weaponsmiths decide to repurpose their skills toward a new projectile: a rocket ship.

The story of the Baltimore Gun Club propelling themselves to the moon is about the extraordinary masculine power of the veteran, who doesn’t simply “overcome” his disability; he derives power and ambition from it. Their “crutches, wooden legs, artificial arms, steel hooks, caoutchouc [rubber] jaws, silver craniums [and] platinum noses” don’t play leading roles in their personalities—they are merely tools on their bodies. These piecemeal men are unlikely crusaders of invention with an even more unlikely mission. And yet who better to design the next great leap in technology than men remade by technology themselves?

Keep Reading ↓Show less