A Drone with Bug Vision

What do you do when you don't have an inertial guidance system? Use your eyes.

2 min read
A Drone with Bug Vision
Photo: AMU/CNRS

Almost anything that flies, be it a plane, a spacecraft, or a drone, has an inertial navigation system with accelerometers and gyroscopes that control yaw, pitch and roll, and thus the flight path. Flying insects like bees, however, don't have inertial systems to guide them; they rely exclusively on what they see. This has inspired two researchers at the Aix-Marseille University in France to build a drone that imitates the way these insects navigate. Their mission was to design it to fly and circumvent obstacles by relying solely on visual cues. In a paper published in the 26 February issue of Bioinspiration & Biomimetics, Fabien Expert and Franck Ruffier describe how a drone, which they call BeeRotor, was able to traverse a circular tunnel, avoiding crashes and obstacles.

The tiny craft, which was guided by panoramic optic flow (OF) sensors and no inertial navigation, weighs 80 grams and is 47 centimeters long. It was attached to the end of a freely rotating arm at the center of a circular tunnel. Two rotors kept the drone aloft. The ceiling and the floor of the tunnel were covered with photographs of natural surfaces, and the speed at which the details of these photographs passed under or above the drone were continuously monitored by the panoramic optic flow sensors’ 26 photodiodes.

The principle of the optic flow guiding mechanism is quite simple, explains Ruffier. The robot does not measure its own speed or distance from the floor, but if the details of the underlying structure pass by too fast, the optic flow sensor triggers a feed-back system increasing the speed of the rotors, moving the craft away from the floor, he says. The “eye” has four optic flow sensors, two directed toward the front of the craft and two that look backwards. Each sensor is equipped with a lens that focuses the image on six photodiodes that record the speed of a passing element as its image transfers from one pixel to the other. The eye keeps itself aligned with the nearest surface. “Orienting its eye allows the robot to avoid slopes of up to 30 degrees,” says Ruffier.

The team is now working on a rotorcraft that will be able to leave its perch and fly around freely. This will require two additional optic flow sensor systems. They will control the roll and yaw so that the craft can veer to the right or left when it is approaching an obstacle.

What is the need for drones guided by optic flow sensors if inertial navigation systems have done the same job well for decades? Drone manufacturers are asking that question, but there is interest from European the aerospace industry, especially in Europe, says Ruffier. The probes that land on the Moon, Mars, or comets contain inertial systems that represent about 20 percent of their weight. Space agencies, Ruffier says, are interested in visual navigation systems because they will be much lighter. And even if they don’t replace inertial systems on spacecraft, they could act as back-up systems poised to save space missions, he adds.

The Conversation (0)