Swiss researchers have used a fruit fly to steer a mobile robot through an obstacle course in the lab. They call it the Cyborg Fly.
Chauncey Graetzel and colleagues at ETH Zurich's Institute of Robotics and Intelligent Systems started by building a miniature IMAX movie theater for their fly. Inside, they glued the insect facing a LED screen that flashed different patterns. These patterns visually stimulated the fly to beat its left or right wing faster or slower, and a vision system translated the wing motion into commands to steer the robot in real time.
The fly, in other words, believed to be airborne when in reality it was fixed to a tether ("A" in the image below), watching LEDs blink ("B") while remote controlling a robot ("C") from a virtual-reality simulation arena ("D"). Is this The Matrix, or Avatar, for flies?
Graetzel tells me the goal of the project was to study low-level flight control in insects, which could help design better, bio-inspired robots. "Our goal was not to replace human drivers with flies," he quips.
The key component in their setup was a high-speed computer vision system that captured the beating of the fly's wings. It extracted parameters such as wing beat frequency, amplitude, position, and phase. This data, in turn, was used to drive the mobile robot. Closing the loop, the robot carried cameras and proximity sensors; an algorithm transformed this data stream into the light patterns displayed on the LED screen.
In a paper in the July 2010 issue of IEEE Transactions on Automation Science and Engineering, they describe the vision system's latest version. It uses a camera that focuses on a small subset of pixels of interest (the part of the fly's wings responsible for most lift, for instance) and a predictive algorithm that constantly reevaluates and selects this subset. The researchers report that their system can sample the wings at 7 kilohertz -- several times as fast as other tracking techniques.
"As autonomous robots get smaller, their size and speed approach that of the biological counterparts from which they are often inspired," they write in the paper, adding that their technique could "be relevant to the tracking of micro and nano robots, where high relative velocities make them hard to folow and where robust visual position feedback is crucial for sensing and control."
The main Cyborg Fly experiments took place about two years ago as part of a research effort led by professor Steven Fry at the the Fly Group at ETH/University of Zurich. That work was a collaboration with ETH's Institute of Robotics and Intelligent Systems, directed by professor Bradley Nelson. Tufts University's Center for Engineering Education and Outreach, in Boston, directed by mechanical engineering professor Chris Rogers, was also involved.
The Cyborg Fly is not the only "flight simulator" for bugs, and other research groups have used insects to control robots. But still, the ETH project stands out because of its high-speed vision component. This system could be useful not only for biology research, to study insect flight and track fast movements of appendages or the body, but also for industrial applications -- for monitoring a production line or controlling fast manipulators, for example.
Graetzel says they tested two different "movie theater" configurations. One used two parallel LED panels, with the fly in the middle. They later upgraded it to a cylindrical LED panel. They also used two types of robot. The first was an e-puck, a small wheeled robot designed for use in research projects. Later the researchers built a robot using Lego NXT.
The Cyborg Fly project was a finalist in the robotics category at this year's Graphical System Design Achievement Awards, an event organized by National Instruments, in Austin, Tex.
Graetzel has since received his PhD degree and moved on to other things -- that do not involve flies.
Another video:Images and videos: ETH Zurich/Institute of Robotics and Intelligent System