The July 2022 issue of IEEE Spectrum is here!

Close bar
Composites of hard and soft materials and circuits make up an electronic version of an insect's compound eye.
University of Illinois and Beckman Institute

Composites of hard and soft materials and circuits make up an electronic version of an insect\u2019s compound eyeEye See You: Composites of hard and soft materials and circuits make up an electronic version of an insect's compound eye.Photo: University of Illinois and Beckman Institute

New “insect eye” cameras could someday help flying drones see into every corner of a battlefield or give tiny medical scopes an all-around view inside the human body. A team of researchers from the United States has constructed such a camera, which offers an almost 180-degree field of view using hundreds of tiny lenses.

The centimeter-wide digital camera has 180 microlenses—roughly what fire ants or bark beetles have in their compound eyes—placed on a hemispherical array. Researchers hope their design will eventually lead to insect-eye cameras that exceed even nature’s blueprints, according to a report in the 2 May issue of the journal Nature.

“We think of the insect world as an inspiration for design, but we’re not constrained by it,” says John Rogers, a physical chemist and materials engineer at the University of Illinois at Urbana-Champaign. “It’s not biomimicry; it’s bioinspiration.”

Biological insect eyes consist of hundreds or thousands of the tiny units, each having a lens, pigment, and photoreceptors. Each unit’s lens is mounted on a transparent crystalline cone that pipes light down to the photoreceptors. Black pigment isolates each of the eye units and screens out background light.

The 160-degree, 180-pixel eye is inspired by an insect's compound eye.Biomimicry: The 160-degree, 180-pixel eye is inspired by an insect’s compound eye.Photo: University of Illinois and Beckman Institute

Nature’s design offers two huge advantages over that of ordinary cameras. First, the hemispherical shape allows for extremely wide-angle fields of view. Second, the hemispherical array of tiny lenses has an almost infinite depth of field, which keeps objects in focus regardless of their distance from the camera.

But camera chips aren’t usually shaped like fly eyes. Researchers faced the tricky task of bending the camera into a hemispherical shape without distorting the image created by each lens or ruining the electronics beneath the tiny lenses. Their solution “relies on composites of hard and soft materials in strategic layouts that allow stretching and bending and flexing to go from planar [flat] to hemispherical form,” Rogers says.

Rogers and his colleagues put the tiny lenses on top of columns connected to a flexible base membrane—all made from elastomeric polydimethylsiloxane material, which is also used in contact lenses. Each supporting cylindrical post protected its lens from any bending or stretching in the base membrane.

The array of tiny lenses sat on a second layer of stretchable silicon photodiodes that converted the focused light from the lenses into current or voltage. Tiny serpentine wires connected the array of photodiodes with the other electronics.

A third, “black matrix” layer sat on top of both the lens layer and the photodiode layer to act as the shield against background light. The black pigment of real insect eyes can adjust in real time to changing light conditions, but the artificial camera version must use software to make such adjustments.

The design allowed researchers to freely inflate the flat layers into the final hemispherical shape—a camera with a 160-degree field of view. (The prototype camera’s array of lenses didn’t quite stretch all the way to the edge of the hemispherical shape.)

A next step could involve figuring out how to dynamically “tune” the inflated shape of the camera, says Rogers. He has also challenged his team to try inflating the camera shape into an almost full spherical shape—he envisions flexible camera designs based on the different compound eyes of other creatures, such as lobsters and shrimp (reflecting superposition eyes), moths and lacewings (refracting superposition eyes), and houseflies (neural superposition eyes).  

The insect-eye camera depends on each individual unit to contribute 1 pixel of resolution. A 180-pixel-resolution camera may not do much right now, but the camera design can scale up its resolution by adding more units to the overall array. Rogers anticipates making camera designs with better resolution than the eyes of praying mantises (15 000 eye units) and dragonflies (28 000 eye units).

The technology won’t likely be used in consumer digital cameras any time soon. But the insect-eye cameras could be used in medical devices, such as endoscopes, which give physicians a look inside the human body. Alexander Borst, director of the Max Planck Institute of Neurobiology, in Germany, envisions commercial versions of the cameras within the next year or two.

Such cameras may also prove useful for small drones to explore disaster areas such as those left behind by the Chernobyl and Fukushima nuclear disasters, Borst says. He was not involved in the latest research but hopes to work with Rogers and his colleagues to put the insect-eye camera to use in a robo-fly developed at his institution.

About the Author

Jeremy Hsu is a New York City–based freelance writer. In April 2013, he reported on a big step toward a silicon quantum computer.

The Conversation (0)

How the U.S. Army Is Turning Robots Into Team Players

Engineers battle the limits of deep learning for battlefield bots

11 min read
Robot with threads near a fallen branch

RoMan, the Army Research Laboratory's robotic manipulator, considers the best way to grasp and move a tree branch at the Adelphi Laboratory Center, in Maryland.

Evan Ackerman

“I should probably not be standing this close," I think to myself, as the robot slowly approaches a large tree branch on the floor in front of me. It's not the size of the branch that makes me nervous—it's that the robot is operating autonomously, and that while I know what it's supposed to do, I'm not entirely sure what it will do. If everything works the way the roboticists at the U.S. Army Research Laboratory (ARL) in Adelphi, Md., expect, the robot will identify the branch, grasp it, and drag it out of the way. These folks know what they're doing, but I've spent enough time around robots that I take a small step backwards anyway.

This article is part of our special report on AI, “The Great AI Reckoning.”

The robot, named RoMan, for Robotic Manipulator, is about the size of a large lawn mower, with a tracked base that helps it handle most kinds of terrain. At the front, it has a squat torso equipped with cameras and depth sensors, as well as a pair of arms that were harvested from a prototype disaster-response robot originally developed at NASA's Jet Propulsion Laboratory for a DARPA robotics competition. RoMan's job today is roadway clearing, a multistep task that ARL wants the robot to complete as autonomously as possible. Instead of instructing the robot to grasp specific objects in specific ways and move them to specific places, the operators tell RoMan to "go clear a path." It's then up to the robot to make all the decisions necessary to achieve that objective.

Keep Reading ↓Show less