Realistic Retinas Make Better Bionic Eyes

Following nature’s example more closely could lead to better visual sensors

4 min read

shape of eye with different colored dots

New visual sensors inspired by the human eye could help the blind see again and provide powerful new ways for machines to sense the world around them. Recent research shows that more faithfully copying nature’s hardware could be the key to replicating its powerful capabilities.

Efforts to build bionic eyes have been underway for several decades, with much of the early research focused on creating visual prostheses that could replace damaged retinas in humans. But in recent years, there’s been growing recognition that the efficient and adaptable way in which the eye processes information could prove useful in applications where speed, flexibility, and power constraints are major concerns. Among these are robotics and the Internet of things.

Now, a pair of new research papers describe significant strides toward replicating some of the eye's capabilities by more closely imitating the function of the retina—the collection of photoreceptors and neurons at the back of the eye responsible for converting light into visual signals for the brain. “It’s a very exciting extension from where we were before,” says Hongrui Jiang, a professor of engineering at the University of Wisconsin–Madison. “These two papers [explain research aimed at] trying to mimic the natural visual system’s performance, but on the retina level right at the signal-filtering and signal-processing stage.”

One of the most compelling reasons to do this is the retina’s efficiency. Most image sensors rely on components called photodiodes, which convert light into electricity, says Khaled Salama, professor of electrical and computer engineering at King Abdullah University of Science and Technology (KAUST), but photodiodes constantly consume electricity, even when they’re on standby, which leads to high energy use. In contrast, the photoreceptors in the retina are passive devices that convert incoming light into electrical signals that are then sent to the brain. In an effort to recreate this kind of passive light-sensing capability, Salama’s KAUST team turned to an electrical component that doesn’t need a constant source of power—the capacitor.

“The problem is that capacitors are not sensitive to light,” says Salama. “So we decided to embed a light-sensitive material inside the capacitor.” The team sandwiched a layer of perovskite—a material prized for its electrical and optical properties—between two electrodes to create a capacitor whose ability to store energy, or capacitance, changed in proportion to the intensity of the light to which it is exposed. The researchers found that the resulting device mimicked the characteristics of the rod-cell photoreceptors found in the retina.

To see if the devices they created could be used to make a practical image sensor, the team fabricated a 100-by-100 array of them, then wired them up to simple circuits that converted the sensors’ change in capacitance into a string of electrical pulses, similar to the spikes of neural activity that rod cells use to transmit visual information to the brain. In a paper published in the February issue of Light: Science & Applications, they showed that a special kind of artificial neural network could learn how to process these spikes and recognize handwritten numbers with an accuracy of roughly 70 percent.

An array of the bio-inspired sensors produces strings of electrical pulses in response to light, which are then processed by a spiking neural network.An array of the bio-inspired sensors produces strings of electrical pulses in response to light, which are then processed by a spiking neural network.Dr. Mani Teja Vijjapu/King Abdullah University of Science and Technology

Thanks to its incredibly low energy requirements, Salama says future versions of the KAUST team’s bionic eye could be a promising solution for power-constrained applications like drones or remote camera systems. “The best application for something like this is security, because often nothing is happening,” he says. “You are wasting a lot of power to take images and take videos and process them to figure out that there is nothing happening.”

Another powerful capability of our eyes is the ability to rapidly adapt to changing light conditions. Image sensors can typically operate only within a limited range of illuminations, says Yang Chai, an associate professor of materials science at the Hong Kong Polytechnic University. Because of this, they require complex workarounds like optical apertures, adjustable exposure times, or complex postprocessing to deal with varying real-world light conditions. By contrast (pun intended), when you transition from a dark cinema hall to a brightly lit lobby, it takes only a short while for your eyes to adjust automatically. That’s thanks to a mechanism known as visual adaptation, in which the sensitivity of photoreceptors changes automatically depending on the level of illumination.

In an effort to mimic that adaptability, Chai and his colleagues designed a new kind of image sensor whose light sensitivity can be modulated by applying different voltages to it. In a paper in Nature Electronics, his team showed that an array of these sensors could operate over an even broader range of illuminations than the human eye. They also paired the array with a neural network and showed that the system’s ability to recognize handwritten numbers improved drastically as the sensors adapted, going from 9.5 percent to 96.1 percent accuracy as it adjusted to bright light and 38.6 percent to 96.9 percent as it adjusted to darkness. These capabilities could be very useful for machines that have to operate in a wide range of lighting conditions. One application for which it will be quite helpful, says Yang, is in a self-driving car, which has to keep track of its position with respect to other objects on the road as it enters and exits a dark tunnel.

While there’s still a long way before bionic eyes approach the capabilities of their biological cousins, Jiang says the kinds of in-sensor adaptation and signal processing achieved in these papers show why researchers should be paying more attention to the finer detail of how the retina achieves its impressive capabilities. “The retina is an amazing organ,” he says. “We’re only just scratching the surface.”

The Conversation (0)