Brain Power

Borrowing from biology makes for low -- power computing

12 min read
Brain Power
BRAIN: GREGOR SCHUSTER/ GETTY IMAGES; GAUGE: PETER DAZELEY/ GETTY IMAGES

opening image for bio brain

BRAIN: GREGOR SCHUSTER/ GETTY IMAGES; GAUGE: PETER DAZELEY/ GETTY IMAGES

Read this aloud and your inner ear, by itself, will be carrying out at least the equivalent of a billion floating-point operations per second, about the workload of a typical game console. The inner ear together with the brain can distinguish sounds that have intensities ranging over 120 decibels, from the roar of a jet engine to the rustle of a leaf, and it can pick out one conversation from among dozens in a crowded room. It is a feat no artificial system comes close to matching.

But what's truly amazing is the neural system's efficiency. Consuming about 50 watts, that game console throws off enough heat to bake a cookie, whereas the inner ear uses just 14 microwatts and could run for 15 years on one AA battery. If engineers could borrow nature's tricks, maybe they could build faster, better, and smaller devices that don't literally burn holes in our pockets. The idea, called neuromorphic engineering, has been around for 20 years, and its first fruits are finally approaching the market.

The likely first application is bionics--the use of devices implanted into the nervous system to help the deaf, blind, paralyzed, and others. There are two reasons for this choice: the biological inspiration crosses over to the application, and the premium on energy efficiency is particularly important.

Bionic ears are a case in point. Today's device, called a cochlear implant, consists of an implanted electrode array; a bulky, power-hungry digital-signal processor worn outside the ear; and a wireless link that conveys data and power to the implanted electrodes. In the near future, these devices will be fully implanted inside the body so that deaf people will be indistinguishable from everyone else in both appearance and, we hope, ability to hear. In the past year, my lab at the Massachusetts Institute of Technology has completed work on a bionic-ear processor that does the job of the digital-signal processor, is small enough to be implanted, and could run on a 2-gram battery needing a wireless recharge only every two weeks [see illustration, "Mimicking the Ear"]. As the best batteries currently available can be recharged about 1000 times, this device is the first to permit

30-year operation without surgery to replace the battery. Last year, a deaf woman replaced her conventional processor with ours, though it was not implanted, and afterward she could understand speech easily and well.

Neuromorphic engineering and, more generally, biologically inspired electronics are still in their infancy, but practitioners have already accomplished amazing things [see table, "Leading Labs"]. These include the attempt to understand biological systems, such as the retina of the human eye and the sonar systems of bats, by modeling them in microchips. Some of the lessons learned have been turned to practical purposes--for instance, applying the principles of vision in the housefly to the control of robotic motion and designing radio-frequency spectrum analyzers that mimic the architecture of the human inner ear. Some devices now measure oxygen saturation in the blood with sensors and processors inspired by the photoreceptors in our eyes; others employ pattern-recognition circuits that rely on the mix of analog and digital features found in the brain.

One of biology's big power-saving secrets is that it relies on the physics of special-purpose structures, such as ears and eyes, to do a lot of analog computing. Ears, for example, are complex structures that by their inherent physics alone perform filtering, frequency-spectrum analysis, and signal compression--all before the signals are transmitted to the brain. Many of the initial insights into biology's computing efficiency originated with Carver Mead, professor emeritus at the California Institute of Technology, in Pasadena--the founding father of neuromorphic engineering.

But ears, eyes, and even individual brain cells also have a digital aspect. Brain cells, or neurons, can be viewed as special-purpose analog-to-digital converters. They recognize particular patterns of voltage inputs from other neurons, integrate these signals in an analog manner, and then output a digital-like signal, a voltage spike (1) or its absence (0). Output spikes from one neuron act as inputs to the next neuron. And this simple process, amplified and repeated by billions of interconnected neurons, leads to movement, hearing, thought, and everything else under our brain's control [see the sidebar, " " which accompanies the online version of this article].

The inner ear uses just 14 microwatts and could run for 15 years on one AA battery

Analog devices in the ear, such as the eardrum and the cochlea, process sound. The ear then digitizes the processed sound signal by encoding it as spikes of voltage that travel down the auditory nerve to the brain, which interprets the spikes to distinguish a jazz tune from an oncoming train or a whisper. Because the ear has already done a great deal of analog computation on the sound, the information it provides the brain is more compact and far better suited than raw sound to human tasks, such as understanding what a child is whispering in a crowded movie theater. This scheme of low-power analog processing followed by digitization is one of the most important lessons biology has to teach designers of electronics.

By now you might be wondering: if analog computing is so marvelously efficient, why is almost every electronic system you come across digital? One obvious reason is digital's relative immunity to noise. With only two possible values, 1 and 0, noise and other vagaries of the electronic world are unlikely to alter a signal. What's more, digital systems are robust; they are insensitive to changes in temperature, to noise from their power supplies, and to variations among their main constituent parts--transistors. Digital systems also tend to be relatively easy to program and scale up.

The advantage analog computers have is in how they use their transistors and other electronic components. Whereas digital devices use transistors as simple switches, analog systems recognize that transistors are complicated things with physical properties that you can compute with. The use of a transistor's real physical relationship to current and voltage, essentially all the shades of gray between digital's black and white, should let analog computers calculate with much greater efficiency than digital ones. But there are two important limits.

An analog system beats a digital one in efficiency only if the analog system doesn't have to do very precise calculations or output its answer at a high bandwidth. Precision basically means being able not only to compute 2 + 2 to get 4 but also to calculate 1.9866235 + 2.0133765 to get 4.0000000. For an analog system, precision is related to how a device's performance varies with

temperature fluctuations, power-supply noise, slight differences among individual transistors, and inherent and wholly unavoidable fluctuations in the flow of current and thermal noise.

Analog computing's problem with precision is best illustrated by comparing an analog adder to a digital adder. To sum using an analog device, just add the currents entering a particular part of a circuit. To add 4 to 5, put 4 milliamperes on one line, 5 milliamperes on another, and join the two. But notice that both the inputs and the output would have to be accurate to at least a milliamp to get the right answer.

A digital adder needs more circuitry, and thus more power, to operate, but it does not require such high accuracy. An 8-bit number would be represented by eight wires, each carrying either a digital 1 or 0. The logic circuits that add the numbers need only be precise enough to tell a 1 from a 0. Adding bigger or more precise numbers, say a 16-bit number, in a digital adder means just doubling the number of 1-bit wires leading to a similarly increased set of logic circuits. But such an upgrade is much more difficult when adding with analog signals. In fact, it would involve the circuit's going from milliampere accuracy to less than a microampere of accuracy.

Analog computing's other problem, bandwidth, is caused by that unavoidable buzz, thermal noise. In order to eliminate its effect on, say, the answer to an addition problem, analog computers average the answer to a calculation over a period of time. Trying to compute more quickly means less time for averaging and more chance the answer will be corrupted by thermal noise.

So digital processing certainly has its advantages. It is common today, when working with signals from the real world, to convert the signal immediately into a torrent of digital bits using a fast and highly precise analog-to-digital converter and then to do all the subsequent processing with lots of watt-munching digital computations. The processor then spits out a smaller stream of bits that are meaningful to the computer or other device whose job it is to interpret the signal. For example, a typical 16-bit audio converter in a digital voice recorder might churn out 352 kilobits per second of digital data. After lots of digital processing, the signal might result in just 5 kb/s of useful speech data.

What these numbers demonstrate is the inefficiency of turning analog signals into digital bits and running digital processing algorithms on them. The fewer bits that need to be converted and processed, the better. As we noted earlier, nature's solution is to first process the incoming analog information efficiently with interconnected, special-purpose analog devices--eardrums, cochleas, and sensory cells, for instance--and delay the analog-to-digital conversion until after this processing has reduced the amount of information needing to be digitized. For example, rather than report just the intensity of the light falling on each of millions of cells in your retina, interconnected neurons in your eye use analog processes to calculate where the edges in an image lie and encode that data as spikes of voltage on your optic nerve.

So how do organs such as the inner ear deal with the imprecision of analog computing? Indeed, many biological components, such as neurons in the brain, are by themselves low-precision components. But in complete organs such as an ear or eye, a great many imprecise analog processing elements interconnect and deliver amazingly accurate information. For a bat to find a bug by echolocation, its auditory system must be able to discern a difference in the time in which sound arrives on the order of tens to hundreds of nanoseconds. But all the neurons involved in that process are accurate down only to the millisecond. Just as the digital adder discussed earlier combines many 1-bit calculations to get an 8-bit answer, biological processors tie together many imprecise analog computational units with a combination of analog and digital interactions.

One of the strengths of digital systems is that they tend to be robust, right down to the transistors that constitute them. We can't say that about biological components, but our ears, eyes, and everything else work perfectly well in all kinds of situations. Rather than all of the system's components being robust as they would be in a digital system, living things become more robust through a combination of adaptation and learning. For sensors such as the eye, robustness means constantly adapting using feedback from the incoming information. When a light brightens, for example, our pupils contract to let less light in. At the same time, the eye's photoreceptors can quickly recalibrate their own sensitivity to compensate for the brightness. Finally, the system that includes the brain and eyes can learn. For example, we learn to compensate for the way our heads move when we walk, by moving our eyes in the opposite direction, so the world isn't just a big blur.

The starting point for making low-power circuits that compute like your eyes and ears is the transistor itself. There was no need to come up with something new, as the metal oxide semiconductor (MOS) transistor--the kind found in most chips today--has an analog personality.

In normal use, MOS transistors are treated as simple switches that are either all on or all off. But in fact, the current passing through the transistor is actually a smooth but very steep exponential function of the control voltage. Even when the transistor is off, you see tiny amounts of what is called subthreshold current. And this tiny current can be controlled and used in computation.

Subthreshold circuits were pioneered largely by Professor Eric Vittoz of CSEM (the Swiss Center for Electronics and Microtechnology), in Neuchâtel, and are in widespread use in the watch and pacemaker industries, where saving power is paramount. The trouble with such circuits is their sensitivity: a small change in either voltage or temperature can drastically alter the current they transmit. Engineers solve the problem by carefully designing the circuits to regulate the currents going through the transistors.

Subthreshold transistors share an interesting characteristic with brain cells that makes them even more attractive for building neuromorphic chips. The relationship between subthreshold current and the voltage controlling it resembles the current-to-voltage relationships neurobiologists see in molecular structures on the surface of brain cells. These structures, called ion channels, are the main means by which brain cells communicate. Channels on one cell open in response to a voltage-controlled chemical signal from an adjacent cell, allowing ions to flow into or out of the cell. This flow ultimately changes the cell's voltage.

Using subthreshold circuits and the biological manner of computing efficiently and robustly, my laboratory has constructed a bionic ear to restore hearing in deaf people [again, see illustration, "Mimicking the Ear"]. To understand how the processor works, it helps to first understand how a real ear works.

Sound enters the ear and causes the eardrum to vibrate. The vibration is transferred by a set of tiny bones to the cochlea, a coiled, tapered tube divided by a membrane along its length into two main fluid-filled compartments. The vibration begins in the membrane at the wider end of the tube and travels toward the narrow end. Because of the stiffness of the cochlea's tapered membrane, high-frequency sounds resonate at the start of the tube and low-frequency ones resonate at its end. So, by purely mechanical means, the cochlea divides sound into its spectrum of component frequencies.

Sensory organs, called outer and inner hair cells, sway with the resonance. The outer hair cells amplify the vibration electromechanically. The inner hair cells secrete chemicals as a consequence of their motion, which cause neighboring neurons to generate voltage spikes that finally convey a filtered and compressed version of the sound to the brain. In a sense, the neurons are performing analog-to-digital conversion.

The hair cells and neurons of the ear also have a gain-control function. That is, even though the incoming sound varies by over 12 orders of magnitude--from a whisper to a jet engine--the rate of voltage spikes to the brain varies by only 3 or 4 orders of magnitude. It is, all together, a marvelous combination of fluid mechanics and neural electronics.

Cochlear implants, or bionic ears, are not exact copies of the ear, but they do restore hearing in profoundly deaf people. Like the ear, they break sound into its component frequencies, then electrically stimulate the neurons in the cochlea that are specialized to pick up those frequencies. The responsibility for gain control is shifted from the hair cells and neurons to the device.

The analog bionic ear we've developed at MIT electronically mimics some aspects of how the ear processes sound and follows biology's general low-power strategy of delaying digitization until it's both necessary and energy efficient. First, a microphone and preamplifier turn sound into an analog electronic signal. That signal passes through an automatic gain-control circuit, which narrows the range of intensity. A 16-channel spectrum analyzer then parses the signal into frequency components. Each channel of the analyzer has three parts--a band-pass filter that lets only a predetermined chunk of frequencies through, an envelope detector circuit that calculates the signal energy at each frequency, and a converter that computes the logarithm of the energy and turns it into a digital value. The output bits are then used to control the intensity of the current passed into each of 16 electrodes implanted in the nerves of the cochlea.

All of that processing draws a frugal 250 W of power, leaving 750 W of the total 1 milliwatt power budget for stimulating the nerves. The processing power is one-twentieth of that in today's cochlear implants.

The design compensates for analog processing's twin problems of precision and bandwidth by compressing the sound signal so much that the analog-to-digital converters need only handle data of a modest 7-bit precision and at most a few kilohertz of bandwidth.

Making the circuit robust required several innovations in the chip's building-block analog circuits. These allowed the circuits to operate over a wide range of frequencies at very low power while remaining immune to the effects of temperature, variations in the individual transistors, and power-supply noise. Again, as in the biological model, the chip handles variations in the environment, because the overall system allows for the sensitivity of its many parts by constantly recalibrating its components, using feed-forward and feedback signals.

Our bionic ear chip uses several circuits we built into an earlier device, a simplified electronic model of the cochlea made of silicon. The silicon cochlea has also led to the design of a new algorithm that can improve the performance of ordinary cochlear implants and other speech processors.

The algorithm mimics one of the cochlea's more interesting behaviors: strong signals in one frequency band tend to suppress weaker signals in neighboring bands. Thus, your inner ear quashes interfering noise before it reaches your brain. However, as with the real cochlea, the algorithm ensures that weak signals in frequency bands that are distant from strong signals are still audible. Other researchers have programmed our algorithm into the cochlear-implant processors of a few deaf people and have shown that it helped improve the accuracy of their understanding of speech in the presence of background noise. It also improved speech recognition in systems that are meant to help users understand what's said in cars and other noisy places.

Taking another lesson from the cochlea, we are designing an ultra-wide-band spectrum analyzer that can simultaneously tune into radio signals all the way from the FM radio (around 100 MHz) bands to Wi-Fi (less than 10 GHz) bands. In designing the device, we're trying to do with silicon what the cochlea does with its exponentially tapered membrane--namely, perform spectral analysis over a hundredfold range of frequencies significantly faster than a conventional spectrum analyzer would.

While we've taken the lesson of low-power analog processing prior to digital conversion to heart, there are three fundamental things that we still need to discover about biological systems to make engineering marvels that rival them. First, we need a better understanding of how they perform efficient, reliable computations with noisy, unreliable devices in large-scale systems.

Second, we want to learn how biological systems operate at many timescales and over many length scales. The brain, for example, is made up of a network of neurons with positive and negative feedback loops that operate in as little as milliseconds or as much as days and over connection distances ranging from micrometers to centimeters. An equivalent system built with the best engineering strategies for complex system design today would likely be highly unstable.

Finally, we need to replicate the ability of a cell to process a great many converging inputs and to produce output that influences a great many other cells. Advanced digital and analog architectures today are just starting to copy the massively parallel architectures of neurobiology. But the parallelism we can achieve today is limited by the paltry number of interconnections we can fit on a chip.

Neuromorphic researchers have so far picked biology's low-hanging fruit. There are many systems vastly more complex than ears and eyes that do amazing computation on very little power. In fact, the network of chemical interactions in just a single human cell form an awe-inspiring computer that regulates the cell's growth, structure, repair, and reproduction. The organization of such a system may one day serve as inspiration to create complex networks of computers that perform tasks we cannot even conceive of yet. As Richard Feynman, a great physicist of the previous century, once said, "The imagination of nature is far, far greater than the imagination of man."

See, " "

About the Author

Rahul Sarpeshkar is an associate professor in electrical engineering at the Massachusetts Institute of Technology, where he heads the Analog VLSI and Biological Systems lab.

To Probe Further

For an in-depth description of the bionic ear, see "An Ultra-Low-Power Programmable Analog Bionic Ear Processor," by Rahul Sarpeshkar et al., IEEE Transactions on Biomedical Engineering, April 2005, pp. 711-727.

Carver Mead, neuromorphic engineering's founding father, described its goals and approach in "Neuromorphic Electronic Systems," Proceedings of the IEEE, October 1990, pp. 1629-1636. More recently, Proceedings devoted a July 2001 article to neuromorphic engineering.

This article is for IEEE members only. Join IEEE to access our full archive.

Join the world’s largest professional organization devoted to engineering and applied sciences and get access to all of Spectrum’s articles, podcasts, and special reports. Learn more →

If you're already an IEEE member, please sign in to continue reading.

Membership includes:

  • Get unlimited access to IEEE Spectrum content
  • Follow your favorite topics to create a personalized feed of IEEE Spectrum content
  • Save Spectrum articles to read later
  • Network with other technology professionals
  • Establish a professional profile
  • Create a group to share and collaborate on projects
  • Discover IEEE events and activities
  • Join and participate in discussions