The March 2024 issue of IEEE Spectrum is here!

Close bar

Mastering the Brain-Computer Interface

At Johns Hopkins University, engineers are learning to translate between the neural signals of the brain and the machine language of a prosthetic arm

4 min read

The first human clinical trials of a brain implant intended to control a prosthetic arm are slated for 2009 at the Johns Hopkins University Biomedical Instrumentation and Neuroengineering Laboratory, in Baltimore. In one brightly lit room, a young volunteer named Rob Rasmussen sits with his head strapped into a tight-fitting cap, inside which 64 electrodes monitor his brain waves. The electrodes detect the electrical activity caused by neurons firing inside the motor areas of his brain and send the raw impulses to a nearby instrument to be digitized. The digitized signals are translated into real-time traces that scrawl across two wide-screen monitors. One of the monitors shows the 64 simultaneous channels of brain-wave recordings. The other, larger monitor is devoted to two entirely different traces--those of the mu bands. These are the keys to controlling a prosthetic arm with the mind.

Mu bands are an abstract feature of the brain waves picked up by the electrodes: they provide a broad reflection of what's happening in the motor areas of the brain. In this case, they characterize what Rasmussen is thinking about doing with his hands. The mu bands maintain a regular rhythm that desynchronizes in the left side of the brain when you wiggle a finger or arch your foot on the right side of your body (and vice versa). That rhythm also responds the same way--and this is key--to merely thinking about doing those things. So to disturb the waves of his mu bands, Rasmussen thinks about moving his hands. To let the waves return to their natural rhythm, he stops thinking about moving his hands. His actual hands are resting lightly on the arms of his chair. They don't even twitch.

For all their vagueness and abstraction, the mu bands are a tremendously useful tool. They change predictably and reliably, and a volunteer can be trained to manipulate them at will. Rasmussen has been training for almost four years to control his with the help of a virtual environment, which provides a window into his mind: he can see his mu bands' rhythm on the screen in front of him as two perpetually cresting waves--one for his left hand, the other for his right. When he thinks about moving his hands, he desynchronizes the mu bands, and the waves on the screen flatten. When he stops thinking about moving his hands, the mu bands' rhythm is synchronized again, and the waves on the screen rise. Next to the graphs, the animated figure of a man wearing blue shorts opens and closes its hand in concert with Rasmussen's intentions. A mechanical hand on the desk nearby opens and closes noisily in tandem with the hand on the screen. This combination of hardware and software, which amplifies and processes Rasmussen's thoughts in real time and uses them to control both the virtual and the mechanical hand, is called a brain-computer interface.

The brain-computer interface (BCI) is the key to controlling a mechanical arm being developed by the Defense Department in a kind of ”Manhattan Project” to create the next generation of neurally controlled prostheses. The Revolutionizing Prosthetics 2009 effort, funded by the Defense Advanced Research Projects Agency, spans the United States, Canada, and Europe, including brain-penetrating electrodes developed at the University of Utah nerve surgery at the Rehabilitation Institute of Chicago, and muscle-injectable electrodes produced at Sigenics, also in Chicago.

But it's in Nitish Thakor's Baltimore lab at Johns Hopkins University where researchers are working on what might arguably be the most important piece of this neuroprosthetics puzzle: the bridge from the mind to the mechanical arm. For their part of the DARPA program, Thakor's team is looking at different ways to interface with an artificial limb, including invasive brain-penetrating electrodes. Researchers agree that the degree of control amputees can expect to have depends on the invasiveness of the methods they are willing to tolerate. Right now the only way into the brain is to literally go inside it with penetrating electrodes. But brain surgery is risky, and within a year of electrode implantation, the brain's defensive mechanisms will kick into gear, and a team of protective astrocytes and glial cells will seal off the foreign object inside a thick white capsule. Not a bad strategy for the brain, but it completely blocks access to the neuronal spikes that could control a prosthetic limb. So in addition to their DARPA-funded work with invasive electrodes, Thakor's team is also looking independently at tapping surface electrical impulses to control mechanical devices.

What all brain-computer interfaces have in common, regardless of their level of invasiveness, are the algorithms that translate between brain and microchip, turning analog intentions into binary machine language. To get the virtual man's hand to make a fist, for example, Rasmussen thinks of playing the piano. But even a broad signal, such as what appears when Rasmussen is thinking of moving his arm, is extremely difficult to derive. Think of a surface EEG's electrodes as 64 microphones in a symphony hall, recording an enormous 700-piece orchestra that's playing the song of the brain. The movement of your hands is equivalent to a part played by two of the violinists. Distinguishing the neural signals that accompany the intention to move your arm from hundreds of other, simultaneous neural functions is like trying to isolate those violinists from the rest of the orchestra. The researchers aren't interested in the entire song; in fact, they're not even interested in what the violinists are playing. They're just looking for the violinists' pitch, which represents the mu bands Rasmussen can control.

But Rasmussen's not thinking of some vague, abstract concept, says biomedical engineering graduate student Soumyadipta Acharya. ”He's actually thinking of moving his hand. It's quite intuitive.” The gentle slope on the computer screen flexes tentatively down until Rasmussen manages to hold it in a quivering flat line. The animated man in blue shorts opens his hand, and the mechanical hand on the desk springs open.

When Acharya instructs him to close the hand, Rasmussen must now raise his mu bands. To do that, he must stop thinking of playing the piano. The fluorescent-lit room is silent except for Acharya's rhythmic commands: ”Open...close...open...close.” Rasmussen, silent and concentrating, watches his mu band until it slowly rises, only to flatten defiantly. He is starting to look tired; operating the BCI is strenuous. The hand stays open. ”Close,” Acharya reminds him. Nothing happens. ”Close the hand,” Acharya prods. ”Get that mu band up.” Rasmussen stares at the screen again and then at the ceiling, and finally the band rises into a peak. The hand closes.

To Probe Further

A detailed look at the brain-machine interface now being developed at Johns Hopkins University: /video?id=360

This article is for IEEE members only. Join IEEE to access our full archive.

Join the world’s largest professional organization devoted to engineering and applied sciences and get access to all of Spectrum’s articles, podcasts, and special reports. Learn more →

If you're already an IEEE member, please sign in to continue reading.

Membership includes:

  • Get unlimited access to IEEE Spectrum content
  • Follow your favorite topics to create a personalized feed of IEEE Spectrum content
  • Save Spectrum articles to read later
  • Network with other technology professionals
  • Establish a professional profile
  • Create a group to share and collaborate on projects
  • Discover IEEE events and activities
  • Join and participate in discussions