21 July 2009—Our brains have a remarkable ability to assimilate motor skills that allow us to perform a host of tasks almost automatically—driving a car, riding a bicycle, typing on a keyboard. Now add another to the list: operating a computer using only thoughts.
Researchers at the University of California, Berkeley, have demonstrated how rhesus monkeys with electrodes implanted in their brains used their thoughts to control a computer cursor. Once the animals had mastered the task, they could repeat it proficiently day after day. The ability to repeat such feats is unprecedented in the field of neuroprosthetics. It reflects a major finding by the scientists: A monkey’s brain is able to develop a motor memory for controlling a virtual device in a manner similar to the way it creates such a memory for the animal’s body.
The new study, which should apply to humans, provides hope that physically disabled people may one day be able to operate advanced prosthetics in a natural, effortless way. Previous research in brain-machine interfaces, or BMIs, had already shown that monkeys and humans could use thought to control robots and computers in real time. But subjects weren’t able to retain the skills from one session to another, and the BMI system had to be recalibrated every session. In this new study, monkey do, monkey won’t forget.
”Every day we just put the monkeys to do the task, and they immediately recalled how to control the device,” says Jose Carmena, an IEEE senior member and professor of electrical engineering, cognitive science, and neuroscience who led the study. ”It was like ’plug and play.’”
Carmena and Karunesh Ganguly, a postdoc in Carmena’s lab, describe their work in a paper today in PLoS Biology.
The findings may ”change the whole way that people have thought about how to approach brain-machine interfaces,” says Lena Ting, a professor of biomedical engineering at Emory University and the Georgia Institute of Technology, in Atlanta. Previous research, she explains, tried to use the parts of the brain that operate real limbs to control an artificial one. The Berkeley study suggests that an artificial arm may not need to rely on brain signals related to the natural arm; the brain can assimilate the artificial device as if it were a new part of the body.
Krishna Shenoy, head of the Neural Prosthetic Systems Laboratory, at Stanford University, says the study is ”beautiful,” adding that the ”day-over-day learning is impressive and has never before been demonstrated so clearly.”
At the heart of the findings is the fact that the researchers used the same set of neurons throughout the three-week-long study. Keeping track of the same neurons is difficult, and previous experiments had relied on varying groups of neurons from day to day.
The Berkeley researchers implanted arrays of microelectrodes on the primary motor cortex, about 2 to 3 millimeters deep into the brain, tapping 75 to 100 neurons. The procedure was similar to that of other groups. The difference was that here the scientists carefully monitored the activity of these neurons using software that analyzed the waveform and timing of the signals. When they spotted a subset of 10 to 40 neurons that didn’t seem to change from day to day, they’d start the experiment; several times, one or more neurons would stop firing, and they’d have to restart from scratch. But the persistence paid off.
Monitoring the neurons, the scientists placed the monkey’s right arm inside a robotic exoskeleton that kept track of its movement. On a screen, the monkey saw a cursor whose position corresponded to the location of its hand. The task consisted of moving the cursor to the center of the screen, waiting for a signal, and then dragging the cursor onto one of eight targets in the periphery. Correct maneuvers were rewarded with sips of fruit juice. While the animal played, the researchers recorded two data sets—the brain signals and corresponding cursor positions.