What will it take for brain implants to become standard-issue tools for people who are paralyzed? When will they be able to use neural commands to type words or drive motorized wheelchairs?
Research published today the journal Science Translational Medicine might point the way. Scientists and engineers who are part of the BrainGate project reveal that they have designed a better decoder to make sense of electric signals from the brain. Their crucial advance: software that compensates for the irregular nature of those neural signals.
The team is working to make their neural implant not only a functional mind-reading device, but also a practical one that paralyzed people could use in their homes. The system’s electrodes are implanted in the motor cortex, where they pick up electric signals from neurons involved in issuing movement commands to the body. In experiments over the last decade, the project’s volunteers have imagined moving their paralyzed arms to control external devices like a robotic arm and a computer cursor.
In the newest set of experiments, the researchers showed off their improved decoding software that turns the brain’s electrical signals into commands. Previously, researchers had to stop their experiments frequently to recalibrate the software, because the electrical signals that the electrodes pick up can vary dramatically over the course of an hours-long session.
In prior sessions using the old software, the researchers would spend the first 10 to 30 minutes calibrating the system, essentially teaching it which neural signals translated into which movement commands.
“Then we’d let the participant use it for something practical for 30 minutes or maybe an hour, but then the signal would degrade,” explains Beata Jarosiewicz, lead author of the new paper and an assistant professor at Brown University.
The researchers would then have to make a decision: Should they spend another 10 to 30 minutes recalibrating the system, or call a halt? For BrainGate to become a practical home-use technology, clearly it can’t require users to stop what they’re doing every half hour for recalibration, says Jarosiewicz.
The electrical signals change during a session for two main reasons. Here’s the first reason in highly technical terms: “The brain is kinda squishy,” Jarosiewicz says. Neural tissue shifts slightly when people move their bodies and even as their hearts beat, so stiff electrodes implanted in the tissue come into contact with different brain cells, which are producing different electrical signals. “Even movements on the order of a few microns is enough to change the signal that we’re recording,” says Jarosiewicz.
Signal instabilty also stems from the environment in which recording takes place. The BrainGate team often conducts experiments in participants’ homes to see how their gear functions in real-world settings, so the system can pick up electromagnetic noise from nearby electronics. “Someone might turn on the vacuum cleaner in the other room,” Jarosiewicz says. Suddenly, a signal that used to indicate a certain cursor movement could be obscured.
The primary trick behind the improved decoding software: Each time the user pauses—say at the end of a sentence—the system recalibrates itself, matching the words and letters selected in the sentence to the set of neural recordings from that time span. With this technique, called “retrospective target inference,” it’s constantly relearning which signals translate into which commands. As the signals change, it adjusts accordingly. The video below gives a brief explanation and demonstration.
One participant with Lou Gehrig’s disease used this improved decoder with the typing interface, and showed that it provided good control over the course of six sessions spaced out over 42 days. Jarosiewicz says the next step is to use the decoder not just to control a cursor for the typing program, but to control a computer mouse. With that ability, users could control just about anything that’s connected to the internet. They could find autonomy thanks to the Internet of Things.
There’s still one big stumbling block before someone who’s locked-in can use the BrainGate system to communicate freely or operate robotic assistants. The current implant must be physically connected via cables to a computer, so a technician has to help the user get jacked in. But Jarosiewicz notes that another neural engineer at Brown is now working on a wireless system. “We want people to have the system available 24-7,” she says.
Senior Editor Eliza Strickland joined IEEE Spectrum in March 2011 and was initially assigned the Asia beat. She got down to business several days later when the Fukushima Daiichi nuclear disaster began. Strickland shared a Neal Award for news coverage of that catastrophe and wrote the definitive account of the accident's first 24 hours. She next moved to the biomedical engineering beat and managed Spectrum's 2015 special report, “Hacking the Human OS." That report spawned the Human OS blog about emerging technologies that are enabling a more precise and personalized kind of medicine. The blog reports on wearable sensors, big-data analytics, and neural implants that may turn us all into cyborgs. Over the years, Strickland watched as artificial intelligence (AI) technology made inroads into the biomedical space, reporting on crossovers between AI and neuroscience research and IBM Watson's ill-fated efforts in AI health care. These days she oversees Spectrum's coverage of all things AI. Strickland has reported on science and technology for nearly 20 years, writing for such publications as Discover,Nautilus, Sierra, Foreign Policy, and Wired. She holds a master's degree in journalism from Columbia University.