What will it take for brain implants to become standard-issue tools for people who are paralyzed? When will they be able to use neural commands to type words or drive motorized wheelchairs?
Research published today the journal Science Translational Medicine might point the way. Scientists and engineers who are part of the BrainGate project reveal that they have designed a better decoder to make sense of electric signals from the brain. Their crucial advance: software that compensates for the irregular nature of those neural signals.
The team is working to make their neural implant not only a functional mind-reading device, but also a practical one that paralyzed people could use in their homes. The system’s electrodes are implanted in the motor cortex, where they pick up electric signals from neurons involved in issuing movement commands to the body. In experiments over the last decade, the project’s volunteers have imagined moving their paralyzed arms to control external devices like a robotic arm and a computer cursor.
In the newest set of experiments, the researchers showed off their improved decoding software that turns the brain’s electrical signals into commands. Previously, researchers had to stop their experiments frequently to recalibrate the software, because the electrical signals that the electrodes pick up can vary dramatically over the course of an hours-long session.
In prior sessions using the old software, the researchers would spend the first 10 to 30 minutes calibrating the system, essentially teaching it which neural signals translated into which movement commands.
“Then we’d let the participant use it for something practical for 30 minutes or maybe an hour, but then the signal would degrade,” explains Beata Jarosiewicz, lead author of the new paper and an assistant professor at Brown University.
The researchers would then have to make a decision: Should they spend another 10 to 30 minutes recalibrating the system, or call a halt? For BrainGate to become a practical home-use technology, clearly it can’t require users to stop what they’re doing every half hour for recalibration, says Jarosiewicz.
The electrical signals change during a session for two main reasons. Here’s the first reason in highly technical terms: “The brain is kinda squishy,” Jarosiewicz says. Neural tissue shifts slightly when people move their bodies and even as their hearts beat, so stiff electrodes implanted in the tissue come into contact with different brain cells, which are producing different electrical signals. “Even movements on the order of a few microns is enough to change the signal that we’re recording,” says Jarosiewicz.
Signal instabilty also stems from the environment in which recording takes place. The BrainGate team often conducts experiments in participants’ homes to see how their gear functions in real-world settings, so the system can pick up electromagnetic noise from nearby electronics. “Someone might turn on the vacuum cleaner in the other room,” Jarosiewicz says. Suddenly, a signal that used to indicate a certain cursor movement could be obscured.
The primary trick behind the improved decoding software: Each time the user pauses—say at the end of a sentence—the system recalibrates itself, matching the words and letters selected in the sentence to the set of neural recordings from that time span. With this technique, called “retrospective target inference,” it’s constantly relearning which signals translate into which commands. As the signals change, it adjusts accordingly. The video below gives a brief explanation and demonstration.
One participant with Lou Gehrig’s disease used this improved decoder with the typing interface, and showed that it provided good control over the course of six sessions spaced out over 42 days. Jarosiewicz says the next step is to use the decoder not just to control a cursor for the typing program, but to control a computer mouse. With that ability, users could control just about anything that’s connected to the internet. They could find autonomy thanks to the Internet of Things.
There’s still one big stumbling block before someone who’s locked-in can use the BrainGate system to communicate freely or operate robotic assistants. The current implant must be physically connected via cables to a computer, so a technician has to help the user get jacked in. But Jarosiewicz notes that another neural engineer at Brown is now working on a wireless system. “We want people to have the system available 24-7,” she says.
Eliza Strickland is a senior editor at IEEE Spectrum, where she covers AI, biomedical engineering, and other topics. She holds a master’s degree in journalism from Columbia University.