This article is part of our exclusive IEEE Journal Watch series in partnership with IEEE Xplore.
Brain-chip technology is quickly accelerating. In one of the latest advancements, researchers have designed a new chip that uses larger groups of neurons and less power to detect when a user wants to initiate a given behavior—for example, reaching for an object. The new approach, if it translates into humans, could theoretically provide users with more autonomy in initiating movement control.
Implanted systems known as intracortical brain-computer interfaces (iBCIs) are a game changer for many people with paralysis, providing them with a means to regain some movement control. iBCIs work by inserting electrode arrays into the brain to record neural activity. Because our neurons naturally communicate with one another using electric pulses, these brain chips are able to detect the electrical signals.
“The detected [signals] are then used by BCI applications to interpret neural activity and translate it into commands, such as controlling a computer cursor or a robotic limb,” explains Daniel Valencia, a researcher at San Diego State University.
Current iBCIs monitor individual neurons in the brain. However, doing so continuously is energy intensive, and it can be difficult to discern whether a signal is indeed coming from the neuron being monitored, or a neighboring neuron that has a similar electrical firing pattern. It takes a lot of power consumption for brain chips to analyze the data, sift through all the “background noise,” and pinpoint true neural firings. Due to this high power consumption, most current iBCIs tend to only be turned on manually during predefined periods, such as clinical or laboratory sessions.
Valencia and his colleagues were interested in creating a different kind of system that passively monitors brain activity and automatically switches on when needed. Instead of monitoring individual neurons, their proposed chip monitors the general activity of a cohort of neurons, or their local field potentials (LFPs).
An Efficient Solution
This approach involves a simpler process, detecting the frequency at which a collection of neurons in a given region of the brain is firing. When certain thresholds of neural activity are hit, the brain chip is switched on. For example, when people are sleeping, the LFP of neurons exhibit increased activity in the 30-to-90-hertz range, but when preparing to move, there is an increase in activity in the 15-to-35-Hz range. The chip proposed by Valencia and his colleagues therefore would only activate when, presumably, the brain activity of a user indicates the user wants to move an object.
In a study published in the February print issue of IEEE Transactions on Biomedical Circuits and Systems, the researchers tested their new LFP approach using previously recorded datasets of neural activity from animals performing movement tasks. They used the data and modeling to determine how much energy is required in their LFP approach compared to conventional brain chips that monitor individual neurons.
The results show that the two approaches are comparable in terms of determining the intentions of a user—conventional brain chips slightly outperformed the LFP approach—but the LFP approach uses significantly less power, which Valencia notes is a key advantage. “Additionally, the recording circuitry needed for LFPs is much simpler compared to [conventional] methods, which reduces hardware complexity,” he says. For instance, brain chips based on LFPs may not require the use of deeply penetrating micro-electrodes, significantly reducing the chance of tissue scarring in the brain and potentially increasing the longevity of the device.
Importantly, this new proposed system would allow users to complete tasks autonomously and more easily, without having to manually activate their brain chip. Many scientists in the field of iBCI design are interested in developing these more advanced, “self-paced” iBCIs. “Our work is a step toward developing these systems, allowing users to control their engagement independently,” says Amir Alimohammad, a professor at San Diego State University who was involved in the study.
Alimohammad adds that his team is currently working on integrating their LFP approach that predicts a user’s intentions within a broader iBCI system that also uses data from single neurons firing. Whereas the LFP data could be used to activate the system, detailed data from individual neurons could be used to execute more precise motor control, he says.
- Brain-Computer Interface Eavesdrops on a Daydream ›
- Next-Gen Brain Implant Uses a Graphene Chip ›
- UC Davis Hosts BCI Speech Prediction Contest - IEEE Spectrum ›
Michelle Hampson is a freelance writer based in Halifax. She frequently contributes to Spectrum's Journal Watch coverage, which highlights newsworthy studies published in IEEE journals.



