The December 2022 issue of IEEE Spectrum is here!

Close bar

Enhanced Imagination Drives Brain-Computer Interface

Brain activity indicates that humans quickly engage and adapt to BCIs by imagining the movements of the devices they control.

2 min read
Enhanced Imagination Drives Brain-Computer Interface

It's been clear since brain computer interfaces were developed, that customizing these devices would require learning both on the part of the machine and the human. New research in the Proceedings of the the Academy of the Sciences gives evidence that humans quickly adapt to BCIs.

A team of neurologists and computer scientists at the University of Washington recruited epilepsy patients awaiting surgery and recorded their brain activity with electrocorticography (electrodes attached to the surface of the brain) before and after they manipulated a simple BCI. You can find the full article here, to the right of the press release.

First of all, here's what they did. They recorded during three circumstances: when patients imagined moving their hand, when they actually moved it, and when they moved a computer cursor by manipulating a BCI. The activity during the imagined task mapped roughly onto the recordings from the actual movement, but were less powerful. When the patients hooked up to the BCI, the pattern was again similar, but the signal much stronger than both the other recordings.

The press release pitched this as evidence that BCIs are a "workout" for the brain. I don't completely buy this. The brain isn't a muscle and more activity doesn't necessarily mean it's operating at a higher level. What it does indicate (to me), and what I find far more interesting, is that people can quickly change their brain activity to accommodate BCIs. It also shows how important visual feedback is to people who are manipulating these devices. Experiments like this seem like a good way to maximize the level of feedback a user is getting and to test out different ways of delivering it.

It's also substantial proof that the brain activity produced when we imagine a movement or task can effectively drive BCIs. Every group that's developing BCIs right now is doing it slightly differently. So far, there is no clear consensus on which brain signals should be used.

This is the first paper I've seen that focused fully on what brain activity looks like when it's manipulating a BCI. The output of the setup was a cursor moving on a screen. The experiment is a good indication that BCIs have become well enough understood that we can use them in experiments as tools to once again study the brain itself.

That being said, there are also some really interesting things to be learned from this article about the brain in general and the difference between imagining and actuating movement. Here are a couple points that may get you to read it and some questions you can respond to if you do.

1. During both tasks, high frequency signals increase while low frequency signals decrease. Does this mean that part of attending to a task is muting some of the competing activity?

2. Of these two, it is the signal that decreases which map similarly in both imagery and movement. Does this mean you could further localize an area that controls movement imagery in the high frequency signals?

The Conversation (0)

Are You Ready for Workplace Brain Scanning?

Extracting and using brain data will make workers happier and more productive, backers say

11 min read
A photo collage showing a man wearing a eeg headset while looking at a computer screen.
Nadia Radic

Get ready: Neurotechnology is coming to the workplace. Neural sensors are now reliable and affordable enough to support commercial pilot projects that extract productivity-enhancing data from workers’ brains. These projects aren’t confined to specialized workplaces; they’re also happening in offices, factories, farms, and airports. The companies and people behind these neurotech devices are certain that they will improve our lives. But there are serious questions about whether work should be organized around certain functions of the brain, rather than the person as a whole.

To be clear, the kind of neurotech that’s currently available is nowhere close to reading minds. Sensors detect electrical activity across different areas of the brain, and the patterns in that activity can be broadly correlated with different feelings or physiological responses, such as stress, focus, or a reaction to external stimuli. These data can be exploited to make workers more efficient—and, proponents of the technology say, to make them happier. Two of the most interesting innovators in this field are the Israel-based startup InnerEye, which aims to give workers superhuman abilities, and Emotiv, a Silicon Valley neurotech company that’s bringing a brain-tracking wearable to office workers, including those working remotely.

Keep Reading ↓Show less