A Soft, Wearable Brain–Machine Interface

Imperceptible micro-needles and flexible circuits improve neural signal recording

3 min read
Two cross straps meet at the crown of a head, where a circuit board is placed. Another strap circles the head.
Georgia Institute of Technology

Even though brain–machine or brain–computer interfaces (BMI/BCI) have come a long way since Hans Berger discovered the presence of electrical activity in the human brain in 1924, seamless communication between our brains and machines remains a holy grail of computer science. The past few years have seen incredible advances in non-invasive wearable BMI research in terms of helping disabled or paralyzed people move again, control robotic prosthetics, or command computers by the power of thought, and give blind people bionic vision.

Now, a new study from an international team of scientists has designed a BMI system that can control a robotic arm or wheelchair by simply imagining the action via a detection system that is easier to wear than earlier systems. The system comprises a soft wireless scalp electronic system, which uses electro-encephalography (EEG) to read and translate neural signals from a human brain into action.

The soft wearable scalp used in Yeo and his colleagues' BMI system is something the team has been working on for two years. Unlike conventional EEG devices, Yeo says, this one doesn't have a bunch of wires, metal electrodes and so on. "It has miniaturized, imperceptible micro-needle electrodes, and flexible circuits with stretchable interconnectors." This, he adds, gives their system a better form factor and better signal acquisition.

Being both flexible and soft, the EEG scalp can be worn over hair and requires no gels or pastes to keep in place. The improved signal recording is largely down to the micro-needle electrodes, invisible to the naked eye, which penetrate the outermost layer of the skin. "You won't feel anything because [they are] too small to be detected by nerves," says Woon-Hong Yeo of the Georgia Institute of Technology. In conventional EEG set-ups, he adds, any motion like blinking or teeth grinding by the wearer causes signal degradation. "But once you make it ultra-light, thin, like our device, then you can minimize all of those motion issues."

The team used machine learning to analyze and classify the neural signals received by the system and identify when the wearer was imagining motor activity. That, says Yeo, is the essential component of a BMI, to distinguish between different types of inputs. "Typically, people use machine learning or deep learning… We used convolutional neural networks." This type of deep learning is typically used in computer vision tasks such as pattern recognition or facial recognition, and "not exclusively for brain signals," Yeo adds. "We are just getting the benefits of the deep learning mechanism itself."

A ten by ten array of pyramid-shaped needs Micro-needle electrodesGeorgia Institute of Technology

The researchers also used virtual reality (VR) to simulate action. Since the system is based on motor imagery, the VR component acts as a visual cue and "is sort of helping a user to imagine better, by showing hands or feet," Yeo says. The data showed that this, in fact, enhanced signal quality as well.

The portable BMI system was able to record real-time, high-quality motor imagery activity, and the four human subjects—all able-bodied people—were able to complete their VR exercises by thinking about them. Despite an accuracy rate of 93.22 ± 1.33 percent, Yeo says there are still many challenges ahead.

"The major limitation [of non-invasive BMIs] is that we are measuring signals on the skin, through the skull, through the tissues," he says, "So I believe we have to continuously improve our device quality to get better signals. And at the same time, we have to also continuously improve our data analysis…to have a better accuracy rate." Also, in the current experiment, the researchers played with only four classes for inputs. "I'd love to expand it to a more than 10 inputs." The team is also awaiting authorization to test the system on disabled human subjects.

The Conversation (0)

The Lies that Powered the Invention of Pong

A fake contract masked a design exercise–and started an industry

4 min read
Pong arcade game in yellow cabinet containing black and white TV display, two knobs are labeled Player 1 and Player 2, Atari logo visible.
Roger Garfield/Alamy

In 1971 video games were played in computer science laboratories when the professors were not looking—and in very few other places. In 1973 millions of people in the United States and millions of others around the world had seen at least one video game in action. That game was Pong.

Two electrical engineers were responsible for putting this game in the hands of the public—Nolan Bushnell and Allan Alcorn, both of whom, with Ted Dabney, started Atari Inc. in Sunnyvale, Calif. Mr. Bushnell told Mr. Alcorn that Atari had a contract from General Electric Co. to design a consumer product. Mr. Bushnell suggested a Ping-Pong game with a ball, two paddles, and a score, that could be played on a television.

Keep Reading ↓ Show less