How We Won Gold in the Cyborg Olympics’ Brain Race

With our brain-computer interface, paralyzed athletes sped their avatars across the finish line

10 min read
Photo of Eric Anselmo wearing the EEG cap.
Photo: Nicola Pitaro/ETH Zürich

img Gold Medalist: Numa Poujouly took top honors in the Cybathlon’s brain-computer interface race. Photo: Nicolas Brodard

In October 2016, inside a sold-out arena in Zurich, a man named Numa Poujouly steered his wheelchair up to the central podium. As the Swiss national anthem played, organizers of the world’s first cyborg Olympics hung a gold medal around Poujouly’s neck. The 30-year-old, who became paralyzed after a bicycle accident in his teens, had triumphed in the tournament’s most futuristic event: a video-game-like race in which the competitors controlled their speeding avatars with just their minds.

The Cybathlon was created as a bionic version of the Paralympic Games. In six different events, people with disabilities skillfully employed high-tech assistive devices to make their way around racetracks or through obstacle courses. The arena crowd cheered as paralyzed competitors walked in robotic exoskeleton suits and amputees manipulated objects using powered prosthetic limbs.

The visionaries behind the Cybathlon, led by Robert Riener of the Swiss Federal Institute of Technology (ETH), in Zurich, had several objectives. First, taking a page from the XPrize Foundation, they wanted to use competition to accelerate the development of assistive technologies. The organizers also wanted end users of the technology to work alongside engineers, clinicians, and entrepreneurs in designing the gear. Finally, they wanted to raise awareness of cutting-edge technologies that may soon give people with disabilities remarkable abilities.

Our group in the brain-machine interface lab at the other Swiss Federal Institute of Technology (EPFL), in Geneva, fielded the Brain Tweakers team to compete in the Cybathlon’s Brain-Computer Interface (BCI) Race. Gold medalist Poujouly was one of our two pilots, both of whom propelled their avatars through a racecourse by mind power alone. As with any BCI system, signals recorded from the pilot’s brain were fed into a decoding algorithm that translated the signals into commands. In this case, the commands were used to control the pilot’s avatar, but other BCI systems could be used by people with severe motor impairments to control a variety of devices—including wheelchairs, prosthetic limbs, and computer cursors.

The Cybathlon criteria allowed for only noninvasive BCI systems, which use electrodes placed on the scalp to pick up neural signals; experimental systems that use electrodes implanted in the brain tissue weren’t permitted. Noninvasive systems pick up noisier neural signals, but paired with the right signal processing software, they can yield information that is good enough to work with—and they don’t bring along the medical risks of brain surgery.

Teamwork was key to our success. We developed the technology with feedback from Poujouly and our other pilot, Eric Anselmo; they helped us determine the mental commands that gave them intuitive and reliable control of their avatars. Our system also created a symbiotic relationship between man and machine, in which the pilots and the BCI software adapted to each other during the training process. Both of our pilots made it to the final race, and Poujouly’s gold medal offers definitive proof that we had a winning strategy.

The Cybathlon’s organizers designed the BCI race to be an entertaining spectacle that would get the audience cheering for the competitors. Yet it was also carefully planned to assess those BCI features that must be improved to make truly practical brain-controlled technology, whether it’s steerable telepresence robots or communication software that lets people type by mentally moving a computer cursor to pick out letters.

Any BCI requires sophisticated signal processing and statistical machine-learning techniques to extract information from brain signals, and this software must be paired with a calibration and training process that helps the user improve the signal-to-noise ratio. To become more than a lab novelty, a BCI must also decode the user’s intentions accurately and quickly, avoid unintentional commands, and maintain the same level of performance for long periods of time.

Our lab has been working for years on BCIs that allow people to control devices and interact with the environment, and our pilots were eager to help accelerate the development of this technology. The 48-year-old Anselmo has been paralyzed since the age of 21, when a car crash injured his spinal cord. Like Poujouly, he is tetraplegic, with no sensation or muscle control below the waist and limited control of his arms. To navigate their daily lives, the two men use wheelchairs and other assistive devices, including smartphones. But they’re frustrated with even state-of-the-art technologies and eager for new devices that would enable them to type faster and manipulate objects with more dexterity.

In the Cybathlon’s BrainRunners race, the racetrack displayed on screen consisted of 16 color-coded and randomly arranged segments. As an avatar reached a new segment, the pilot had to send one of three distinct mental commands: a “run faster” command on greenish-blue sections; a “jump” command on purple sections, which were filled with protruding blocks; and a “slide” command on yellow sections, which contained electric fences. Sending the wrong command caused the avatar to slow down. A fourth type of section, colored gray, required pilots to issue no command at all; the pilots had to control their thoughts, and the BCI software had to avoid interpreting any stray neural signals as a command.

The race cleverly tested the BCI systems’ ability to provide an adequate number of different commands and to deliver them with precise timing. The challenge was made greater by the racing conditions: Pilots had to maintain concentration in a noisy arena and cope with the psychological pressure of performing in front of a crowd. If the pilots could manage the technology in that hectic environment, the organizers hypothesized, they could probably handle similar systems in daily life.

Photo of Eric Anselmo wearing the EEG cap. Finish Line: Eric Anselmo celebrates after powering his avatar through the BrainRunners race. Photo: Nicola Pitaro/ETH Zürich

To capture our pilots’ brain signals, we used a lightweight EEG (electroencephalography) cap that was studded with electrodes and a specialized biosignal amplifier. These components are commercially available and often used in neuroscience research. The EEG cap measures the electrical activity of millions of brain cells, but the signals it records through skin, bone, and tissue are fairly messy. So we worked with our pilots to find strong signals that they could generate reliably, and we customized our BCI signal processing software to use their signals effectively in the racing application.

To find robust neural signals, our pilots imagined performing specific movements with their limbs. During these mental motions, the EEG electrodes picked up changes in the rhythmic patterns of electrical activity in the motor cortex, the region of the brain that controls voluntary movement. Each person’s brain pattern is as unique as a fingerprint, so our pilots’ signals originated in different locations in the motor cortex and varied in signal frequency (typically between 8 and 30 hertz). Our BCI software had to recognize each pilot’s distinct EEG patterns so that we could associate their motor-imagery tasks with commands for their avatars.

To tweak our signal processing software for the Cybathlon, we started by filtering out electric signals from the eyes and head muscles, which the EEG electrodes also pick up. This filtering both refined the neural signal and complied with the Cybathlon’s rules; the organizers stressed that controlling the BCI with eye or muscle signals would be considered cheating. We also had to rework our algorithm to manage three commands (our previous experiments used only two) and to recognize an “idle” state, which is often overlooked in experimental systems but is vital for real-world applications.

The most important component of our victory, however, was our “mutual learning” approach for pilot training—a methodology in which the user and the BCI adapt to each other to achieve optimal performance. This is when feedback from our pilots became most essential.

Photo: Erik Tham

To read our racers’ brain signals, we used a cap with 16 scalp electrodes.

Our research team had previously worked on BCI experiments with pilot Anselmo, and he began training with the racing BCI six months before the Cybathlon. Poujouly had never used a BCI when he began training three months before the event. We conducted the training sessions in the pilots’ homes, starting with weekly 2-hour sessions and increasing the frequency as the competition approached.

Our first task was to calibrate the BCI system to recognize Anselmo’s and Poujouly’s neural patterns. We asked both pilots to imagine moving their right hand, left hand, or both feet while the BCI software made models of the associated brain patterns and linked them to the three commands that would be used in the BrainRunners game. But after a few sessions, we realized that these three imagined movements didn’t provide sufficiently distinct and robust signals. So we tried another tactic. We calibrated the BCI system to recognize just two brain patterns: the imagined movement of both hands and both feet. To create a third identifiable signal, the pilots imagined a quick sequence of movements, moving first both hands and then both feet.

After the initial calibration, it was time for the pilots to train. They started with a simple nonracing task, in which they looked at a computer screen and tried to move a bar up and down using their neural signals. With this continuous visual feedback about their performance, they gradually learned to produce more distinctive patterns. As the pilots got better at generating clear signals, we periodically recalibrated the BCI system to reflect each pilot’s altered brain activity.

Once the pilots and software were working in harmony, we focused on getting the pilots accustomed to the BrainRunners game. We linked the both-hands movement to the “run faster” command, the both-feet movement to “jump,” and the hands-then-feet movement to the “slide” command.

tech art Brain to Game: In the BCI system we devised for the Cybathlon’s BrainRunners game, the racers imagined moving both their hands, both their feet, or their hands and feet in quick sequence. Those visualizations provided three distinct neural signals that the BCI software linked to three game commands (run faster, jump, and slide) that had to be employed during specific sections of the racecourse. The game also included sections where the racers had to carefully control their thoughts and avoid sending commands to their avatars. Illustration: James Provost

In the initial training races, our pilots competed against a single opponent controlled by the computer. Anselmo completed 182 such races, with an average completion time of 127 seconds and an all-time record of 83 seconds. Poujouly did 57 training races with an average time of 130 seconds and a record of 86 seconds. At this stage we further personalized the BCI for each pilot by finding the optimal parameters that balanced accuracy and speed of command. Finally, in the month before the Cybathlon, we conducted two joint training sessions in our lab. In these two face-off events, our pilots raced against each other while we surrounded them and cheered them on, trying to emulate the feel of a real competition.

When we got to the Zurich arena in October, it became clear that our training protocol was effective. Our pilots were able to replicate their excellent performances in the official Cybathlon races, where they were matched up against other determined competitors, and kept their cool as hundreds of spectators looked on. Anselmo and Poujouly won their qualification races to advance to the final with the two best performances in the BCI competition: 90 and 123 seconds, respectively. Although Anselmo had a performance lapse during the final race that cost him a medal, Poujouly raced at his regular pace to capture the gold—and to capture the glory for the whole Brain Tweakers team. He won with a time of 125 seconds, coming in 31 seconds before the next competitor!

While winning the Cybathlon race was a great accomplishment, our team knows that the greater challenge will be transforming this experimental technology into devices that can be of real use to people with disabilities.

Our pilots have helped us understand the limitations of today’s BCI technologies. They note that the EEG cap takes at least 10 minutes to set up and isn’t exactly inconspicuous. A typical EEG system requires the application of gel under each electrode to facilitate conductivity through the scalp, so the user needs help to get everything in place. But researchers are making advances with dry electrodes that don’t require gel, which should lead to more user-friendly systems. Some experts have also suggested using permanent electrodes inserted just under the skin of the scalp—though any system that requires surgery will face regulatory hurdles.

The calibration process, which Anselmo and Poujouly found long and tedious, also seems ill-suited for an everyday technology. So our research team is now working on calibration software that could be easily run by a rehab therapist who isn’t an expert in BCI technology, or even by the users themselves. We’ve already experimented with a program for a BCI typing system that guides the user through calibration in a series of simple steps and adapts the BCI decoder without the user having to take any action. Another possibility is to make BCI training less tedious by designing a game that incorporates calibration.

Photo: Erik Tham/Alamy

The Cybathlon took place in an arena in Zurich, Switzerland, and included events that used six different types of cyborg technology.

Despite the current limitations, our BCI for the Cybathlon demonstrated how satisfying and intuitive the technology can be. Both our pilots say they felt an immediate sense of control, even in the earliest training sessions. “The first time the avatar responded to my thoughts made me really happy,” says Anselmo. As the pilots grew accustomed to the simple control system, manipulating their avatars became instinctual. “I was not thinking of moving my limbs but simply about what the avatar should do,” says Poujouly.

While a racing BCI may not appear to have immediate and obvious applications in the real world, the essence of any BCI system is the translation of brain signals into commands. That means the fundamental technology that powered our winning system can potentially find use in a wide range of practical devices, including systems to control a computer cursor or robotic limbs.

To transfer this technology from the lab to the home, we’ll have to consider several more factors. Performing well with our BCI still required intense concentration; Poujouly even meditated before his races to focus his mind. Will such concentration be possible for a user who’s trying to employ a BCI in the routine tasks of daily life? What’s more, our BCI’s ability to manage three commands (plus the idle state) was a great technical achievement, but real-world users may want more capabilities.

To make BCIs that control devices in more naturalistic ways, we may be able to draw lessons from how humans use their bodies. For example, we could make BCI decoders that recognize not only specific imagined movements but also other aspects of these motions, such as direction and velocity, and use that information to modulate the outgoing command. Our lab is also investigating BCIs that respond to cognitive signals such as the user’s recognition of error and the anticipation of critical decision points. For the latter, we’ve experimented with a BCI for cars that could pick up drivers’ neural signals when they anticipate braking and accelerating.

Both of our Cybathlon pilots are enthusiastic about the future of BCIs and imagine a wide range of uses that will give them more autonomy in their daily lives. We, the Brain Tweakers engineers, are dedicated to making the technology that will be one part of the equation. But we won’t forget that our team won gold based on a mutual-learning approach, in which man and machine adapted to each other and finally formed one symbiotic system. 

This article appears in the September 2017 print issue as “Brain Racers.”

About the Authors

José del R. Millán is the Defitech Foundation Chair in Brain-Machine Interface at the Swiss Federal Institute of Technology (EPFL) in Geneva. Serafeim Perdikis and Luca Tonin are postdocs in his group.

The Conversation (0)

Video Friday: DARPA Subterranean Challenge Final

1 min read

This week we have a special DARPA SubT edition of Video Friday, both because the SubT Final is happening this week and is amazing, and also because (if I'm being honest) the SubT Final is happening this week and is amazing and I've spent all week covering it mostly in a cave with zero access to Internet. Win-win, right? So today, videos to watch are DARPA's recaps of the preliminary competition days, plus (depending on when you're tuning in) a livestream of the prize round highlights, the awards ceremony, and the SubT Summit with roundtable discussions featuring both the Virtual and Systems track teams.

Keep Reading ↓ Show less

Making 3D-Printed Objects Feel

3D-printing technique lets objects sense forces applied onto them for new interactive applications

2 min read

Researchers from MIT have developed a method to integrate sensing capabilities into 3D printable structures comprised of repetitive cells, which enables designers to rapidly prototype interactive input devices.


Some varieties of 3D-printed objects can now “feel," using a new technique that builds sensors directly into their materials. This research could lead to novel interactive devices such as intelligent furniture, a new study finds.

The new technique 3D-prints objects made from metamaterials—substances made of grids of repeating cells. When force is applied to a flexible metamaterial, some of their cells may stretch or compress. Electrodes incorporated within these structures can detect the magnitude and direction of these changes in shape, as well as rotation and acceleration.

Keep Reading ↓ Show less

Help Build the Future of Assistive Technology

Empower those in need with a master’s degree in assistive technology engineering

4 min read

This article is sponsored by California State University, Northridge (CSUN).

Your smartphone is getting smarter. Your car is driving itself. And your watch tells you when to breathe. That, as strange as it might sound, is the world we live in. Just look around you. Almost every day, there's a better or more convenient version of the latest gadget, device, or software. And that's only on the commercial end. The medical and rehabilitative tech is equally impressive — and arguably far more important. Because for those with disabilities, assistive technologies mean more than convenience. They mean freedom.

So, what is an assistive technology (AT), and who designs it? The term might be new to you, but you're undoubtedly aware of many: hearing aids, prosthetics, speech-recognition software (Hey, Siri), even the touch screen you use each day on your cell phone. They're all assistive technologies. AT, in its most basic form, is anything that helps a person achieve enhanced performance, improved function, or accelerated access to information. A car lets you travel faster than walking; a computer lets you process data at an inhuman speed; and a search engine lets you easily find information.

Keep Reading ↓ Show less

Trending Stories

The most-read stories on IEEE Spectrum right now