Samsung has begun testing mind-controlled tablets and smartphones as the next step toward freeing people from tapping on keyboards or screens. A lot of early research into mind-control has focused on helping the disabled, and the South Korean company's efforts will similarly likely benefit disabled gadget users sooner than the average electronics consumer.
Early experiments have shown how people can use thoughts alone to launch an app, find and select a contact, choose to play songs from favorite playlists and power a tablet up or down, according to a story in MIT Technology Review. Samsung's Emerging Technology Lab teamed up with Roozbeh Jafari, an electrical engineer at the University of Texas at Dallas, to carry out the research on a Samsung Galaxy Note 10.1 tablet.
Such achievements sound less impressive when considering that users can only carry out mind-control actions about once every five seconds, and with an accuracy of only 80 to 95 percent. Users must wear a cap covered with electrodes and wires running to each electrode—like an electroencephalograph (EEG), it picks up the patterns in the brain's electrical signals.
The Samsung approach interprets well-known brain activity patterns—ones related to the action of seeing repeating visual patterns—as mind-control commands. Researchers found that users can carry out certain actions on a tablet by mentally focusing on an icon that blinked at certain frequencies.
Similar mind-control technologies relying upon EEG readings have shown up in commercial headsets meant for gaming or high-tech amusements, including the Neurosky Mindset and Emotiv "neuroheadsets." Labs have even experimented with the Emotiv headset for driving cars.
Samsung doesn't expect to put out mobile devices using the technology anytime soon given the imperfect nature of current mind control technology. Kevin Brown, a senior inventor at IBM's emerging technology lab, told BBC News that testers had needed 20 minutes just to send an e-mail with mind control during one IBM experiment. That's a far cry from the 25 words in 83 seconds clocked by a quadriplegic man using a head-tracking system a couple of years ago.
Still, Brown and other researchers expect the current state of mind control technology could end up helping disabled people with conditions that prevent them from effectively using the touch, voice, gesture or eye movement controls commonly found in everyday consumer gadgets.
Jeremy Hsu has been working as a science and technology journalist in New York City since 2008. He has written on subjects as diverse as supercomputing and wearable electronics for IEEE Spectrum. When he’s not trying to wrap his head around the latest quantum computing news for Spectrum, he also contributes to a variety of publications such as Scientific American, Discover, Popular Science, and others. He is a graduate of New York University’s Science, Health & Environmental Reporting Program.