A Brainy Approach to Image Sorting

DARPA project reads the brain waves of image analysts to speed up intelligence triage

4 min read

3 April 2008--We may need computers to tell us the square root of 529 679, but for now, at least, they still need us to recognize a kitten napping in a box of yarn. The point goes to the humans for our keen sense of the relationship between objects, our eye for texture, and our understanding of emotional relevance, but we don't wield these abilities with great speed. This slowness, unfortunately, has caused intelligence agencies a good deal of distress. They collect surveillance images from satellites, infrared sensors, and aerial-mounted cameras so quickly that analysts must struggle to keep up.

But what if we could combine the speed of a computer with the sensitivity of the human brain? Teams of researchers at Honeywell, Teledyne Scientific and Imaging, and Columbia University are busy hooking image analysts up to EEG machines, reading their brain activity, and speeding up data sorting sixfold. Their research is for a Defense Advanced Research Projects Agency (DARPA) program called Neurotechnology for Intelligence Analysts, which began its second of three phases this year. Each phase whittles down the number of participating research teams, and by the end, DARPA expects to have one team with a superior system.

Keep Reading ↓Show less

This article is for IEEE members only. Join IEEE to access our full archive.

Join the world’s largest professional organization devoted to engineering and applied sciences and get access to all of Spectrum’s articles, podcasts, and special reports. Learn more →

If you're already an IEEE member, please sign in to continue reading.

Membership includes:

  • Get unlimited access to IEEE Spectrum content
  • Follow your favorite topics to create a personalized feed of IEEE Spectrum content
  • Save Spectrum articles to read later
  • Network with other technology professionals
  • Establish a professional profile
  • Create a group to share and collaborate on projects
  • Discover IEEE events and activities
  • Join and participate in discussions

This Implant Turns Brain Waves Into Words

A brain-computer interface deciphers commands intended for the vocal tract

10 min read
A man using an interface, looking at a screen with words on it.

A paralyzed man who hasn’t spoken in 15 years uses a brain-computer interface that decodes his intended speech, one word at a time.

University of California, San Francisco
Blue

A computer screen shows the question “Would you like some water?” Underneath, three dots blink, followed by words that appear, one at a time: “No I am not thirsty.”

It was brain activity that made those words materialize—the brain of a man who has not spoken for more than 15 years, ever since a stroke damaged the connection between his brain and the rest of his body, leaving him mostly paralyzed. He has used many other technologies to communicate; most recently, he used a pointer attached to his baseball cap to tap out words on a touchscreen, a method that was effective but slow. He volunteered for my research group’s clinical trial at the University of California, San Francisco in hopes of pioneering a faster method. So far, he has used the brain-to-text system only during research sessions, but he wants to help develop the technology into something that people like himself could use in their everyday lives.

Keep Reading ↓Show less
{"imageShortcodeIds":[]}