The Mind-Reading Machine

Veritas Scientific is developing an EEG helmet that may invade the privacy of the mind

Image: Mike Agliolo/Photo Researchers/Getty Images

9 July 2012—Memories and thoughts are private—or at least they used to be. A new company, Veritas Scientific, is developing a technology that promises to peek into a person’s brain to reveal some of their secrets. “The last realm of privacy is your mind,” says Veritas CEO Eric Elbot. “This will invade that.”

Elbot’s device belongs in a Philip K. Dick novel: It’s a futuristic motorcycle-type helmet containing metal brush sensors that will read brain activity as images of, say, bomb specs or Osama bin Laden’s face flash quickly across the inside of the visor. Scientists have shown that familiar images prompt spikes of electrical brain activity that indicate recognition. Recognition indicates memory, and memory implies knowledge. Veritas’s goal is to create an electroencephalogram (EEG) helmet with a slideshow of images that could reliably help to identify an enemy.

But whose enemy? Veritas would provide the U.S. military with the device first, as a way to help them pick friend from foe among captured people. But Elbot imagines that the brain-spying, truth-telling technology will also be useful for law enforcement, criminal trials, and corporate takeovers. Eventually, it will even make its way into cellphone apps for civilians, he says.

“Certainly it’s a potential tool for evil,” says Elbot. “If only the government has this device, it would be extremely dangerous.”

EEG experiments on mock terrorism plots have been conducted in laboratories, identifying participants and detecting criminal details. Veritas wants to put its helmets on real suspected terrorists. According to Elbot, the U.S. military used an earlier Veritas device called BrainTruth to test the thoughts of suspected Iranian agents crossing the Mexican border into the United States.

Elbot envisions a scenario in which troops in a village in Afghanistan round up all the men and put helmets on them, and then the soldiers will able to classify them as friend or foe almost instantly. Elbot hopes to have a prototype ready for the U.S. military’s war games this fall and is pursuing a military contract.

Veritas draws heavily on the work of J. Peter Rosenfeld, a professor of psychology and neuroscience at Northwestern University, in Evanston, Ill. Rosenfeld develops EEG tests that ferret out lies; the U.S. military sponsors some of his research.

Rosenfeld’s tests—and Veritas’s work—are based on certain types of brain activity known as event related potentials (ERPs). When the brain recognizes someone, there is a specific, well-documented response called a P300. A person sees a face and then identifies it as John, Mary, or Mom. As the person’s brain puts a name to the face, a sharp dip in the EEG appears between 200 and 500 milliseconds after first seeing the face. That dip reveals that the subject recognizes that person. The same reaction occurs with a photo of an object, a place, or even a name.

It sounds simple, but it isn’t. For each test, there is a probe image—the one the subject may recognize. It has to be a surprise, so it is mixed into a series of dummy images, some related to the probe, some not. Sometimes there’s an image that prompts a physical response, such as pressing a button, to show the subject is paying attention.

It will be hard to avoid reacting inside Veritas’s helmet. Fitted tightly to the head without being painful, it will be soundproofed against the outside world, says Elbot. The visor will display images only centimeters from the eyes. The metal brush sensors, still in development, are being designed to go easily through hair and conduct brain signals without the conductive gel used in hospitals.

Veritas isn’t the first company to try to commercialize ERPs. Behavioral neuroscientist Lawrence A. Farwell founded Brain Fingerprinting Laboratories, also based on ERPs, with a goal similar to that of Veritas. Lauded by the media but denounced by peers, Farwell’s venture has so far not succeeded.

P300s are tricky signals, says Paul Sajda, an associate professor of biomedical engineering at Columbia University. Sajda conducts research on the P300 response, but to a very different end: to aid in image recognition. Sajda has also offered his work to intelligence agencies, but as a way for image analysts to spot more of whatever they’re looking for, not as interrogation technology. This is a situation where a false positive won’t hurt anyone, and there are false positives with ERPs, he says.

The trouble with the P300 response is that it’s related to more than recognition. Loud noises, arousal, surprises, and suddenly focused attention can all cause P300s. Stress and depression can alter the intensity or timing as well. “It’s an interesting signal, but it’s also complicated,” says Sajda. What’s worse, EEG readings are noisy and messy and must be interpreted carefully using computer algorithms. “It would have to be a situation where false positives and negatives don’t matter that much,” says Sajda. Which brings up the question: When a person’s life or freedom is at stake, what is an acceptable margin of error?

Veritas claims it is devoted to extremely high accuracy and doesn’t intend its device to be the only factor in whatever scenario it’s used in, says Peter Lauro, head of Veritas’s neuroscience research. Decisions and interpretation would ultimately fall to human beings. The company is at the “very beginning of testing, testing, testing” to find the right combination of ERPs, questions, and patterns of images for a reliable deception test, says Lauro. They’re also adding functional near-infrared imaging (fNIRs) to the helmet, a brain imaging technology that measures blood flow.

Using ERPs requires a delicate combination of psychology and neuroscience, Rosenfeld says, including an understanding of how and why a person will react and what that reaction will look like on an EEG. But that doesn’t mean it isn’t possible to use them, he says; it’s just difficult.

The helmet isn’t ready yet, but mind-reading tech is inevitable—even if it’s far in the future, experts say. Whether this technology should be used seems the bigger question. “Once you test brain signals, you’ve moved a little closer to Big Brother in your head,” says Sajda.

Related Stories

Advertisement
Advertisement