This is part of IEEE Spectrum's special report: Winners & Losers VII
The U.S. Department of Homeland Security (DHS), which operates airport security checkpoints in the United States, is spending upward of US $7 million a year trying to develop technology that can detect the evil intent of the terrorists among us. Yes, you read that correctly: They plan to find the bad guys by reading their minds.
Dozens of researchers across the country are in the middle of a five-year program contracted primarily to the Charles Stark Draper Laboratory, in Cambridge, Mass. They've developed a psycho-physiological theory of "malintent"—basically, a hodgepodge of behaviorism and biometrics according to which physiological changes can give away a terrorist's intention to do immediate harm. So far, they've spent $20 million on biometric research, sensors, and a series of tests and demonstrations.
This is no mere fantasy, DHS officials insist. And it isn't: It's a noble fantasy. And it's destined to be a noble failure. It's called the Future Attribute Screening Technology, or FAST.
"We're not reading minds," says Robert P. Burns, a deputy director of innovation at the Homeland Security Advanced Research Projects Agency and the FAST project manager. "We're just screening."
The underlying theory is that your body reacts, in measurable and largely involuntary ways, to reveal the nature of your intentions. So as you wait in line at the airport checkpoint, thermal and other types of cameras and laser- and radar-based sensors will try to get a fix on the baseline parameters of your autonomic nervous system—your body temperature, your heart rate and respiration, your skin's moistness, and the very look in your eyes. Then, as a security officer asks you a few questions, the sensors will remeasure those parameters so that the FAST algorithms can figure out whether you're naughty or nice, all on the spot, without knowing anything else about you. It's a bit like asking one of those RFID-based highway toll systems—the ones that automatically deduct the price of the toll from your credit card account—to determine whether the purpose of your car trip is business or pleasure.
The algorithms will scrutinize the output of the sensors, looking for several specific changes. For example, your pupils may dilate when someone asks you questions about matters on which you have malevolent intent, your heart may skip a beat, or you may suck in your breath. Of course, you may be having bad thoughts that have nothing to do with terrorism—if you're getting on a plane to meet your paramour, for example. But in that case, "you won't get noticed," maintains Daniel Martin, a psychologist and independent contractor who developed the initial malintent theory and is the director of research for FAST. "It only measures the signs of malintent in this specific context, in this situation," he says.
Martin and other backers say the system can tell whether a racing heart and sweaty skin are those of a nervous terrorist or merely a person who had to run to catch a plane. They say it can even distinguish among terrorists, garden-variety smugglers, and anxious travelers. But they won't reveal just how the system manages to discern the infinite and wondrous varieties of guilt and remorse that lurk in the hearts of men and women.
"If we laid out specifically [how the system works], the first thing someone will do is say, 'How do I counter that?' " Martin says.
Some of the research will eventually appear in peer-reviewed journals, says Burns. But he declines to say when and where. At an April 2009 meeting, 30 scientists met at Draper for a private peer review of the theory of malintent and the experimental protocols. Martin says the consensus was that the theory is generally correct. The DHS has also set up privacy panels to try to ensure that its system's operation won't violate passengers' rights; the program doesn't tie data to an individual's identity, nor does it store any of the information collected. "We dump the data once you're through security," says Burns.
Crucially, FAST will not measure you against some theoretical norm of, say, average heart rates for adult males. "Each person serves as his or her own baseline," says Martin. The system "will measure how people's signals change in response to stimuli." To top it all off, the profile of symptoms can't be faked.
"The signals are uncontrollable, even for a trained terrorist—you still give them off," says Burns. Even if there's a complete lack of signals—in other words, no change from the baseline—"that's also a sign" of something suspicious: that the person has been trained to suppress those signals.
At a September press conference at Draper, DHS researchers screened 30 volunteers with the FAST system, offering some of them $100 bonuses for acting in a way that might "involve something malicious." They were told there was a mock explosive that would "cause a loud noise but won't cause too much harm."
Thermal cameras read each subject's body temperature. BioLidar tracked respiration and heart rate (think of a police radar gun that uses laser light instead of radio waves, and is sensitive enough to measure changes in the surface motion of the skin, particularly in the large arteries of the neck). An eye tracker determined a subject's pupil size, rate of blinking, and "gaze vector." A "fidget detector" had the subject stand on a Nintendo Wii's balance board, which people normally use to exercise at home. All the sensors but one were off-the-shelf. The exception was the government-developed eye-safe BioLidar.