Loser: Bad Vibes

A quixotic U.S. government new security system seeks to look into your soul

6 min read
Illustration by MCKIBILLO
Illustration: MCKIBILLO

This is part of IEEE Spectrum’s special report: Winners & Losers VII

imgIllustration: MCKIBILLO

The U.S. Department of Homeland Security (DHS), which operates airport security checkpoints in the United States, is spending upward of US $7 million a year trying to develop technology that can detect the evil intent of the terrorists among us. Yes, you read that correctly: They plan to find the bad guys by reading their minds.

Dozens of researchers across the country are in the middle of a five-year program contracted primarily to the Charles Stark Draper Laboratory, in Cambridge, Mass. They’ve developed a psycho-physiological theory of “malintent”—basically, a hodgepodge of behaviorism and biometrics according to which physiological changes can give away a terrorist’s intention to do immediate harm. So far, they’ve spent $20 million on biometric research, sensors, and a series of tests and demonstrations.

This is no mere fantasy, DHS officials insist. And it isn’t: It’s a noble fantasy. And it’s destined to be a noble failure. It’s called the Future Attribute Screening Technology, or FAST.

“We’re not reading minds,” says Robert P. Burns, a deputy director of innovation at the Homeland Security Advanced Research Projects Agency and the FAST project manager. “We’re just screening.”

The underlying theory is that your body reacts, in measurable and largely involuntary ways, to reveal the nature of your intentions. So as you wait in line at the airport checkpoint, thermal and other types of cameras and laser- and radar-based sensors will try to get a fix on the baseline parameters of your autonomic nervous system—your body temperature, your heart rate and respiration, your skin’s moistness, and the very look in your eyes. Then, as a security officer asks you a few questions, the sensors will remeasure those parameters so that the FAST algorithms can figure out whether you’re naughty or nice, all on the spot, without knowing anything else about you. It’s a bit like asking one of those RFID-based highway toll systems—the ones that automatically deduct the price of the toll from your credit card account—to determine whether the purpose of your car trip is business or pleasure.

The algorithms will scrutinize the output of the sensors, looking for several specific changes. For example, your pupils may dilate when someone asks you questions about matters on which you have malevolent intent, your heart may skip a beat, or you may suck in your breath. Of course, you may be having bad thoughts that have nothing to do with terrorism—if you’re getting on a plane to meet your paramour, for example. But in that case, “you won’t get noticed,” maintains Daniel Martin, a psychologist and independent contractor who developed the initial malintent theory and is the director of research for FAST. “It only measures the signs of malintent in this specific context, in this situation,” he says.

Martin and other backers say the system can tell whether a racing heart and sweaty skin are those of a nervous terrorist or merely a person who had to run to catch a plane. They say it can even distinguish among terrorists, garden-variety smugglers, and anxious travelers. But they won’t reveal just how the system manages to discern the infinite and wondrous varieties of guilt and remorse that lurk in the hearts of men and women.

“If we laid out specifically [how the system works], the first thing someone will do is say, ‘How do I counter that?’”  Martin says.

Some of the research will eventually appear in peer-reviewed journals, says Burns. But he declines to say when and where. At an April 2009 meeting, 30 scientists met at Draper for a private peer review of the theory of malintent and the experimental protocols. Martin says the consensus was that the theory is generally correct. The DHS has also set up privacy panels to try to ensure that its system’s operation won’t violate passengers’ rights; the program doesn’t tie data to an individual’s identity, nor does it store any of the information collected. “We dump the data once you’re through security,” says Burns.

Crucially, FAST will not measure you against some theoretical norm of, say, average heart rates for adult males. “Each person serves as his or her own baseline,” says Martin. The system “will measure how people’s signals change in response to stimuli.” To top it all off, the profile of symptoms can’t be faked.

“The signals are uncontrollable, even for a trained terrorist—you still give them off,” says Burns. Even if there’s a complete lack of signals—in other words, no change from the baseline—“that’s also a sign” of something suspicious: that the person has been trained to suppress those signals.

At a September press conference at Draper, DHS researchers screened 30 volunteers with the FAST system, offering some of them $100 bonuses for acting in a way that might “involve something malicious.” They were told there was a mock explosive that would “cause a loud noise but won’t cause too much harm.”

Thermal cameras read each subject’s body temperature. BioLidar tracked respiration and heart rate (think of a police radar gun that uses laser light instead of radio waves, and is sensitive enough to measure changes in the surface motion of the skin, particularly in the large arteries of the neck). An eye tracker determined a subject’s pupil size, rate of blinking, and “gaze vector.” A “fidget detector” had the subject stand on a Nintendo Wii’s balance board, which people normally use to exercise at home. All the sensors but one were off-the-shelf. The exception was the government-developed eye-safe BioLidar.

In fact, the test subjects that journalists watched at the Draper press conference were actors; the real subjects had gone through the mock screening beforehand. In a control room, for the benefit of the assembled press, computer monitors showed faces overlaid with neon lines and dots that showed readings of temperature, moistness, and the like (what the squiggly graphs on a different screen were actually measuring remained a mystery to the journalists). Time charts also tracked the subject’s voice and facial expressions each time a question was asked.

Paul Ekman, the author of Telling Lies: Clues to Deceit in the Marketplace, Politics, and Marriage (W.W. Norton, 1992) and other books on behavioral psychology, notes that a critical problem for research intended to identify malevolent intent is “designing an experimental test in which the stakes are anywhere near as high as they are for the terrorist.”

Does it work? Officials didn’t give a success rate for the September demonstration but have said the system is generally about “87 percent accurate.”

Such a system would be useless in the field, says Bruce Schneier, an expert on applying technology to security problems and a frequent critic of airport screening systems. He notes that more than 700 million people board airliners every year in the United States, and there have been very few attacks. “Imagine an attack every five years,” he says, “and a system with a very good 0.1 percent false-positive rate,” meaning that of every 1000 people screened, one gets stopped but isn’t a bad guy. “Over those five years, the system will still have 3.5 million false alarms.” And that’s for a system with far better performance numbers than the “87 percent accurate” system that DHS says it now has.

Kenneth R. Foster, a bioengineering professor at the University of Pennsylvania, who studies biotechnologies, says, “Mammography, which has about the same accuracy as that quoted for FAST, has the same problem. For every woman it finds with breast cancer, it scares the heck out of a dozen or more who do not. And breast cancer is a lot more common than terrorists in our airports.”

EXPERT CALLS
“In screening large populations for exceedingly rare occurrences, false positives dominate outcomes; any researcher engaging in a modicum of quantitative analysis would reject the hypothesis immediately.”
Nick Tredennick

“Surely this is a joke! They should equip the Transportation Security Administration screeners with Ouija boards, and if your pointer moves to the wrong square, they’ll pull you out of line and imprison you.”
Robert W. Lucky

Burns says an acceptable level of false positives “depends on the operational application.” He also notes that no one would be arrested or even kept off a plane because they failed the test. Rather, passengers who seem to have bad intent would be sent on to a secondary screening, the same thing that happens when a passenger’s behavior sets off suspicion in the frontline security officers. The goal is to have “less than the current go-to-secondary-screening rate,” Burns says, though he and other DHS officials wouldn’t give the current rate of secondary screening. “It’s an additional tool. Screening is not going away, so how do we make it better?”

DHS officials acknowledge the extent of the challenge they have set for themselves—and the likelihood of failure. “There’s a high degree of technical risk,” Burns says. “But if this works, it could be a game changer in terms of security.”

“The theory of malintent is still being developed and vetted in peer review,” Martin acknowledges. And, he admits, “we’re still a number of years away from deploying it.” Ah. Maybe that number has two digits. Or three.

Nope. DHS is determined to have a prototype by 2011. “We have to do it,” Burns says. “We have a lot of people in line, and we have to get them through quicker. We have to identify the people of interest.”

You can start by throwing out the entire FAST system. “FAST will never achieve the extremely low rate of false positives it needs,” says Schneier. “But even if it did, it’s useless against passengers who are unwittingly carrying bombs. Better to screen for bombs directly.”

For all of 2010’s Winners & Losers, visit the special report.

This article is for IEEE members only. Join IEEE to access our full archive.

Join the world’s largest professional organization devoted to engineering and applied sciences and get access to all of Spectrum’s articles, podcasts, and special reports. Learn more →

If you're already an IEEE member, please sign in to continue reading.

Membership includes:

  • Get unlimited access to IEEE Spectrum content
  • Follow your favorite topics to create a personalized feed of IEEE Spectrum content
  • Save Spectrum articles to read later
  • Network with other technology professionals
  • Establish a professional profile
  • Create a group to share and collaborate on projects
  • Discover IEEE events and activities
  • Join and participate in discussions