A new computerized system is better than humans at telling a genuine expression of pain from a fake one.
Researchers who developed the system say it could be used to detect when someone is feigning illness or pain to escape work. It could also spot attempts to minimize or mask pain, which could be useful during, say, interrogations or health assessments.
According to a report in Wired, the new system is based on something called the Facial Action Coding System, developed by psychologist Paul Eckman. The system is used by animators to give their computerized characters realistic human expressions. The idea is that any facial expression can be mapped to a specific group of muscles in the face.
When people attempt to fake pain, they use the exact same facial muscles that are contracted during real pain. What distinguishes a deliberate expression from a spontaneous one is the dynamics: things like when, how much, and how quickly the muscles move. Humans, it turns out, aren’t great at picking up on this subtle difference in the dynamics of facial motion.
To test this, researchers at the University of California at San Diego and the University of Toronto first recorded videos of volunteers’ facial expressions as they experienced real pain by dipping their arm in icy water and also as they faked pain while putting their arm in warm water. Then the researchers showed the videos to 170 people and asked them tell real pain expressions from fake ones. The observers could only guess correctly about half the time. With training, their accuracy went up to only 55 percent.
The researchers’ computer vision and machine learning system, on the other hand, was much better at spotting the difference in the dynamics of muscle movement. It could distinguish between real and fake expressions with 85 percent accuracy.
The study, which was published today in the journal Current Biology, shows that the single most predictive feature of fake expressions is the mouth, and how and when it opens. The researchers found that when people fake pain, their mouth-opening action during grimaces is too regular. Both the interval between mouth opening and the time for which they open their mouth is too consistent.
In a press release, Marian Bartlett, research professor at UC San Diego's Institute for Neural Computation and an author of the study, said: "Our computer-vision system can be applied to detect states in which the human face may provide important clues as to health, physiology, emotion, or thought, such as drivers' expressions of sleepiness, students' expressions of attention and comprehension of lectures, or responses to treatment of affective disorders."