Hey there, human — the robots need you! Vote for IEEE’s Robots Guide in the Webby Awards.

Close bar

Does AI in Healthcare Need More Emotion?

Emotion AI, also known as affective computing, is rare in certain healthcare applications. Should emotion be included as a parameter?

3 min read

Illustration of an eye with a heart over a medical symbol
Illustration: Getty Images

For the last 25 years, researchers have sought to teach computers to measure, understand, and react to human emotion. Technologies developed in this wave of emotion AI—alternately called affective computing or artificial emotional intelligence—have been applied in a variety of ways: capturing consumer reactions to advertisements; measuring student engagement in class; and detecting customer moods over the phone at call centers, among others.

Emotion AI has been less widely applied in healthcare, as can be seen in a recent literature review. In a meticulous analysis of 156 papers on AI in pregnancy health, a team in Spain found only two papers in which emotions were used as inputs. Their review, published in the journal IEEE Access, concluded that expanded use of affective computing could help improve health outcomes for pregnant women and their infants.

“There is a lot of evidence that stress, anxiety and negative feelings can make [outcomes] worse for pregnant women,” says study co-author Andreea Oprescu, a PhD student at the University of Seville. An app or wearable that takes these feelings into account could better help detect and monitor certain conditions, she notes.

Other researchers are more hesitant to apply emotion AI to healthcare. “I caution people about trying to use a variable that is so difficult to understand and estimate in anything that has to do with medical diagnosis or interpretation,” says Aleix Martinez, director of the Computational Biology and Cognitive Science Lab at Ohio State University, which uses machine learning and computer vision to study emotion and intent.

In healthcare, affective computing has been applied to the field of autism. Affective computing pioneer Rosalind Picard, a professor at MIT, first started designing headsets and minicomputers that displayed emotional cues in the 1990s, and others have more recently developed wearables that rely on AI to help children with autism interpret facial reactions of people around them.  

In Spain, a team of doctors at the Virgen del Rocío University Hospital in Seville were interested in developing a mobile application or wearable for detecting pregnancy health conditions, and they reached out to computer scientist María del Carmen Romero-Ternero at the University of Seville to explore what was already in development in the field.

Using a recognized methodological framework for reviewing health research evidence, Romero-Ternero’s team identified 156 papers that used artificial intelligence in pregnancy studies. They found that over the last 12 years, AI has been used for detecting and monitoring fetal health, congenital birth defects, risk of preterm birth, and gestational diabetes. Most of the studies were in early stages and not yet supported by clinical testing, notes Oprescu. “For an algorithm to be used in a medical field, it needs to go through a lot of tests before physicians can actually use it,” she adds.

Because emotional stress, anxiety and depression have been associated with complications during pregnancy and birth, the researchers also searched the literature for AI studies that include emotional parameters. They only found two: one about the relationship between pre-eclampsia and worry, and one concerning patient reactions to prenatal tests. “The majority [of the studies] didn’t look into the emotional aspects of the patient,” says Oprescu.

This is perhaps not surprising for the relatively young field. While affective computing has worked well in areas such as image searches for emotions and teaching robots to interact socially with humans, computers in general are not great at interpreting human emotions, says Martinez, who was not involved in the pregnancy study.

While affective computing has worked well in areas such as image searches and teaching robots to interact socially, computers in general are not great at interpreting human emotions.

That is largely because we humans aren’t good at it ourselves. Study after study shows that humans are pretty bad at knowing what they feel internally, much less understanding how other people feel or teaching a computer how to understand. “If I don’t know what I am experiencing, how am I going to tell you what I am experiencing to collect a good dataset that can be used to train a machine learning algorithm?” says Martinez. “There’s no hope for that.” Plus, understanding emotion needs to take into account variation, culture and context, he notes. For example, is a person screaming because he is mad or because he just scored the winning goal?

While AI continues to be applied widely to healthcare, there is still a long way to go before emotional states can be automatically recorded and applied to health tools. But there is no doubt that in many areas of healthcare, including pregnancy, “emotion is important,” says Oprescu. “We think that should not be overlooked.”

This article appears in the January 2021 print issue as “Health Care Needs Empathic AI.”

The Conversation (1)
Vaibhav Sunder
Vaibhav Sunder18 Jan, 2023
M

This will fix the biggest digital age problem. Dare I say, a bigger one will substitute it by the time this is taken on a mass scale. The problem of AI dependence. Oh! let's not forget Elon Musk's remark that AI will cause some catastrophie in the next 5-10 years. Well, after that.