Wouldn’t You Like Alexa Better if It Knew When It Was Annoying You?

Affectiva’s Rana El-Kaliouby says our devices need to get a lot more emotionally intelligent

3 min read

An image of the Mona Lisa, with facial features labeled that can be decoded to determine emotion
Illustration: Affectiva

What could your computer, phone, or other gadget do differently if it knew how you were feeling?

Rana el Kaliouby, founder and CEO of Affectiva, is considering the possibilities of such a world. Speaking at the Computer History Museum last week, el Kaliouby said that she has been working to teach computers to read human faces since 2000 as a PhD student at Cambridge University.

“I remember being stressed,” she says.  “I had a paper deadline, and “Clippy” [that’s Microsoft’s ill-fated computer assistant] would pop up and do a little twirl and say ‘It looks like you are writing a letter.’ I would think, ‘No I’m not!’”

(“You may,” Computer History Museum CEO John Hollar interjected, “be one of the few advanced scientists inspired by Clippy.”)

That was a piece of what led her to think about making computers more intelligent. Well, that, plus the fact that she was homesick. And the realization that, because she was spending more time with her computer than any human being, she really wanted her computer to understand her better.

Computer Museum CEO John Hollar interviews Affectiva founder Rana El-KalioubyComputer History Museum CEO John Hollar interviews Affectiva founder Rana el KalioubyPhoto: Tekla Perry

Since then, she’s been using machine learning, and more recently deep learning, to teach computers to read faces, spinning Affectiva out of the MIT Media Lab in 2009 to commercialize her work. The company’s early customers are not exactly changing the world—they are mostly advertisers looking to better craft their messages. But that, she says, is just the beginning. Soon, she says, “all of our devices will have emotional intelligence”—not just our phones, but “our refrigerators, our cars.”

Early on, el Kaliouby focused on building smart tools for individuals with autism. She still thinks emotional intelligence technology—or EI—will be a huge boon to this community, potentially providing a sort of emotional hearing aid.

It’ll also be a mental healthcare aid, el Kaliouby predicts. She sees smart phones with EI as potentially able to regularly check a person’s mental state, providing early warning of depression, anxiety, or other problems. “People check their phones 15 times an hour. That’s a chance to understand that you are deviating from your baseline.”

Computer Museum CEO John HollarAffectiva’s software decodes Computer History Museum CEO John Hollar’s expressions in real timePhoto: Tekla Perry

Cars, she said, will need to have emotional intelligence as they transition to being fully automated; in the interim period, they will sometimes need to hand control back to a human driver, and need to know if the driver is ready to take control.

Smart assistants like Siri and Alexa, she says, “need to know when [they] gave you the wrong answer and you are annoyed, and say ‘I’m sorry.’”

Online education desperately needs emotional intelligence, she indicated, to give it a sense of when students are confused or engaged or frustrated or bored.

And the killer app? It just might be dating. “We have worked with teenagers who just want to have a girlfriend, but couldn’t tell if girls were interested in them,” el Kaliouby says. A little computer help reading their expressions could help with that. (Pornography and sex robots will likely be a big market as well, el Kaliouby says, but her company doesn’t plan on developing tools for this application. Nor for security, because that violates Affectiva’s policy of not tracking emotions without consent.)

While Affectiva is focusing on the face for its clues about emotions, el Kaliouby admits that the face is just part of the puzzle—gestures, tone of voice, and other factors need to be considered before computers can be completely accurate in decoding emotions.

And today’s emotional intelligence systems are still pretty dumb. “I liken the state of the technology to a toddler, el Kaliouby says. “It can do basic emotions. But what do people look like when inspired, or jealous, or proud? I think this technology can answer these basic science questions—we’re not done.”

The full recording of el Kaliouby’s talk is below.

The Conversation (0)