Are Your Students Bored? This AI Could Tell You

A new software program attempts to recognize students’ emotions to aid teachers, but faces aren’t so easy to read

3 min read

Image of students in a classroom with green squares around their faces and an identification of their emotion, including Happy or Neutral.
Image: Hong Kong University of Science and Technology

A professor finishes a lecture and checks his computer. A software program shows that most students lost interest about 30 minutes into the lecture—around the time he went on a tangent. The professor makes a note to stop going on tangents.

The technology for this fictional classroom scene doesn’t yet exist, but scientists are working toward making it a reality. In a paper published this month in IEEE Transactions on Visualization and Computer Graphics, researchers described an artificial intelligence (AI) system that analyzes students’ emotions based on video recordings of the students’ facial expressions.

The system “provides teachers with a quick and convenient measure of the students’ engagement level in a class,” says Huamin Qu, a computer scientist at the Hong Kong University of Science and Technology, who co-authored the paper. “Knowing whether the lectures are too hard and when students get bored can help improve teaching.”

Qu and his colleagues tested their AI system in two classrooms consisting of toddlers in Japan and university students in Hong Kong. The teachers for each class received a readout of the emotions of individual students and the collective emotions of the group as a whole during their lectures.

The visual analytics system did a good job of detecting obvious emotions such as happiness. But the model often incorrectly reported anger or sadness when students were actually just focused on the lectures. (The frown that often washes over our faces when we listen closely can be easily confused, even by humans, with anger, when taken out of context .) “To address this issue, we need to add new emotion categories, relabel our data and retrain the model,” says Qu.

The focus frown and other confusing facial expressions are a challenge for just about everyone working in the field of emotion recognition, says Richard Tong, chief architect at Squirrel AI Learning, who was not involved in the paper. “We have had similar problems in our own experiments,” he says, referring to the multimodal behavioral analysis algorithms his company is developing with its partners.   

Lots of groups are working on some kind of behavior or emotion recognition technology for the classroom, says Tong, who is also the chair of the IEEE Learning Technology Standards Committee. But he says this kind of analysis is of limited use for teachers in traditional classroom settings.

“Teachers are overwhelmed already, especially in the public schools,” Tong says. “It’s very hard for them to read analytical reports on individual students because that’s not what they’re trained for and they don’t have time.”

A group of toddlers have black bars over their eyes and green boxes around their faces. Photo: Viv Limited

Instead, Tong envisions using emotion recognition and other means of behavioral analysis for the development of AI tutors. These one-on-one computer-based teachers will be trained to recognize what motivates a student and spot when a student is losing interest, based on their physical or behavioral cues. The AI can then adjust its teaching strategy accordingly.

In this world of AI tutors, Tong says he envisions human teachers taking a role as head coach over the AI agents, which would work one-on-one with students. “But that requires a much more capable AI” than what we have now, he says.

Putting video cameras in the classroom also creates privacy issues. “The disclosure of the analysis of an individual’s emotion in a classroom may have unexpected consequences and can cause harm to students,” says Qu.

And it could backfire on educators. “It may distract students and teachers and could be harmful to learning, since students and teachers may feel like someone could be watching them and might not freely express their opinions,” Qu says. “The privacy issue is important for everyone, and needs to be carefully considered.”

This article appears in the March 2020 print issue as “The Boredom Detector.”

The Conversation (0)