"Charles, I think this is the beginning of a beautiful friendship," Peter Robinson says to the passenger sitting in the car next to him.
The passenger is a robot head that Robinson, a professor of computer technology at Cambridge University in England, is using to explore the role of emotions in human-machine interaction.
Can computers understand emotions? Can computers express emotions? Can they feel emotions?
These are the questions that Robinson and his team at Cambridge's Computer Laboratory want to answer.
When people talk to each other, they express their feelings through facial expressions, tone of voice, and body postures. The interesting thing is that humans do the same even when they are interacting with machines.
So could we build better computers, robots, and other machines if they could understand and respond to these hidden signals?
Robinson's team believe the answer is yes. They are developing systems to analyze faces, gestures, and speech and infer emotions. They hope these systems could improve human-machine interactions in real situations.
Charles is a robotic head modeled on Charles Babbage. (Am I the only one who didn't notice the similarity? And is Charles a Hanson Robotics creation?) It's one of the research tools Robinson uses in his experiments, which include riding a car simulator with the robot as a GPS assistant.
"The way that Charles and I can communicate," Robinson says in a short movie called "The Emotional Computer" [watch below], "shows us the future of how people will interact with machines."
Do you agree? Would you replace your car GPS with Charles the robot head?
Image and video: "The Emotional Computer"/Cambridge University
Erico Guizzo is the digital product manager at IEEE Spectrum. An IEEE Member, he is an electrical engineer by training and has a master’s degree in science writing from MIT.