It's easy to argue that R2-D2 from Star Wars has more personality than many robots twice its size, but without a face or limbs to speak of, where does it all come from? The answer, of course, is sound. Now U.K. researchers are trying to do the same with real robots, teaching them to communicate information and emotions to humans using beeps, boops, and squeaks.
Robin Read and Tony Belpaeme from Plymouth University's Centre for Robotics and Neural, in the U.K., are investigating the relationship between things like the pitch and rhythm of sounds and their perceived emotional connotations. Funded by the ALIZ-E Project, an European effort to create robots that can form meaningful bonds with humans in a hospital setting, the researchers asked several dozen 6- to 8-year-old kids to try to match sounds with expressions (as performed by Nao):
So, how well did it work out? Here's a summary from the paper, which you can read in full at the link below.
It is striking how children show strong categorical perception when interpreting the robot's utterances. There is no subtlety in their interpretation: the robot is -- in their words -- either sad, happy, angry, scared, surprised or tired, but they seldomly interpret utterances in more subtle emotions. We believe that upon closer inspection categorical perception will be observed in other modalities also, having a significant impact on the design of HRI [human-robot interaction] for younger children: any effort to convey subtlety might be a lost effort.
Non-verbal communication, whether or not subtlety is involved, is going to be a critical skill for human-robot interaction in the short term, since it doesn't require any complex hardware or software to implement and it's generally language and age independent. Roomba owners, for example, will immediately recognize their robots' "I'm charged!" tune. And communication isn't limited to audio, either: remember PR2's needy behaviors from back in 2009?
If we're lucky, we should be hearing a lot more about all this human robot interaction stuff at the ACM/IEEE International Conference on Human Robot Interaction (HRI 2012, if you like to breathe while speaking), which takes place in Boston starting today.
[ Paper (PDF) ]
Evan Ackerman is a senior editor at IEEE Spectrum. Since 2007, he has written over 6,000 articles on robotics and technology. He has a degree in Martian geology and is excellent at playing bagpipes.