Asimo, the Honda humanoid, one of the world's most loved robots, was showing off its dance moves this week at the IEEE International Conference on Intelligent Robots and Systems in San Francisco.
The robot was here to demonstrate some new tricks it's been learning from scientists at the Honda Research Institute in Mountain View, Calif.
Victor Ng-Thow-Hing, Behzad Dariush, and colleagues work with Asimo seeking to develop robotics technologies that can assist people, especially in terms of mobility.
In one demonstration, the scientists showed how Asimo can mimic a person's movements in real time. The researchers use Microsoft's Kinect 3D sensor to track selected points on a person's upper body, and their software uses an inverse kinematics approach to generate control commands to make Asimo move. The software prevents self collisions and excessive joint motions that might damage its system and is integrated with Asimo's whole-body controller in order to maintain balance. The researchers say that the ability of mimicking a person in real time could find applications in robot programming and interactive teleoperation, among other things.
In another demo, the scientists showed how they're using gestures to improve Asimo's communication skills. They're developing a gesture-generating system that takes any input text and analyzes its grammatical structure, timing, and choice of word phrases to automatically generate movements for the robot. To make the behavior more realistic, the scientists used a vision system to capture humans performing various gestures, and then they incorporated these natural movements into their gesture-generating system.
Here's a video showing these two demos:
This was my first encounter face to face with Asimo, and upon close inspection I noticed something on Asimo's face that I didn't know it was there. Take a look at the photo below. Can you see it?
Photos: Evan Ackerman; video: Erico Guizzo and Evan Ackerman