16 July 2008— Robot researcher Atsuo Takanishi, a professor of engineering at Waseda University, in Tokyo, is driven by a vision that would probably appall many musicians. Takanishi wants to create a humanoid robot orchestra. So far he and his group of researchers have developed a pretty good flute-playing robot and have begun work on a saxophone player. Though he has spent many years perfecting the flutist, Takanishi expects things will move much faster now, because he tackled one of the hardest instruments to play first.
”Anyone can play some notes the first time he tries a reed instrument like the saxophone. But getting even a sound out of the flute is very difficult,” says Takanishi.
The seated robot is essentially made up of two acrylic cylinders and bellows for the lungs, a vibrato mechanism to imitate human vocal cords, an artificial tongue and lips made of a thermoplastic rubber called Septon, two CCD cameras for the eyes, and flexible arms and fingers that can open and close. Together, these ”organs” have 41 degrees of freedom and are driven by complex mechanical systems of motorized levers and pulleys under the control of actuators and a computer.
Getting the robot to produce a melody turned out to be a monumental task. First, the researchers worked with professional players to create a performance index of what constitutes the best flute sounds. They translated these sounds into mathematical formulations, to which the robot refers. The researchers then programmed the robot’s organs to create a sound. Once a sound was produced, they used the parameters controlling the organs that produced the sound as a base and then adjusted those parameters repeatedly until the sound improved and eventually approximated a target sound in the performance index.
”We had to teach it everything,” says Takanishi. ”The different positions of the lips and fingers, the strength of the air pressure, everything. There are any number of parameters [making it] almost impossible to engineer.… It was a very slow process.”
To make the procedure less laborious and more autonomous, audio feedback control has been added to help the robot make its own adjustments. Also, more computer intelligence has been incorporated so that the robot can now ”read” Musical Instrument Digital Interface (MIDI) data and translate it into the parameter controls needed to transform the data into flute playing. ”We can now download virtually any MIDI file into the robot’s computer and it can reproduce the music unaided,” says Takanishi [watch the video]. ”It may not play perfectly yet, but it plays well.”
Other robot researchers have been chasing similar orchestral dreams. A Honda robot conducted the Detroit Symphony Orchestra in May, and a Toyota humanoid played the trumpet for crowds at the SAE World Congress in April.