When robots talk to each other, they're not generally using language as we think of it, with words to communicate both concrete and abstract concepts. Now Australian researchers are teaching a pair of robots to communicate linguistically like humans by inventing new spoken words, a lexicon that the roboticists can teach to other robots to generate an entirely new language.
Ruth Schulz and her colleagues at the University of Queensland and Queensland University of Technology call their robots the Lingodroids. The robots consist of a mobile platform equipped with a camera, laser range finder, and sonar for mapping and obstacle avoidance. The robots also carry a microphone and speakers for audible communication between them.
To understand the concept behind the project, consider a simplified case of how language might have developed. Let's say that all of a sudden you wake up somewhere with your memory completely wiped, not knowing English, Klingon, or any other language. And then you meet some other person who's in the exact same situation as you. What do you do?
What might very well end up happening is that you invent some random word to describe where you are right now, and then point at the ground and tell the word to the other person, establishing a connection between this new word and a place. And this is exactly what the Lingodroids do. If one of the robots finds itself in an unfamiliar area, it'll make up a word to describe it, choosing a random combination from a set of syllables. It then communicates that word to other robots that it meets, thereby defining the name of a place.
From this fundamental base, the robots can play games with each other to reinforce the language. For example, one robot might tell the other robot “kuzo,” and then both robots will race to where they think “kuzo” is. When they meet at or close to the same place, that reinforces the connection between a word and a location. And from “kuzo,” one robot can ask the other about the place they just came from, resulting in words for more abstract concepts like direction and distance:
This image shows what words the robots agreed on for direction and distance concepts. For example, “vupe hiza” would mean a medium long distance to the east.
After playing several hundred games to develop their language, the robots agreed on directions within 10 degrees and distances within 0.375 meters. And using just their invented language, the robots created spatial maps (including areas that they were unable to explore) that agree remarkably well:
In the future, researchers hope to enable the Lingodroids to "talk" about even more elaborate concepts, like descriptions of how to get to a place or the accessibility of places on the map. Ultimately, techniques like this may help robots to communicate with each other more effectively, and may even enable novel ways for robots to talk to humans.
Schulz and her colleagues -- Arren Glover, Michael J. Milford, Gordon Wyeth, and Janet Wiles -- describe their work in a paper, "Lingodroids: Studies in Spatial Cognition and Language," presented last week at the IEEE International Conference on Robotics and Automation (ICRA), in Shanghai.
Blog Post: Georgia Tech has given a robot a sword and told it that humans are out to get it, all in the name of safety
Blog Post: Robot cars can drive like maniacs, as Google demonstrates, but it's all in the name of safety
Blog Post: Stanford University has their PR2 picking up items in a checkout line, scanning them, and putting them in a bag for you
Blog Post: How do you get a ground robot over an obstacle? Just turn it into a helicopter