Bandit, a caregiving humanoid robot developed at USC's Interaction Lab
Last Thursday, I headed out to the University of Southern California campus in Los Angeles for an open house at the Center for Robotics and Embedded Systems (CRES). It was a great opportunity to see some amazing research on humanoid robots, robots learning from humans, machine learning, and biologically inspired robots. Some highlights:
Let's start at the Interaction Lab led by Dr. Maja J. Mataric, a professor of computer science, neuroscience, and pediatrics and director of CRES. Her lab focuses on human-robot interaction, specifically with the goal of developing "socially assistive systems" to help in convalescence, rehabilitation, training, education, and emergency response. (Spectrum recently ran a profile of Mataric, read here.)
Ross Mead, a graduate student in Mataric's group, is currently working with children with autism through USC's Center for Autism Research in Engineering (CARE). Children with autism tend to interact more easily with robots than with humans. So Dr. Mataric’s group has been exploring the use of socially assistive robots in conjunction with speech processing technology to help improve social communication skills of the children.
Image courtesy of Dr. Maja J. Mataric and USC Interaction Lab
Current results have shown improved speech and interaction skills in autistic children when presented with robots, such as their caregiving robot named Bandit. It has 6-DOF arms and a head than can pan and tilt, with a face with movable mouth and eyebrows, and stereo-cameras for eyes.
In another application, Bandit serves as a social and cognitive aid for the elderly. It will not only instruct the user to perform certain movements, but also motivate the person and ensure that each movement is performed correctly.
Below is a video of Bandit showing off USC colors and interacting with graduate student Juan Fasola (and here's a video with an overview of the project).
Video courtesy of Dr. Maja J. Mataric and USC Interaction Lab
Another student at the Interaction Lab, Ross Mead is studying what aspects of robotic design create a more humanlike appearance and that improve acceptance of robots by humans. This has involved Sparky (below), a “minimatronic figure” developed by Walt Disney Imagineering Research and Development. The robot has 18 degrees of freedom and uses small servos and tendon-driven mechanisms to reproduce humanlike motions.
One possible application for Sparky will be as a lab tour guide. Equipped with a mobile base, it should be able to stop at various parts of the lab and describe using speech and gestures the various projects.
Watch the video below to see how Sparky uses its tendons and a spring as a spine to try to achieve natural movements:
Next up is the Computational Learning and Motor Control Lab headed by Dr. Stefan Schaal, a professor of computer science and neuroscience.
As part of the DARPA Learning Locomotion program, Schaal and his colleagues are investigating legged locomotion with the quadruped robot Little Dog developed by Boston Dynamics, whose other robots include the also quadruped Big Dog, the LS3 robot mule, and biped bot PETMAN.
Legged robots have the potential to navigate more diverse and more complex terrain than wheel-based robots, but current control algorithms hinder their application. So Schaal’s group is using Little Dog as a platform for learning locomotion in which learning algorithms developed with Little Dog will enable robots to transverse large, irregular and unexpected obstacles.
I had the opportunity to speak with Dr. Jonas Buchli and Peter Pastor of Dr. Schaal’s group following a demonstration of Little Dog. They discussed potential applications that include survivor location and recovery after a disaster, prosthetic limbs, and space exploration.
Watch the video below to see Little Dog in action (and watch this other video to see the little bot performing even more maneuvers).
Finally, at USC's iLab, Dr. Laurent Itti, a professor of computer science, is investigating how to make robots interact more naturally with humans and more effectively integrate into our lives. For that to happen, it will be important to create robots with humanlike qualities. In other words, robots will have to demonstrate humanlike locomotion, facial expressions, and eye movement. In addition, as robots gradually leave controlled environments, such as factory floors, and enter environments populated by humans, they’ll need enhanced cognitive abilities that enable them to autonomously navigate in an unstructured environment. One way of achieving that is by looking at biology.
One of the lines of research Itti and his students are pursuing involves monitoring the gaze of human participants as they watch a movie or play a video game. Such research will provide a window into how the brain functions as well as how it may become altered in diseased states. Furthermore, insights into brain function gleaned from the research has applications in machine vision, image processing, robotics, and artificial intelligence. Dr. Itti is also investigating the application of biologically inspired visual models for automatic target detection in a cluttered environment, driver alert monitoring, autonomous robotic navigation, and video games.
His group launched the Beobot 2.0 project to create an integrated and embodied artificial intelligence system and, through providing open access to their hardware and software design, enable other research groups to build other robots with diverse capabilities. Below is a picture of Beobot 2.0, and you can watch a video here to see it navigating a corridor.
Image courtesy of Dr. Laurent Itti and USC's iLab
With the expected increase in the robot population over the next decades, robots will emerge as a prevalent force in our lives and will permeate environments beyond manufacturing and include everything from healthcare and emergency response to personal entertainment and services. While providing many benefits, robots will become part of society, raising new and unforeseen social and ethical questions that will, in effect, give us a better understanding of ourselves and what it means to be human.
In the meantime, what's my Roomba doing?
Daniel Garcia is an intern at Lux Capital and is interested in clean technology and innovations in healthcare. He holds a PhD in biomedical engineering from UCLA.