If you've ever gotten stuck in a t-shirt, this robot is for you.
For most people, putting on a t-shirt isn't a chore, but researchers at Nara Institute of Science and Technology in Japan have identified this important task as particularly difficult for the elderly or disabled with limited arm movement.
A cross-laboratory team led by Tomohiro Shibata and Takamitsu Matsubara developed a two-arm robot to slide a shirt over and onto a person's head and torso. Since a person's neck or arms may not be in the exact same position each time, a scripted movement could potentially cause distress.
Enter the team's reinforcement learning approach. Just like a child learning through experience, the robot is taught once how to clothe a human user, and then is given several attempts to put the shirt on by itself. The success is measured with motion capture system at the end of each trial, which lasts about 10 seconds.
In the video, we see that after three learning trials, the robot has learned the trajectory to place the shirt on without any trouble. According to Shibata, Japanese reporters who tested out the system gave it a thumbs up, saying it makes it easy to put on.
For now, the system has been tested with only a couple of different t-shirts and with several subjects including a few patients. The next step, says Shibata, is to try out the system with more subjects and patients, and with different t-shirts. "Our approach could be applied to other types of important clothing tasks such as pulling up/down pants."
PETMAN is an adult-sized humanoid robot developed by Boston Dynamics, the robotics firm best known for the BigDog quadruped.
Today, the company is unveiling footage of the robot's latest capabilities. It's stunning.
The humanoid, which will certainly be compared to the Terminator Series 800 model, can perform various movements and maintain its balance much like a real person.
Boston Dynamics is building PETMAN, short for Protection Ensemble Test Mannequin, for the U.S. Army, which plans to use the robot to test chemical suits and other protective gear used by troops. It has to be capable of moving just like a soldier -- walking, running, bending, reaching, army crawling -- to test the suit's durability in a full range of motion.
Marc Raibert, the founder and president of Boston Dynamics, tells me that the biggest challenge was to engineer the robot, which uses a hydraulic actuation system, to have the approximate size of a person. "There was a great deal of mechanical design we had to do to get everything to fit," he says.
As I said before, this is the first time I see a machine performing movements like that -- remarkably human, yet uncanny valley-esque at the same time.
Led by Dr. Robert Playter, Boston Dynamics' VP of engineering, development of PETMAN got its start with a $26.3 million Army program. Two years ago, the company, based in Waltham, Mass., first demonstrated PETMAN's legs by putting them to walk on a treadmill. This year, the company showed that the robot legs canrun at up to 7 kilometers per hour (about 4.4 miles per hour) and announced it had completed a prototype of the body.
But until now, the extent of PETMAN's full capabilities was a mystery.
Raibert says the humanoid and its behavior are still under development. "We plan to deliver the robot to the Army next year."
According to the Army requirements, the robot has to have about the same weight and dimensions of a 50th percentile male (the size of a standard crash-test dummy), or a mass of 80 kilograms (about 180 pounds) and height of about 1.75 meters (nearly 6 feet). PETMAN also has to simulate respiration, sweating, and changes in skin temperature based on the amount of physical exertion. Boston Dynamics used motion-capture systems to study the movements of humans as they performed a variety of exercises.
The robot relies on a tether that provides hydraulic power, but its body had to share space with many sensors and other components. Cramming everything together became a big engineering puzzle. And not only the legs had to be strong, Raibert explains, but the upper body too, to allow the robot to crawl and stand up.
And I know some of you are wondering: Will it have a head? "We were a bit late getting the articulated neck mechanism working," he says, "but it is coming along, and a head along with it."
I also asked Raibert if they could eventually use PETMAN or PETMAN-related technologies in other projects. In other words, are we going to see PETMAN used in applications other than the chemical suit tests?
"You bet," he says. "There are all sorts of things robots like PETMAN could be used for. Any place that has been designed for human access, mobility, or manipulation skills. Places like the Fukushima reactors could be accessed by PETMAN-like robots (or AlphaDogs), without requiring any human exposure to hazardous materials. Perhaps firefighting inside of buildings or facilities designed for human access, like on board ships designed for human crews."
This, of course, will mean another big challenge for his team: Transforming the humanoid from a tethered system into a free standing, self-contained robot. Boston Dynamics, however, has already demonstrated its ability to transition to tether-less machines with its BigDog project.
One question remains unanswered, though: Will BigDog become PETMAN's best friend?
Late last month, Northrop Grumman's ultra-futuristic X-47B unmanned combat air system (UCAS) performed its first test flight in "cruise mode;" that is, with its landing gear up in its typical flight configuration:
While a robot wingman does sound cool, there are probably two things which are incorrect about the term "wingman." One would be the "man" part: there's no flesh, blood, or other specific piece of humanity and/or masculinity inside. The other thing is that the X-37B is nobody's wingbot. It's entirely capable of running missions on its own, either controlled remotely by a human, or completely autonomously.
These missions will eventually include aircraft carrier take-offs and landings, refueling, reconnaissance, and attack missions, which will look uncannily like this:
It's pretty wild how the CG footage looks nearly identical to the real thing: we're totally living in the future right now.
It's not quite an invasion, but in recent years we've seen a small parade of quadruped robots strutting out of labs around the world. In the United States, Boston Dynamics has introduced its now-famous BigDog and, more recently, a bigger bot named AlphaDog. Early this year, we wrote about FROG, a prototype built in China, and just a few weeks ago we described the SQ1 robot, a South Korean project.
Now it's time to unveil the latest addition to this pack: HyQ is a robot developed at the Istituto Italiano di Tecnologia (IIT), in Genoa. The machine, built by a team led by Professor Darwin Caldwell, is a hydraulic quadruped (hy-q, get it?) designed to perform highly dynamic tasks such as running and jumping.
Legged locomotion remains one of the biggest challenges in robotics, and the Italian team hopes that their robot can become a platform for research and collaboration among different groups -- a kind of open source BigDog.
One of HyQ's key design features, Semini said, is that its legs are actively compliant, and the robot can change the stiffness of each limb. The robot does that by rapidly adjusting the hydraulic flow on its leg actuators. This capability allows the robot to run and jump, as well as negotiate rough terrain, its actuators absorbing shocks and vibrations without damage to the body.
As you can see in the video below, HyQ, which weighs in at 70 kilograms, can walk and trot at speeds up to 6 kilometers per hour. Starting at 0:53, you can see the robot moving its legs very slowly, almost like a giant crawling insect. The robot can also rear like a horse, and even squat jump, getting all four feet off the ground. At the very end of the clip, you can see another trick: The robot can kick!
To achieve the required high joint speeds and torques, HyQ uses two hydraulic cylinders and one electric motor on each leg, which are built with aerospace-grade aluminum alloy and stainless steel. Hydraulics gives the legs speed and robustness; the dc motor allows the robot to elevate its hip joints 120 degrees. There are position and force sensors on each joint, and an inertial measurement unit on the body.
HyQ uses a torque-control approach to move its legs. Its control system, which has a detailed model of the robot's body, uses inverse dynamics to calculate what torque it needs to apply to each joint.
And if you're wondering where the robot's head is, Semini said that's one of the next steps. They plan to build a head equipped with a stereo camera and laser range finder for navigation and mapping. Eventually, they also want to add a manipulator arm to the body, so the robot can grasp objects or push away obstacles.
After that, another big challenge is getting rid of the tether, which will mean designing an on-board hydraulic system to power the actuators. (If you want more technical details, visit the project's page and check out the group's publications.)
As for applications, unlike BigDog and AlphaDog, the Italian quadruped wasn't designed for carrying heavy payloads. HyQ is a smaller system, which the IIT researchers say could be used for search and rescue missions in dangerous environments. You could send the robot to navigate autonomously looking for victims, for example, or teleoperate it to investigate a disaster-stricken zone.
The researchers are currently testing HyQ on a special treadmill at their lab, making the robot walk, trot, and jump while attached to the hydraulic tether and a safety harness. Soon, however, they want to let their beast run free. "We need a parking lot to walk this dog," Semini says.
Microsoft has followed up their recent release of Robotics Developer Studio 4 and the Parallax Eddie platform with this demo showing how Eddie can be programmed to be an autonomous party photographer, aka "Roborazzi." I'd tell you all about it myself, but wouldn't you rather hear it from the dude who actually put the project together? Sure you would!
Not a bad demo, really. It's a nice way to leverage what Kinect is best at (people tracking), and the camera integration makes it handy to have around for all of those wild parties that you've never invited me to. Oh well, I guess now I can at least watch it all on Flickr. :(
Okay, so technically, Actroid-F got a "brother," not a boyfriend. Even more technically, Actroid-F got another Actroid-F in a different wig. Yeah, weird. But I mean, when it comes down to it, what's the difference? She/he/it also got some fancy new eyes with cameras in them:
Now, you and I may think that these robots are borderline uncanny, but when they went on duty in a hospital in Japan, patients actually kinda liked them.
"When we tested the robot in a hospital, we asked 70 subjects if having an android there made them feel uneasy. Only 3 or 4 people said they didn't like having it around, and overall, quite a lot of people said they felt this robot itself had an acceptable presence."
Hmmm. My guess is that if Actroid-F were to find itself in a hospital here in the U.S., the reaction would be substantially different. Robots (especially anthropomorphic humanoids) still have a bit of a hill to climb when it comes to public perception, and from what I understand we don't have as much of a positive history with them as you can find in Japanese culture. The researchers themselves seem to agree:
"When this robot went to a hospital for a month during a trial, we felt lonely, as if someone had moved out. Another factor is the sense of immersion this robot gives. When it imitates your movements, you gradually feel it's become your alter ego. When the robot's being photographed, you feel as if you're being photographed. You don't get that kind of feeling of togetherness with other robots."
Hmmmmmm. Yeah, I think a "feeling of togetherness" with an Actroid would be a stretch, at least for me, but then, I haven't had the pleasure (it's pleasure, right?) of spending a lot of time with one.
The robot is powered by a "pneumatic battery," which uses hydrogen peroxide and a catalyst to generate the gas pressure with which the robot sequentially inflates silicone bladders to propel itself. There's a brilliant system inside the battery to self-regulate the reaction so that the robot only ever uses as much of the H2O2 fuel as it needs. To control its motion, the bot relies on a system of electropermanent magnet valves. These valves are just like regular electromagnetic valves -- except they're permanent. You can switch them on and off using a little bit of current, but once the switch is made, they'll stay there without needing any power at all. It's very clever.
This research was sponsored by DARPA under the Chembots program and the Programmable Matter program, with help from Boeing. Combinations like that get me all excited, and although there may not be a future for this squishy little guy specifically, the underlying technology (specifically, those nifty little valves), could start popping up in all sorts of (probably less creepy) places.
"Soft Robot Actuators using Energy-Efﬁcient Valves Controlled by Electropermanent Magnets," by Andrew D. Marchese, Cagdas D. Onal, and Daniela Rus from MIT, was presented at IROS 2011 in San Francisco last month.
The last few robot dragons that we've been introduced to have done a pretty good job living up to that whole "dragon" mythos, being giant and dangerous and potentially scary. But dragons can also be cute and fuzzy and cuddly, and researchers at Northeastern University, Harvard, and MIT have gotten together and invented a little robot dragon designed to appeal to preschoolers. Fans of celebrity roboticists might recognize MIT's Cynthia Breazeal in the above picture on the far left; also in the pic are David DeSteno from Northeastern (right) and Paul Harris from Harvard (far right).
The robot they're all fawning over is, believe it or not, a descendant of Nexi, MIT Media Lab’s small humanoid. As you can see, it's a robotic dragon, called (as far as I can tell) "dragon robot." The relation to Nexi comes in the form of research by Northeastern's Social Emotions Group, showing that things like eyes and movements have a very significant impact on how people relate to robots, especially when it comes to trust and communication in learning environments. DeSteno, an associate professor of psychology at Northeastern, explains:
“Certain non-verbal cues like mimicking behavior to improve rapport and social bonding, or changes in gaze direction to guide shared attention, are central. When kids learn from human teachers, these cues enhance the learning. We’re designing our new dragon robots to be able to have these capabilities.”
Specifically, the dragon robot is designed to teach preschoolers language skills. It's furry, extremely emotive, and the intention is that kids will be able to develop an emotional connection with it. And when they trust the robot like they would something that's actually alive, it'll be a much more effective teacher.
At this stage, the dragon robots are going to undergo some preliminary testing with preschoolers at MIT. Once the researchers figure out what social cues are the most crucial to developing those emotional bonds, the robots will venture out into the world as distance-learning tools to help kids in rural areas learn their shapes, colors, numbers, and fantasy animals.
The current generation of bicycle-riding robots (I'm talking about those crazy kids from Murata) are extremely complicated, relying on giant gyroscopes and thick wheels to keep themselves upright even while stationary. This is certainly a neat trick, but it's not something that most humans can pull off. It's not a problem that robots are better at something than we are (by now, we're used to it), but there's something to be said for human emulation, too.
It turns out that getting a robot to ride a bicycle doesn't need to involve much more than a hobby level humanoid employing a relatively simple gyroscope that sends steering commands to keep things generally upright. This KHR3HV bipedal robot (which can be yours for about $2200) has a nifty custom bike that it got from I know not where, and can zip around under remote control at up to 10 kph, even making its own starts and stops:
Robots have ears. They're called microphones, and you usually find them just inside some tiny little hole somewhere. But you have to figure that there are good reasons why animals like this exist: big ears can confer an advantage. Namely, big ears allow animals to hear quieter sounds, and localize those sounds more precisely.
This is the idea behind "active soft pinnae," which is fancy roboticist talk for "ears that wiggle." The robotic ear in the picture above is a reasonably faithful reproduction of a kitty ear, including a fake fur covering on the back and the ability to both rotate side to side and deform downwards. There's a microphone buried down inside the ear, of course, but the external structure is the important part.
So what good is it? I mean, you can ask your cat, but testing has shown that it's possible to pinpoint the direction (azimuth and elevation) to a sound with just two wigglable ears instead of needing a complex microphone array. Furthermore, the ears can be used to localize sounds by moving independently of the head or body of a robot, which is a much more efficient approach. And of course, ears like these are awfully cute, and with the addition of some touch sensors, you could give your robot that friendly scritching that it deserves.