Microsoft has followed up their recent release of Robotics Developer Studio 4 and the Parallax Eddie platform with this demo showing how Eddie can be programmed to be an autonomous party photographer, aka "Roborazzi." I'd tell you all about it myself, but wouldn't you rather hear it from the dude who actually put the project together? Sure you would!
Not a bad demo, really. It's a nice way to leverage what Kinect is best at (people tracking), and the camera integration makes it handy to have around for all of those wild parties that you've never invited me to. Oh well, I guess now I can at least watch it all on Flickr. :(
Okay, so technically, Actroid-F got a "brother," not a boyfriend. Even more technically, Actroid-F got another Actroid-F in a different wig. Yeah, weird. But I mean, when it comes down to it, what's the difference? She/he/it also got some fancy new eyes with cameras in them:
Now, you and I may think that these robots are borderline uncanny, but when they went on duty in a hospital in Japan, patients actually kinda liked them.
"When we tested the robot in a hospital, we asked 70 subjects if having an android there made them feel uneasy. Only 3 or 4 people said they didn't like having it around, and overall, quite a lot of people said they felt this robot itself had an acceptable presence."
Hmmm. My guess is that if Actroid-F were to find itself in a hospital here in the U.S., the reaction would be substantially different. Robots (especially anthropomorphic humanoids) still have a bit of a hill to climb when it comes to public perception, and from what I understand we don't have as much of a positive history with them as you can find in Japanese culture. The researchers themselves seem to agree:
"When this robot went to a hospital for a month during a trial, we felt lonely, as if someone had moved out. Another factor is the sense of immersion this robot gives. When it imitates your movements, you gradually feel it's become your alter ego. When the robot's being photographed, you feel as if you're being photographed. You don't get that kind of feeling of togetherness with other robots."
Hmmmmmm. Yeah, I think a "feeling of togetherness" with an Actroid would be a stretch, at least for me, but then, I haven't had the pleasure (it's pleasure, right?) of spending a lot of time with one.
The robot is powered by a "pneumatic battery," which uses hydrogen peroxide and a catalyst to generate the gas pressure with which the robot sequentially inflates silicone bladders to propel itself. There's a brilliant system inside the battery to self-regulate the reaction so that the robot only ever uses as much of the H2O2 fuel as it needs. To control its motion, the bot relies on a system of electropermanent magnet valves. These valves are just like regular electromagnetic valves -- except they're permanent. You can switch them on and off using a little bit of current, but once the switch is made, they'll stay there without needing any power at all. It's very clever.
This research was sponsored by DARPA under the Chembots program and the Programmable Matter program, with help from Boeing. Combinations like that get me all excited, and although there may not be a future for this squishy little guy specifically, the underlying technology (specifically, those nifty little valves), could start popping up in all sorts of (probably less creepy) places.
"Soft Robot Actuators using Energy-Efﬁcient Valves Controlled by Electropermanent Magnets," by Andrew D. Marchese, Cagdas D. Onal, and Daniela Rus from MIT, was presented at IROS 2011 in San Francisco last month.
The last few robot dragons that we've been introduced to have done a pretty good job living up to that whole "dragon" mythos, being giant and dangerous and potentially scary. But dragons can also be cute and fuzzy and cuddly, and researchers at Northeastern University, Harvard, and MIT have gotten together and invented a little robot dragon designed to appeal to preschoolers. Fans of celebrity roboticists might recognize MIT's Cynthia Breazeal in the above picture on the far left; also in the pic are David DeSteno from Northeastern (right) and Paul Harris from Harvard (far right).
The robot they're all fawning over is, believe it or not, a descendant of Nexi, MIT Media Lab’s small humanoid. As you can see, it's a robotic dragon, called (as far as I can tell) "dragon robot." The relation to Nexi comes in the form of research by Northeastern's Social Emotions Group, showing that things like eyes and movements have a very significant impact on how people relate to robots, especially when it comes to trust and communication in learning environments. DeSteno, an associate professor of psychology at Northeastern, explains:
“Certain non-verbal cues like mimicking behavior to improve rapport and social bonding, or changes in gaze direction to guide shared attention, are central. When kids learn from human teachers, these cues enhance the learning. We’re designing our new dragon robots to be able to have these capabilities.”
Specifically, the dragon robot is designed to teach preschoolers language skills. It's furry, extremely emotive, and the intention is that kids will be able to develop an emotional connection with it. And when they trust the robot like they would something that's actually alive, it'll be a much more effective teacher.
At this stage, the dragon robots are going to undergo some preliminary testing with preschoolers at MIT. Once the researchers figure out what social cues are the most crucial to developing those emotional bonds, the robots will venture out into the world as distance-learning tools to help kids in rural areas learn their shapes, colors, numbers, and fantasy animals.
The current generation of bicycle-riding robots (I'm talking about those crazy kids from Murata) are extremely complicated, relying on giant gyroscopes and thick wheels to keep themselves upright even while stationary. This is certainly a neat trick, but it's not something that most humans can pull off. It's not a problem that robots are better at something than we are (by now, we're used to it), but there's something to be said for human emulation, too.
It turns out that getting a robot to ride a bicycle doesn't need to involve much more than a hobby level humanoid employing a relatively simple gyroscope that sends steering commands to keep things generally upright. This KHR3HV bipedal robot (which can be yours for about $2200) has a nifty custom bike that it got from I know not where, and can zip around under remote control at up to 10 kph, even making its own starts and stops:
Robots have ears. They're called microphones, and you usually find them just inside some tiny little hole somewhere. But you have to figure that there are good reasons why animals like this exist: big ears can confer an advantage. Namely, big ears allow animals to hear quieter sounds, and localize those sounds more precisely.
This is the idea behind "active soft pinnae," which is fancy roboticist talk for "ears that wiggle." The robotic ear in the picture above is a reasonably faithful reproduction of a kitty ear, including a fake fur covering on the back and the ability to both rotate side to side and deform downwards. There's a microphone buried down inside the ear, of course, but the external structure is the important part.
So what good is it? I mean, you can ask your cat, but testing has shown that it's possible to pinpoint the direction (azimuth and elevation) to a sound with just two wigglable ears instead of needing a complex microphone array. Furthermore, the ears can be used to localize sounds by moving independently of the head or body of a robot, which is a much more efficient approach. And of course, ears like these are awfully cute, and with the addition of some touch sensors, you could give your robot that friendly scritching that it deserves.
Technically, what this robot uses is hot-melt adhesive, or HMA. This is the stuff that comes out of hot glue guns, and it goes from a solid to a sticky liquid when it's passed through a heating element. As it cools, it solidifies again. The robot uses this property to temporarily bond its limbs to a vertical surface one by one and hoist itself up, unsticking itself as it goes by re-heating the blobs of glue that it sets down:
By now, you've probably spotted several issues that this robot has to deal with: first, it's very, very slow, since it has to wait for the adhesive to cure every time it takes a step, a 90 second process. And second, totally it leaves a trail of sticky little glue spots along every surface that it climbs, making its usefulness questionable in many (if not most) environments.
So yes, a few things need to be addressed, but this technique has a bunch of upsides, too. The biggest one is that glue, being glue, sticks to just about anything. It doesn't have to be especially rough, especially smooth, or especially magnetic, which makes it more versatile than than the current generation of just about every other robot adhesion system that I can think of off the top of my head.
Also, the hot melt adhesive can support a lot of weight, and it can do it completely passively: you don't need to expend energy once the adhesive sets to keep from falling. The bonding strength of the HMA in its solid state is such that a four square centimeter little patch can hold a staggering 60 kilograms, easily enough to hold this robot plus a fairly gigantic payload, most of which is likely going to have to consist of extra sticks of glue.
Robots are quite good at doing very specific tasks. Arguably, doing very specific tasks are what robots are best at. When you put a robot into an unknown situation, however, odds are you're not going to have a design that's optimized for whatever that situation ends up being. This is where modular robots come in handy, since they can reconfigure themselves on the fly to adapt their hardware to different tasks, and the Modular Robotics Lab at the University of Pennsylvania has come up with a wild new way of dynamically constructing robots based on their CKBot modules: spray foam.
The process starts with a "foam synthesizer cart" that deploys several CKBot clusters, each consisting of a trio of jointed CKBot modules. The CKBot clusters can move around by themselves, sort of, and combined with some helpful nudging from the cart, they can be put into whatever position necessary to form the joints of a robot. The overall structure of the robot is created with insulation foam that the cart sprays to connect the CKBot clusters in such a way as to create a quadruped robot, a snake robot, or whatever else you want. Watch:
Having a robot that shoots foam is good for lots more than building other robots; for example, Modlab has used it to pick up hazardous objects and to quickly deploy permanent doorstops. There's still some work to be done with foam control and autonomy, but Modlab is already thinking ahead. Way ahead:
"By carrying a selection of collapsible molds and a foam generator, a robot could form end effectors on a task-by-task basis -- for example, forming wheels for driving on land, impellers and oats for crossing water, and high aspect ratio wings for gliding across ravines. Molds could also be made of disposable material (e.g. paper) that forms part of the final structure. Even less carried overhead is possible by creating ad-hoc molds: making a groove in the ground or placing found objects next to each other."
With this kind of capability, you could (say) send a bunch of modules and foam to Mars, and then create whatever kind of robots you need once you get there. And with foam that dissolves or degrades, you could even recycle your old robots into new robots if the scope of the mission changes. Modular robots were a brilliant idea to begin with, but this foam stuff definitely has the potential to make them even more versatile.
Once a secret project, Google's autonomous vehicles are now out in the open, quite literally, with the company test-driving them on public roads and, on one occasion, even inviting people to ride inside one of the robot cars as it raced around a closed course.
This could be it, folks. The one killer application that the entire robotics world has been waiting for. It's bold, it's daring, it's potentially transformative, and you know you want it.
Ben Cohen and his colleagues from the GRASP Lab at the University of Pennsylvania devoted literally an entire weekend to programming their PR2 robot, Graspy, to handle POOPs. POOPs (Potentially Offensive Objects for Pickup) are managed by the robot using a customized POOP SCOOP (Perception Of Offensive Products and Sensorized Control Of Object Pickup) routine. While POOP can be just about anything that you'd rather not have to pick up yourself, in this particular case, the POOP does happen to be poop, since arguably, poop is the worst kind of POOP.*
Oh yes, there absolutely is video:
While you can't hear it in the video, Graspy begins its task by declaring in a vaguely disappointed robotic monotone, "time for me to scoop some poop." You get the sense that this $400,000 robot is asking itself whether or not this kind of work is really what it signed up for. Using its color camera, the robot first identifies poops based on their color, navigates to said poop, and then using a special human tool, it performs the scoop. Haptics are employed to ensure that each poop scoop is a success, and if not, the robot will give it another try. Failure doesn't happen often, though: Graspy is able to successfully scoop poop about 95% of the time in over 100 trials, at a rate of over one poop per minute.
There's still some work to do be done in order to get PR2 scooping poop like a pro (or an obedient human). For example, it's currently only able to handle high fiber poop, although that may be solvable with a different tool. If you think you have a clever way of making PR2 a better poop scooper, you can download the POOP SCOOP ROS stack and contribute to the betterment of humanity through robotics at the link below.
"POOP SCOOP: Perception Of Offensive Products and Sensorized Control Of Object Pickup" was presented at the PR2 workshop at IROS 2011.