Automaton iconAutomaton

Wiggly Robotic Cat Ears: Sure, Why Not

Robots have ears. They're called microphones, and you usually find them just inside some tiny little hole somewhere. But you have to figure that there are good reasons why animals like this exist: big ears can confer an advantage. Namely, big ears allow animals to hear quieter sounds, and localize those sounds more precisely.

This is the idea behind "active soft pinnae," which is fancy roboticist talk for "ears that wiggle." The robotic ear in the picture above is a reasonably faithful reproduction of a kitty ear, including a fake fur covering on the back and the ability to both rotate side to side and deform downwards. There's a microphone buried down inside the ear, of course, but the external structure is the important part.

So what good is it? I mean, you can ask your cat, but testing has shown that it's possible to pinpoint the direction (azimuth and elevation) to a sound with just two wigglable ears instead of needing a complex microphone array. Furthermore, the ears can be used to localize sounds by moving independently of the head or body of a robot, which is a much more efficient approach. And of course, ears like these are awfully cute, and with the addition of some touch sensors, you could give your robot that friendly scritching that it deserves.

"Active soft pinnae for robots," by Makoto Kumon and Yoshitaka Noda from Kumamoto University in Japan, was presented at the IEEE International Conference on Intelligent Robots and Systems in San Francisco last month.

Swiss Climbing Robot Hot Glues Itself To Your Walls

Robot use all sorts of clever techniques to climb: we've seen magnets, grippers, gecko feet, electrostatics, and even supersonic jets of air. It's sort of surprising, then, that the idea of using the most stereotypically sticky thing in the universe to climb has been (more or less) ignored until now. Yes, this robot sticks to surfaces with glue.

Technically, what this robot uses is hot-melt adhesive, or HMA. This is the stuff that comes out of hot glue guns, and it goes from a solid to a sticky liquid when it's passed through a heating element. As it cools, it solidifies again. The robot uses this property to temporarily bond its limbs to a vertical surface one by one and hoist itself up, unsticking itself as it goes by re-heating the blobs of glue that it sets down:

By now, you've probably spotted several issues that this robot has to deal with: first, it's very, very slow, since it has to wait for the adhesive to cure every time it takes a step, a 90 second process. And second, totally it leaves a trail of sticky little glue spots along every surface that it climbs, making its usefulness questionable in many (if not most) environments.

So yes, a few things need to be addressed, but this technique has a bunch of upsides, too. The biggest one is that glue, being glue, sticks to just about anything. It doesn't have to be especially rough, especially smooth, or especially magnetic, which makes it more versatile than than the current generation of just about every other robot adhesion system that I can think of off the top of my head.

Also, the hot melt adhesive can support a lot of weight, and it can do it completely passively: you don't need to expend energy once the adhesive sets to keep from falling. The bonding strength of the HMA in its solid state is such that a four square centimeter little patch can hold a staggering 60 kilograms, easily enough to hold this robot plus a fairly gigantic payload, most of which is likely going to have to consist of extra sticks of glue.

"A Climbing Robot Based on Hot Melt Adhesion," by Marc Osswald and Fumiya Iida from the Bio-Inspired Robotics Laboratory at ETH Zurich, was presented at the IEEE International Conference on Intelligent Robots and Systems in San Francisco last month.

Watch a Robot Build Other Robots out of Spray Foam

Robots are quite good at doing very specific tasks. Arguably, doing very specific tasks are what robots are best at. When you put a robot into an unknown situation, however, odds are you're not going to have a design that's optimized for whatever that situation ends up being. This is where modular robots come in handy, since they can reconfigure themselves on the fly to adapt their hardware to different tasks, and the Modular Robotics Lab at the University of Pennsylvania has come up with a wild new way of dynamically constructing robots based on their CKBot modules: spray foam.

The process starts with a "foam synthesizer cart" that deploys several CKBot clusters, each consisting of a trio of jointed CKBot modules. The CKBot clusters can move around by themselves, sort of, and combined with some helpful nudging from the cart, they can be put into whatever position necessary to form the joints of a robot. The overall structure of the robot is created with insulation foam that the cart sprays to connect the CKBot clusters in such a way as to create a quadruped robot, a snake robot, or whatever else you want. Watch:

Having a robot that shoots foam is good for lots more than building other robots; for example, Modlab has used it to pick up hazardous objects and to quickly deploy permanent doorstops. There's still some work to be done with foam control and autonomy, but Modlab is already thinking ahead. Way ahead:

"By carrying a selection of collapsible molds and a foam generator, a robot could form end e ffectors on a task-by-task basis -- for example, forming wheels for driving on land, impellers and oats for crossing water, and high aspect ratio wings for gliding across ravines. Molds could also be made of disposable material (e.g. paper) that forms part of the final structure. Even less carried overhead is possible by creating ad-hoc molds: making a groove in the ground or placing found objects next to each other."

With this kind of capability, you could (say) send a bunch of modules and foam to Mars, and then create whatever kind of robots you need once you get there. And with foam that dissolves or degrades, you could even recycle your old robots into new robots if the scope of the mission changes. Modular robots were a brilliant idea to begin with, but this foam stuff definitely has the potential to make them even more versatile.

[ UPenn Modlab ]

PR2 Scoops Some Poop

This could be it, folks. The one killer application that the entire robotics world has been waiting for. It's bold, it's daring, it's potentially transformative, and you know you want it.

It's POOP.

Ben Cohen and his colleagues from the GRASP Lab at the University of Pennsylvania devoted literally an entire weekend to programming their PR2 robot, Graspy, to handle POOPs. POOPs (Potentially Offensive Objects for Pickup) are managed by the robot using a customized POOP SCOOP (Perception Of Offensive Products and Sensorized Control Of Object Pickup) routine. While POOP can be just about anything that you'd rather not have to pick up yourself, in this particular case, the POOP does happen to be poop, since arguably, poop is the worst kind of POOP.*

Oh yes, there absolutely is video:

While you can't hear it in the video, Graspy begins its task by declaring in a vaguely disappointed robotic monotone, "time for me to scoop some poop." You get the sense that this $400,000 robot is asking itself whether or not this kind of work is really what it signed up for. Using its color camera, the robot first identifies poops based on their color, navigates to said poop, and then using a special human tool, it performs the scoop. Haptics are employed to ensure that each poop scoop is a success, and if not, the robot will give it another try. Failure doesn't happen often, though: Graspy is able to successfully scoop poop about 95% of the time in over 100 trials, at a rate of over one poop per minute.

There's still some work to do be done in order to get PR2 scooping poop like a pro (or an obedient human). For example, it's currently only able to handle high fiber poop, although that may be solvable with a different tool. If you think you have a clever way of making PR2 a better poop scooper, you can download the POOP SCOOP ROS stack and contribute to the betterment of humanity through robotics at the link below.

"POOP SCOOP: Perception Of Offensive Products and Sensorized Control Of Object Pickup" was presented at the PR2 workshop at IROS 2011.

[ ROS Wiki ]

[ GRASP Lab ]

*My guess is that this is an IEEE Spectrum record for the most uses of the word "poop" in one single sentence.

Photo Gallery: Asimo, Kilobots, iCub, and More

We have a few more solid weeks worth of IROS awesomeness to share with you, but since it's Friday and all, we thought it might be a nice time to put together a little gallery of some of the robots from the expo floor of the IEEE International Conference on Intelligent Robots and Systems, which took place last month in San Francisco.

Most of the bots you'll recognize easily, but keep an eye out for some (okay a LOT) of those little Kilobots, as well as a guest appearance by Disney's swarming display robots. Enjoy!


 


And just in case that wasn't enough for you, Willow Garage (one of the IROS sponsors) also put together this little video montage:

[ IROS 2011 ]

Robot Masters Jenga, Next the World

Confirming our suspicions that roboticists basically just sit around and invent ways to play games with robots all day, here's a video from Torsten Kröger (the same guy who took us through the JediBot demonstration at the IEEE International Conference on Intelligent Robots and Systems, go figure) detailing how he and a bunch of his friends built themselves a robot that plays Jenga back in 2005:

As with most, uh, "research" projects like this, there's supposedly some larger purpose to it. Something about the potential of multi-sensor integration in industrial manipulation. Or whatever. I don't buy it, of course, but we can certainly applaud the fact that the robot was able to make 29 moves in a row, which means that it added nearly ten solid layers of blocks to the top of the tower without knocking it over. Time to preemptively surrender, folks. Here's one more vid of the robot making a move:

[ Jenga Robot ]

Video: Meka Robotics Talks Up its Anime-Style Expressive Head

Meka Robotics is based in San Francisco, which is lucky for us, since that made it pretty much impossible for them to not show up at the IEEE International Conference on Intelligent Robots and Systems. They're probably best known for their underactuated, compliant hand (and the arm that goes with it) and more recently for their humanoid head. The S2 head is notable because it manages to maintain a high degree of expressiveness (those eyes are amazing) while entirely avoiding the Uncanny Valley effect, thanks to its vaguely cartoonish look. We asked Meka's co-founder, Aaron Edsinger to take us through it:

The particular robot in this video is called Dreamer, and it belongs to the Human Centered Robotics Lab at the University of Texas, Austin. Dreamer's head was a cooperative effort involving Meka and UT Austin professor Luis Sentis, who came up with the subtle and effective anime look. Part of what helps keep Dreamer's motions so compliant (and lifelike) is its software: called "Whole Body Control," it's a collaboration between UT Austin, Meka, Stanford, and Willow Garage.

Meka is also offering an entirely new system consisting of an arm, gripper, sensor head, and mobile base for $200,000. It's no coincidence that the one-armed PR2 SE costs the exact same amount; the NSF's National Robotics Initiative provides research grants including up to $200k for research platforms. Yep, the government is basically giving these things away for free, all you have to do is convince them that you deserve one, and then pick your flavor.

[ Meka Robotics ]

[ HCRL ]

Video: SQ1 Quadruped Robot from South Korea

This feisty little guy is a quadruped robot called SQ1. It's a project by South Korean company SimLab, whom we met at the IEEE International Conference on Intelligent Robots and Systems last month. Their RoboticsLab simulation software is being used to figure out how to get the quadruped to walk without actually, you know, having to risk a trial-and-error approach on a real robot. And it works! Or rather, it mostly works:

We don't know too much about it, but apparently, there's a much larger (think BigDog/AlphaDog sized) quadruped in existence (sponsored by the South Korean government). This smaller robot is being used to test out different gaits that have proven themselves in simulation, before the full-sized (and more expensive) version tries not to fall over on its own.

[ RoboticsLab ]

How JediBot Got Its Sword Fighting Skills

JediBot, which we saw in action back in July, was a brilliant final project conceived by a group of students for an experimental robotics course at Stanford University. Kuka spotted the video on YouTube, and shortly thereafter, JediBot found itself with a new job as the main attraction at Kuka's booth on the expo floor of the IEEE International Conference on Intelligent Robots and Systems last month. We caught up with Stanford roboticist Torsten Kroeger, who took us through the brains programming behind JediBot's unquenchable thirst for the blood of Sith lords:

It's worth mentioning that due to a slight miscalibration, JediBot was not acting as aggressive as it could have been when we shot this demo. I took some whacks at it myself a little later on, and the robot was having a great time going for my throat every time I let my guard down. I have to say, it's really quite an experience to be on the other end of a robot with a sword doing its level best to separate your head from your body, but considering all the dull, dirty, and dangerous tasks that we tend to saddle robots with, can you really blame them for being overly enthusiastic when we ask them to take a few good-natured swings in our direction?

[ Stanford Robotics ]

Advertisement

Automaton

IEEE Spectrum's award-winning robotics blog, featuring news, articles, and videos on robots, humanoids, automation, artificial intelligence, and more.
Contact us:  e.guizzo@ieee.org

Editor
Erico Guizzo
New York, N.Y.
Senior Writer
Evan Ackerman
Berkeley, Calif.
 
Contributor
Jason Falconer
Canada
Contributor
Angelica Lim
Tokyo, Japan
 

Newsletter Sign Up

Sign up for the Automaton newsletter and get biweekly updates about robotics, automation, and AI, all delivered directly to your inbox.

Advertisement
Load More