Automaton iconAutomaton

Clear Your Schedule: National Robotics Week is Just One Month Away

Last year, we had a blast at the first annual National Robotics Week, where we got the world's first look at Stickybot III, got some tasty chocolate from Willow Garage, and tried to best an Adept Quattro at pick and place robot with a Wiimote (we failed).

Once again, National Robotics Week is much too badass to be constrained by one single week, which is why it's nine days long, running from the 9th to the 17th of April. Sponsors include heavyweights like iRobot, Adept, National Instruments, and Microsoft. As far as what you personally can get out of it, well, just check on this handy map for special events in your area.

Part of the point of National Robotics Week is to spread the word about how robotics is playing an increasingly important role in our lives, and how that makes robotics education even more important. If you're reading this blog, you're probably more familiar with the epic awesomeness of robots than most people you know, so don't just go to an event: take someone else who isn't familiar with robots along with you, and show them why robotics is the future.

[ National Robotics Week ]

Robot's Magic Wheels Transform Into Legs

Wheels are great for moving fast and efficiently, but bad for negotiating terrain. Legs are great for negotiating terrain, but not as good for moving fast and efficiently. To create a robot that can move fast when it needs to but can also adapt to get around complex surfaces, a group from National Taiwan University's Bio-Inspired Robotic Laboratory (BioRoLa) created Quattroped, a robot that can turn its wheels into legs: 

How awesome is that, right?! It would probably be most accurate to say that the bot's wheels transform into not legs but whegs, several varieties of which we've seen over the last couple years. Whegs function similarly to legs, except that they move in a circle instead of back and forth, making them more effective at clambering over obstacles. And as you can see in the video, the bot can even "walk" by moving alternate pairs of whegs.

Quattroped is equipped with GPS, a vision system, and laser ranger, and the team is actively working to integrate more sensors to improve the perceptual capabilities of the robot. On the software side, it's running National Instruments' LabView, and while a remote PC is involved for control and data logging, most of the processing is done on the robot itself.

This is an amazingly adaptable platform, and besides the additional complexity in the wheel hubs and some minimal compromises on wheel strength, this type of thing seems like an obvious way to give mobile robots significant additional capabilities.

[ National Instruments ]

Navy Wants Robot Swarm That Can Autonomously Build Stuff, Apocalypse Unlikely

The US Navy is soliciting proposals for program that's intended to develop a swarm of tiny robots that are capable of manufacturing complex objects, potentially including other robots. If you let your imagination go berserk this may sound like a precursor to some sort of unstoppable robot uprising, but that's just fiction. And why would we waste our time talking about fictional robot uprisings or whatever when the real robots themselves are so much more interesting? Here's what the US Navy wants:

Develop a swarm of micro-robotic fabrication machines that will enable the manufacture of new materials and components. A micro-robot swarm should be able to perform material synthesis and component assembly, concurrently.  The micro-robots could be designed to perform basic operations such as pick and place, dispense liquids, print inks, remove material, join components, etc. Examples of complex material systems of potential interest include but are not limited to: multi-functional materials, programmable materials, metamorphic materials, extreme materials, heterogeneous materials, synthetic materials, etc.

Basically, it's one of those DARPA-esque "here's some crazy thing we want, now go make it happen" things. And it's actually crazier than it sounds, since "micro" is a bit misleading: what the Navy is really looking for are robots that are capable of manipulating "nano- and micron-scale building blocks." So these robots would be really, really small, and there'd need to be a whole heap of them cooperating and doing different jobs in the right places and in the right order. All right there, on your desk. You'd just dump out a bunch of these itty bitty robots, tell them you need a new cellphone or whatever, and they'd get busy and whip up a new one for you right there while you watch.

Unsurprisingly, we've got a little ways to go before you'll be able to buy your own jar of magic robodust. The Navy solicitation is in three phases, with phase I being a proof of concept, and it's going to take some work to even get that far. But micro, nano, and swarm robots are all a reality already, so now that the government has decided to throw a bunch of money at the problem, it's just going a matter of time before all the little pieces get put together and start working for us.

[ Navy Solicitation ] via [ Danger Room ]

Image of Alice swarmbots, from EPFL

TaxiBot Brings Autonomy to Aircraft Taxiing, Almost

 

This is TaxiBot. TaxiBot is big and strong and is capable of hauling the mighty Boeing 747 and the mightier Airbus A380 around airports, almost autonomously:

If you think about it, an airport is more or less the best possible place outside of a laboratory for an autonomous robotic vehicle to operate. It's tightly controlled, without random people wandering around all over the place or suicidal bicyclists. It's entirely flat. There are extremely well-defined areas in which vehicles can operate. Everything runs on a tight schedule (ideally). And as far as hauling airplanes around, there are huge freakin' yellow lines painted on the ground that a robot can follow anywhere it needs to go.

It's a little disappointing, then, that TaxiBot doesn't actually incorporate much in the way of autonomy. It's basically just a big remote control car that pilots can steer directly from the cockpit, and that's driven around by a human when it's not hauling aircraft. The point? The aircraft don't have to use their engines while taxiing, reducing wear and saving fuel. So that's good and all, I just kinda wish TaxiBot was, you know, a little less taxi and more a little bot. It's something they've got in the works, though: the company says that the control architecture of the vehicle is already in place to support autonomous tug operation so that in the near future no tug driver would be needed for taxiing. Sweet, bring it on!

[ Ricardo TaxiBot ]

Nao Gets Clever New Self-Charger

This is just an engineering prototype, but Nao's new self-charging station looks pretty slick. The robot checks out special marks on the base of the charger to align what looks like a special backpack with a (magnetic?) charging plug, and once it's attached, an extendable cord lets you continue to use the robot while it charges. Or, Nao will just relax a bit until its topped off. When charging is complete, Nao swipes its arm across its back to detach the plug, which retracts back into the charger:

So, that's neat. It's also a little bit convoluted, if you ask me, but what do you want, a charger Nao could just walk onto that would charge it through its feet or something? Hey, now there's an idea...

No info on pricing or availability just yet, but we'll keep you updated.

[ Nao ] via [ Robots-Dreams ]

Top 10 Robotic Kinect Hacks

We love Microsoft's Kinect 3D sensor, and not just because you can play games with it. At a mere $150, it's a dirt-cheap way to bring depth sensing and 3D vision to robots, and while open-source USB drivers made it easy, a forthcoming Windows SDK from Microsoft promises to make it even easier.

Kinect, which is actually hardware made by an Israeli company called PrimeSense, works by projecting an infrared laser pattern onto nearby objects. A dedicated IR sensor picks up on the laser to determine distance for each pixel, and that information is then mapped onto an image from a standard RGB camera. What you end up with is an RGBD image, where each pixel has both a color and a distance, which you can then use to map out body positions, gestures, motion, or even generate 3D maps. Needless to say, this is an awesome capability to incorporate into a robot, and the cheap price makes it accessible to a huge audience.

We've chosen our top 10 favorite examples of how Kinect can be used to make awesome robots, check it out:

1. Kinect Quadrotor Bolting a Kinect to the top of a quadrotor creates a robot that can autonomously navigate and avoid obstacles, creating a 3D map as it goes.

2. Hands-free Roomba Why actually vacuum when you can just pretend to actually vacuum, and then use a Kinect plus a Roomba to do the vacuuming for you?

3. iRobot AVA iRobot integrated two (two!) Kinect sensors into their AVA not-exactly-telepresence prototype: one to help the robot navigate and another one to detect motion and gestures.

4. Bilibot The great thing about Kinect is that it can be used to give complex vision to cheap robots, and Bilibot is a DIY platform that gives you mobility, eyes, and a brain in a package that costs just $650.

5. Gesture Surgery If you've got really, really steady hands, you can now use a Kinect that recognizes hand gestures to control a DaVinci robotic surgical system.

6. PR2 Teleoperation Willow Garage's PR2 already has 3D depth cameras, so it's kinda funny to see it wearing a Kinect hat. Using ROS, a Kinect sensor can be used to control the robot's sophisticated arms directly.

7. Humanoid Teleoperation Taylor Veltrop put together this sweet demo showing control over a NAO robot using Kinect and some Wii controllers. Then he gives the robot a banana, and a knife (!).

8. Car Navigation Back when DARPA hosted their Grand Challenge for autonomous vehicles, robot cars required all kinds of crazy sensor systems to make it down a road. On a slightly smaller scale, all they need now is a single Kinect sensor.

9. Delta Robot This Kinect controlled delta robot doesn't seem to work all that well, which makes it pretty funny (and maybe a little scary) to watch.

10. 3D Object Scanning Robots can use Kinect for mapping environments in 3D, but with enough coverage and precision, you can use them to whip up detailed 3D models of objects (and people) too.

Latest Geminoid Is Incredibly Realistic

geminoid dk
Geminoid DK is the first Geminoid based on a non-Japanese person, and also the first bearded one.

Okay, I admit it... I found myself wondering whether this was in fact a real robot, or actually a person pretending to be a robot.

It's not a fake. This is the latest iteration of the Geminoid series of ultra-realistic androids, from Japanese firm Kokoro and Osaka University mad scientist roboticist Hiroshi Ishiguro. Specifically, this is Geminoid DK, which was constructed to look exactly like associate professor Henrik Scharfe of Aalborg University in Denmark.

UPDATE: Wow. We've just found a new video that is absolutely amazing:

When we contacted Prof. Scharfe inquiring about the android, he confirmed: "No, it is not a hoax," adding that he and colleagues in Denmark and Japan have been working on the project for about a year now. His Geminoid, which cost some US $200,000, was built by Kokoro in Tokyo and is now at Japan's Advanced Telecommunications Research Institute International (ATR) in Nara for setup and testing.

"In a couple of weeks I will go back to Japan to participate in the experiments," he says. "After that, the robot is shipped to Denmark to inhabit a newly designed lab."

Geminoid DK does look pretty much exactly like the original template:

The Geminoid is on the right. I think.

geminoid hi-1 geminoid fIf you're wondering why on Earth someone would want an exact robotic double of themselves, besides being TOTALLY AND COMPLETELY AWESOME, the Geminoid is going to be used for studying human-robot interaction, in particular people's emotional responses when they face an android representing another person. Prof. Scharfe wants to find out if the robot can transmit a person's "presence" to a remote location and whether cultural differences in people's acceptance of robots make a difference.

These are some of the same questions that Hiroshi Ishiguro set out to explore when he created his robot clone, the Geminoid HI-1, and a copy of a twentysomething Japanese model, the Geminoid F [see photos, right].

For his part, Ishiguro, a professor at Osaka University and group leader at ATR, declined to give us more details about his involvement with the Geminoid DK project, saying only that he and Scharfe "are working together."

Like with the other Geminoid robots, all of the movements and expressions of Geminoid DK are remote controlled by an operator with a computer, who uses a motion-capture system that tracks facial expressions and head movements. Turn your head and the Geminoid does the same; move your mouth and the android follows suit.

But it's not hard to imagine full autonomy in the not-to-distant future.

Incidentally, according to a note on his website here's what Prof. Scharfe's wife thinks about his robotic double:

- She prefers body number 1

- She suggests that he should always send body number 2 to conferences and stuff

Prefers body number 1, eh? Does she know that body number 2 is upgradeable?

Here's another video and more (freaky) pics of Geminoid DK in the making to fuel your nightmares, enjoy:

geminoid dk

geminoid dk

Images and videos: Geminoid DK

READ ALSO:

Meet Elfoid, A Creepy Robot Cellphone
Thu, March 03, 2011

Blog Post: With the Elfoid robot-cellphone hybrid, you can carry the uncanny valley in your pocket

Female Android Geminoid F Unveiled 
Sat, April 03, 2010

Blog Post: Geminoid F, an android copy of a woman in her twenties, can talk, gesture, and smile

Geminoid F Gets Job as Actress
Thu, November 11, 2010

Blog Post: The Japanese android Geminoid F takes to the stage. Is her next stop -- Broadway?

Geminoid F Looks Even More Realistic
Mon, November 01, 2010

Blog Post: The female android has now facial movements even more realistic than before 

Google Shows Us Why We All Need Robot Cars

We're pretty familiar with autonomous cars around here, and we've even been treated to a ride in one of Stanford's robots at their automotive innovation lab, which they launched in partnership with Volkswagen. You might also remember Shelley, their autonomous Audi TTS, which autonomously raced to the top of Pikes Peak last year. Volkswagen's thinking behind all of this high performance autonomous car stuff is that at some point, they'll be able to program your car to be a far, far better driver than you could ever be, and it'll have the ability to pull some crazy maneuvers to save you from potential accidents.

Read More

AeroVironment's Nano Hummingbird Surveillance Bot Would Probably Fool You

Back in July of 2009, we got our first look at AeroVironment's excessively hummingbirdish nano air vehicle (NAV) as it went through tethered and untethered tests. The more capable Phase II version that DARPA asked for is now complete, and is demonstrating controlled indoor and outdoor flight, endurance flights, and precision hovering:

The Nano Hummingbird may be slightly noisier than a real hummingbird, but it sure does look like one, and be honest: if you saw one of these things buzzing around, would you think it might be a surveillance robot, or would you just assume it was an obnoxiously loud bird?

The bot itself weighs in at 19 grams, with a 16 centimeter wingspan, and it can buzz around at upwards of 11 mph for over 10 minutes. Just like a real hummingbird, the NAV uses only its wings for propulsion and steering, and can move in any direction or rotate in place like a helicopter. It's not yet clear how resilient the robot is when it comes to impacts (something that's common in indoor environments where surveillance bots like this can be expected to be used), but in general, winged designs tend to be more capable and forgiving than something with a rotor.

The Nano Hummingbird isn't just a stable platform, it's quite nimble, too. Here it is doing a 360 degree loop:

Note that that wasn't just a flip, it was an autonomous flip, implying that the dude standing there with the controls is more or less redundant.

So what's the future for the NAV? Well, this is obviously just a prototype, but all of the functionality (including the payload and endurance) seems to be there. AeroVironment estimates that it would be about a decade before the Nano Hummingbird would be ready for reconnaissance deployment, but that strikes me as extremely conservative, especially considering the immediate usefulness of the platform. Besides, in a decade this thing will probably be refined and miniaturized to about the size of a flea, and we'll have them crawling over us all the time, everywhere.

On a side note, the Nano Hummingbird was chosen by DARPA to move to Phase II of development over Lockheed Martin's equally cool SAMARAI, which they're still working on as of October of last year. So pretty soon, it'll be birds, bugs, and trees that will all be spying on you. Yay robots!

[ AeroVironment ] VIA [ Physorg ]

Climbing Robot Squirts Honey On Its Feet For Sticking Power

We've seen robots use a staggering variety of different techniques to climb things, and some of the most elegant (if not necessarily the most successful) are inspired by biology. Stanford's Stickybot is a good example of this, using nanoscale adhesive pads modeled on gecko feet to cling to smooth surfaces. But there are other animals that can stick to things even better than geckos can: our friends (and occasional enemies), the insects.

Insects climb in a couple different ways. On rough surfaces, they usually rely on small claws (kinda like Spinybot), but on smooth surfaces, some insects secrete an oily fluid to help turn pads on their feet into little suction cups of a sort. Minghe Li, a roboticist at Tongji University in Shanghai, has created a climbing robot that mimics this capability using pliable silicon feet that squirt out a mixture of honey and water onto the climbing surface.

It only takes a very little bit of liquid for the feet to stick, and while the robot currently can't climb slopes past 75 degrees, this method may ultimately prove to be as effective as the gecko-type sticky foot on smooth surfaces and more effective on rough or wet surfaces, which the gecko adhesive has trouble with. Also, making artificial gecko feet is tricky and expensive, while making honey just involves being nice to bees.

Li is currently adapting his robot's feet to better emulate those of insects to try to improve its climbing effectiveness. He's also, I imagine, looking for a way to clean up the sticky little footprints that are undoubtedly all over his lab.

Via [ New Scientist ]

Advertisement

Automaton

IEEE Spectrum's award-winning robotics blog, featuring news, articles, and videos on robots, humanoids, automation, artificial intelligence, and more.
Contact us:  e.guizzo@ieee.org

Editor
Erico Guizzo
New York, N.Y.
Senior Writer
Evan Ackerman
Berkeley, Calif.
 
Contributor
Jason Falconer
Canada
Contributor
Angelica Lim
Tokyo, Japan
 

Newsletter Sign Up

Sign up for the Automaton newsletter and get biweekly updates about robotics, automation, and AI, all delivered directly to your inbox.

Advertisement
Load More