Automaton iconAutomaton

Nao Gets Clever New Self-Charger

This is just an engineering prototype, but Nao's new self-charging station looks pretty slick. The robot checks out special marks on the base of the charger to align what looks like a special backpack with a (magnetic?) charging plug, and once it's attached, an extendable cord lets you continue to use the robot while it charges. Or, Nao will just relax a bit until its topped off. When charging is complete, Nao swipes its arm across its back to detach the plug, which retracts back into the charger:

So, that's neat. It's also a little bit convoluted, if you ask me, but what do you want, a charger Nao could just walk onto that would charge it through its feet or something? Hey, now there's an idea...

No info on pricing or availability just yet, but we'll keep you updated.

[ Nao ] via [ Robots-Dreams ]

Top 10 Robotic Kinect Hacks

We love Microsoft's Kinect 3D sensor, and not just because you can play games with it. At a mere $150, it's a dirt-cheap way to bring depth sensing and 3D vision to robots, and while open-source USB drivers made it easy, a forthcoming Windows SDK from Microsoft promises to make it even easier.

Kinect, which is actually hardware made by an Israeli company called PrimeSense, works by projecting an infrared laser pattern onto nearby objects. A dedicated IR sensor picks up on the laser to determine distance for each pixel, and that information is then mapped onto an image from a standard RGB camera. What you end up with is an RGBD image, where each pixel has both a color and a distance, which you can then use to map out body positions, gestures, motion, or even generate 3D maps. Needless to say, this is an awesome capability to incorporate into a robot, and the cheap price makes it accessible to a huge audience.

We've chosen our top 10 favorite examples of how Kinect can be used to make awesome robots, check it out:

1. Kinect Quadrotor Bolting a Kinect to the top of a quadrotor creates a robot that can autonomously navigate and avoid obstacles, creating a 3D map as it goes.

2. Hands-free Roomba Why actually vacuum when you can just pretend to actually vacuum, and then use a Kinect plus a Roomba to do the vacuuming for you?

3. iRobot AVA iRobot integrated two (two!) Kinect sensors into their AVA not-exactly-telepresence prototype: one to help the robot navigate and another one to detect motion and gestures.

4. Bilibot The great thing about Kinect is that it can be used to give complex vision to cheap robots, and Bilibot is a DIY platform that gives you mobility, eyes, and a brain in a package that costs just $650.

5. Gesture Surgery If you've got really, really steady hands, you can now use a Kinect that recognizes hand gestures to control a DaVinci robotic surgical system.

6. PR2 Teleoperation Willow Garage's PR2 already has 3D depth cameras, so it's kinda funny to see it wearing a Kinect hat. Using ROS, a Kinect sensor can be used to control the robot's sophisticated arms directly.

7. Humanoid Teleoperation Taylor Veltrop put together this sweet demo showing control over a NAO robot using Kinect and some Wii controllers. Then he gives the robot a banana, and a knife (!).

8. Car Navigation Back when DARPA hosted their Grand Challenge for autonomous vehicles, robot cars required all kinds of crazy sensor systems to make it down a road. On a slightly smaller scale, all they need now is a single Kinect sensor.

9. Delta Robot This Kinect controlled delta robot doesn't seem to work all that well, which makes it pretty funny (and maybe a little scary) to watch.

10. 3D Object Scanning Robots can use Kinect for mapping environments in 3D, but with enough coverage and precision, you can use them to whip up detailed 3D models of objects (and people) too.

Latest Geminoid Is Incredibly Realistic

geminoid dk
Geminoid DK is the first Geminoid based on a non-Japanese person, and also the first bearded one.

Okay, I admit it... I found myself wondering whether this was in fact a real robot, or actually a person pretending to be a robot.

It's not a fake. This is the latest iteration of the Geminoid series of ultra-realistic androids, from Japanese firm Kokoro and Osaka University mad scientist roboticist Hiroshi Ishiguro. Specifically, this is Geminoid DK, which was constructed to look exactly like associate professor Henrik Scharfe of Aalborg University in Denmark.

UPDATE: Wow. We've just found a new video that is absolutely amazing:

When we contacted Prof. Scharfe inquiring about the android, he confirmed: "No, it is not a hoax," adding that he and colleagues in Denmark and Japan have been working on the project for about a year now. His Geminoid, which cost some US $200,000, was built by Kokoro in Tokyo and is now at Japan's Advanced Telecommunications Research Institute International (ATR) in Nara for setup and testing.

"In a couple of weeks I will go back to Japan to participate in the experiments," he says. "After that, the robot is shipped to Denmark to inhabit a newly designed lab."

Geminoid DK does look pretty much exactly like the original template:

The Geminoid is on the right. I think.

geminoid hi-1 geminoid fIf you're wondering why on Earth someone would want an exact robotic double of themselves, besides being TOTALLY AND COMPLETELY AWESOME, the Geminoid is going to be used for studying human-robot interaction, in particular people's emotional responses when they face an android representing another person. Prof. Scharfe wants to find out if the robot can transmit a person's "presence" to a remote location and whether cultural differences in people's acceptance of robots make a difference.

These are some of the same questions that Hiroshi Ishiguro set out to explore when he created his robot clone, the Geminoid HI-1, and a copy of a twentysomething Japanese model, the Geminoid F [see photos, right].

For his part, Ishiguro, a professor at Osaka University and group leader at ATR, declined to give us more details about his involvement with the Geminoid DK project, saying only that he and Scharfe "are working together."

Like with the other Geminoid robots, all of the movements and expressions of Geminoid DK are remote controlled by an operator with a computer, who uses a motion-capture system that tracks facial expressions and head movements. Turn your head and the Geminoid does the same; move your mouth and the android follows suit.

But it's not hard to imagine full autonomy in the not-to-distant future.

Incidentally, according to a note on his website here's what Prof. Scharfe's wife thinks about his robotic double:

- She prefers body number 1

- She suggests that he should always send body number 2 to conferences and stuff

Prefers body number 1, eh? Does she know that body number 2 is upgradeable?

Here's another video and more (freaky) pics of Geminoid DK in the making to fuel your nightmares, enjoy:

geminoid dk

geminoid dk

Images and videos: Geminoid DK

READ ALSO:

Meet Elfoid, A Creepy Robot Cellphone
Thu, March 03, 2011

Blog Post: With the Elfoid robot-cellphone hybrid, you can carry the uncanny valley in your pocket

Female Android Geminoid F Unveiled 
Sat, April 03, 2010

Blog Post: Geminoid F, an android copy of a woman in her twenties, can talk, gesture, and smile

Geminoid F Gets Job as Actress
Thu, November 11, 2010

Blog Post: The Japanese android Geminoid F takes to the stage. Is her next stop -- Broadway?

Geminoid F Looks Even More Realistic
Mon, November 01, 2010

Blog Post: The female android has now facial movements even more realistic than before 

Google Shows Us Why We All Need Robot Cars

We're pretty familiar with autonomous cars around here, and we've even been treated to a ride in one of Stanford's robots at their automotive innovation lab, which they launched in partnership with Volkswagen. You might also remember Shelley, their autonomous Audi TTS, which autonomously raced to the top of Pikes Peak last year. Volkswagen's thinking behind all of this high performance autonomous car stuff is that at some point, they'll be able to program your car to be a far, far better driver than you could ever be, and it'll have the ability to pull some crazy maneuvers to save you from potential accidents.

Read More

AeroVironment's Nano Hummingbird Surveillance Bot Would Probably Fool You

Back in July of 2009, we got our first look at AeroVironment's excessively hummingbirdish nano air vehicle (NAV) as it went through tethered and untethered tests. The more capable Phase II version that DARPA asked for is now complete, and is demonstrating controlled indoor and outdoor flight, endurance flights, and precision hovering:

The Nano Hummingbird may be slightly noisier than a real hummingbird, but it sure does look like one, and be honest: if you saw one of these things buzzing around, would you think it might be a surveillance robot, or would you just assume it was an obnoxiously loud bird?

The bot itself weighs in at 19 grams, with a 16 centimeter wingspan, and it can buzz around at upwards of 11 mph for over 10 minutes. Just like a real hummingbird, the NAV uses only its wings for propulsion and steering, and can move in any direction or rotate in place like a helicopter. It's not yet clear how resilient the robot is when it comes to impacts (something that's common in indoor environments where surveillance bots like this can be expected to be used), but in general, winged designs tend to be more capable and forgiving than something with a rotor.

The Nano Hummingbird isn't just a stable platform, it's quite nimble, too. Here it is doing a 360 degree loop:

Note that that wasn't just a flip, it was an autonomous flip, implying that the dude standing there with the controls is more or less redundant.

So what's the future for the NAV? Well, this is obviously just a prototype, but all of the functionality (including the payload and endurance) seems to be there. AeroVironment estimates that it would be about a decade before the Nano Hummingbird would be ready for reconnaissance deployment, but that strikes me as extremely conservative, especially considering the immediate usefulness of the platform. Besides, in a decade this thing will probably be refined and miniaturized to about the size of a flea, and we'll have them crawling over us all the time, everywhere.

On a side note, the Nano Hummingbird was chosen by DARPA to move to Phase II of development over Lockheed Martin's equally cool SAMARAI, which they're still working on as of October of last year. So pretty soon, it'll be birds, bugs, and trees that will all be spying on you. Yay robots!

[ AeroVironment ] VIA [ Physorg ]

Climbing Robot Squirts Honey On Its Feet For Sticking Power

We've seen robots use a staggering variety of different techniques to climb things, and some of the most elegant (if not necessarily the most successful) are inspired by biology. Stanford's Stickybot is a good example of this, using nanoscale adhesive pads modeled on gecko feet to cling to smooth surfaces. But there are other animals that can stick to things even better than geckos can: our friends (and occasional enemies), the insects.

Insects climb in a couple different ways. On rough surfaces, they usually rely on small claws (kinda like Spinybot), but on smooth surfaces, some insects secrete an oily fluid to help turn pads on their feet into little suction cups of a sort. Minghe Li, a roboticist at Tongji University in Shanghai, has created a climbing robot that mimics this capability using pliable silicon feet that squirt out a mixture of honey and water onto the climbing surface.

It only takes a very little bit of liquid for the feet to stick, and while the robot currently can't climb slopes past 75 degrees, this method may ultimately prove to be as effective as the gecko-type sticky foot on smooth surfaces and more effective on rough or wet surfaces, which the gecko adhesive has trouble with. Also, making artificial gecko feet is tricky and expensive, while making honey just involves being nice to bees.

Li is currently adapting his robot's feet to better emulate those of insects to try to improve its climbing effectiveness. He's also, I imagine, looking for a way to clean up the sticky little footprints that are undoubtedly all over his lab.

Via [ New Scientist ]

Elfoid: A Pocket-Size Fetus-Like Robot Might Be Your Next Cellphone

UPDATE: I've obtained more photos of Elfoid -- see below.

elfoid telenoid ishiguro android cellphone
Elfoid, a hybrid of cellphone and robot, transmits voice and motion to convey a person's "presence."

A pocket-size android shaped like a fetus might be your next cellphone.

Meet Elfoid, a miniature anthropomorphic robot unveiled today in Japan that works like a cellphone but is designed to transmit not only voice but also "human presence."

That's right. Next time you call your friends, you might be uploading yourself into this fetus body right into their pockets -- and their hands. Can you feel me now?

elfoid cellphone android

The idea is you use a motion-capture system to transmit your face and head movements to the Elfoid, which would reproduce them, plus your voice, on its own little body, thereby conveying your presence.

The contraption is a creation of Japanese roboticist Hiroshi Ishiguro, a professor at Osaka University, famous for creating android clones of himself and of a twentysomething Japanese model, among others. [See Ishiguro, in a black jacket, playing with an Elfoid, photos at the bottom.]

telenoidLast August, Ishiguro and his colleagues at the Advanced Telecommunications Research Institute International, known as ATR, where he's a visiting group leader, unveiled the Telenoid [photo right], an infant-size telepresence android that resembles, depending on whom you ask, Casper the Friendly Ghost, an overgrown sperm, or a developing fetus. Talk about embryonic technology.

Now the Japanese researchers have shrunk the Telenoid into a little robot elf you can carry in your pocket. The Elfoid P1, introduced today at a press conference in Tokyo, combines the robotic technology of the Telenoid with cellphone capability, allowing people to interact in a way that they can "feel each other's presence," according to Ishiguro. It seems the Elfoid can't move its face and limbs as the Telenoid does, but the researchers say they're planning to use microactuators to improve the device's movements.

His team received technical support from Qualcomm Japan to use a 3G communication unit on the android, and NTT Docomo assisted the researchers in testing the device.

Ishiguro says cellphones are constantly improving, and smartphones showed they can have superb interface designs, but one thing has remained the same: Voice still plays a big role in how we communicate, and voice has limitations.

Ishiguro says the human body -- capable of displaying and recognizing subtle cues and gestures -- is the most effective and natural interface for communication, so trying to use androids to capture these advantages makes sense. With the Elfoid, the researchers want to create "an innovative communication medium" capable of conveying human presence to remote locations using voice, appearance, motion, and touch (the Elfoid has a "soft, pleasant-to-the-touch exterior," they say).

And why the strange fetus-like looks? They explain that they sought a minimum design that could be recognized as male or female, old or young, and that users would use their imagination to make the robot more personal.

Any early adopters? Are you ready to "Elfoid" your friends?

Below, a video and more images:

elfoid cellphone android

elfoid cellphone android robot

elfoid cellphone android

elfoid cellphone android

Images: Osaka University and Advanced Telecommunications Research Institute International

READ ALSO:

Female Android Geminoid F Unveiled 
Sat, April 03, 2010

Blog Post: Geminoid F, an android copy of a woman in her twenties, can talk, gesture, and smile

Geminoid F Gets Job as Actress
Thu, November 11, 2010

Blog Post: The Japanese android Geminoid F takes to the stage. Is her next stop -- Broadway?

Geminoid F Looks Even More Realistic
Mon, November 01, 2010

Blog Post: The female android features facial movements even more realistic than before 

Germans Build 'Terminator' Robot Hand
Tue, January 25, 2011

Blog Post: Watch this robot hand getting hit by a hammer without breaking into pieces

Robot Seal Plays Basketball Better Than You

robot seal plays basketball

The furry creature above? No, it's not Paro, the Japanese therapeutic robot seal. This hoop-shooting mechatronic harp seal is a creation of Taiwanese roboticists at the Industrial Technology Research Institute. The thing is not really a fully-actuated robotic animal; it's more of a manipulator arm disguised as a stuffed plush seal, with its multi-fingered gripper freakishly sticking out of the creature's mouth, for added Uncanny Valley-esque creepiness

The researchers claim their robot can convert hoops 99 percent of the time, but keep in mind it's shooting a toy basketball at close range (the maximum distance in the experiment was 3 meters). Still, watching this bizarre bot in action is utterly entertaining.

Jwu-Sheng Hu and colleagues described their robot at the IEEE International Conference on Intelligent Robots and Systems, last October, where they presented the paper "A Ball-Throwing Robot with Visual Feedback."

It looks like a simple stunt, but there's a good deal of tech behind it. The robot uses a stereo vision system to compute the position of the hoop in three-dimensional space. Based on that position, an algorithm determines the angle and speed the robot needs to launch the ball to hit the target. After some calibration procedures, the robotic arm does the rest. Watch out, LeBron!

UPDATE: I incorrectly attributed this work to researchers at National Chiao-Tung University. The robot was actually built at Taiwan's Industrial Technology Research Institute. National Chiao-Tung University collaborated in the project.

Image and video: Industrial Technology Research Institute

Boston Dynamics Building Fast-Running Robot Cheetah, New Agile Humanoid

boston dynamics cheetah robot
Computer model of Boston Dynamics' Cheetah robot galloping.

Boston Dynamics, best known for its BigDog bionic beast and other agile machines, is developing two new robots: one will be a super fast quadruped called Cheetah, which obviously should've been named BigCat; the other is a beautifully intricate, freakishly scary full-size humanoid called T-800 Atlas.

The Cheetah robot will have a flexible spine, an articulated head and neck, and possibly a tail. Think of BigDog, bur rather than a robot mule, Cheetah will be able to accelerate rapidly and make tight turns so it can "chase or evade," the company said in a statement.

In fact, Boston Dynamics says Cheetah will sprint "faster than any existing legged robot and faster than the fastest human runners." That's a bold claim. But seeing what the company has demonstrated with BigDog, we're excited to see this cybernetic cat stepping out of their lab.

The second robot Boston Dynamics is building, the humanoid Atlas, will have a torso, two arms and two legs, and will be capable of climbing and maneuvering in rough terrain. The robot will "sometimes walk upright as a biped, sometimes turning sideways to squeeze through narrow passages," and sometimes crawl, using its hands for extra support and balance. (I don't suppose they'll run the Cheetah running algorithms on it, or will they?)  

petman atlas boston dynamics humanoid
Atlas, a new humanoid robot that Boston Dynamics is developing, will rely on hardware built for another of the company's robots, Petman, shown here during initial assembly and testing.

Atlas will be based, in part, on Petman, an anthropomorphic robot Boston Dynamics developed for the U.S. Army. Until recently, only the robot's legs had been made public, but now the company has unveiled its full (well, headless) body [see photo above].

Atlas will be different from existing humanoids that use static techniques to control their movements, relying instead on a dynamic control approach, the company said. "Unlike Honda’s Asimo and most other humanoid robots you’ve seen, Atlas will walk like a man, using a heel-to-toe walking motion, long strides and dynamic transfer of weight on each step," said Rob Playter, the Atlas principal investigator and vice president of engineering at Boston Dynamics.

Another bold claim. So far we've seen that Boston Dynamics can make its Petman humanoid run -- and fast. But I still want to see it walking around in human spaces, negotiating obstacles on the floor, or keeping its balance when someone pokes it on the chest -- things that other humanoids, including, yes, Asimo, have demonstrated a long time ago.

The company says both robots will stand out for their use of dynamic agility, throwing or swinging their legs and arms to maintain balance and overcome obstacles. “For these programs to succeed we must develop robot hardware and software with the speed, flexibility and strength of athletes, and a more fundamental understanding of how legs work” said Marc Raibert, lead investigator of the Cheetah program and president of Boston Dynamics.

Boston Dynamics, based in Waltham, Mass., will develop the two robots as part of new contracts that the U.S. Defense Advanced Research Projects Agency (DARPA) has awarded the company. 

The company says that, in addition to military applications, Cheetah and Atlas could find uses in emergency response, firefighting, advanced agriculture, and vehicular travel in places that are inaccessible to conventional wheeled and tracked vehicles.

Images: Boston Dynamics

World's First Robot Marathon Ends With Great Finale

vstone humanoid robot marathon

Marathon runners require long hours of training, plenty of water, and an iron will. In the world's first bipedal robot marathon, the key ingredients seemed to be line-tracking algorithms, batteries, and lots of compressed air coolant.

The 42.195-kilometer race (the length of a real marathon) took place in Osaka, and a little humanoid robot called Robovie-PC was the big champion. It crossed the finish line on Saturday, after a grueling 54 hours, 57 minutes and 50.26 seconds -- more than two days running non-stop on the track. Only 1.73 seconds later, another contestant, Robovie-PC Lite, completed the race. The robot naming isn't a coincidence: The two robots were the submissions of Vstone robotics company, which organized the event with the city of Osaka.

It was an exciting ending. Watch:

What makes a winning robot? Team Vstone used line-tracking to navigate the track, taking advantage of the rule allowing autonomous navigation. The other four teams, including two student teams from Osaka Institute of Technology, patiently controlled their bots using game controller-like remotes. A dedicated human presence was also necessary to support the "runners": When batteries ran low, teams rushed in to swap them for fresh ones. Periodically, teams also needed to spray overheating motors with cans of cool, compressed air. Falling, however, was not a problem -- all robots had to be designed with an automatic "getting up" feature.

At an average speed of 0.7 km/h, the robots were about as exciting as watching a tortoise cross the Sahara. However, these endurance races highlight the requirements for long-running, autonomous robots. Robots -- that don't have their own dedicated pit crew -- need autonomous navigation, automatic recharging, and low-maintenance actuators. The bipedal aspect was also important; stairs and raised sidewalks are constant reminders that our world is designed for two-legged humans.

The Japanese government is aiming for robots to take care of their nation's aging population. But if we want robots to take care of us, instead of the other way around, we'll first need to see a robot marathon where no human intervention is required. So what we saw here were the first steps -- literally.

Image: Vstone Co.

Angelica Lim is a graduate student at the Okuno and Ogata Speech Media Processing Group at Kyoto University, Japan.

Advertisement

Automaton

IEEE Spectrum's award-winning robotics blog, featuring news, articles, and videos on robots, humanoids, automation, artificial intelligence, and more.
Contact us:  e.guizzo@ieee.org

Editor
Erico Guizzo
New York, N.Y.
Senior Writer
Evan Ackerman
Berkeley, Calif.
 
Contributor
Jason Falconer
Canada
Contributor
Angelica Lim
Tokyo, Japan
 

Newsletter Sign Up

Sign up for the Automaton newsletter and get biweekly updates about robotics, automation, and AI, all delivered directly to your inbox.

Advertisement
Load More