Automaton iconAutomaton

Robots Preparing to Defeat Humans in Soccer

Can a team of soccer-playing robots beat the human World Cup champions by 2050?

That's the ultimate goal of RoboCup, an international tournament where teams of soccer robots compete in various categories, from small wheeled boxes to adult-size humanoids.

IEEE Spectrum's Harry Goldstein traveled to Singapore to attend RoboCup 2010 -- and check out how the man vs. machine future of soccer is playing out.

Special thanks to Aamir Ahmad and Prof. Pedro U. Lima from the Institute for Systems and Robotics, Instituto Superior Técnico, in Lisboa, Portugal; Prof. Mike Wong Hong Kee from Singapore Polytechnic; and the Advanced Robotics & Intelligent Control Centre at Singapore Polytechnic for additional footage.

Cornell's Ranger Robot Breaks New Walking Record

cornell ranger robot

Ranger, a four legged bi-pedal robot, set an unofficial record at Cornell last month for walking 23 kilometers (14.3 miles), untethered, in 10 hours and 40 minutes. Walking at an average pace of 2.1 km/h (1.3 miles per hour), Ranger circled the indoor track at Cornell’s Barton Hall 108.5 times, taking 65,185  steps before it had to stop and recharge. Ranger walks much like a human, using gravity and momentum to help swing its legs forward, though its looks like a boom box on stilts. Its swinging gait is like a human on crutches since the robot has no knees, and its two exterior legs are connected at the top and its two interior legs are connected at the bottom.

Engineering students at Cornell’s Biorobotics and Locomotion Laboratory stayed up all night on Tuesday, July 6th, 2010 while their professor Andy Ruina cheered them on over Skype. Jason Cortell, a research engineer specializing in electronics and the lab’s manager, steered Ranger using a remote control. He walked for most of the 11 hours but was carted around when he felt tired, controlling the robot all the same. "When he had to take a bathroom break, he made a run for it while Ranger was on a straightaway," says Ruina.

This is a competitive milestone for the lab after (unofficially) competing for the record with Boston Dynamics’ BigDog over the past two years. The original record was set by Ruina’s lab in April 2008 when Ranger walked 9 km (5.6 miles) around Barton Hall. The record was subsequently broken by Boston Dynamics' BigDog when it walked 20.6 km (12.8 miles). "Ranger competing with BigDog is like Bambi meets Godzilla," says Ruina. "While DARPA funds Boston Dynamics with tens of millions of dollars a year, we've probably received a total funding of 1 million over many years." Most of Ruina’s lab's funding comes from the NSF's Robust Intelligence program.

"I don't anticipate bi-pedal robots being necessarily important in the world of engineering," says Ruina. "What fascinates me is the scientific aspect of bi-pedal robots. It's an indirect way to understand human beings. By studying the legs, joints, and length ratios we appreciate the beauty of nature's design."

But the overall task of the project isn't to reverse engineer a human being -- it is a study of electrical efficiency and their goal is to figure out how to build a robot that moves as efficiently as a human. "Human beings are robust and energy stingy," explains Ruina. "We are trying to get a robot to be as reliable as a human being. If Ranger walks 14 miles, he uses 3 cents of electricity, which is more than twice as much as a human of equal weight would have used for the same distance." The data could have an impact on biological research for rehabilitation, prosthetics for humans, and improving athletic performance.

What's up next for Ranger? The lab aims to have Ranger walk 30 to 80 km (about 20 to 50 miles), while continuing to cut back on energy consumption. Ruina also wants to see Ranger on an outdoor track with solar cells on top of its head. "Ranger would stop when it gets tired," he explains. "Then wait for the sun to charge him back up so he could go, go go!"

Here's a video report from IDG and more photos:

 Images: Biorobotics and Locomotion Laboratory/Cornell University

Engineers Turn Robot Arm into Formula 1 Simulator

cybermotion simulator robocoaster kuka 500

Sometimes it's hard to believe that researchers are contributing to science instead of just having a blast.

As Paolo Robuffo Giordano and colleagues at the Max Planck Institute for Biological Cybernetics, in Tübingen, Germany, would have it, scientific research means riding the business end of a giant industrial robot arm while playing video games. But hey -- they produced some serious research on it, which was presented at ICRA 2010

The CyberMotion Simulator is basically a full motion simulator adapted to a racing car game. Players (or subjects, the researchers prefer to call them) sit in a cabin on a robot arm some 2 meters off the ground and drive a Ferrari F2007 car around a projected track with force-feedback steering wheel and pedals. The aim is to make the experience as realistic as possible without having to buy a real F2007, and to test the simulator with an environment that requires sudden, massive acceleration.

The robot arm is a Robocoaster, a modified six-axis Kuka KR 500 that can lift up to 500 kg. It's usually found in amusement parks and normally does not allow users any control. Robuffo Giordano and his collaborators want to use it to study how we perceive motion; their paper, though, deals with the mechanics of the simulator.

"A motion simulation system is a fundamental tool to understand how humans experience the sensation of motion," he says. "By running suitable experiments, one can gain better insights into the cognitive processes of the human brain."

Most motion simulators usually use six-axis actuated hexapods or Stewart platforms to recreate motion; the CAE 5000  is a typical flight simulator that moves on six cylindrical jacks. But lab director Heinrich Bulthoff wanted to use a robotic manipulator arm to study multi-sensory motion perception, and F1 racing seemed like a challenging way to test the notion.

"The main challenges were related to the adaption or extension of existing motion control algorithms," says Robuffo Giordano, who worked on the F1 arm as a control and robotics engineer. "Our system offers a much larger motion envelope [than Stewart platforms], allowing subjects to be freely displaced in six degrees of freedom in space and even be placed upside-down."

The video above shows what it's like to turn the KUKA 500 into an F1 car. You can hear the loud whine of the arm as the driver takes the curves in the simulation of the famed Monza track. The 3D visuals of the course were created by Robuffo Giordano's colleagues Joachim Tesch and Martin Breidt. Not bad for a DIY arcade game.

The team was able to reduce the delay in the robot's reaction to just 40 milliseconds and they are satisfied with the results. They believe the CyberMotion Simulator can be adapted to recreate the experience of being in a plane, helicopter, ship, and other vehicles. Another possibility is telepresence applications.

For now, the researchers are working on expanding the motion envelope of the simulator by adding movement to the cabin itself, as well as other improvements.

Meanwhile, I'd love to plug in Gran Turismo 5 and go for a spin, robot-style.

Updated on 9 August 2010 to clarify goal of the project and correct delay in robot's reaction.

Read also:

Autonomous Car Learns To Powerslide Into Parking Spot
Mon, May 10, 2010

Blog Post: Stanford's Junior uses two control schemes to stop right on target

CyberWalk: Giant Omni-Directional Treadmill To Explore Virtual Worlds
Wed, April 28, 2010

Blog Post: Built by European researchers, it's the world's largest VR platform

Tactile Gaming Vest Punches and Slices
Fri, March 26, 2010

Blog Post: Now you, too, can feel the pew-pew-pew

Robots With Knives: A Study of Soft-Tissue Injury in Robotics
Thu, May 06, 2010

Blog Post: What would happen if a knife-wielding robot struck a person?

Telenoid R1: Hiroshi Ishiguro's Newest and Strangest Android


Image: Osaka University and ATR Intelligent Robotics and Communication Laboratories

Japanese roboticist Hiroshi Ishiguro has just unveiled a new teleoperated android: a strange robotic creature called the Telenoid R1.

Ishiguro, a professor at Osaka University, is famous for creating humanlike androids designed to "transmit the presence" of people to a distant place. His previous remote controlled androids include a robot replica of himself that he named Geminoid HI-1 and a smiling female android called the Geminoid F.

But the new Telenoid R1 robot is quite different. The previous androids had lifelike appearances, every detail trying to reproduce the features of a real person. The Telenoid has a minimalistic design. The size of small child, it has a soft torso with a bald head, a doll-like face, and stumps in place of limbs. It looks like an overgrown fetus.

Ishiguro and his collaborators say the idea was to create a teleoperated robot that could appear male or female, old or young, and that could be easily transported. The new design pushes the envelope of human-robot interaction, and Ishiguro is certainly not afraid of exploring the depths of the uncanny valley.

The researchers, who demonstrated the robot today at a press conference in Osaka, hope it will be used as a new communication device, with applications in remote work, remote education, and elderly care. The goal of the project, a collaboration between Osaka University and Japan's Advanced Telecommunications Research Institute International, known as ATR, is to investigate the essential elements for representing and transmitting humanlike presence.

Here's how the system works: An operator sits at a computer with a webcam and special teleoperation software developed by ATR. The computer captures voice and tracks the operator's face and head movements. The voice and some movements are transmitted to the Telenoid. The operator can also push buttons to activate other behaviors.

Even its creators admit the Telenoid R1, which will be demonstrated at this year's Ars Electronica festival in Linz, Austria, is a bit, uh, eerie:

The unique appearance may be eery when we first see it. However, once we communicate with others by using the telenoid, we can adapt to it. If a friend speaks from the telenoid, we can imagine the friend’s face on the telenoid’s face. If we embrace it, we have the feeling, that we embrace the friend.

The Telenoid R1 uses dc motors as actuators, and there are only nine in its body. Ishiguro's previous androids use pneumatic actuators; the Geminoid HI-1 has 50 actuators, and the Geminoid F has 12. The Telenoid's smaller and simpler body helped reduce development and production costs. A research version of the robot will cost about US $35,000, and a commercial version about $8,000. They will be available later this year, distributed by Eager Co. of Japan.

UPDATED: Added price and availability.

Videos and more images:

All images courtesy of Osaka University and ATR Intelligent Robotics and Communication Laboratories

Read also:

Robotics: Sat, April 03, 2010
Geminoid F: New Smiling Female Android
Robotics: Fri, April 02, 2010
Who's Afraid of the Uncanny Valley?
Robotics: April 2010
Hiroshi Ishiguro: The Man Who Made a Copy of Himself
Robotics: Tue, November 17, 2009
The Amazing Androids of Hiroshi Ishiguro

Little Soccer Robots Dribble, Kick, Score

The Darmstadt Dribblers have some of the most impressive humanoid robots in the RoboCup tournament. For the second year in a row, the team from the Technische Universität Darmstadt, Germany, took the title in the kid-size humanoid league (for robots 30-60 centimeters tall). How did they do it?

IEEE Spectrum's Harry Goldstein went to RoboCup 2010 in Singapore to find out. Watch the video below to see these amazing little robots playing and also an interview with Dribblers team member Dorian Scholz. Then visit their YouTube channel for more more videos, including this year's kid-size final.

Robot Head Flobi Designed To Dodge Uncanny Valley

flobi

Does this blushing fake face freak you out? German engineers at Bielefeld University hope it's more cute than creepy. It's a robotic head called Flobi and it's designed to express emotions while overcoming the creep factor associated with the Uncanny Valley, when robots resemble zombies due to design issues.

There have been a number of robotic heads developed over the years in an attempt to replicate human interaction and emotional expression. Flobi, described by Ingo Lutkebohle and colleagues in a paper for the ICRA 2010 conference, is notable in that it has a magnetic mouth actuator system as well as a modular construction that lets an operator perform a sex change of sorts. In only a few minutes, Flobi's plastic face and hair features can be swapped for male or female.

Flobi, about the size of a human head, was designed to be effective in communication as well as sensing. It has 18 actuators, and is equipped with microphones, gyroscopes, and high-res cameras. As seen in the photos and video below, it can express a variety of emotions such as anger and surprise by moving its eyes, eyebrows and mouth. It also has four cheek LEDs to mimic blushing.

The obvious design features are that the hardware is almost completely concealed, there are no holes in the face, and its appearance is cartoonish, not unlike Philips' human-robot interaction platform iCat

"The comic nature is far enough from realistic not to trigger unwanted reactions, but close enough that we can take advantage of familiarity with human faces," Lutkebohle and colleagues write. "One advantage we have already explored is the use of 'babyface' attributes."

These features were very deliberate choices so that people interacting with Flobi would not notice the degree to which it falls short of human. That gap often results from silicone rubber-skin robots such as CB2, a mechatronic baby built at Osaka University, that can seem creepy.

"While I wouldn't say that plastic is inherently superior to other materials, it is instantly recognizable as a human artifact," says Lutkebohle. "Furthermore, rigid plastics match current actuation capabilities in contrast to latex, for instance, which is hard to actuate without triggering the Uncanny Valley effect."

So does Flobi overcome the Uncanny Valley? That's highly subjective, but at least one roboticist unaffiliated with the project thinks it does.

"There is value in robotics making aesthetics and design a first class citizen," says Andrea Thomaz, an assistant professor of interactive computing at Georgia Institute of Technology, who wrote about Flobi on her blog. "How the robot looks will directly impact people's perceptions of it, and in turn will impact any studies of interaction done with the platform. The Flobi project demonstrates a couple of interesting ideas with respect to robot design: magnetic features to limit awkward holes on the face, and swappable features to easily change the gender/style."

Will plastic, and not rubber, become the dominant robot design medium? We'll see whether Flobi will inspire other robots to overcome the Uncanny Valley.

Images and video: Bielefeld University

Invasion of the Robot Babies

Every week I grab New York Magazine and flip to the last page to see their despicably funny Approval Matrix. I like it so much in fact that I decided to shamelessly rip it off -- robot style.

Did you notice there's been a proliferation of robot infants, robot toddlers, and robot children in the past few years? It seems that roboticists enjoy becoming parents to bionic babies.

They build them for a good reason: These bots help researchers not only learn more about robotics but also investigate human cognition, language acquisition, and motor development. Some Japanese researchers even say robot babies could help introduce young people to the wonders of parenthood and boost birth rates. Yes, robot babies help make real babies!

So check out our reality matrix below, where we rated each robot according to its similarity to humans and its technical capabilities. What's the coolest? Cutest? Creepiest? And did we forget any? Let us know.

Click on the image to see a larger -- readable! -- version.
robo_baby

Clockwise from top left: Aldebaran Robotics; Thomas Bregardis/AFP/Getty Images; RobotCub; Kokoro; Erico Guizzo; Koichi Kamoshida/Getty Images; University of Tsukuba; Tony Gutierrez/AP Photo; Sankei/Getty Images; Georgia Institute of Technology; Yoshikazu Tsuno/AFP/Getty Images; University of Bonn; Erico Guizzo

READ ALSO:

Lessons From a Mechanical Child
Wed, November 11, 2009

Blog Post: A child humanoid robot called iCub is helping Swiss scientists study cognition, learning, and mobility

Humanoid Mimics Person in Real Time
Tue, April 27, 2010

Blog Post: An operator wearing a sensor suit can control this robot's arm and leg movements

Geminoid F, the Smiling Female Android
Sat, April 03, 2010

Blog Post: Geminoid F, a robot copy of a woman in her twenties, is designed with natural facial expressions

Who's Afraid of the Uncanny Valley?
Fri, April 02, 2010

Blog Post: To design the androids of the future, we shouldn't fear the uncanny valley

A Robot That Fetches You a Beer, If You Ask Nicely

It's late on a Friday afternoon at Gamma Two, a robotics development company in Denver, Colo., and that can only mean one thing: It's time for BeerBot.

"Wilma?" Jim Gunderson tells the robot next to him, a cabinet-shaped machine that seems straight out of a 1970s science fiction movie.

"What do you want me to do?" the robot responds.

"Deliver."

"What do you want delivered?"

"Beer."

After driving itself to the kitchen and muttering along the way, Wilma the robot delivers the drink to its owner.

Gamma Two is run by husband and wife team Jim and Louise Gunderson, whose pride and joy these days lies in two autonomous mobile robots named Wilma and Basil.

The Gundersons designed the robots as personal servants that they now plan to commercialize. The machines can respond to voice commands and perform tasks such as delivering drinks and ... well, that's pretty much all they can do now.

But the Gamma Two couple has ambitious plans for the bots. Jim Gunderson, whose title is "cognitive systems architect," says many people would benefit from robots that could work as a nurse's aide or a discreet butler.

People with physical limitations, for example, could use a robot to keep track of their schedules, carry the laundry or groceries, or to follow them around with pills, glasses, phone, or a TV remote control. The robots could also check whether their owners took their pills and if not, send a text message to a caregiver, or even call for help.

The robots [see photo below], have have multiple microcontrollers and a computer running Linux. They use voice recognition to respond to their owner's commands, which according to the Gundersons enables "a very smooth interaction, much like giving commands to a very intelligent, well trained dog."

gamma two robotics wilma and basil
robots

The Gundersons even programmed the robots with personalities and tempers, so they respond better if you're nice to them, saying "please" and "thank you."

To sense the environment, the robots use arrays of sonars. The sonars send data to the robot's "cybernetic brain," as the couple calls it, which processes the data to identify objects such as chairs, tables, corners, and most important, people. The robots can also recognize places like a kitchen or living room, because each has a specific sonar signature.

Equipped with two motors, the robots can move at up to 9.6 kilometers per hour [6 miles per hour], though they drive much slower when they sense objects around them. An encoder keeps track of their position.

Gamma Two plans to build the robots as customers order them. Each will cost from US $12,000 to 20,000 and take six to eight weeks to assemble. You can also rent them for parties and events.

Jim and Louise, who've been married for 27 years, form a two-headed entity with a mean IQ and deep knowledge of things as varied as tiger salamanders and extreme programming. They hold several patents and have presented numerous papers in IEEE conferences.

"A lot of our ideas come out of animated discussions between my husband and I," says Louise, the president and CEO. "We are a team up to and including the fact that we team code together."

The two met in San Francisco in the '70s while Louise was studying chemistry at UC Berkeley. They later moved to Denver where Louise received her master's in environmental science. Twenty years later, they traveled to the University of Virginia to pursue doctorates in computer science (Jim) and systems engineering (Louise). In 1997 they formed Gunderson and Gunderson, which later became Gamma Two.

The Gundersons believe there's a major obstacle preventing robots from becoming practical in daily applications. The problem, they claim, is a chasm between a robot's ability to sense the world and its ability to reason about it.

Humans and animals can perceive things, create mental models, perform manipulations and computations using those models, and then plan and execute actions. A gap between sensing and reasoning is preventing robots from doing the same. The two wrote a book detailing their ideas titled, "Robots, Reasoning, and Reification,” published by Springer in 2008.

Louise, who also studied biology, applies her insight about living systems to her study of robotics. The robots she and Jim design have hardware and software components structured as in a living creature, with a "robotic cerebellum" to handle movement and a "reification engine" to classify objects, for example.

Gamma Two is located in the Arts District of Denver and on the first Friday of every month the Gundersons open up their lab as if it were a gallery. For over a year they've been loading trays of canapes onto Wilma and Basil, which mingle among guests serving crab puffs, cheese, and crackers.

If you are in Denver this August, be sure to stop by the Gamma Two labs. To celebrate the life of Andy Warhol, Jim and Louise will paint their robots' panels with Warhol-like designs. Be the first of your friends to say you've tried robotically served hors d'oeuvres.

And don't forget to say "thank you."

Photos and video: Gamma Two Robotics

SADbot: A DIY Drawing Machine That Loves Light

SADbot eyebeam

Outside of New York City’s Eyebeam studio, an artist's hub dedicated to the convergence of art and technology, two women pause to see a pen doodling across a canvas behind a window. When they touch little circles on the glass, the pen changes direction.

“What’s this?” they ask. Then they read the description. This is a SADbot.

SADbot, or Seasonally Affected Drawing Robot, is a solar-powered, interactive drawing machine created by Eyebeam artists Dustyn Roberts and Ben Leduc-Mills. The contraption was on display this month at Eyebeam's Window Gallery in Chelsea.

SADbot eyebeam"People are only happy when it's sunny," says Roberts. "Just like our robot."

When the sky is dark, SADbot stops doodling and "goes to sleep." But when the sun is out, SADbot lets people interact with it and doodles across a large canvas.

SADbot uses an Arduino microcontroller, four photocell sensors, a battery, and two stepper motors to control two cables attached to a pen. The electronics gets power from solar panels on the building's roof. But light not only powers the installation -- it also affects SADbot's behavior.

The interactive part occurs when a person stands in front of SADbot and covers up one of its photocell sensors, which the SADbot registers and then changes its drawing direction. By covering the sensors in a determined sequence, a person could do his or her own drawings.

But after checking the gallery's window where SADbot was to be installed, Roberts and Leduc-Mills noticed a problem. The window doesn't get much sunlight -- which would make SADbot, well, sad.

No problem. The artists built a rooftop mirror array to direct sunlight to a fixed mirror hanging off the ledge, which reflects light down to the gallery window.

If none of the photocells are covered, SADbot draws according to the programmed algorithm -- in the current case, small movements in random directions.

"At the moment its aesthetic is very small, random movements, or doodles," says Leduc-Mills. Since the project has been up, they've been filling up one canvas with doodles per day, which tells them that SADbot has received a lot of interaction.

Leduc-Mills wanted to create an interactive project that people could influence from the sidewalk so he took his ideas to Roberts, a mechanical engineer, and SADbot was born.

They met at NYU's ITP program, where Roberts teaches a class called Mechanisms and Things That Move. She will include SADbot in her book in progress called Making Things Move.

To build SADbot, the duo raised over US $1,000 in funding on Kickstarter.com, an innovative project-funding site, which paid for all of the bot's components. Depending on the size of the donation, backers of the SADbot project received SADbot drawings, mini SADbot DIY kits, and fully built miniSADbots.

SADbot uses open source platforms like Arduino, Processing, and EasyDriver motor boards, so it's easy for you to build your own SADbot!

Images and video: SADbot project, Dustynrobots/Flickr, Courtneybeam/Flickr.

More images:

SADbot eyebeam

SADbot eyebeam

SADbot eyebeam

SADbot eyebeam

MIT Robot Lamp Turns Desk Into Interactive Surface

What if your desk lamp could not only shine light but also project online content onto your workspace? LuminAR is an augmented reality project from MIT's Media Lab that combines robotics and gestural interfaces in an everyday household item.

Developed by Natan Linder and Pattie Maes from the Fluid Interfaces Group, the device consists of two parts: a bulb and a lamp. The LuminAR Bulb can be screwed into a standard incandescent light fixture and contains a pico projector, camera, and a compact computer with wireless access to the Net. The lamp fixture, meanwhile, is a a rotating base with a multi-jointed robot arm that can move to different positions by following user gestures. 

The bulb's camera tracks hand positions while the projector streams online content to different areas of the desktop. The two turn a desk into an interactive surface. The robot can also be taught to remember preferred areas to project content or digital tools such as an email application or a virtual keyboard, as seen in the video below.

The project is similar to the Sixth Sense by Pranav Mistry, also of the Fluid group, and other gestural interfaces that combine hand tracking with content projection. The difference is the form factor. The LuminAR Bulb could have wider appeal because it can be used with any ordinary desk lamp, though it would then lack robotic functions.

Still, it's an innovative way to free computing from the mouse-and-keyboard box and embed it in the environment. I wonder whether the projector is powerful enough to work well on a brightly lit desktop, and whether the robotic arm might misinterpret an involuntary gesture like sneezing and do something undesirable. Or it might hand you a tissue.

Image and video: MIT Media Lab

Advertisement

Automaton

IEEE Spectrum's award-winning robotics blog, featuring news, articles, and videos on robots, humanoids, automation, artificial intelligence, and more.
Contact us:  e.guizzo@ieee.org

Editor
Erico Guizzo
New York, N.Y.
Senior Writer
Evan Ackerman
Berkeley, Calif.
 
Contributor
Jason Falconer
Canada
Contributor
Angelica Lim
Tokyo, Japan
 

Newsletter Sign Up

Sign up for the Automaton newsletter and get biweekly updates about robotics, automation, and AI, all delivered directly to your inbox.

Advertisement
Load More