Automaton iconAutomaton

Next Weekend: RoboGames

If you're within 20,000 kilometers* of San Mateo, you owe it to yourself, your parents, your kids, and everyone else you know to come to RoboGames. I mean, let's face it, if you're reading this blog you have at least a passing interest in robots, and if you have at least a passing interest in robots, how could you not have a fantastic time at what is officially the world's largest robot competition. And besides, RoboGames this year will be hosted by Mythbuster extraordinaire Grant Imahara, who knows a thing or two about robots himself.

This year, you can expect to see 600 competitors and their robots participating in nearly 70 different events, from bot hockey to MechWars to autonomous firefighting to heavyweight combat between 220-pound juggernauts. And there will be a lot of heavyweight combat... Organizers are expecting approximately 3.4 tons of robots in that one single event, although they probably won't all be in the arena at once. Sad.

Also, uh, there will be some people giving talks. People like me. And you wouldn't want to miss your chance to see a robot blogger's embarrassed mumbling live, would you?

RoboGames runs April 15-17 in San Mateo, CA. And guess what? Through April 13, Spectrum readers can get 20% off the ticket price by going here and using coupon code "Spectrum."

[ RoboGames 2011 ]

*If you happen to be located approximately 1,700 km off the southeast coast of Madagascar, we'll cut you some slack this year. Otherwise, you're totally close enough to make it.

JPL Animation Shows Off New Mars Rover's Harrowing Travel Plan

This video shows how the Mars Science Laboratory rover (aka "Curiosity") is planning to get from here to the surface of Mars. Since MSL is too large for airbags and Mars doesn't have enough atmosphere for a parachute to do the whole job, the only option is a rocket-assisted landing. The "sky-crane" system in the video above has never been used for a mission before, and I can't even imagine how agonizing it's going to be waiting to find out whether everything went successfully when touchdown happens in August of 2012.

Boing Boing recently had the chance to send a photographer to JPL to check out the more or less completed rover before it's sent of to Florida next month to prepare for its November launch. Here are a couple of my favorite pics:

Check out that beastly robotic arm and the friendly looking head. So cute!

That, uh, fetchingly ample rear end contains a radioisotope thermal electric generator, which is capable of producing power for a minimum of 14 years, which means MSL should still be wandering around by the time humans make it to Mars to personally congratulate the robot on doing such a bang-up job.

Swing by Boing Boing for the rest of the set, taken by photographer Joseph Linaschke.

[ Mars Science Laboratory ]

More Video Craziness With da Vinci Surgical Robots

Ever wondered just how surgeons (and grad students) train on da Vinci surgical robots? Apparently, here's how it works:

It's worth mentioning, I think, that had a human not been in the loop here, the robot could almost certainly gotten that wishbone out much, much faster. In fact, I personally challenge robots everywhere to perform the fastest flawless game of Operation ever and post it on YouTube. Aaaaand, GO!

Travis over at Hizook found a couple more da Vinci robot vids, too:

If you're wondering what the point of these videos are, well, besides being funny, the da Vinci systems (and robotic-assisted surgeries in general) are gaining popularity mostly just because they're cool. Such surgeries aren't always better for patients; although the incisions are significantly smaller, robot-assisted surgeries can take up to twice as long as conventional surgeries. There's also the several thousand dollar premium that patients (or their insurance companies) pay. Still, it's hard to beat the appeal of being operated on by a robot, apparently:

But now, patient after patient was walking away. They did not want [conventional] surgery. They wanted surgery by a robot, controlled by a physician not necessarily even in the operating room, face buried in a console, working the robot’s arms with remote controls.

“Patients interview you,” said Dr. Cadeddu, a urologist at the University of Texas Southwestern Medical Center at Dallas. “They say: ‘Do you use the robot? O.K., well, thank you.’ ” And they leave.

But anyway, the point is that surgical robots are now sexy. They bring in business. And after you've just spent a couple million on your brand new surgical robot, more business is definitely what you're looking for, so putting up YouTube videos showcasing your new medical marvel is definitely a good idea.

Interview: iRobot's Colin Angle on Robotics Industry, Remote Presence Robots

colin angle irobot

At the InnoRobo conference in Lyon, France, last month, I got a chance to speak with Colin Angle, CEO of iRobot -- in a very candid interview about his view on the robotics industry, his vision for AVA, a new robot platform his company is developing, and how he sees things shaping up in the coming years.

Cool over function

In keeping with a presentation Colin gave earlier in the day, he started off our conversation with a discussion on how there have been hundreds of millions of dollars spent on making cool demos – but relatively little in the way of solving high value business needs.

To illustrate his point, he mentioned the incredible effort that has been undertaken on the development of humanoid robots. He calls this an exercise of “cool over utility." As he explained it, the challenge of having to build a system that supports the model of bipedal legs and actually executing walking and balancing has been a costly adventure. Even the most exciting systems often have a team of scientists walking behind them, and the systems have a mean-time to failure of about 45 minutes, with limited performance – all to the cost of millions of dollars.

Compare that to the iRobot Warrior, which Colin feels is the first practical human-sized robot ever designed. Handling drops of up to 6 meters [20 feet], it's able to carry payloads of over 90 kilograms [200 pounds] and navigate rough terrain to go where human-sized systems should go -- in other words, the Warrior shows you don't need bipedal systems to solves a high-value mobility problem.

irobot ava robotThoughts on remote presence

So, in keeping with my focus on remote presence systems, I steered the conversation to remote presence and how he saw their AVA prototype [photo right] potentially accomplishing this. Colin quite nicely broke down the problem and how AVA is an attempt to resolve the puzzle.

First and foremost, he wants to deliver an experience better than being there yourself -- regardless of the travel time. He wants to mimic "presence" in such a way that the experience for you (the pilot) is rich, deep, and intuitive. And keeping with many people in this space, he does not feel that remote-controlled webcams or the Cisco telepresence solutions are solving this.

To achieve ubiquitous remote presence, a remote controlled webcam is not effective since there is limited ability for the pilot to truly understand the environment. While a person could learn the environment over time (e.g., where the offices are, where the conference rooms are), wouldn't it be better to have the remote presence system know the entire layout and allow you to request the place to go and simply take you there? Cisco telepresence solutions are not effective in other cases simply due to the very nature of the systems themselves -- limited in freedom, tied down to a single location, and very limited in being able to represent you outside of the magic screen.

Colin's vision is the ability to have a surrogate "you" -- one that could, in any location, be able to be present and do things that you normally would do. Go to the room you wish to go, carry on a conversation outside of a room, be aware of who is around, where they are spatially and go to them with minimal effort.

How AVA fits in

Colin was amused that people thought the AVA was iRobot's effort into robotic telepresence -- he sees the AVA as a "generic platform" for supporting all of the robotic functions that are necessary for enabling remote presence that iRobot is known for (“Do what iRobot is great at”). For instance, as we discussed the functional components of AVA, he pointed out the various features that the robot platform supports:

  • Downward facing IR for cliff detection

  • Braking systems to ensure the system is not going to fall

  • Small physical footprint (on the order of a human) to ensure fast turning radius and strong stability

  • Bumpers and upward facing sonar for detection of objects that could potentially collar the head of the device

  • Two PrimeSense sensors to enable a better understanding of the world through 3D mapping both of the navigation environment (downward facing) and the environment in front (on the camera assembly)

  • A LIDAR component that he wants to reinvent to bring the cost down (most expensive piece of the system)

  • Control surfaces for participants to move the system without physically pushing the system (through the bumper pads on the neck) to improve management of the system

  • Telescoping neck (via lead-screw) to ensure a lower center of gravity for movement/motion while affording a variable height for engagement with participants either standing or seated

  • Positioning control for the neck/head component

  • Supports adding manipulators on the system through a rail mechanism on the back of the neck of the AVA

A platform for application development

Colin said that iRobot's primary focus is on the "robotic functions" for a "generic platform" -- to help others overcome the liability issues. iRobot has done a lot of work -- through their previous designs and their own operating system (AWARE2) -- to make as safe and reliable platforms as possible. Rather than trying to make a specific platform for remote presence, Colin said that it is iRobot's intent to build the platform and let developers/designers create a solid system.

I got somewhat confused here -- it sounded like he was suggesting that iRobot would not compete in the application development and would not build a system for specific purposes -- like remote presence. And, when I pressed, he clarified that iRobot would not get in the way of something like Pad-to-Internet-to-Pad communications (e.g., FaceTime, qik, Skype), but in terms of building a navigation interface (e.g., a web front-end for piloting the system) for the pilot to interface with AVA, iRobot might offer a solution. Like Apple, iRobot's solutions for various applications could sit alongside of any other third-party solutions -- enabling these developers to build a better interface/application that would interface as well with AWARE2 and control the AVA platform. Here's how he put it:

Yes, it is our intention to develop apps for AVA alongside other developers, as we need to, as you say, “prime the pump”. As we look at the way things are likely to play out, iRobot is committed to being best in the world at autonomy/navigation software, platforms, manipulation, and the integration of 3rd party hardware – while we aspire to be a one of many application developers.

But for remote presence, the idea of having a tablet with a camera and a large screen (like the Motorola Xoom or the iPad 2) connecting to the AWARE2 API would easily support the creation of a remote presence system and allow the developers to rapidly iterate versions. And with an extendable head and telescoping neck, the placement of the pilot's face would be an easy effort and allow for remote presence to potentially become true.

Other juicy bits

From other conversations, I learned that there are a number of the AVA prototypes out in the market space already -- in the midst of prototype development for various problems. I could see the vision Colin has allows for an augmented reality for the pilot -- being able to have a click-and-response action within the view of the remote presence system (e.g, open doors, tag people in a meeting, set vision points to track where people are and respond to them rapidly by turning the head). How this comes about will be an interesting exercise in the coming years.

This article appeared originally at Pilot Presence.

Sanford Dickert is a technologist and product manager focusing on the intersection of engineering, collaboration, and team dynamics. He's held numerous senior positions in engineering, product development, and digital marketing. He writes about remote presence systems at Pilot Presence.

Robots Play Soccer, Make Cereal at RoboCup German Open

The RoboCup German Open 2011 wrapped up last weekend, and we've got a couple video highlights to share from the event.

This first clip is from the RoboCup@Home competition, which aims to develop service and assistive robot technology that will eventually make its way into your home. Here, Dynamaid and Cosero, two robots from Team NimbRo at the University of Bonn, team up to autonomously to make breakfast (of a sort):

RoboCup is perhaps best know for soccer, and the Darmstadt Dribblers (we've been big fans for years) took first place in the Kidsize soccer competition, defending their 2010 title. The 3v3 fully autonomous matches feature thrills, spills, violence, dives, and unprecedented speed and skill... Those robots are as good or better at aiming for the corners than most humans I know. In the first half of the match, stick around until the very end to see some tricky ball-handling skills:

And in the second half, check out one of the bots go from left footed to right footed and score, and make sure to hang on until minute nine to witness the first ever successful goalkeeper save and throw in a regulation robot soccer match:

Remember, the goal of RoboCup is to field a team of human robots capable of defeating a world-class team of humans at full field soccer. Obviously, we're not there yet, but the magnitude of improvements that we've seen over just the last two or three years has me convinced that the 2050 target is, if anything, pessimistic.

[ RoboCup German Open ]

[ Team NimbRo ]

[ Darmstadt Dribblers ]

Willow Garage's TurtleBot Brings Mobile 3D Mapping and ROS to Your Budget

Just a year or two ago, if you'd wanted to buy yourself a mobile robot base with an on-board computer and 3D vision system, you'd probably have been looking at mid-four to five figures. But today is the future, baby, and Willow Garage is introducing TurtleBot, an eminently hackable pre-configured platform designed to give mobility to a Kinect sensor on the cheap.

TurtleBot consists of an already sensored iRobot Create base, a 3000 mAh battery pack, a gyro, a Kinect sensor, an Asus 1215N laptop with a dual core processor to run everything, and a mounting structure for you to get creative with. TurtleBot runs ROS, of course, and will come with everything preconfigured so that the robot can make maps, navigate, and follow you around straight out of the box.

I know I've beaten this to death with respect to Willow Garage and ROS before, but remember that the whole point (or much of the point) of this kind of open source hardware and software is to keep hard working roboticists like you from having to start from scratch every time you want to invent something. Like, why waste your time and money designing and constructing a mobile robot with a 3D sensor and then waste more time teaching it to navigate, when it's all already been done a thousand times before? Where's the progress, man? See, now you have time to move on to more interesting things, like getting your robot to do cool stuff, which is the whole point of robots in the first place.

So, what can you do with TurtleBot? Well, it can bring you food, explore your house on its own, bring you food, build 3D pictures, bring you food, take panoramas, bring you drinks, bring you food, and more. Check it out:

If this platform looks vaguely familiar, that's because it is... Bilibot is the same basic idea: a cheap, effective platform for developing applications for Kinect using ROS. Both of these platforms were developed in parallel, though, and they're both after the same thing (more accessibility), so don't worry, there's nothing shady going on here. Willow Garage expects both robots to be able to collaborate on hardware and software while still maintaining their individuality.

TurtleBot will be available for pre-order later this week. The core kit is $500, which includes:

  • USB Communications Cable
  • TurtleBot Power and Sensor Board
  • TurtleBot Hardware
  • Microsoft Kinect
  • TurtleBot to Kinect Power Cable
  • USB Stick TurtleBot Installer

The complete TurtleBot (which is what's in the pictures and video) is $1200, and adds the following to the core kit:

  • iRobot Create Robot
  • 3000 mAh Ni-MH Battery
  • Fast Charger
  • Asus EeePc 1215N

The reason that they're doing it this way is to make it as cheap as possible for you to put this kit together yourself, if (say) you have your own laptop already, or even your own iRobot Create. For reference, an iRobot Create with a battery is about $200 from iRobot, and an Asus 1215N is about $500.

Turtle costume sold separately.

[ TurtleBot ]

Geminoid Robots and Human Originals Get Together

geminoid android humanlike robot

The Geminoid family has gathered together for the first time.

The ultrarealistic androids, each a copy of a real person, met on March 30 at Japan's ATR laboratory, near Kyoto.

Attending were Geminoid F, Geminoid HI-1, and Geminoid DK, as well as their respective originals: a twentysomething woman (whose identity remains a secret), Prof. Hiroshi Ishiguro of Osaka University, and Prof. Henrik Scharfe of Aalborg University, in Denmark [photo above].

The Geminoid robots, conceived by Prof. Ishiguro and a team at ATR, are manufactured by Japanese firm Kokoro. The robots work as a person's telepresence avatar: A computer captures the person's voice, facial expressions, and upper-body movements and transmits this data to the android.

Anyone can teleoperate the androids, but the experience is certainly unique for those individuals who served as templates.

"We wanted to get together and share our experience of having robot copies," Scharfe told me. "The three of us has a lot of fun doing this."

Watch what happened:

But the meeting was also an opportunity to conduct experiments. With the three robots sitting around a table, the human originals teleoperated their own copies and tried to have a conversation. Then they took turns operating each other's Geminoids.

"Returning to your own Geminoid felt like coming home," Scharfe said.

The researchers also tried other configurations, for example by having the human originals sitting with their androids on the table while other people teleoperated the robots.

According to Prof. Scharfe, whose Geminoid cost some US $200,000 and will be shipped to Denmark soon, some situations felt more natural than others, but generally he could accept the different conditions as "real conversations."

He will now take time to interpret the material from these experiments and hopes to publish his findings at some point.

As for the next Geminoid reunion -- have the researchers schedule it yet?

"It's very costly to ship [the androids] around," Scharfe says. "So it might not happen again!"

More photos:

geminoid android humanlike robots

geminoid android humanlike robot

geminoid android humanlike robot

Images and video: Geminoid DK

[ Geminoid DK ] via [ CNET ]

Robot Videos: Festo's SmartBird, Social Robots, and Autonomous Cars

There's no better way to start off the week than with a trio of fascinating robot videos, each of which is easily educational enough that you should be able to convince yourself (and anyone else) that watching them is definitely not procrastinating. 

This first video is a follow-up to Festo's SmartBird robotic seagull that we posted about last month. Creating a heavier than air fully functional robotic bird is no small feat, and this 17 minute video takes you through the development process, including lots of juicy details and behind the scenes test footage:


Cynthia Breazeal gave a seminar at CMU's Robotics Institute on "The Social Side of Personal Robotics." As you may have noticed, robots tend to be pretty lousy at interacting socially with humans, largely because robots have a hard time understanding what's going on inside our heads. I can totally relate to this because I have a hard time understanding what's going on inside other people's heads too, and if it's difficult for me, it's practically impossible for a robot.

Dr. Breazeal talks about new capabilities that her lab is developing to allow robots to employ a higher degree of insight (if you want to call it that) into how humans think, to enable robots to interact with us more naturally and more successfully. For example, the seminar includes video of experiments with Leonardo, where the robot demonstrates how it can understand not just what a human wants, but also what a human believes, which allows the robot to be much more... Well, I'm not sure what else to say but "insightful." Other experiments show how Leonardo can successfully pick up on unknown rules based on behavioral feedback, which is a skill that could hypothetically be extended to abstract social situations.

This talk is just over an hour long, but it's definitely worth watching in its entirety:


Lastly, we've got a (rather brief) TED Talk from Sebastian Thrun, who's been developing autonomous cars at Stanford and, more recently, Google. I never get tired of hearing his vision for the future where we all ride around in safe and efficient robotic vehicles, but it's somewhat ironic that no matter how much safer autonomous cars are over human drivers, it's the risk of accidents that's keeping them out of the hands of consumers. We have a ways to go both socially and legally before sharing the road with robots is going to be acceptable, but there are ways to ease us into it that may help to make the transition both smoother and quicker.

Why We Should Build Humanlike Robots

hanson robotics robokind zeno humanoid robot
Robokind Zeno, a small walking humanoid with an expressive face created by Hanson Robotics.

People often ask me why I build humanlike robots. Why make robots that look and act like people? Why can't robots be more like ... appliances?

In fact, some people argue that it's pointless for robotic researchers to build robots in our image; existing humanoids fall short of what science-fiction authors have dreamed up, so we should just give up. Others even say we'll never have humanoid androids around us, because when you try to make robots look more human, you end up making them look grotesque.

I disagree. I believe robotic researchers should aspire as grandly and broadly as possible. Robots can be useful in many shapes and forms, and the field is young—with so much room left for innovation and diversification in design. Let a thousand robot flowers bloom.

On the tree of robotic life, humanlike robots play a particularly valuable role. It makes sense. Humans are brilliant, beautiful, compassionate, loveable, and capable of love, so why shouldn’t we aspire to make robots humanlike in these ways? Don’t we want robots to have such marvelous capabilities as love, compassion, and genius?

Certainly robots don’t have these capacities yet, but only by striving towards such goals do we stand a chance of achieving them. In designing human-inspired robotics, we hold our machines to the highest standards we know—humanlike robots being the apex of bio-inspired engineering.

In the process, humanoid robots result in good science. They push the boundaries of biology, cognitive science, and engineering, generating a mountain of scientific publications in many fields related to humanoid robotics, including: computational neuroscience, A.I., speech recognition, compliant grasping and manipulation, cognitive robotics, robotic navigation, perception, and the integration of these amazing technologies within total humanoids. This integrative approach mirrors recent progress in systems biology, and in this way humanoid robotics can be considered a kind of meta-biology. They cross-pollinate among the sciences, and represent a subject of scientific inquiry themselves.

hanson robotics humanike robots
Some of Hanson Robotics' creations [from left]: Alice, Zeno, and Albert Hubo.

In addition, humanlike robots do prove genuinely useful in real applications. Numerous studies, including those with humanoids Nao, Bandit, Kaspar, and RoboKind Zeno, show that autistic children respond favorably to such robots, promising treatments and social training uses. Additionally, consider a humanoid robot like NASA's Robonaut (just to name one). Its capabilities for use in space and in factory automation promise safer, more efficient work environments for people. And then, there is the simple wonder and psychological power of humanoid robots. Just as human-inspired depictions brought joy and insights throughout history—such as in the sculptures of Michelangelo, in great works of literature, and in film animation such as those of Disney, Miyazaki, and others, there is no reason that robots can’t inspire similarly. Humanlike robotics already bring us wonder and joy. Why can’t robots communicate just as much wisdom, knowledge and ardor, as do other figurative arts? In addition to known uses for humanlike robots, new uses for humanlike robots will certainly emerge, expand and surprise us, as the capabilities of robots evolve onwards.

It is true that humanlike robots are not nearly human-level in their abilities today. Yes, humanlike robots fail. They fall, they lose the topic in conversation, misunderstand us, and they disappoint as much as they exhilarate us. At times these failures frustrate the public and robotics researchers alike. But we can’t give up. Humanoid robots are still in their infancy. Though they falter, the abilities of humanoid robots continue to grow and improve. Just as babies can’t walk, talk, or really do anything as well as adults do, or do anything particularly useful, this doesn’t mean that babies deserve our contempt. Let’s not give up on our robotic children. They need nurturing. And as a researcher in humanoid robotics, I can attest that it’s a pleasure to raise these robots. They are a lot of fun to develop.

Looking forward, we can find an additional moral prerogative in building robots in our image. Simply put: if we do not humanize our intelligent machines, then they may eventually be dangerous. To be safe when they “awaken” (by which I mean gain creative, free, adaptive general intelligence), then machines must attain deep understanding and compassion towards people. They must appreciate our values, be our friends, and express their feelings in ways that we can understand. Only if they have humanlike character, can there be cooperation and peace with such machines. It is not too early to prepare for this eventuality. That day when machines become truly smart, it will be too late to ask the machines to suddenly adopt our values. Now is the time to start raising robots to be kind, loving, and giving members of our human family.

david hanson hanson roboticsSo I can see no legitimate reason not to make humanlike robots, and many good reasons to make them so. Humanlike robots result in good science and practical applications; they push robots to a higher standard, and may eventually prevent a war with our intelligent machines. What’s not to love about all of that?

David Hanson, Ph.D. [photo right] is the founder and CTO of Hanson Robotics, in Richardson, Texas, a maker of humanlike robots and AI software. His most recent creation is Robokind, a small walking humanoid with an expressive face designed for research.

Da Vinci Surgical Bot Folds, Throws Tiny Paper Airplane

da vinci surgical robot

Everybody already thinks that robot surgery is way cool, but I suppose there's no harm in taking a few minutes to show off the precision that tiny little robot grippers are capable of. On the other end of these steely claws is an even steelier-eyed surgeon with a questionable amount of aeronautical experience, and in between the two is a da Vinci surgical system. This particular robot hails from Swedish Hospital in, you guessed it, Seattle.

The da Vinci system, if you recall, provides surgeons with an interface that allows them to control little robotic hands with their own (much larger) hands, enabling much finer control in a much tighter space. For patients, this means smaller incisions that heal faster, and for surgeons, it means no more going elbow deep into someone else's guts.

I do feel obligated to point out that depending on your definition of robot, the da Vinci system may not qualify as one, in that it doesn't have much of an autonomous component: all of those motions are controlled directly by the surgeon using a master/slave system. However, robots with actual autonomous surgical capabilities aren't that far off, and now that we've seen demos of robots autonomously sucking your blood out and autonomously taking biopsies on simulated turkey prostates, it's just a matter of time before you start having to choose your surgeon based on whether it's running Windows or Linux.

[ Intuitive Surgical ] via [ Nerdist ]

Advertisement

Automaton

IEEE Spectrum's award-winning robotics blog, featuring news, articles, and videos on robots, humanoids, automation, artificial intelligence, and more.
Contact us:  e.guizzo@ieee.org

Editor
Erico Guizzo
New York, N.Y.
Senior Writer
Evan Ackerman
Berkeley, Calif.
 
Contributor
Jason Falconer
Canada
Contributor
Angelica Lim
Tokyo, Japan
 

Newsletter Sign Up

Sign up for the Automaton newsletter and get biweekly updates about robotics, automation, and AI, all delivered directly to your inbox.

Advertisement
Load More