With all of the new competition in the consumer robotics field, it’s about time for iRobot to show that they’re still capable of innovating new and exciting things. AVA, their technology demonstrator, definitely fits into the new and exciting category.
AVA is short for ‘Avatar,’ although iRobot was careful not to call it a telepresence robot so as not to restrict perceptions of what it's capable of. AVA is capable of fully autonomous navigation, relying on a Kinect-style depth sensing camera, laser rangefinders, inertial movement sensors, ultrasonic sensors, and (as a last resort) bump sensors. We got a run-down a few days ago at CES, check it out:
All of the sensor data crunching is taken care of by a heavyweight on-board computer, but the brains of the operation is really whatever AVA happens to be wearing for a head, in this case, a tablet PC. This makes it easy to develop applications to control the robot, which is a concept not unlike the iRobot Create: the building a robot part is done for you, leaving you to focus on getting said robot to do cool stuff.
There are also a bunch of interesting ways to interact with AVA. You’ve got the tablet of course, if you want to do things the hard way. A second Kinect camera on the bot can detect people and recognize gestures, and an array of microphones can detect and interpret voice commands. Finally, AVA’s round ‘collar’ piece has touch sensors all the way around, offering an intuitive way to steer AVA around.
While iRobot wouldn’t speculate on what’s coming next for AVA (disappointing), telepresence is an obvious first application. AVA also has a bunch of expansion ports that you can attach stuff to, which obviously makes me think manipulators. Personally, I’m hoping that now that AVA is out in the open, iRobot will keep us updated with some of the new ideas that they’re playing around with.
Robots made a big appearance at this year’s Consumer Electronics Show in Las Vegas. There were home robots, robotic pets, humanoids, telepresence systems, and even a little robot to massage people’s backs. Check out the highlights:
• iRobot brought two new home robots to CES: a more powerful Roomba and a smaller Scooba washer [see photo above]. According to the company, the updated Roomba 700 series is 20 percent better at sucking up fine dirt particles and new power management software provides 50 percent longer battery life than previous Roomba generations. The new vacuum units start at US $450. The new Scooba 230, priced at $300, is 16 centimeters in diameter and 9 cm high, ideal to get into small areas such as that dreaded space around the toilet. According to the company, Scooba differs from a mop because it only uses clean solution to wash the floors, not dirty water. The robot has an active reservoir that keeps the cleaning solution and dirty water separate and it can clean 14 square meters of linoleum, tile, or sealed hardwood floors in a single session.
• iRobot was also showing off a telepresence robot prototype called AVA, which looks like an iPad on wheels. It seems that after its aborted ConnectR project -- a telepresence robot based on the Roomba platform -- iRobot is trying to catch up in the telepresence arena. The AVA prototype was quite bulky and didn't move much, but the interesting thing is that iRobot wants to allow developers to create apps to make the robot do useful things. [UPDATE: Okay, iRobot is not calling its prototype a telepresence robot, although AVA is short for avatar. BotJunkie has the details.] Watch iRobot CEO Colin Angle explaining the idea behind AVA:
• Paro, the therapeutic robot seal, was drawing lots of visitors who wanted to caress the furry creature, but another therapeutic robot was also getting a lot of attention -- and it was the robot that was caressing people. The WheeMe, created by Israeli company DreamBots, uses tilt sensors to balance on a person's back, moving slowly as its four sprocket-like rubber wheels press gently on the skin. As we wrote before, the company admits that the robot can't give you a deep tissue massage, because it's very light (240 grams, or 8.5 ounces), but it claims the device can provide "a delightful sense of bodily pleasure." It will retail for $69.
• The Fujitsu Emotion Bear is a robotic teddy bear with a camera in its nose, motors stuffed in its body, and advanced AI. The bear has 13 touch sensors and runs image recognition software to recognize people. Like Paro the robot seal, it's designed to interact with children, elderly, and infirm people, though one can imagine it could become a robot toy like the dinosaur robot Pleo or Sony's Aibo dog robot. It can move its head and paws, track people's faces, laugh, cry, and sneeze. The bear is a concept product and Fujitsu hasn't announced any plans to sell it.
• Developed by Orbotix of Boulder, Colo., Sphero is a robotic ball that you can control with an iPhone or iPad via Bluetooth. Slide your finger on a circular control to move the ball and you can play office golf or challenge a friend for a game of sumo ball. The Sphero balls change color but don't have cameras or other sensors. Some people may argue this is just a remote controlled toy, not a robot, but Orbotix hopes that by providing an easy to use open API, app developers can add new capabilities to the ball bot. No details on price and availability, except that it should cost less than $100 and hit the market later this year.
• Murata Boy, developed by Murata Manufacturing Co., is a little humanoid robot that rides a bicycle. It made an appearance at CES along with a new companion: Murata Girl, which rides a unicycle and blushes and nods her head. Both robots can balance in place or even ride along a narrow beam. Show demonstrators controlled them by waving specially designed wands.
• The Vgo robot, created by Vgo Communications in Nashua, N.H., allows remote workers to not only see, hear, and talk but also move around and collaborate more effectively with colleagues. Unveiled last June, the telepresence robot sells for $5,000 plus a service contract -- an attractive price compared to competitors such as the Anybots QB, which costs $15,000. The Vgo robot is rather short (1.2 meter, or 4 feet, tall), and one wonders how it feels to embody them. The company says that executives at Palantir Health and Orbitz have been using the robot to improve collaboration and reduce travel across multiple offices.
• Finally, my favorite robot demo was when Japanese company Cyberdyne allowed tech journalist Evan Ackerman to try out its robot suit HAL. It's not everyday you get a chance to step into a robotic exoskeleton that can sense when you want to move your legs and move them for you! Designed to assist the elderly and disabled to regain more mobility, the HAL suit is available to hospitals and clinics in Japan and rents for about $1,500 per month. Ackerman became the first person in the United States to try the legs -- and he liked them.
Last year was an incredible time for robotics, and to recap the best robot moments of 2010 we decided to compile a list of our favorite videos. Check out below our selection -- going from No. 20 to the No. 1 -- and let us know what you think.
No. 20 Let's start off with the musculoskeletal humanoid Kojiro, built at the JSK Robotics Lab in Tokyo. With a body that mimics the way our skeletons and muscles work, it's surely one of the coolest -- and strangest -- robots of 2010.
No. 17 Among last year's robotics milestones is the emergence of commercial telepresence robots. And Silicon Valley startup Anybots was probably first to hit the market with its skinny alien-looking QB robot, which made the future seem a little bit closer by allowing people to roam around embodied as robotic avatars.
No. 14 It's always fun to see Honda's Asimo doing its thing. Last year, the astronaut-looking humanoid made an appearance at Ars Electronica in Linz, Austria, where it perform some old tricks as well as some new ones. You funny, Asimo.
No. 13 Speaking of Asimo, Iran seems to be a fan. Engineers at Tehran University built an adult-size humanoid called Surena. The robot can walk, stand on one foot, and even perform a little dance. It also loves to be on TV.
No. 10 The Honda U3-X personal mobility device is not exactly a robot, but this amazing unicycle does use balancing technology from Asimo -- and it definitely comes from the future.
No. 9 It's hard to believe that researchers were able to make a swarm of bacteria build a tiny pyramid. We have news for you: now they want to use this type of bacteria to power microscopic robots -- inside your body!
No. 6Quadrotors have been gaining popularity, and last year several groups demonstrated some impressive results. One group in particular, the GRASP Lab at the University of Pennsylvania, stood out for its acrobatics, with its machines flying and looping through obstacles and even landing on vertical surfaces.
No. 5 Is this the most amazing adult-size humanoid ever built? Possibly. AIST's HRP-4 is sleek, athletic, and graceful, and few, if any, robots can move like it does. As one observer put it, it "will make you bow in deference."
No. 2 In another popular story of 2010, U.C. Berkeley researchers programmed a PR2, an advanced robot developed by Willow Garage, to fold towels. Video of the robot neatly folding towel after towel was seen by tens of thousands of people soon after it was released. Which proves that people really hate folding towels. The PR2 was also responsible for several other cool videos in 2010 -- it could have its own top 20 videos list! -- like this one of the bot playing pool.
No. 1 Many of the robots above are extremely sophisticated and expensive systems and they are capable of performing formidable things. But sometimes simplicity and beauty win. Below is our choice for the No. 1 video of 2010. It shows a robot that balances on a ball. Beautiful.
What do you think of our list? Do you disagree with any of our choices? Think we forgot something? Let us know.
At yesterday's demo, Ackerman got a taste of the future by becoming a man-machine hybrid. Though the tried just the robot legs (the company also makes a full suit includes powered arms), he said the experience was "incredible."
To use the suit, Cyberdyne employee Takatoshi Kuno first attached sensors to Ackerman's legs. The sensors monitor the electrical activity of nerves to control the suit's dc motors.
The suit works on intent: the user needs only to "think" of moving his or her legs -- the suit does the rest. That's because the brain sends signals to the muscles of the legs, and the sensors detect them.
"Once I figured out how to stop trying to walk in the suit and just let the suit walk for me, the experience was almost transparent," Ackerman said.
The suit includes a pouch with a computer, Wi-Fi card, and battery, and it sends data about its operation to a remote PC. Cyberdyne's Kuno said he set the suit on "level 1," because Ackerman's legs had normal strength; for people with weaker muscles, the suit could go to level 4.
Ackerman walked around the room and also climbed stairs to go up and down the stage. At first he appeared to struggle to move its legs, but after just a few minutes he was feeling comfortable in his new robot body.
"I didn't try to kick anything to pieces Iron Man style," Ackerman said, "but going up stairs was definitely all the suit doing the work and not me."
The manufacturing industry in many countries, facing labor shortages and pressed to become ever more efficient, can certainly use a little help. Or how about a Little Helper?
Mads Hvilshøj, Simon Bøgh, and their team at Aalborg University in Denmark have been working on an industrial robot, which they named Little Helper, designed for handling parts and moving them around on a factory floor. The robot consists of a manipulator arm mounted on a mobile platform.
The Danish researchers equipped Little Helper's mobile platform with an array of on-board sensors (laser range, ultrasonic, and motor encoders), which help with navigation and safety. The manipulator system consists of an Adept six-degrees-of-freedom industrial arm, plus a tool changer and various tools. The robot also relies on a vision system, which consists of a camera with adjustable lens and light system. The current prototype, built entirely from commercial off-the-shelf components, can run continuously for eight hours, and is capable of automatically recharging itself when needed.
To program Little Helper for operation, users have to load its computer with digital layouts of the work areas and let the robot scan the environment with its sensors. With some additional programming using a graphical user interface and a touch-screen, the robot can start to navigate autonomously, pick parts, transport them, and even perform assembly tasks. The robot's different systems are decoupled, so when Little Helper is driving around, only the mobile platform is active; when the manipulator is in use, the mobile platform remains stationary. This approach helps to ensure that the bot operates safely.
But before we can see Little Helpers toiling in factory floors, the researchers will have to overcome several challenges. For one, they have to figure out if their robot can adapt to a wide variety of environments and perform tasks in a cost-effective way. The robot must be able to deal with errors and unpredictable situations and always operate in a safe way. The researchers also have to improve the programming interface, so setup is easy and not too expensive -- a crucial factor in getting this type of robot, which can cost tens of thousands of euros, out of the lab and into real factories. The Danish group plans to address these and other issues by testing their robot in a real facility at Grundfos, one of the world's largest pump manufacturers.
What caught my interest in this project is that it appears to be at the intersection of two distinct robotics areas: the well established industrial robotics area and the emerging service robotics area. This confluence is very promising. Also, Little Helper is a good example of integrating existing technologies into a novel robotics product, which, as Hvilshøj puts it, "is essential in order to gain acceptance and simplify implementation in real-world industrial environments." Does anyone need a Little Helper?
Watch a demo of the machine at work:
Image and video: Mads Hvilshøj/Aalborg University
Samuel Bouchard is a co-founder of Robotiq in Quebec City.
Last month I posted a video of Bruno Maisonnier, founder and CEO of Aldebaran Robotics, showing off the newly enhanced Nao humanoid robot. Then several people asked me to see the full sequence of Nao doing its "Star Wars" act, with hilarious impressions of Darth Vader and R2-D2. Here it is:
Exercise is much less work if you can pawn the hard stuff off on a teleoperated robot. The system in this video is kinda like the physical master/slave system that we saw last year, combined with Willow Garage’s PR2 Kinect demo. While I’m sure this technology has at least a few practical uses, I’m personally hoping that all those humanoid robot competitions will start requiring Kinect teleoperation. Just imagine how much more entertaining it would be to watch robot combat and wildly gesticulating humans at the same time, kinda like this. And you know what, that sounds cool enough that maybe it should be made into a movie or something…
Along with the new Scooba 230, iRobot has today unveiled a redesigned version of the Roomba, the 700 series. There are three different models: the 760, 770, and 780, and similar to other Roomba series, they mostly seem to differ from each other in frills. Here are the core upgrades from the 500 series:
-New design is smaller and sleeker.
-Battery life is 50% longer than previous generations (although it’s not clear whether they’re talking about the ‘premium’ Roombas with the increased battery life).
-I’ll quote this from the PR: “Persistent Pass Cleaning Pattern – when Roomba senses excessive dirt and debris, it uses a brush-like, back and forth motion to focus its cleaning effort in the dirty area it has detected.” Interesting; we’ll have to see it in operation.
The 770 and 780 include a few extras not present in the 760:
-Also quoted from the PR: “Debris Detector uses an optical sensor to detect larger, soft particles on the floor like popcorn, lint or paper chads, so Roomba can respond by focusing its cleaning pattern to ensure deeper, concentrated cleaning in that area.” The 760 doesn’t do this, so we’ll have to find out how exactly this differs from the regular ‘dirt detect’ feature that the 500 series Roombas have, and whether that feature is present in the 760.
-They both light up an indicator light when their dust bins are full.
-The 780 has a fancy capacitive touch sensor interface. No more buttons!
The Roomba 760 starts at $449; the 770 and and 780 will certainly be more expensive, possibly in $50 increments but we’ll find out shortly… We’ll be getting our first look and hands-on at CES starting Tuesday, and we’ve just scheduled a personal demo and interview on Friday, so stay tuned.
To get into the holiday mood, what better than watching some crazy robot videos! Make a holiday video featuring any robot, real or not, and put it on YouTube. Send us a link and we'll feature it on our dedicated playlist and on our website!
For now two videos are up, but keep your eyes open: I know of at least a couple of other submissions in preparation!
"Charles, I think this is the beginning of a beautiful friendship," Peter Robinson says to the passenger sitting in the car next to him.
The passenger is a robot head that Robinson, a professor of computer technology at Cambridge University in England, is using to explore the role of emotions in human-machine interaction.
Can computers understand emotions? Can computers express emotions? Can they feel emotions?
These are the questions that Robinson and his team at Cambridge's Computer Laboratory want to answer.
When people talk to each other, they express their feelings through facial expressions, tone of voice, and body postures. The interesting thing is that humans do the same even when they are interacting with machines.
So could we build better computers, robots, and other machines if they could understand and respond to these hidden signals?
Robinson's team believe the answer is yes. They are developing systems to analyze faces, gestures, and speech and infer emotions. They hope these systems could improve human-machine interactions in real situations.
Charles is a robotic head modeled on Charles Babbage. (Am I the only one who didn't notice the similarity? And is Charles a Hanson Robotics creation?) It's one of the research tools Robinson uses in his experiments, which include riding a car simulator with the robot as a GPS assistant.
"The way that Charles and I can communicate," Robinson says in a short movie called "The Emotional Computer" [watch below], "shows us the future of how people will interact with machines."
Do you agree? Would you replace your car GPS with Charles the robot head?
Image and video: "The Emotional Computer"/Cambridge University