Automaton iconAutomaton

Aggressive Quadrotor Maneuvers Are Totally Nuts

uav drone grasp lab

Every once in a while, we get to see a video of a robot doing something that makes us think "OMG WTF THAT’S WICKED CRAZY IMPOSSIBLE!!!" And then, we remember that crazy stuff is entirely possible, because we’re talking about robots, and we have to stop thinking about what is and is not possible in terms of human capabilities.

This is one of those videos:

I don’t have much more info for you than what’s in the video, unfortunately, but it does look like these maneuvers (while obviously autonomous) are currently restricted to an area with a whoooole bunch of sensors that can tell the robot where it is with an accuracy (and frequency) that’s probably pretty impressive.

If you remember, we’ve seen both autonomous acrobatics and autonomous landing on slopes by UAVs, but nothing like this… The precision of these maneuvers is just totally completely nuts.

[ GRASP Lab ] via [ DIY Drones ]

Willow Garage's PR2 Robots Graduate

It's graduation season, and yesterday Willow Garage, a start-up dedicated to accelerating the development of personal robots, sent its first graduation class of PR2s off into the world. These 11 robots are heading out to universities and labs in Germany, Japan, Belgium, and the United States, where they will help researchers figure out how robots can assist the elderly and the autistic, navigate buildings and open doors, and help people do house chores, to name just a few of the many projects in the works. At the graduation party in Menlo Park, Calif., some of the researchers told IEEE Spectrum about their plans for these robots. And then it was time to celebrate.

And here's Willow Garage showing off the PR2 at a pre-party press conference. The video was recorded by Spectrum's Erico Guizzo, who was embodied as a Texai, a telepresence robot also created by Willow:

How I Became a Texai Robot and Went Partying

willow garage texas texai telepresence robotI couldn't make it to Willow Garage's PR2 robot launch party last night, so I went as a robot.

While some 400 people dragged their physical bodies to the event in Menlo Park, Calif., I sat in my living room in Brooklyn, N.Y., and uploaded myself into a robot surrogate.

Using this telepresence robot, called Texai, I was able to move around, see and talk to Willowites and guests, and sip WD-40 cocktails. Just kiddin'. No drinks for robots.

Two Willow engineers built the first Texai prototype just for fun, using spare parts they found in the office. The robot proved so useful it became an official project at Willow, which has built 25 of them.

This was the second time this month I've embodied a robot, which convinces me humanlike androids and robotic surrogacy will become a reality sooner than we think.

But the star of the night, and the reason for the party, was another robot, Willow's PR2. Or more precisely, the 11 PR2s that Willow is giving away to institutions all over the world to speed up research in personal robotics.

The PR2 is a mobile robot with advanced vision and manipulation capabilities. Each costs several hundred thousand dollars. But what makes the robot stand out is its software: the Robot Operating System, or ROS, a powerful, open source robotics platform that Willow is building.


Eric Berger (left) and Keenan Wyrobek of Willow Garage show off the PR2 robot. I attended the event via a telepresence robot (that's my face on the bottom left corner).

My colleague Tekla Perry, who was also at the event (in her physical body), interviewed several PR2 recipients and will be posting videos.

For me the most interesting part was being a Texai for a night.

The robot's head is a standard flat-screen monitor, fixed atop a long metal pole. The Texai uses a wheel system similar to the PR2's. And it also runs ROS, which handles the motor controllers and teleoperation functions.

People I talked to via the robot really wanted to know how the driving works. The Texai uses Skype to establish a two-way video link, and a Web page shows a simple, intuitive control interface [below].

willow garage texas texai controller robot

You just use the mouse to hold and drag a little red ball and the robot moves. You can also make the head camera point in different directions, or switch to an auxiliary camera that shows the robot's wheels, to help while navigating through furniture and feet.

Learning how to drive is easy. But safety first! Willow makes new Texai users watch a video showing all the things you should not do with the robot -- drive down a stairway, let children ride on it, stick a screwdriver into its body.

At first my driving was in grandma-mode. But after a few minutes I felt comfortable to drive faster and fearless. You can move in any direction, slowly or rapidly, as well as rotate on your vertical axis.

The robot has a plastic bumper, so it won't damage walls, furniture, or a person's leg, for that matter. I did manage, though, to get myself stuck against a wall.

"May I help you?" said Sanford Dickert, my driving instructor and escort at Willow [see photo at the beginning of this post]. Yes, please! He nudged me -- well, the robot -- and off I went.

I headed out to the EE lab to talk to Dallas Goecker, a Willow electrical engineer living in Indiana who, along with Curt Meyers, built the first Texai prototype. Goecker, as he does every workday, was present as a Texai.

So there we were: Two Texai talking to each other screen to screen.


Dallas Goecker, Texai co-creator, and I (inset, bottom left) meet face to face -- or screen to screen.

Goecker told me that being a robot became so natural for him that he sometimes can't recall whether he did something -- a discussion with a coworker, say -- as a person or as a robot.

So what is WIllow going to do with their 25 Texai? They're not selling them. So far they're doing some field tests at undisclosed sites and collecting feedback.

For a company focused on open source projects, I find they're a bit secretive about the Texai. My guess is the robot has commercial possibilities that they want to explore. Especially when you have a Google CEO showing it off at parties.

Or maybe they'll just give the robots away for free.

After the press conference, the party was to continue at a tent outside the building. I was told there was just one robot for reporters. John Markoff was in it and he was not getting out. Damn you New York Times!

Oh, well. I hung out inside with other guests and my Texai brothers. It was fun. The highlight was meeting people I'd been spoken via e-mail or on Twitter but had never met in person: BotJunkie's Evan Ackerman, GetRobot's Noriko Kageki, and Hizook's Travis Deyle -- some of the world's top robotics bloggers!

I even met some celebrities. As a huge MythBusters fan, it was great to chat with Tory Belleci, who for some reason wouldn't stop laughing at me, or the robot, or both [below].

My robotic existence wasn't perfect. More than once people had their backs facing me, though unintentionally, thinking I was just a piece of high-tech furniture. Other people felt clearly uncomfortable with a talking monitor and pretended I wasn't there.

Sometimes I couldn't hear people and vice versa. Twice I lost connection and had to log on again. And once my video feed froze and my face, I was later told, became a Francis Bacon portrait.

But overall it was a great experience. In the future, I see no reason why people wouldn't rely heavily on telepresence robots to attend meetings, interact with coworkers, and -- why not -- go partying.

More images:

willow garage texas texai robot

Robots: The Nao Humanoid

Aldebaran's Nao robot

In its latest episode, Robots, the podcast for news and views on robotics, takes a closer look at French robotics company Aldebaran and its humanoid Nao. Aldebaran's Vice President in Engineering Luc Degaudenzi and his colleague Cédric Vaudel, who is Aldebaran's Sales Manager for North America, discuss the Nao's success in the RoboCup Standard Platform League, share details on the robot and outline how they see the future of the market for humanoids. Read on or tune in!

Read also:

The Robots Podcast: 50 Years of Robotics (Part 1)
Fri, April 23, 2010

Blog Post: The Robots podcast is celebrating its 50th episode.

Robots: 50 Years of Robotics (Part 2)
Sat, May 15, 2010

Blog Post: The Robots podcast celebrates 50 episodes.

Aldebaran Robotics seeking beta-testers for its Nao humanoid robot
Thu, June 04, 2009

Blog Post: The French robotics company is inviting robot enthusiasts in France and the U.K. to try Nao -- for a fee

Microsoft Shifts Robotics Strategy, Makes Robotics Studio Available Free

Updated May 20, 4:23 p.m.: Added National Instruments comments; 5:49 p.m.: Added Willow Garage comments; May 21, 11:21 a.m.: Added details on competing robotics software platforms; 1:50 p.m. Added Herman Bruyninckx comments.

microsoft robotics developer studio
Microsoft's new and now free release of its Robotics Developer Studio includes new 3-D simulation environments like this multi-level house.

Over the past year or so, Microsoft's robotics group has been working quietly, very quietly. That's because, among other things, they were busy planning a significant strategy shift.

Microsoft is upping the ante on its robotics ambitions by announcing today that its Robotics Developer Studio, or RDS, a big package of programming and simulation tools, is now available to anyone for free.

Previously, RDS had multiple releases: one free but with limited features, a full commercial version that users could purchase, and an academic version distributed only to partners.

By releasing a single version with full capabilities and at no cost, Microsoft wants to expand its RDS user base, hoping to amass a legion of hobbyists, researchers, entrepreneurs, and other robot enthusiasts who will come up with the next big things in consumer robotics.

I spoke about the new plan with Stathis Papaefstathiou, who leads the robotics group and is responsible for Microsoft’s robotics strategy and business model.

"We decided to take out all of the barriers that today our users might have in order to help them build these new [robotic] technologies," he told me.

Papaefstathiou (pronounced papa-ef-sta-THI-u) says that price is a big limitation for mass produced robots. "That means that in the consumer space it's not about sophisticated hardware, it's about the software stack."

He says RDS has been downloaded half a million times since it launched in 2007. The company estimates it has about 60,000 active users.

Those are respectable numbers but they didn't help Microsoft fulfill its goal of kick-starting a vast and lucrative robotics development ecosystem -- like MS-DOS and Windows did for the PC.

So over the past two years the robotics group, which is part of of an elite software division called Startup Business Group led by Amit Mital, who reports to Craig Mundie, set about devising a plan to expand Microsoft's stake in robotics.

Not everyone is convinced the new plan makes sense.

"This is all just a non-event," says Herman Bruyninckx, a robotics professor at K.U. Leuven in Belgium and coordinator of EURON, the European Robotics Research Network.

Bruyninckx, an advocate of free and open source software who started OROCOS, or Open Robot Control Software, a framework for robot control, says that making RDS free is not a change in strategy and nobody he knows in the robotics community is "talking about RDS, let alone using or planning to use it."

Papaefstathiou says that in addition to creating a single RDS release, Microsoft is also making the source code of selected program samples and other modules available online, hoping to improve collaboration among users. In particular, the group wants to entice the growing community of hobbyists, do-it-yourselfers, and weekend robot builders.

He also says there will be closer collaboration with other projects at Microsoft. He mentions Project Natal, a motion tracking user interface that Microsoft is creating for the Xbox 360. He says Natal's ability to track gestures could "be available also in solutions where human-robot interaction becomes important."

Will the new strategy work?

microsoft robotics developer studio
A factory model is now part of Microsoft Robotics Developer Studio's simulation environments.

RDS is not a robot operating system -- it's a comprehensive set of development tools, samples, and tutorials. It includes a visual programming interface, a popular 3-D simulator, and also Microsoft's CCR and DSS runtime toolkit.

But despite its broad range of tools, RDS works best with the specific robot platforms it supports, including iRobot's Create, LEGO Mindstorms, CoroWare, Parallax, and others.

These are great robot platforms but by no means the only ones. In fact, many budding roboticists today are using Arduinos and programming ATmega microcontrollers to build innovative robots. Why would they need RDS?

Microsoft has plenty of competition as well. Other robotics software platforms include Urbi by French firm Gostai, ERSP by Evolution Robotics, and the Player/Stage Project.

One platform that is rapidly gaining adoption and has shown impressive results is the Robot Operating System, or ROS, a broad set of open source tools by Silicon Valley robotics firm Willow Garage.

Other users, including a growing number of high school students participating in the popular FIRST robotics competition, use National Instruments' LabVIEW tools and controllers to program their robots.

Papaefstathiou acknowledges that there are alternative software packages that can do some of the things -- visual programming and simulation, for example -- that RDS does, but he insists that "there's no single competitor for the overall toolset that we have."

As for Willow Garage, Papaefstathiou says they're "targeting different platforms and different capabilities," adding that some of the robots they're using are half million dollar systems.

"People are doing a great job in developing robotics technology there, but this is not something that goes into scale," he says. "And we here in Microsoft we are about scale."

Not surprisingly, Willow Garage disagrees.

"We designed ROS to be flexible and open, because researchers and application developers alike need to be able to inspect, improve, and extend the system," says Brian Gerkey, Willow Garage's director of open source development. "As a result, ROS is now used on a wide variety of robots, from inexpensive iRobot Creates to sophisticated humanoids and even autonomous cars. It's only through open source that we can reach this level of adoption and community involvement."

National Instruments, for its part, welcomes Microsoft's move.

"I'm glad to see that National Instruments, Microsoft, Willow Garage and other major players are aligned on a critical missing element to the robotics industry crossing the chasm and really taking off," says Shelley Gretlein, senior group manager of NI's real-time and embedded software. "The key is in the development software. Lowering the software barriers will make it easy to get into robotics."

Microsoft established the robotics group in 2007 under the leadership of Tandy Trower, a software veteran who'd headed some of Microsoft's largest and most successful businesses, eventually becoming a minister-without-portfolio reporting directly to Bill Gates.

Trower and Gates believed the consumer market was the right place for the next biggest innovation in robotics, finding parallels with the beginnings of the PC industry, a view Gates described in a now-famous Scientific American article, "A Robot in Every Home."

But things changed late last year when Trower left Microsoft to start a healthcare robotics company. The company chose Papaefstathiou, an unashamed Trekkie -- "Data is very inspirational" --  with a background in high-performance computing, as the robotics group's new leader. It's up to him now to turn Gates' a-robot-in-very-home vision into reality.

I do see potential for a big expansion of RDS. But my impression is that it will be strongest among schools and universities. Now any engineering school in, say, Brazil, Russia, India or China, could use it and have students programming robots, or at least simulating them.

The question is, Will promising, cool robotics products for the consumer market emerge from a larger RDS community? I asked Papaefstathiou what kinds of commercial robots he envisions would be around.

He wouldn't give me specific examples, preferring to say it was up to "the community to think broader about the scenarios."

"Consumer robotics is a new product category and building [applications] there requires leveraging the capabilities and inspiration of a broader community," he says. "This is exactly what we want to do.”


Microsoft's robotics squad. Left to right: Stathis Papaefstathiou (general manager), Russ Sanchez (creative director), Branch Hendrix (business development), Stewart MacLeod (development), Hunter Hudson (quality), Mukunda Murthy (program management), George Chrysanthakopoulos (distinguished engineer), and Chris Tham (engineering).

Images: Microsoft Robotics Group

Read also:

The Conference Room That Re-Arranges Itself
Wed, May 19, 2010

Blog Post: Just pick how you want it set up and the tables move themselves into position

Autonomous Car Learns To Powerslide Into Parking Spot
Mon, May 10, 2010

Blog Post: Stanford's "Junior" combines two control schemes to stop right on target

Robots With Knives: A Study of Soft-Tissue Injury in Robotics
Thu, May 06, 2010

Blog Post: What would happen if a knife-wielding robot struck a person?

Willow Garage Giving Away 11 PR2 Robots Worth Over $4 Million
Tue, May 04, 2010

Blog Post: The robotics company has announced the 11 institutions in the U.S., Europe, and Japan that will receive its advanced PR2 robot to develop new applications

The Conference Room That Re-Arranges Itself

You can add a new entry to the long list of problems that can be solved by robots: arranging tables in a conference room. On my personal workplace hassle scale, I'm not sure that moving conference room furniture ranks much above "occasional nuisance." But Yukiko Sawada and Takashi Tsubouchi at the University of Tsukuba, Japan, evidently find shoving tables to be an unappealing task for humans. So they built a room that could re-arrange itself.

In this case, the tables are the robots. Select the arrangement you want from a graphical interface, and the tables will move to their new locations. The movement is monitored by an overhead camera with a fish-eye lens, and the software uses a trial-and-error approach to determine the best sequence of motion. But it's best to see the room in action for yourself. Check out the video the researchers presented at ICRA earlier this month.

In the paper, the authors explained the rationale for the project:

In these days, at conference rooms or event sites, people arrange tables to desired positions suitable for the event. If this work could be performed autonomously, it would cut down the man power and time needed. Furthermore, if it is linked to the Internet reservation system of the conference room, it would be able to arrange the tables to an arbitrary configuration by the desired time.

I'm not sure the cost and complexity of such a system could ever be low enough to be practical, but there's definitely something fun about watching the tables reconfigure themselves. And if you already have autonomous, why not go all the way and add a reconfigurable wall?

Review: Neato Robotics XV-11

The Neato Robotics XV-11 robot vacuum made its first appearance in December of last year, and we got a brief hands-on with it the following January at CES. Like the iRobot Roomba, the XV-11 is an autonomous robotic vacuum. Unlike the Roomba, the XV-11 maps the room it’s cleaning and follows an efficient pattern to minimize cleaning time. Neato says that the XV-11 is smart, fast, and powerful, and they lent us a unit for a day to test out… How’d it fare? We’ll show you, with lots of pics and a video, after the jump.

-Design

The first thing you’ll notice about the Neato XV-11 is that it has a square front, which sort of makes it look like it’s going backwards most of the time until you get used to it. The square front helps the robot clean more effectively along walls and in corners. The front also has a wrap-around bumper that actuates if it encounters an obstacle, and small sensors on the sides of the bumper help it avoid obstacles while turning. Around the back (which is the round bit, remember) is the exhaust vent for the vacuum, the charging contacts, a plug-in charging port, and a potentially exciting little USB port.

Underneath, the XV-11 is pretty straightforward. It has one single spinning brush with rubber flaps, and that’s it. There’s a little squeegee blade behind the brush, and the vacuum itself is back inside the brush compartment. Edge sensors around the front edge keep the robot from falling down stairs.

On top is a recessed latch for the dust bin, the other side of which is a recessed handle that you can use to pick the robot up. The giant orange button wakes the robot up and starts it cleaning, while you can set the rest of the options on the little LCD screen (more on that later).

The last interesting bit is, of course, the dome on top of the robot that houses the laser scanner (or distance sensor, if you prefer). There’s a laser emitter and a receiver, and they spin around inside to make a map of the room that the XV-11 is cleaning in. The laser itself has a power of 2.1mW at at a wavelength of 785nm, which is in the near-infrared, so you won’t see it. Also, it’s designated as Class 1, which means that it is safe under all conditions of normal use (for humans and pets).

The design of the charging dock is pretty clever. It’s a little bulky, but it’s flatish, so it’s minimally intrusive to set against a wall. Since the XV-11 doesn’t have to drive up on anything, it just presses against the contacts, it’s harder for it to accidentally shove the dock around. Also, part of the reason that it’s bulky is that it opens up to reveal hiding places for the power adapter and extra cord. Need more cord length? Pull some out. Need less? Stuff it back in. Very handy.

-Features

It’s a little bit difficult to discuss a list of features on a robot with a selling point of “push one button and it does everything you need it to do on its own.” But if that’s not enough for you, there are ways to avoid having to actually bend down and push that button.

The XV-11 features on-board scheduling, which lets you set different times on different days for the robot to wake itself up, clean your floor, and then go back to its dock. I set it up without reading the manual, which is the ease with which all interfaces should work. The LCD also provides status and support information, and lets you pause and resume cleaning and direct the robot to go back to its dock.

-Cleaning Technique

Cleaning technique is what makes the XV-11 so interesting. When the robot starts to clean a room, it’ll move out into the room a bit and then spin up its laser rangefinder and start to map. It looks for walls, doors, obstacles, and tries to identify areas where it needs more information. When it has some idea of how your room is laid out, it decides what route to take and begins to vacuum, continuing to map as it goes, which allows it to adapt to changes that happen while it’s cleaning (new objects on the floor, moving furniture, stuff like that).

While cleaning my living room, the XV-11 began by going around the outside of the room to where my sofas and coffee table are. It spent a bunch of time getting into all the nooks and crannies around the sofas and under the table legs, and then finished cleaning around the perimeter of the room. Finally, it covered the open space in a series of straight back and forth lines, shut off its vacuum, and made a beeline back for its dock, job done. Total time elapsed: just over 12 minutes.

While cleaning, the XV-11 appears remarkably intelligent. It moves with purpose and with a recognizable pattern. Despite its shape (which prevents it from making zero-radius turns in tight spots), the amount of information that it gets from its laser sensor, bumper, and side sensors gives it very good spatial awareness, and it didn’t get stuck once. In some cases, it took the XV-11 a little bit longer to move around complex areas like forests of chair and table legs because of its square front, but it knows what shape it is and has no trouble getting around things. It’s also low enough to fit under most furniture, and it’s pretty determined… If it thinks it can squeeze underneath something, it’ll try as hard as it can to do so.

The XV-11 has no trouble moving from room to room. When the laser sensor maps an area, it pays special attention to anything that looks like a doorway, and remembers that it needs to go through there later after cleaning the room that it’s currently in. If the robot is cleaning multiple rooms and gets low on battery, it will remember its location and the progress it made, go back to its dock and charge, and then return to where it was before and finish up.

If there are areas that you don’t want the XV-11 going into, you can keep it out by laying out a magnetic strip that the robot will vacuum up to, but not across. 15 feet is included along with the robot, and you can cut it up to suit your needs.

-Cleaning Effectiveness

We’d seen the XV-11 in operation before, but only on square pieces of carpet that don’t accurately reflect what most people’s homes look like. My living room, on the other hand, features hardwood, deep carpet, shallow carpet, tables, chairs, cords on the floor, and the occasional cat. In other words, it’s a pretty typical living room, full of complications and potential hazards.

The XV-11 had no trouble with any of these things… In fact, it managed not to get stuck at all, which bodes well for its overall autonomous intelligence and robustness. The robot took just over twelve minutes to clean my living room, and it clocked nearly exactly the same time each time it cleaned. This is interesting, because it implies that the robot is calculating an efficient way to clean the room, and then recalculating a similar efficient pattern each time it cleans. Here’s what the pattern looks like:

It’s pretty easy to see what Neato is talking about here when they say that their robot cleans efficiently and in straight lines, because that’s what it does. Most of the floor, the robot covers exactly once. In more complex areas, it spends more time, but that’s less about inefficiency and more about just moving around. I mean, the robot shuts itself off before it returns to its charging base, because it knows that the floor has been entirely covered and it doesn’t need to waste energy keeping the vacuum turned on while it goes back home.

Since the XV-11 covers most area of the floor only once, it’s important that it cleans effectively. And for the most part, it very much does. Compared to an upright vacuum, the robot did just as well or better on hardwood, and comparably on carpet. There were only two circumstances in which the XV-11 didn’t clean especially well, both illustrated in this picture:

Issue one is the dirt around the coffee table leg. While cleaning, the robot was able to consistently get pretty close to the leg itself, but its square front didn’t help it clean more effectively there. The problem might be that no matter how close the robot can get itself to the table leg, there’s a limit to how effective it can clean there since the brush and vacuum don’t extend the full width of the robot. The same is true (in general) for cleaning along walls and in corners… Due to the design of the robot, there are some areas where the vacuum just can’t reach no matter how close the robot gets. However, it’s worth keeping in mind that this is the same for upright vacuums as well, which is why they come with hose attachments and stuff.

The other issue is that the XV-11 isn’t that great at getting cat hair out of carpet. It gets most of it, but for particularly tenacious hair, the rubber brush isn’t as effective as a bristle brush might be.

As far as cleaning along walls and getting into corners, while the XV-11 is shaped better for these tasks, it doesn’t always make it into a corner in its most effective orientation. For example, if the robot is following a wall on its right side and encounters a recessed doorway, it will turn into the doorway and follow closely along the door but misses the corner to its right where it first turns in. I’m being pretty picky about this, but it’s worth mentioning.

-Maintenance

While the XV-11 might have some minor issues picking up pet hair, the upside is that after three runs around my living room (shared by three cats), here’s what the XV-11’s brush looked like:

And it’s not like it wasn’t picking stuff up, either:

To clean out the dust bin, you just lift it out of the top of the robot. To actually get the dust out, you have to remove the air filter (it snaps in and out), but this is actually kinda nice, since the dust bin stays closed until you get it to a place where you can dump it. Since brush maintenance seems to be minimal, emptying the dust bin is basically all you need to worry about on a regular basis.

If you do need to clean out the brush area, it’s easy. The bottom panel releases with two clips, and you can pull the brush right out. It’s belt driven, which is kinda cool, and reinstalling it is as simple as sliding it back under the belt again.

And that’s pretty much it. Conceivably, you’ll need to replace the air filter and possibly the brush or squeegee, and eventually, the batteries will wear out. All of these bits are available on Neato’s website for prices that are sort of reasonable, as long as you don’t need to do it too often.

-Conclusion

Overall, the Neato XV-11 cleans fast and efficiently. It has some minor issues with a few specific aspects of how it cleans, but I feel like there is a great deal of potential with this robot. This is not to say that the XV-11 isn’t already impressive… It’s more like, there are lots of way that the robot could potentially be tweaked to make it even better at what it’s already good at, especially considering the amount of information (and level of detail) that its sensors collect.

For example, if the XV-11 can recognize a closed door, maybe its algorithm could be slightly modified to make an extra pass across the door from the opposite direction to be sure to get both corners. What I’m really hoping is that at some point, Neato will allow the users of its robots to plug into that USB port to take advantage of the XV-11’s impressive suite of sensors and modify its behavior themselves. Have some issue with the way your robot cleans? No problem, tweak it yourself, or download another user’s software over the internet.

On the hardware side, putting some bristles on the brush might make the XV-11 better at getting pet hair out of carpet, but might also make the brush more of a chore to clean out. Still, if Neato offered it as an option, then pet owners could decide whether or not they’d like to have less pet hair on their carpet and more tangles in their robot’s brush.

I wouldn’t worry too much about these quibbles, however… I believe that just as iRobot has, Neato will listen to their users and make upgrades and improvements based on real world feedback, and there’s no reason not to get one of the first generation XV-11s if you feel that its cleaning technique is right for you.

The Neato XV-11 is on pre-order for $399, to be available “this summer.” This makes it $50 more expensive than the Roomba 560 that we looked at yesterday, and $50 cheaper than the Roomba 570, which is (for all practical purposes) the highest end Roomba model. Tomorrow, we’ll compare them more directly, but the point is that XV-11 is quite comparable to Roombas with similar capabilities in terms of price, meaning that if you’re considering a robot vacuum, the XV-11 should definitely be on your list.

[ Neato Robotics XV-11 ]

Anybots QB Telepresence Robot Lets You Be At the Office ... Without Being There

anybots qb

Meet QB. This skinny alien-looking robot may soon replace you at work.

But don’t worry. It doesn’t want your job. QB is a robotic stand-in for workers. You control it remotely as a videoconference system on wheels. Embodied as a QB, you can attend meetings, drop by a coworker’s office, even confab at the water cooler.

You can control your robotic self from anywhere using a computer connected to the Net. It’s a bit like the recent Bruce Willis movie Surrogates. Except QB is less, uh, muscular.

Anybots, a robotics start-up in Mountain View, Calif., is officially unveiling the telepresence robot today. QB will be available in the fall for US $15,000.

"We wanted to create a technology that allows remote workers to collaborate more fully -- and feel part of the team," founder and CEO Trevor Blackwell told me when we spoke a few weeks ago.

What they created is a sophisticated mobile robot. Its base houses a compact computer, two Wi-Fi interfaces, a LIDAR-based collision-detection system, powerful motors, and a lithium-ion battery pack that lasts 8 hours, or enough for a full day of work.

The head has a 5-megapixel video camera pointing forward, a lower resolution camera pointing down at an angle to help with driving, three microphone and high-quality speakers, and -- my favorite feature -- a laser pointer that shoots green light from one of its eyes.

The 16-kilogram robot [35 pounds] rolls on two wheels using a custom self-balancing system, an approach that Blackwell says is more power-efficient, lets the robot drive over bumps, and has proved quite stable. QB can rotate around its vertical axis, easily take turns, and drive at 5.6 kilometers per hour [3.5 mph].

Anybots says "robocommuting" could not only improve collaboration but also save companies' time and money. Employees can work from home or other locations and reduce commute and travel.

But the question I -- and I guess many other people -- might ask themselves is, Why do you need a robot if you have pretty decent videoconference systems? Cisco Systems, the leader in this area, even uses the term "telepresence" for its products (Jack Bauer is a major "customer," by the way.)

"Videoconference is confined to structured environments like conference rooms," says Bob Christopher, Anybots' COO. "We want people to talk and interact in non-structured environments, anywhere."

"With QB," he adds, "you can continue talking to your colleagues after you left the conference room."

To use QB you don’t need to add any extra hardware to the office -- all it needs is a Wi-Fi network. The robot connects to it like any computer and sends and receives video and commands over the Net.

Controlling the robot requires only a Firefox browser and a plug-in from Anybots. You log in and instantly start seeing and hearing what the robot is seeing and hearing.

It’s not Star Trek teleportation, but "incarnating" a robotic body is quite an experience.

I had a chance to try it and will report on my tests in an upcoming feature article in IEEE Spectrum and here on this blog. In the mean time, let us know: Is robotic telepresence the future of work?

QB Specs:
8 hours of battery life
5 megapixel video camera
Supports Wi-Fi 802.11g
3.5 mph normal cruise speed
Price: US $15,000
Availability: Fall 2010

More photos:


Laser shoots from the right eye.


Docked on the recharging station.

anybots qb
Retracted neck, ready to travel.

Photos and video: Anybots

Review: iRobot Roomba 560

The iRobot Roomba is not a new product. Since 2002, it has been (more or less) the only robotic vacuum available to consumers in the US. iRobot has been continually improving the Roomba, however, and the Roomba 560 is one of the latest and most sophisticated models. Now that there’s some new competition on the horizon, it’s a good time to take an updated look at the Roomba and what makes it a reliable and effective autonomous vacuum. We’ll have a review of the aforementioned competition (the Neato XV-11) up tomorrow so that you can compare the two, but for today, we have a review of the Roomba 560.

If you’re not familiar with the Roomba, here’s the deal: it’s a robotic vacuum cleaner that can clean your floors all by itself. All you have to do is tell it to start cleaning, and it’ll go clean, avoiding obstacles and getting around furniture and ultimately returning to its home base to recharge itself. There’s a lot more too it than that, of course… Lots more, after the jump.

The particular Roomba that we’re reviewing is a 560. The 560 is a fifth generation Roomba, which is a significant upgrade from the earlier 400 series. It’s generally about the same size and shape, with a 13″ diameter and a weight of about 8 pounds. It’s also quite stylish, with a slick black and silver color scheme that doesn’t make you want to hide it in a closet like a conventional vacuum. This is good, because having your Roomba in a closet pretty much defeats the entire purpose of a robotic vacuum, especially one with on-board scheduling like the 560 has. But I’m getting ahead of myself.

The Roomba is able to clean autonomously thanks to its suite of sensors. Proximity sensors on the front of the robot work with a physical bump sensor to help the robot avoid walls and maneuver around obstacles and furniture. Drop sensors underneath keep the robot from going over stairs or ledges. On top, the Roomba has an infrared sensor that allows it to find its dock and use virtual walls (more on those later).

All of these sensors provide a limited amount of information about the Roomba’s environment and path, but they don’t directly tell the Roomba where it is in a given room. Instead, the robot relies on an algorithm to tell it where to go next, and cleans in a variable pattern that ends up covering most areas of a room between three and five times. While this pattern looks totally random, it’s not, and the patterns are actually derived from MIT research on the efficient coverage behaviors of foraging insects.

To do the actual cleaning, the Roomba combines a vacuum system with two counter-rotating brushes. The brushes help pick up all the big things (with the bristle brush and rubber beater brush working together like a broom and dustpan), while the vacuum itself takes care of smaller particles and dust. All off this stuff ends up in a removable bin at the rear of the Roomba, which incorporates a replaceable air filter. The entire vacuum module (the green piece in the above picture) is flexible and can move up and down, which helps the Roomba to adapt to floors and different lengths of carpet. Since the Roomba is fetchingly round, it does have some issues getting into corners. On the right side near the front is a spinning brush that’s designed to mitigate that problem to some extent by sweeping dirt and stuff out of corners and back under the Roomba.

While the Roomba is entirely capable of cleaning by itself, it does take a little bit of work to “pre-clean” your floor for it to be most effective. The robot can’t lift things, of course, so if you have a bunch of stuff all over the floor, you’re not going to get the best cleaning. The Roomba will nudge things a bit, but it tends to get caught on stuff like loose clothing and may become stuck. If that happens, the robot will try to free itself, and if it can’t it’ll stop and beep at you to come free it. Not a big deal, but it does keep it from autonomously completing its cleaning, so it’s best to keep things tidy. Supposedly, the 500 series of Roombas are good at not getting themselves entangled in cords and rug fringes and other stringy things. This has not been my experience. I guess it’s partially my fault for having ten thousand power cords strewn all over the place, but the Roomba likes to grab them and then run away, unplugging things as it does. It also manages to (occasionally) rip out pieces of fringe from a rug. My guess is that the anti-tangle system works if the Roomba considers itself tangled, but it has a tendency to tug pretty hard before it reaches that point.

Overall, I’ve been very impressed with how well the Roomba cleans. I bought my parents a 530, which managed to fill its dirt bin in one run around our living room the day after the carpet had been professionally steam cleaned. The bristle brushes do a great job of picking up stuff like pet hair, and for beating dirt out of carpet. The Roomba is least effective in corners and around objects, where it can’t always get its brushes close enough to the edges of things. I ran the 560 every other day or so for about a week, and in each case, the floor (which is half carpet, half hardwood) was noticeably cleaner when it was finished, and the Roomba’s dirt bin was nearly full. It’s not a substitute for a conventional vacuum, not completely, but it does a pretty darn good job for day-to-day cleaning.

The 500 series Roombas also include a ‘Dirt Detect’ feature, which gives them the ability to sense where there’s a particularly dirty spot and then spend more time there (in a tight spiral). Incidentally, if there’s just one spot you want cleaned, you can set the Roomba down manually and have it ‘Spot Clean’ just that bit instead of the whole room.

To clean an average sized room takes the Roomba 560 about 45 minutes. This seems like an awfully long time, especially if you watch it at work, which you totally will, because it’s adorable. It can also be frustrating at times, since you start to wonder why it’s covered that particular spot 37 times but still hasn’t managed to catch the rogue dust bunny over by the couch. Remember, while the Roomba may look like it’s just bumbling around randomly, it’s actually following an algorithm designed to cover all areas of a room multiple times. If there are places you don’t want it to go, you can set up little round towers the project a ‘virtual wall’ of infrared light that the Roomba won’t cross. The 560 is able to clean up to four rooms before it needs to head back to its home base to recharge, which it does all by itself when it considers itself finished or when it’s low on battery power.

While the Roomba certainly cleans effectively, it tends to make quite a mess of itself while doing so, which calls for weekly maintenance (or more often, depending on how frequently you run it). I’m not talking about just emptying the dustbin… Dust and hair get trapped in and around the brushes, and even inside the brush bearings themselves, necessitating partial disassembly of the cleaning compartment. It’s very easy to do this, but it’s still a chore, and often extraordinarily tangled and dirty requiring patience (and scissors) to clean out.

Really, it’s surprising how well the Roomba is able to clean on its own. iRobot has gotten the design to the point where with a little bit of forethought, you can just leave the robot cleaning and come back a few hours later and it’ll be back on its base charging. The 560 gives you some additional options (if you trust it) to schedule cleaning for when you’re not around. This is all done on the robot itself using a few buttons and an LCD… You can set different cleaning times for each day, and the robot will wake itself up, clean your room(s), and then go back to its base. You still have to remember to empty the bin and clean it and stuff, but daily vacuuming doesn’t get much simpler than that.

Although we reviewed the Roomba 560, iRobot makes a variety of different models with slightly different capabilities (and different costs). The base 500 model is the 510, for $280, but you don’t want that one ’cause it doesn’t come with a charging dock. As you go up through the different (and increasingly expensive) models, you gain battery life, on-board scheduling, some accessories, and (eventually) the ability to use Lighthouses, which are special Virtual Wall units that help the Roomba navigate around multiple rooms. The 560 that we reviewed here costs $350 and can’t use Lighthouses, which is funny, since my other Roomba (a 535 model that cost $250 and appears to be discontinued) can. Anyway, when you’re looking at buying a Roomba, it’s important to put some thought into how you’re going to use it. You definitely want a 500 series with a self-charging dock, but as far as other features go, consider how many rooms you’d like it to clean, whether you’d like it to move from room to room on its own, and whether you’re going to start the Roomba cleaning yourself or you’d like it to start by itself (when you’re not home, for example). It’s important to remember, though, that the fundamental cleaning technology is basically the same. You can pay a bit more for some extra features, but the robot isn’t going to navigate any differently or pick up any extra dirt.

The 560 is a fifth generation Roomba. As such, it benefits from a half decade worth of improvements that iRobot has implemented based on customer feedback and testing. It’s a practical and polished robotic vacuum that works in your home and can actually make your life easier… Or at least, make your floors cleaner. You can pick one up at iRobot.com, but I might recommend that you buy it from a retail store like Best Buy so that you can try it out and take it back if it’s not for you. You won’t take it back, though… Once you let it run around your house a little bit, you’ll be sold. It’s awesome.

For more on how the Roomba works, check out our interview with Nancy Dussault Smith, Vice President of Marketing Communications at iRobot.

[ iRobot Roomba 560 ]

Kokoro's I-Fairy Robot Conducts Wedding in Japan


Photo: Mr. Moriyama/Node

The groom is a robotics researcher. The bride works at a robotics firm. Robots brought them together. So when it came time to plan their wedding, the choice only seemed natural: A robot would conduct the ceremony.

The wedding took place today in Tokyo, according to this AP report. The groom was Tomohiro Shibata, a professor of robotics at the Nara Institute of Science and Technology in central Japan; the bride was Satoko Inoue, who works at famed robotics firm Kokoro.

Leading the ceremony was a little humanoid robot called I-Fairy with a high-pitch voice and flashing eyes. Kokoro, which unveiled the robot early this year, designed the I-Fairy as a robot receptionist and entertainer. It sells for 6.3 million yen (US $68,000).

The robot has a humanoid body in a sitting posture and, as the company puts it, its appearance was "based on the image of a lovely fairy." It can talk, gesture with its arms, and detect the presence of a person, according to this story in the Japanese blog Node.

Kokoro says this was the first time a robot celebrated a wedding.

At one point the robot told the groom: "Please lift the bride's veil."

Then the couple kissed.

Watch:

Thanks, Dr. Kumagai!

Read also:

Thomas and Janet: first kissing humanoid robots
Mon, August 24, 2009

Blog Post: Developed by the National Taiwan University of Science and Technology, the theatrical robots performed the first robot kiss during a performance of Phantom of the Opera.

Geminoid F: Hiroshi Ishiguro Unveils New Smiling Female Android
Sat, April 03, 2010

Blog Post: Geminoid F, a copy of a woman in her 20s with long dark hair, exhibits facial expressions more naturally than previous androids

Hiroshi Ishiguro: The Man Who Made a Copy of Himself
April 2010

Article: A Japanese roboticist is building androids to understand humans--starting with himself

Advertisement

Automaton

IEEE Spectrum's award-winning robotics blog, featuring news, articles, and videos on robots, humanoids, automation, artificial intelligence, and more.
Contact us:  e.guizzo@ieee.org

Editor
Erico Guizzo
New York, N.Y.
Senior Writer
Evan Ackerman
Berkeley, Calif.
 
Contributor
Jason Falconer
Canada
Contributor
Angelica Lim
Tokyo, Japan
 

Newsletter Sign Up

Sign up for the Automaton newsletter and get biweekly updates about robotics, automation, and AI, all delivered directly to your inbox.

Advertisement
Load More