Automaton iconAutomaton

Quadrotors Demonstrate Mad Cooperative Juggling Skills

Back in December, I posted a little teaser preview of a talented quadrotor juggling a ball at ETH Zurich's entirely awesome Flying Machine Arena. That quadrotor has been practicing, and has even enlisted a friend. Hey look, now robots can amuse themselves!

Besides the quadrotors, what makes this all possible is an extremely sophisticated motion capture system, so it's unlikely that you'll see these skills (or these skills, for that matter) outside of a tightly controlled environment.

For the record, these are easily the most impressive juggling robots in recent memory, which includes one or two or three or four other bots. Now seriously, put a net up there and let's have ourselves some robot volleyball already.

[ ETH - IDSC ]

Robots Are the Next Revolution, So Why Isn't Anyone Acting Like It?

willow garage pr2 robot
This robot can fetch you a beer. But it will cost you $400,000.

Back in 2006, when Bill Gates was making his tear-filled transition from the PC industry into a tear-filled career as a philanthropist, he penned an editorial on robotics that became a rallying cry for… no one. Titled "A Robot in Every Home," Bill Gates highlighted the obvious parallels between the pre-Microsoft PC industry and the pre-anybody personal robotics industry. Industrial use, research work, and a fringe garage hobby. That was the state of the computer industry before Bill Gates and Steve Jobs, and that’s more or less the state of the robotics industry now, five years after Bill’s editorial.

Of course, Bill hasn’t been around to make the dream come true, he’s been busy saving Africa and our public school system and the souls of fellow billionaires. He did leave behind a multi-billion dollar sotware company, however, that is perfectly poised to make "A Robot in Every Home" a piece of fact instead of fiction. Since then, Microsoft’s one major (intentional) contribution to the industry has been the sporadically updated Microsoft Robotics Developer Studio. It’s a good tool for prototyping and simulating simple robotics, but it isn’t moving anything forward. In fact, it treats the robotics industry exactly like computer industry stalwarts treated the burgeoning PC industry: as a hobby. What Microsoft hasn’t been doing over these past years is building a robot operating system, or making an even greater gamble on actual robots themselves.

Oddly enough, Microsoft’s largest contribution to robotics as of yet was largely inadvertent. The Kinect sensor for the Xbox 360 was launched in November of 2010, and was a surprising success with consumers. While normal people snapped up the mysterious sensor by the millions, brought it into their living rooms, and realized how very-out-of-shape they were, pale hobbyists ("hackers," as they’re known these days) quickly sequestered themselves in their garages (circa 2010/2011: poorly heated loft apartments), and taught the Kinect sensor new tricks. The piece of hardware that was originally intended to be a locked down add-on for the 360 became a multipurpose 3D sensor extraordinaire. Microsoft actually issued a mild out-of-touch (and never repeated) threat to the hackers, but the "damage" was done, and hundreds of burgeoning roboticists had a supremely powerful tool in their hands -- and incidentally generated millions of dollars worth of free PR for Kinect with YouTube videos of their exploits.

In 1974, when Intel released the 8080 microprocessor, it wasn’t trying to invent the PC, it was just trying to improve upon its existing, limited 8-bit 8008 chip. It was up to the likes of MITS (the Altair 8800) and Microsoft (Altair Basic) to make good use of it. Clones and successors quickly followed, and Intel has obviously kept up over the years. Perhaps Microsoft would be happy to accidentally spark a robotics revolution with the Kinect sensor, but wouldn’t it prefer to be at the center of it? Besides, Microsoft doesn’t actually build the 3D sensor heart of the Kinect, those honors go to a company called PrimeSense, which is offering the same tech to anyone for a similarly low price.

Someone is going to figure this out. Willow Garage, fueled by some mysterious and apparently inexhaustible venture capital, is taking the open source angle with its ROS (Robot Operating System). The project already has a good amount of traction among bearded hackers and ambitious university robotics programs, since it allows altruistic types to build upon the innovation of others instead of continually "reinventing the wheel" (as Willow Garage puts it) and building their own robot operating system and hardware support from the ground up. Still, while ROS has made great strides and is home to some very exciting innovation -- along with its fair share of Kinect hacks, of course -- it’s nothing a consumer would find useful or even approachable. What the personal computing revolution did was take tools that were already commonplace in the enterprise and hand them to regular cro-mags who wanted to "balance a checkbook" with a spreadsheet application or "word process" without a typewriter ribbon. Microsoft put those tools in the hands of hobbyists, then Apple put them in the hands of regular people, and then Microsoft put them in the hands of everybody.

What we need a Microsoft or a Google or an Apple to do -- or if they won’t do it, Enterprising Upstart X -- is build an operating system that runs on standardized hardware or commodity hardware, with built-in capabilities for doing things that are actually useful for a home user. Buzzing you in when you get locked out, signing for a package, taking that frozen chicken out of the freezer while you’re at work, feeding your pet, and of course the veritable classic of robo-problems: getting you a beer. As simple as these things sound, they’re actually incredibly complex in terms of where general robotics innovation is at currently. That’s why EUX is a longshot, but there’s still room for some barefaced ingenuity. The dawn of the PC was marked by incredible efficiency of code and hardware, techniques that made Bill Gates and Steve Wozniak famous. Currently, the retail robot closest to being able to manage all these tasks is Willow Garage’s PR2, which costs $400,000, harbors two dual-processor Xeon servers (16 cores total) and is still slow as molasses.

Imagine a robot that you could buy at Best Buy for somewhere between $2k and $4k, unbox and configure in half an hour, and then just take for granted as an extremely reliable, whine-free household member for the next few years (or, if you bought it from Apple, exactly 12 months before the upgrade lust sets in). It would change everything. Of course, it sounds preposterous given the current state of this barely-there industry, but it’s going to be a reality within the next decade. Who will get us there first?

Image: Willow Garage

Paul J. Miller, a New York-based technology writer, is a former editor at Engadget. This post appeared originally at pauljmiller.com.

iMobot Brings Robot Modules to Modular Robots

We love the concept behind modular robots: they're simple, cheap, easy to use, and capable of doing anything you want them to do, as long as you're willing to let them reconfigure. They're also easy to fix, and in many cases, capable of fixing themselves. So for example, if you've got a modular humanoid that you decide to kick in the face, it can put itself back together, as long as it's got enough modules attached to each other to enable movement. But single modules, left on their own, are more or less helpless.

iMobot is a project from UC Davis that takes all those cool possibilities embodied in modular robotics and adds a couple extra degrees of freedom that gives each individual module significant capabilities as well. The basic central hinged design is familiar from projects like ckBot, but iMobot adds rotating plates at the end of each module, which can turn the module into a single axle of sorts, capable of driving itself around. The modules can also crawl, roll, and "undulate" to get from place to place:

Besides movement, these additional degrees of freedom allow the modules themselves to perform tasks, like operating as little individual camera turrets. And of course, by sticking a bunch of the modules together, you can create much more sophisticated robots with enhanced capabilities: 

The creators of these modules, Graham Ryland and Professor Harry Cheng, have taken the promising step of starting up their own company to produce these little guys in bulk and make them available for research institutions and anybody else who wants to mess around with modular robots. Barobo already has a sizable NSF grant to kick things off, and they hope to have a product ready to go by the end of the year. 

[ iMobot ] via [ Physorg ]

Japanese Robot Surveys Damaged Gymnasium Too Dangerous for Rescue Workers

Editor's Note: This is part of our ongoing news coverage of Japan's earthquake and nuclear emergency.

japan earthquake tsunami search and rescue robot

Japanese researchers have sent a robot into a damaged gymnasium where a partially collapsed ceiling makes it dangerous for rescue workers.

The team used a remote-controlled ground robot to enter the building in Hachinohe, Aomori Prefecture, in the northeastern portion of Japan's Honshu island, and assess damages.

The roboticists, led by Fumitoshi Matsuno, a professor at Kyoto University and vice president of the International Rescue System Institute, used their KOHGA3 robot, a tank-like machine equipped with cameras and sensors, to carry out the mission.

"Part of the ceiling fell down," Prof. Matsuno told me. "That's why we used the robot." Emergency workers feared that aftershocks could send the rest of the ceiling crashing down.

Several robotics teams have been on standby throughout Japan, ready to assist in rescue and recovery operations after the earthquake and tsunami that struck the country early this month. Robots could also help at the troubled Fukushima Dai-1 nuclear power plant.

At the Hachinohe gymnasium, Prof. Matsuno's group set up the operator station -- a laptop computer with a video game-style controller attached -- at a safe location near the entrance. From there, they deployed their robot. Watch:

The KOHGA3 has powerful motors and four sets of tracks that allow it to traverse rubble, climb steps, and go over inclines up to 45 degrees. The robot is 86 centimeters long, 53 cm tall, and weighs in at 40 kilograms. Its maximum speed is 1.8 meters per second.

The robot carries three CCD cameras, a thermal camera, laser scanner, LED light, attitude sensor, and a gas sensor. Its 4-degrees-of-freedom robotic arm is nearly 1 meter long and equipped with CCD camera, carbon-dioxide sensor, thermal sensor, and LED light.

Upon reaching the area above which the ceiling had collapsed, the robot directed one of its CCD cameras upward, using its zoom capabilities to get a good look of the damage. The robot also pointed its camera to the debris on the ground, so workers could determine whether structural parts of the roof had collapsed.

japan earthquake tsunami search and rescue robot

japan earthquake tsunami search and rescue robot

japan earthquake tsunami search and rescue robot

Then it was time to explore other parts of the gymnasium. The roboticists drove up to a room, whose door was half open. Before entering, they used the robotic arm to peek inside. "Using a camera that is mounted at the tip of the arm, we obtained information on what's inside the room," Prof. Matsuno said.

The Kyoto University team included Dr. Noritaka Sato, Dr. Kazuyuki Kon, and Hiroki Igarashi. Prof. Masatshi Daikoku and Dr. Ryusuke Fujisawa from the Hachinohe Institute of Technology collaborated with the mission. The researchers are members of the IEEE Robotics and Automation Society.

The researchers also inspected the stage of the gymnasium, again using the robot's CCD cameras and the one mounted on the robotic arm. With all the inspection tasks completed, they drove the robot back to the entrance.

The group departed Hachinohe and headed out to Kuji, Iwate Prefecture, hoping to perform more inspections there. Their first stop was the National Kuji Storage Base, one of Japan's main oil stockpiles, with three storage tanks and total capacity of 10.5 million barrels. The facility, located on the seashore, was completely destroyed [photos below], and there wasn't much the robots could do to help.

japan earthquake tsunami search and rescue robots

japan earthquake tsunami search and rescue robots

Next they followed to a shipyard nearby. There were still buildings standing that rescue workers needed to inspect. The roboticists offered their assistance, but the officials in charge told them that a private company owned the buildings and they'd have to get permission to use the robots.

japan earthquake tsunami search and rescue robots

In another attempt to deploy their robot, the roboticists drove to Noda village, located about 15 kilometers south of Kuji. The earthquake and tsunami wiped out the coastal strip of Noda, leaving almost every building completely destroyed [photos below].

japan earthquake tsunami search and rescue robots

japan earthquake tsunami search and rescue robots

On a rooftop overlooking the devastated landscape, the roboticists discussed potential targets for their robot with the rescue workers in charge. But the same impediment came up: The buildings were private property, and the roboticists would need permission from the owners to get in, a process that could take a long time.

japan earthquake tsunami search and rescue robots

After several days on the road looking for opportunities to assist with their robot, the Kyoto University team began to make its way home. The researchers were happy to have helped, but also overwhelmed by the extent of the destruction they saw. Their contribution, Prof. Matsuno said, is only a "very small result."

Images: Fumitoshi Matsuno/Kyoto University

READ ALSO:

Can Robots Fix Troubled Reactors?
Tue, March 22, 2011

Blog Post: It's too dangerous for humans to enter the Fukushima Dai-1 nuclear plant. Why not send in robots?

iRobot Sends Packbots to Fukushima
Fri, March 18, 2011

Blog Post: A group of iRobot employees is on their way to Japan along with specially equipped Packbots and Warriors

More Robots to the Rescue
Fri, March 18, 2011

Blog Post: An underwater vehicle and another ground robot join the rescue and recovery operations

Robots Help Search For Survivors
Sun, March 13, 2011

Blog Post: Japanese engineers are deploying wheeled and snake-like robots to assist emergency responders

Festo Launches SmartBird Robotic Seagull

Festo has a fairly fascinating, frankly fantastical, and frequently full-on fabulous history with the robotic systems that they develop in partnership with universities and research groups as part of their Bionic Learning Network. In the past, we've seen flying penguins and jellyfish, as well as bio-inspired manipulators like this one.

Today, Festo has unveiled their 2011 Bionic Learning Network projects, the most awesome of which is definitely SmartBird. Watch it fly:

And here's what going on inside:

Unlike many of Festo's flying robots, SmartBird doesn't appear to rely on lifting gas at all. It weighs less than half a kilo, and is capable of autonomous take-off, flight, and landing using just its two meter-long wings. SmartBird is modeled very closely on the herring gull, and controls itself the same way birds do, by twisting its body, wings, and tail. For example, if you look closely in the video, you can see SmartBird turning its head to steer.

I love how Festo isn't just inspired by biological systems, but actually strives to exactly duplicate their functionality, often with remarkable results. We saw this philosophy in action last year, too, with their elephant trunk gripper, that now has a new home on a mobile robotic base:

This system is called Robotino XT, and it's easy to use, fast, precise, and most importantly human-safe thanks to the pliable nature of its arm and gripper, which is actually printed using a 3D printer.

Hit up the links below for lots more information; Festo's website have links to PDF brochures with just about all the detail you could possibly want.

[ Festo SmartBird ] and [ Festo Robotino XT ]

Review: iRobot Scooba 230


The iRobot Scooba 230 fits an entire floor washing robot inside an adorable little cylindrical package.

We're totally stoked about iRobot's new Scooba 230 floor cleaning robot, largely because it's something entirely new from iRobot, a company that we've gently chided in the past for making only incremental and cosmetic improvements to their consumer products over the last few years.

The Scooba 230, which becomes available for purchase this week, manages to fit an entire floor washing robot inside an adorable little cylindrical package. But is a robot this small able to clean bathrooms and kitchens well enough to give you a break from your chores? We got ourselves a review unit, and we'll tell you, right now.

In the box, or at least the box we got, you'll find the robot itself, two virtual walls (the kind with on/off switches that take those gigantic D batteries that nobody ever uses except for oversized flashlights), a set of three spare bottom plates, a base plate, some packets of cleaning solution, the battery, a charger (two prong), and a handy quick-start guide.

If you have a Roomba, or a regular Scooba, the first thing you'll notice about the Scooba 230 is that it's small. Incredibly small. I have relatively big hands, I'm told, but this robot fits comfortably in one of them. The controls, all of two buttons, are mounted on the top: there's one button that says "power" and one button that says "clean." There's even an adorable carrying handle.

The top of the Scooba 230 is at about the same height as a Roomba or Scooba, and I assume that the infrared virtual wall sensors are spaced around the top rim somewhere underneath that black strip. The front of the robot, all 180 degrees worth, is a bump sensor, and there's a wall-following sensor looking forwards too.

Around the back, you'll find the battery slot, which is kinda neat: the battery itself is long and flat and extends essentially the entire length of the body of the robot. If you're curious, it's a 7.2 V, 1300 mAh, nickel metal hydride. It's press-fit, without any catches or clips or anything, and there's an o-ring on the outside end to keep it dry. Right next to the battery is a little rubber flap that keeps the charging port covered.

The Scooba 230 fills up with clean water from the front and empties dirty water out the back (more on that later). Little plastic doors (handily labeled "FILL" and "EMPTY") flip out to allow water to be added or removed. The doors are designed so that you have to press them tightly into the robot to make sure they seal properly, but there's no clicking noise or anything to let you know you've got it tight enough. It's a little too easy to not quite press the door hard enough, which will end up letting little dribbles of water leak out.

Poking around the Scooba 230 as much as I can without going at it with a hacksaw reveals some neat design features. Part of how iRobot was able to make the robot so small was to use an active reservoir system, which you can see when you look in the fill ports:

Inside, there's a large water compartment containing one big flexible plastic bladder that's attached to the "FILL"port. When you fill the robot with clean water, the bladder expands until you've got about 1.65 cups of water in it, and it entirely fills the interior of the robot. As the robot does its thing, it sprays clean water out and sucks dirty water in, and the dirty water starts filling up the inside of the robot's sealed water compartment, but outside the clean water bladder itself, which is busy getting smaller as the water gets used up.

So eventually, you end up with lots of dirty water inside the water compartment, and an empty clean water bladder that's squished flat and not taking up any room. The clever bit is that the volume of water inside the robot never really changes; clean water in one place just gets turned into dirty water in another place, and utilization of the limited amount of space inside the bot is always close to 100 percent.

The bottom plate itself is detachable in five seconds with no tools, making it easy to take it off to clean it or to put a new one on if the little scrubby bristles wear out, which happens in about six months of normal use. Near the back is the squeegee, which has a bunch of tiny (millimeter-sized) holes punched into it that lead to two ports up inside the body of the robot. This, I assume, is the system used to suck up dirty water back into the Scooba, and as far as I can tell, these holes set the limit of what the robot can physically remove from the floor. Let me reiterate that: anything larger than these teeny tiny little holes will not be "cleaned up" by the robot, it'll just get shoved around.

Underneath the bottom plate, you can see a peristaltic pump that's used to squirt the clean water out of the robot. This makes a lot of sense, largely because pumps like this have basically one single moving part and no valves or seals or anything else to wear out, which definitely bodes well for the reliability of the robot itself.

The slots at the front of the robot appear to be the edge sensors, not the water jets. In fact, I had a heckuva time trying to figure out where the robot spits out the clean water, until I realized that a lot of it actually comes out the back, not the front. You can see minuscule nozzles here, that align with slightly less minuscule holes in the bottom plate:

Why does it work this way? Well, you have to remember that the Scooba (and every other home robot that iRobot makes, pretty much) is designed to work most effectively in multiple passes. So in this case, my guess is that since the water comes out the back (or mostly out the back, at least), pass one is with dry bristles, which are probably more effective at loosening up dirt. Then, water is left on the dirt as the robot passes over it to let it soak a bit. Finally, after a few passes, the robot stops squiring water out the back and transitions to just squeegeeing it up, and you're left with a clean, dry(ish) floor.

Now, on to what you really care about, which is how it works in practice. My bathroom supports a total of three people. And three cats. And, four three rats. And also one rather large snake. Admittedly, not all of us are generally trying to use the bathroom at the same time (or, at all), but I mention them anyway to attempt to give you a flavor of the variety (and quantity) of, uh, maintenance that our bathroom generally requires.

The first thing to do when using the Scooba 230 is to fill it with warm, but not hot, water. I tried to be careful, but it's hard to avoid slopping water all over the robot as you do this. Fortunately, iRobot figured that this would happen, and the bot can get wet (to a reasonable extent) without harming it. It's important to leave the back ("EMPTY") port open even as you fill the bot with clean water; this lets the bladder expand fully. After putting the water in, you can optionally add cleaning solution before sealing it up.

Starting the cleaning cycle involves all of two buttons, and since the robot only has two buttons, you're not likely to have trouble figuring out which ones to push. The only decision you have to make is whether you want the robot to clean for 20 minutes for smallish areas (60 square feet or so), or 40 minutes for largeish areas up to 150 square feet. It defaults to a 40 minute clean, but if you hold the "CLEAN" button for a couple extra seconds, it makes a sound and switches over to 20 minutes. And then, you just let the Scooba 230 do its job, simple as that.

I have to say, it's a pretty cute little robot to watch at work. It's brisk. Determined. Feisty, even. It clearly wants to do a bang-up job, and it's going put in as much effort as its round little body is capable of to get your floors clean. While operating, the 230 is certainly not silent, but it's not what I'd call loud, either. You can have a conversation while it's running, and if you lock it in your bathroom, you'll probably only here the occasional "thunk" as it runs into a wall.

It's possible to pick the robot up mid-cycle, and it will stop cleaning. However, it gets unhappy when you do this and complains loudly, flashes a red light, and drips all over the place. This brings up an unfortunate reality of a wet-surface cleaning robot, which is that unlike a Roomba, you can't really just decide that it's done and shut it off. I mean, you can, but if you interrupt the Scooba in the middle of its cleaning cycle, it's going to leave a wet and sloppy mess all over your floor.

After a full cleaning cycle, the Scooba 230 will sing at you and light up a little green check mark to let you know that it's done. You can then lift it up, carry it over to the sink, and dump the dirty water out of the "EMPTY" door. It's much less drippy when you pick it up at the end of its cycle, probably because it's had its water jets turned off for a little bit before it actually stops cleaning.

A cursory inspection revealed some gunk caught up in the bristles on the bottom plate, which I rinsed off. A more careful inspection revealed that a few of the tiny little vacuum holes that the robot uses to suck up dirty water were clogged by more gunk. This is a little bit troubling, since it implies that after maybe five or ten runs, all of the vacuum holes would be clogged up and the Scooba would cease to clean. Luckily, iRobot has anticipated such an event, and the rubber squeegee bit can be partially removed to get at the holes from both sides and clean them out. You can also remove and clean the wheels, and iRobot recommends that you rinse out both water reservoirs.

The robot got pretty wet during this cleaning process (which took me maybe 3 minutes), and some water got up underneath its front bumper and stuff, but no electrical shorts or fires seemed to result, so that's good. Personally, it's hard to get used to cleaning a robot under running water, but with the Scooba 230, that seems to be the way to go.

When the bot is all emptied out and cleaned up, you just set it on its little baseplate where it can drip dry without making a mess, plug in the charger, and you're done.

So, great, but how's the floor? The short answer is, it's clean. The long answer is, it's clean but still pretty wet. I was honestly expecting the robot to do a slightly better job of getting the water up. I wouldn't say that it leaves puddles or anything, but you'll need to let the floor air-dry for a few minutes at least. To give you a better sense of how much water it leaves behind, here's a pic of a glass tabletop after the robot has cleaned it:

In general, however, I was quite impressed by how clean the bathroom got. The robot was successful at removing not just surface dirt, but also sticky patches from soap and things that would generally require a bit of scrubbing from a human. It can easily and effectively take over for routine bathroom floor maintenance, and there's nothing stopping it from being equally effective on other hard surfaces in your home, like kitchens. Really, it does a good job.

Now, as impressive as the Scooba 230 is, there are some points that you should be aware of if you're thinking about buying one. First off, one thing that quickly became apparent when using the Scooba 230 is that, as we suspected based on the design of the bottom plate, while the robot is totally happy to clean the surface of your floor, it's really not any good at picking up stuff that couldn't be called "dirt." It does have a vacuum in it, but that vacuum is designed to suck up water, not debris, and is physically incapable of ingesting any particles larger than about a millimeter. You may need to sweep or vacuum your floor before you unleash the Scooba on it.

Another thing to be aware of is that the Scooba 230 can't clean corners. That same roundness that allows it to make zero-radius turns also prevents it from getting itself into square corners. It's great at getting close up along walls, but there are always going to be little triangles inside any right angles in your bathroom where the robot simply can't reach.

I don't feel like this is a huge issue, though, because iRobot has always said, quite correctly, that their robots are maintenance tools. Neither a Roomba nor a full-size Scooba can completely take over for you wielding a vacuum or a mop. What the robots can do is make things significantly cleaner most of the time, and make it so that the cleaning that you have to do is easier and less frequent. Yes, you're still going to have to clean your bathroom floor to get those little corners that your Scooba misses. But when you don't have time to do that, the robot will keep most of your floor much, much cleaner.

The last thing to be aware of about the Scooba is that it's not designed to be completely autonomous, and has somewhat less autonomy than a Roomba does. With a Roomba, you can tell it to clean a room, and then just leave, and the robot will do its thing and then go back home to charge, and it can do this several times completely unsupervised. The Scooba, by contrast, requires you to fill it with clean water, seal it up, tell it to clean, go get it after it's done cleaning, dump out the dirty water, and then plug it into its charger every time you want it to do its job. There's no "fire and forget" capability. This, incidentally, is why the Scooba doesn't have a scheduling function: the assumption is that you're going to need to be there at the beginning and the end of the cycle.

Really, though, it's all relative. The fact is, the Scooba does the cleaning for you, which is otherwise the sucky part. Yes, it requires you to put a minimal amount of effort into setup and cleanup, but while it's scrubbing your floors, you can go do something else. Someday, I'm sure, robots will be able to integrate themselves much better into our homes, and iRobot might even be working on it. But until that happens, the Scooba 230 requires minimal and intuitive maintenance that isn't nearly as bothersome as it sounds, especially relative to its effectiveness.

The Scooba 230 kit (which includes the virtual walls and spare base plates) costs $299.99, or $300 to anyone who's not a marketing executive. It's on sale at iRobot.com as of right now, so if you like the look of it, go get one! And if you've got any questions, this baby is mine for the next week, so ask away.

[ iRobot Scooba 230 ]

FirstLook: iRobot's New Throwable Baby Surveillance Bot

iRobot 110 FirstLook robot

iRobot has just introduced the 110 FirstLook, a very small and lightweight robot designed to be used for scouting and surveillance when you don't have access to its big brother, the Packbot. FirstLook is 25 centimeters (10 inches) long, 23 cm (9 in) wide, and only 10 cm (4 in) high. It weighs less than 2.3 kilograms (5 pounds). Onboard, it has four separate cameras, one on each side, allowing the operator to see in every direction at once, with IR illuminators for night vision.

FirstLook is designed to be as rugged and reliable as iRobot's other battlefield robots. It's throwable, and can survive a 4.5 meter (15 foot) drop onto concrete and complete submergence in water. Using a pair of rotating flippers, it can climb curbs and stairs, and flip itself over if it ends up upside-down. Top speed is 5.6 kilometers per hour (3.5 mph), and FirstLook can scoot around for up to six hours on a charge, or spend 10 hours broadcasting live video from a stationary position.

If FirstLook robot looks somewhat familiar, that's because it is: We saw a very similar robot (or at the very least a similar form factor) as part of an early LANdroid prototype program, which was still active as of September of 2010. That program was intended to create a swarm of super cheap (less than US $100) urban robots that can work together to form an adaptable and self-healing wireless network. Now, I'm not saying that FirstLook is related to the LANdroid, per se; it may just be that iRobot has developed a simple, rugged, and reliable form factor that can be adapted for several different purposes.

However, FirstLook also does seem to have some very LANdroid-y capabilities. From iRobot's fact sheet:

Mesh Networking Capabilities -
Digital mesh networking allows multiple FirstLook robots to relay messages over greater distances, increasing Line of Sight and Non-Line of Sight capabilities. The robot offers multiple public and military radio band configurations.

Interesting, very interesting. It sort of sounds like FirstLook may in fact be able to be used as network extenders like LANdroids, albeit likely without the autonomous and self-healing capabilities, and definitely without the $100 price.

Another cool feature: iRobot developed a fancy operator control unit (OCU) for the FirstLook. It's a wrist-mounted touchscreen device that looks like something straight out of a James Bond movie [see photo below]. From the specs:

Wrist-Mounted, Touchscreen OCU -
FirstLook uses a wrist-mounted,  touchscreen Operator Control Unit  (OCU). The battery-powered OCU  includes a built-in radio.

Oh, and there's one other little interesting factoid from iRobot's fact sheet on the FirstLook, when they're talking about payloads:

Payload Expansion -
Facilitates integration of specialized cameras, thermal imagers,
chem-bio-radiation sensors and destructive payloads weighing up to a half pound.

Destructive payloads, you say? Would that be like dropping little mines, or like driving underneath a tank and committing suicide in homage to one of the first battlefield robots ever? Either way, my imagination is already running wild with that one.

While we don't yet know how much FirstLook is going to cost to deploy, to keep it competitive with other small surveillance robots it's going to need to end up somewhere in the high four figures to low five figures. We'll keep you updated as we find out more, but in the meantime, check out a bunch of extra pics and some video of FirstLook in action:

More images:

iRobot 110 FirstLook robot

iRobot 110 FirstLook robot

iRobot 110 FirstLook robot

iRobot 110 FirstLook robot

Images and video: iRobot

Lockheed Martin's Spybot Knows How Not to Be Seen

There are some basic rules that both humans and robots should be aware of when it comes to not being seen, and Monty Python only scratched blew up the surface. Lockheed Martin's Advanced Technology Laboratory has been developing a robot designed to operate around humans without being detected, and not just by being small and quiet: it listens for humans, guesses where they might be looking, and then finds itself a nice dark hiding place when it needs to.

Lockheed's robot is equipped with a 3D laser scanner that allows it to build detailed maps of its surroundings. It also has an array of acoustic sensors, which allow it to localize footsteps and voices. It can then combine the locations of humans with its 3D map to guess what areas the humans might be able to see, and then does its best to stay hidden. Keeping to the shadows, the robot always maintains an escape route, and if it senses a human approaching, it will look for the deepest darkest corner it can find and then hold its virtual breath until the danger has passed.

This is certainly not the first deceptive robot we've seen. Given the opportunity, a robot swarm at EPFL independently evolved the capacity for deception alarmingly quickly in a competition for virtual food. And researchers from Georgia Tech taught a robot to use deliberately deceptive tactics to fool other robots and humans. The Georgia Tech research, especially, seems like it's destined for applications like surveillance, as it endows a robot with a method of analyzing a situation to determine whether deception would be effective, based on what it knows about the robot (or person) trying to find it. If it decides that deception would help it achieve its goals, the robot then leaves tracks in one direction before moving off in a different one.

The key to avoiding detection by humans is to understand how you're perceived by humans. As Monty Python so astutely pointed out, for example, even the most perfect hiding place won't do you any good if it's the only possible place that you can be. By building models of both physical environments and perceptual environments, or how humans sense and react to things, robots will be much better at not just spying, but also understanding and reacting to us in less adversarial environments.

[ Lockheed Martin ] via [ New Scientist ]

'Blinky' Short Film Now Online, Not Safe For Your Sanity

“Soon every home will have a robot helper. Don’t worry, your kids are perfectly safe.”

You know when you see a tag line like that on a movie poster, that your kids are going to be far, far from perfectly safe. Blinky, a short film from a company called Bad Robot Productions that we've been looking forward to since 2009, doesn't disappoint.

Directed by Ruairi Robinson (who's currently working on Akira), Blinky is very well done, although it definitely messes with your head. Personally, while I'm glad I watched it, I kinda don't think I'd want to watch it again. If you're up for it, Blinky is probably rated PG-13, so use your best judgment as to what that implies. You've been warned!

Via [ io9 ]

Can Japan Send In Robots To Fix Troubled Nuclear Reactors?

japan fukushima radiation monitoring robotEditor's Note: This is part of IEEE Spectrum's ongoing coverage of Japan's earthquake and nuclear emergency.

When it comes to robots, Japan is a superpower, with some of the world's most advanced robotic systems and the highest levels of industrial automation. So it makes sense to ask: Why can't Japan use robots to fix the damaged reactors at the Fukushima Dai-1 nuclear power plant?

Many people have wondered about this possibility, and there's been a lot of speculation and confusion. One news report even slammed Japan for lacking nuclear-disaster robots.

I'd be the first to shout, "Send in the robots!" if it were clear that robots could help in this case. But things aren't that simple. To understand what robots can and cannot do at Fukushima, I spoke to several experts. Here's what they say.

Can Japan send robots into the reactors to repair them?

It'd be a difficult mission. To understand why, let's first take a quick look at the alarming situation at the Fukushima plant. One of the biggest problems is that the reactors and spent-fuel pools have lost -- and and may be continuing to lose -- cooling water. To make things worse, the earthquake and tsunami, and subsequent fires and explosions, may have damaged the reactor vessels, spent-fuel pools, and cooling and control systems, as well as the buildings that house them.

So if you wanted to send in robots, the first challenge is getting around inside the buildings. "The problem of mobility includes not only rough terrain but also gaps and obstacles," says Satoshi Tadokoro, an IEEE Fellow and professor of robotics at Tohoku University, in Sendai. "The path might have obstacles that a human could remove but most robots can't."

Dennis Hong, a roboticist at Virginia Tech, says researchers are constantly developing new ways of traversing difficult terrain -- using wheels, legs, tracks, wheel-leg hybrids, and other approaches -- but still, "a site like these reactors, where debris is scattered with tangled steel beams and collapsed structures, is a very, very challenging environment."

But what about robots designed for difficult terrain, like search-and-rescue robots and those bomb disposal robots used in Iraq and Afghanistan?

There are many robots capable of negotiating rough terrain, steep inclines, and even stairs. Indeed, as we've reported earlier, Japan might use these robots in rescue and recovery operations. But there exist countless other obstacles -- as simple as a closed door, for example -- that could be hard for most mobile robots to overcome, says Henrik Christensen, a professor of robotics at Georgia Tech, in Atlanta. 

What's more, he says, the robots would have to be remote controlled by human operators, and communication is another challenge. Relying on wireless transmissions is tricky because the reactors have thick concrete walls and lots of metal around. An alternative would be using a tether, but the trade-off is you lose range and mobility. "Even with a fiber-optic tether it is very hard have a range longer than 2 kilometers, so they would have to deploy people to be close by to operate the vehicle," says Prof. Christensen.

What if the path inside the reactor is more or less clear for a robot -- what other challenges exist?

The biggest one is radiation, which can damage microchips and sensors, and also corrupt data (bits) in semiconductors [read "Radiation Hardening 101: How To Protect Nuclear Reactor Electronics" to understand why radiation damages electronics]. So if you'd want your robot to last long enough for a complex mission, it would need not only radiation-hardened electronics but also lots of heavy shielding.

The result is that if you try to build a robot that can overcome all the challenges described above (mobility, communication, radiation), you'll end up with a machine that is big and slow, as Dr. Robin Murphy, director of the Center for Robot-Assisted Search and Rescue (CRASAR) at Texas A&M University, in College Station, explains:

So in some sense you need a dinosaur robot -- big, beefy, slow, and stupid (as in few processors) -- and even then it’s just a matter of time before enough radiation fries something important… You don’t know how long you’ve got. 

In the end, even if the robots can survive the radiation and reach the right places, they'd have to be capable of performing complex tasks like opening and closing valves, activating pumps, or handling hoses to deliver the cooling water.

The problem is that there are no commercial or research robots designed to carry out a mission like that. Any attempt involving robots would require a lot of improvisation, and this being a nuclear crisis, and this being Japan, authorities will probably be very conservative in their actions.

What about an agile humanoid robot that can walk on rubble, operate heavy machinery, and endure fires and radiation, Terminator-style?

You're watching too much TV. Even Japan, which has built the world's most advanced humanoid robots, doesn't have anything remotely close to that. Humanoid robots, despite their recent advances, are still research projects. They can walk, run, climb stairs, dance, and perform dexterous manipulations. But they can't fix nuclear reactors.

But there must be something robots can do at Fukushima?

There's plenty robots can do -- and are already doing. Perhaps the most important job at the moment is monitoring radiation. Dangerous levels of radiation prevent emergency personnel from accessing the buildings, so we need robots that act as our eyes in and around them. Only by gauging the damage can authorities devise effective plans to control the situation.

Prof. Tadokoro says there's already at least one robot on site equipped with cameras and sensors to measure gamma and neutron radiation [see photo above]. (The authorities are also measuring radiation with non-robotics methods, of course, on the ground and using airplanes and helicopters in Fukushima and elsewhere.)

Developed by the Japan Atomic Energy Research Institute after a nuclear accident at a fuel processing facility in Tokai in 1999, the tank-like robot is 1.5 meters tall and weighs in at 600 kilograms. The robot moves at about 40 meters per minute and can operate at a distance of 1.1 km from its controller. Researchers designed this robot for several missions, including opening doors, turning valves, and drilling a hole on pipes. These capabilities could be useful inside the Fukushima reactors, but it all depends on whether the robot would be able to navigate inside treacherous spaces.

Tadokoro adds that if it becomes necessary to spray more water on the reactors from the outside, and if using manned trucks is too dangerous for a human crew, Japan has developed several firefighting robots that could shoot water on the buildings. The only problem is that these robots were not designed to withstand radiation, so they'd have to be fitted with shielding. He says it's not clear whether firefighting robots are present at Fukushima at this time.

Japan has sent out a request for more robots to the international community. The Japanese authorities apparently plan to use robots for gaining visual access of areas near the reactors and removing rubble and other clean-up operations. iRobot has sent PackBots and Warriors ground robots at Japan's request. France has apparently offered robots, too.

What about flying robots to peek inside the buildings?

Both Georgia Tech's Christensen and Virginia Tech's Hong suggest using unmanned aerial vehicles, or UAVs, to generate imagery. "I am very surprised they have not used this option to provide better live footage from the site," Christensen says. "UAVs could be used to generate information from close range without risking lives."

The U.S. military has reportedly sent a Global Hawk drone to peek at the reactors from above, and there's talk of sending unmanned helicopters as well. But again, the Japanese authorities will probably be conservative in their choices, preferring not to fly a UAV that could crash and make things worse.

Robots fixed the BP oil leak in the Gulf. Why can't they do the same here? Does the nuclear industry use robots anyway?

The nuclear industry does use robots, and newer plants have higher levels of automation, but you won't see robots running around doing chores. Robots are typically used in reprocessing plants, where spent fuel is recycled. The robots are not really autonomous machines; they are teleoperated robotic arms to handle highly radioactive materials.

Dr. Gerd Hirzinger, director of the Institute of Robotics and Mechatronics, part of DLR, the German Aerospace Center in Wessling, says that in the 1960s, Germany did a lot of work on teleoperated manipulators for the nuclear power industry, but when plans for a central German reprocessing plant were suddenly killed in 1989 (the government decided to do reprocessing at a French plant), robot development stopped and roboticists shifted their focus to other areas. "But I agree that we should have a mature and highly reliable teleoperation technology for all nuclear plants," he says.

In deepwater oil exploration, the tools used to assemble the riser pipes, wellheads, and other equipment are designed for the robotic hands of remotely operated vehicles, or ROVs, not for human hands. These underwater robots, in other words, act as telepresence systems for human operators. This approach never became part of the nuclear industry, though some argue it should. AI pioneer Marvin Minsky called for this type of technology more than 30 years ago:

Three Mile Island really needed telepresence. I am appalled by the nuclear industry's inability to deal with the unexpected. We all saw the absurd inflexibility of present day technology in handling the damage and making repairs to that reactor. [...] The big problem today is that nuclear plants are not designed for telepresence. Why? The technology is still too primitive. Furthermore, the plants aren't even designed to accommodate the installation of advanced telepresence when it becomes available. A vicious circle!

But people have used robots in other nuclear emergencies, no?

Yes. Carnegie Mellon roboticist William "Red" Whittaker developed ground robots that have been to the nuclear disaster sites at Three Mile Island, in the United States, and Chernobyl, in Ukraine. The robots helped by capturing images of the sites and monitoring radiation, but they couldn't do much more than that.  

Why did Japan have to ask foreign companies, like U.S. firm iRobot, to send robots rather than use some of their own?

Due to post-World War II regulations, Japanese robot makers can't export military robots. For this reason, Japanese robots haven't been tested in real conditions as extensively as U.S. robots like iRobot's PackBot and Foster-Miller's Talon, both used in Iraq and Afghanistan, have.

What's more, Japan's wireless regulation is very strict, limiting the power output of transmissions, even during emergencies, compared to what is allowed in the United States.

Will the nuclear industry invest in disaster robots now?

I hope so, but there's reason for skepticism. The nuclear industry never embraced robots like the auto industry or the oil and gas industry because it didn't make economical sense. Auto makers use robots because they help make cars cheaper; the oil industry uses ROVs because that's the only way they can get to deepwater reserves. The nuclear industry never had the incentive to adopt robots on the same scale.

Photo: Asahi Shimbun

READ ALSO:

iRobot Sending Packbots and Warriors to Fukushima Dai-1 Nuclear Plant
Fri, March 18, 2011

Blog Post: A group of iRobot employees is on their way to Japan along with specially equipped Packbots and Warriors

More Robots to the Rescue
Fri, March 18, 2011

Blog Post: An underwater vehicle and another ground robot join the rescue and recovery operations

Global Hawk UAV May Be Able to Peek Inside Damaged Reactors
Thu, March 17, 2011

Blog Post: A Global Hawk UAV is scheduled to overfly the Fukushima Dai-1 nuclear plant today, taking infrared images to attempt to determine what's happening inside the reactor buildings

Robots Help Search For Survivors
Sun, March 13, 2011

Blog Post: Japanese engineers are deploying wheeled and snake-like robots to assist emergency responders

Most Commented Posts

Automaton

IEEE Spectrum's award-winning robotics blog, featuring news, articles, and videos on robots, humanoids, automation, artificial intelligence, and more.
Contact us:  e.guizzo@ieee.org

Editor
Erico Guizzo
New York, N.Y.
Senior Writer
Evan Ackerman
Berkeley, Calif.
 
Contributor
Jason Falconer
Canada
Contributor
Angelica Lim
Tokyo, Japan
 

Newsletter Sign Up

Sign up for the Automaton newsletter and get biweekly updates about robotics, automation, and AI, all delivered directly to your inbox.

Advertisement
Advertisement
Load More