Since their earliest days, robots have been hopefully imagined as charming domestic servants, cheerfully serving beer or cocktails to their human masters [see photo, "Then..."]. Twenty-three years ago, Heathkit Co. released the first real attempt to deliver on that dream: its Hero line of home robots. Not surprisingly, their 8-bit processors, minimal onboard memory, and limited ability to sense their surroundings meant that humanity still found itself getting up and going to the fridge whenever it wanted a cold one.
But in the past few years, a new genus of home robots is making the dream seem less fanciful. Entertainment bots like the endearing Sony AIBO pet dog, floor-cleaning automatons like iRobot Corp.'s Roomba, and general-purpose home robots like the US $1200 PC-Bot, from White Box Robotics Inc., often have enough processing power to execute interesting programs, such as voice recognition or vision-based navigation. They also have enough payload capacity to carry useful attachments such as personal sound systems and gripper arms, rather than just a cup of your favorite beverage.
This summer, I tested the beer-fetching prowess of the ER1 Personal Robot System, a minimalist home robot from Evolution Robotics Inc. of Pasadena, Calif. It's a three-wheeled, 60-centimeter-high collection of aluminum struts that cradles the robot's brain--a laptop computer running Microsoft Windows [see photo, "...And Now"].
For $299, Evolution supplies a kit containing a controller, a battery pack, software, motorized drive wheels, parts for a frame, a tiny digital video camera, and other miscellaneous components. Buyers supply the laptop (about $300 used on eBay if you don't have a spare in your closet) and assembly skills. For $249 more I added a gripper arm, just in case no one was standing by the refrigerator to balance a beer on the robot's frame when it came calling.
AS I REACHED for the 20th of the 80-plus 6-millimeter screws that would eventually hold the ER1's frame together, I began to understand what makes robotics so challenging. A tedious and repetitive simple task, such as installing screws, sounds like it would be tailor-made for a robot to do. However, although an instruction such as "(dotimes (i 80) (install-setscrew i))" might be a simple piece of code to write for an assembly robot, in the physical world, the task requires the fine manipulations of fingertips and an Allen wrench.
Once the frame was built, I nestled my refurbished ThinkPad laptop into it, loaded the robot-control software, and hooked up the USB cables that let the PC interact with the rest of the system. The software lets you program a series of stimulus-response behaviors--for example, playing a tune when something blue comes into the robot's field of view--capture images for the object-recognition and motion-detection software, and drive the robot and its gripper by clicking and dragging with a mouse. If you have a wireless network, you can run the software on another PC--which may be easier than frantically following your robot to select on-screen buttons with a mouse or trackball while it careens across the floor.
ER1 users have developed other programs that you can find on the Web and download for a fee, including applications for mapping spaces and advanced navigation.
It took me a day to finish fitting the frame and its joints, tighten the setscrews, and bolt on the rest of the hardware (or attach certain components with the included industrial-strength Velcro). Meanwhile, I dreamed about how nice it was going to be when I could finally snap my fingers and order the ER1 to bring me that cold beer. I could easily see how just a few basic commands, including "turn toward object," "drive toward object," and "close gripper," could make that happen.
Like putting in the screws, the thought may have been simple, but the execution wasn't. Programming the ER1 renewed my appreciation for the complexity of the real world. A wheelchair-bound ex-skateboarder in a body cast with only a thumb and forefinger exposed has more degrees of freedom than the gripper-equipped ER1. The only thing that made my grand plan even remotely plausible was the 5 cm of height difference between the bottom of the fridge compartment and the floor. That clearance ensured that whatever the gripper pulled out of the refrigerator would remain comfortably above the ground during the rest of its trip.
SNAGS SOON BEGAN TO CROP UP. First I discovered that there's a reason for the warning sign on the ER1's power pack that tells you not to operate the robot with the charger plugged in, when I irreparably damaged the 12-volt battery. Fortunately, I also discovered that a 24-volt battery pack I had around the house could be sliced into two 12-volt halves, one of which I could use to power the robot. Then I ran into the fail-safe that puts a message on the screen telling you that the ER1 won't move at all while the laptop's power cord is plugged in.
Once I unplugged the cord and got the machine moving, I found that specifying behaviors for it is a little like writing in an assembly language where every instruction can fail in unexpected ways. "Move forward, stop when an object enters the gripper jaws, close the jaws" soon became "Drive slowly enough that the gripper jaws don't knock your beer can over when they touch it, but fast enough that they don't close on mostly empty space." In the morning, I successfully instructed the machine to recognize a big red square of construction paper and turn toward it. But by late afternoon, that paper might as well have been green for all the notice the ER1 gave it, thanks to the sunlight now streaming in through the windows.
I had a bit more luck with the object-recognition subsystem, as beer bottles and soft-drink cans captured under glare-free lighting against a featureless background have distinctive enough profiles that the vision software can pick them out and estimate their distance.
Now it was just a matter of rebuilding the robot to solve a pesky parallax problem. With the camera mounted on a mast well above the laptop and gripper, by the time the ER1 gets close enough to grapple with something it has spied in the distance, its goal has slipped below the robot's field of view. If you're going to tell a machine to stop when a target fills some fraction of its sensor view, you'd better make darn sure that the target will in fact be visible at the critical moment. (I am reminded of an early presentation on autonomous vehicles where a researcher remarked that the voice recognition system had been programmed with "Oh, sh--!" as a synonym for "stop all actions.") Rebuilding the frame with the camera mast in a lower position solved that problem.
Programming a robot is thirsty work (especially when you have to spend most of your time prone on the floor to reach the computer), but by now I'd already begun to abandon the robot-butler notion. Even should my code be perfected, I reasoned, it wouldn't be all that convenient unless I built robot-accessible ramps leading anywhere I'd want a drink delivered.
But all is not lost. Perhaps when my 8-month-old son learns to crawl, I'll have the ER1 follow him, at a safe distance, picking up discarded toys.
About the Author
Contributing Editor PAUL WALLICH writes about oddball gadgets and technology policy from Montpelier, Vt. None of his household appliances are connected to the Internet.
To Probe Further
For more information on the ER1, see https://www.evolution.com/er1.