At RoboGames last weekend, we got a sneak peak at thatmysteriousnewrobot that's been churning through the rumor mill for the last month or so. We can't tell you everything about it, not yet, but we can tell you SOME things... Enough things to subtly suggest that this robot could have a significant place in your home in the near future.
First, an exclusive teaser video:
Now, you should understand that this robot isn't designed to be a fancy, futuristic new platform. Really, it's fairly simple: it's got a beefy computer in the base, a big battery that lets it run for eight hours, an 8" touchscreen, and a bunch of USB expansion ports. And of course, it's open source, so you can write your own apps for it.
What makes this robot different is that it's 1.5 meters [5 feet] tall, which is tall enough for you to interact with naturally, and relative to other robots this size, and it's affordable for regular people without corporate funding or grant money -- think mid-range laptop in price. So just imagine all of those handy things that your computer can do for you right now, and then imagine how many other handy things your computer could do for you if it could move around and interact with its environment like this robot obviously can.
For the moment, you'll have to just have to keep using your imagination as to exactly what this robot will be capable of, but we'll have more details (all the details) for you on May 11th.
Today is Earth Day, and one of my coworkers was telling me about all the little things we can do to help preserve the beautiful place we all live in. That got me thinking, naturally, on things that robots could do to help preserve the planet. Let's not be disingenuous: robots, like all technologies, are not a panacea. More automation could mean less carbon emissions and less waste, but it could also mean the opposite -- it all depends on how we use it. Below I'm listing five robotic technologies that could potentially help to make the planet greener. If you have more robots to add to the list, or if you disagree that robots are Earth-friendly creations, leave a comment below.
1. Recycling robots. Waste is a huge problem all over the world, and many people do their share by separating plastic, paper, glass, and other trash, which is then collected, resorted, and (hopefully) recycled. To me it looks like a hugely inefficient process screaming for more automation. The only project I've heard of in this area is an Italian mobile robot called Dustbot [photo below], which picks up trash at people's homes and brings it to a recycling facility. It's cute, but just a prototype. If we don't want to end up in a landfill of a planet as depicted in WALL-E, we need much better recycling bots.
2. Telepresence robots. Air travel is responsible for a sizable fraction of the world's carbon emissions. It's also costly. That's why many corporations have reduced business trips and embraced videoconference meetings. Now there's another option: telepresence robots. The idea is simple: You embody a robot, controlled over the Net, that acts as your proxy at a remote location. And you can choose from many different types of body. You can be the skinny Anybots QB [video below], or the large-headed Willow Garage Texai, or you if you have US $200,000 to spare you can even get an android copyof yourself.
3. Harvesting robots. Agriculture has become highly industrialized and wasteful, with bad results for the environment and for us, who literally eat the fruits of this process. Could more automation improve this scenario? I don't know. I want to believe that robots could replace some of today's wasteful practices with more efficient ones that would save energy and fuel, cut down on fertilizers and pesticides, and as a result make crops more sustainable. (How harvesting robots would impact labor is another issue that only adds complexity to this problem.) Companies trying to bring robots into the field include Vision Robotics, which is developing an autonomous grape-vine pruner [video below], and Harvest Automation, which has created a small mobile robotthat picks up and moves potted plants in nurseries.
3. Personal mobility vehicles. The greenest mode of transportation is -- you guessed it -- walking. But we can't walk everywhere, of course. At the same time, using a car to drive for short distances is very wasteful. That's why we need a way of going places that doesn't involve using our legs or our gas-guzzling automobiles. Enter the personal mobility vehicle -- a small machine designed to take a single person for short rides. The Segway was the first in this category, but unfortunately many cities banned it from the streets. We still think, though, that these vehicles will play an important role in reducing our dependence on cars (laws and pedestrians can't get in the way of the future!). Our favorite prototype? Take a look at Honda's futuristic unicycle called U3-X [video below].
5. Autonomous cars. As cool as machines like the U3-X above might be, commuting on a unicycle might not work for everyone. But if we're going to continue using cars, can we at least make them smarter? We're not just talking about driverless cars, vans, and even buses that researchers have recently demonstrated. Sure, these autonomous vehicles could in principle help us drive a bit more efficiently by finding the best routes and optimizing acceleration and braking of the vehicles. But we also need smarter cars that interact with each other and the road, so everyone moves along smoothly and safely. One example is the European project SARTRE [video below], which is studying whether autonomous convoys of vehicles improve safety and save fuel.
Seriously, how could you walk past this adorable little robot and not give it everything you have in your pockets? This is DONA, an "Urban Donation Motivating Robot," which wanders around public spaces and proceeds to look cute until people give it money. 'Cause, you know, robots have to make ends meet too. And from the looks of it, it totally works:
Good thing DONA doesn't take credit cards over the internet, or I'd be flat broke right now.
Here's a bunch of nifty robot vids that I haven't had a chance to post about yet, so I thought I'd toss 'em all up for you in bite-size format. Enjoy!
Like Taurus, Survivor Buddy (from Texas A&M and Stanford) is designed to augment existing robotic platforms to give them additional capabilities. Specifically, it's a little moving screen that can help make people who are trapped in uncomfortable places feel a little better without having to rely on the steely inhuman gaze of a rescue robot:
-LEGO Sorting LEGO Delta Robots
While I'm a proponent of the big tub o' mixed up LEGO bricks, I recognize that sometimes organization can be important. And what better way to organize LEGO bricks than with a LEGO brick organizing robot made out of LEGO bricks?
PhillieBot was created by students at UPenn's GRASP Lab in a month and a half of spare time work. It's specifically designed to to lob a game-opening pitch at around 60 kph "to ensure safety," but such speeds are, well, kinda tame. I'd say, if you're gonna make a pitching robot, you might as well just give it a baseball cannon and impress the crowd, especially if the Philly Phanatic is attempting to catch the ball projectile.
A robotic aerial vehicle hovering at the crippled Fukushima Dai-ichi nuclear plant has captured close-up video and photos that reveal the extent of the destruction in greater detail than previously seen.
Tokyo Electric Power Co. (TEPCO), the plant's operator, is using a T-Hawk [photo below], a remote operated flying machine created by U.S. firm Honeywell, to get a closer view of the severely damaged reactors.
The T-Hawk, known as a micro air vehicle, or MAV, uses a ducted-fan propulsion system that allows it to hover in place like a helicopter and fly into tight spaces where other aircraft can't go.
Last Friday, TEPCO workers, with assistance from Honeywell employees trained to pilot the T-Hawk, used the vehicle to survey the reactor buildings of Units 1, 3, and 4. TEPCO released the images the next day.
"What these images show is that the magnitude of the hydrogen explosions was incredible," said Stewart B. Minahan, executive director of operations for the Nuclear Energy Institute, an industry group.
Minahan, who has 35 years of experience with boiling water reactors, say the images are "very clear," and although they don't provide any new insight into the disaster, they show details that he hadn't seen before.
One of the elements most clearly visible is a round yellow structure sitting on the operating floor of Unit 4. The structure, Minahan says, is the drywell dome, the top part of the reactor's containment structure. When a boiling water reactor is in refueling mode -- as it was the case with Unit 4 -- workers use a crane to remove the dome and place it over concrete blocks on the floor. It's also possible to see fuel-handling machines, used to move fuel from the reactor into spent fuel pools.
But he says no parts of the reactors themselves are visible. "In my opinion, it will be hard to see them," he says. "These buildings had multiple floors, which collapsed because of the explosions."
TEPCO has used manned helicopters, high-altitude drones, and ground robots to obtain images of the facility. But the T-Hawk, because pilots can hold it in place and use its camera to zoom in on features, is giving TEPCO a better look of damages in and around the buildings.
Developed as part of a DARPA project, the machine is currently used in Iraq and Afghanistan for surveillance, route planning, and other missions. It weighs in at 7.7 kilograms (17 pounds), and pilots can control it manually or set up autonomous flight paths from up to 9.6 kilometers (6 miles) away and for up to 40 minutes at a time.
Honeywell, based in Morris Township, N.J., said in a release that three of its employees have flown five missions so far, capturing hours of video and dozens of photos. There are two T-Hawk units flying in Fukushima and two as back-up. Honeywell said that, in addition to cameras, they are carrying radiation sensors, though TEPCO hasn't yet released any data from them.
Below, some of the images (all taken on April 15) made public:
Unit 3, roof of reactor building
Unit 4, upper side of spent fuel pool, reactor building
Unit 4, seashore side of reactor building
Unit 4, operation floor of reactor building 2
Below, videos taken on April 15 [Editor's note: TEPCO says the videos show Units 1, 3, and 4, but they appear to show only Unit 4; the videos have no audio]:
A few days ago, Tokyo Electric Power Co. (TEPCO), the operator of the Fukushima nuclear power plant, sent iRobot PackBots into three reactor buildings at the complex. Now TEPCO has released multiple videos showing two PackBots navigating inside the dark, highly radioactive buildings.
It's quite a sight to watch the robots negotiating steps, rolling over debris, and pointing their cameras to sensors and other equipment inside the badly damaged buildings. In the first video below, you can see one of the robots using its manipulator arm to close a heavy door. The last video shows what appear to be sensor readings that reveal low oxygen levels and high radioactivity.
All the mounting hardware, wiring and electronics needed to put it all together
A fairly beastly computer with a 3.1 GHz Intel i3 processor and 4 gigs of RAM
Ubuntu and ROS pre-installed
The biggest news is, as you may have noticed from the picture, the addition of a functional, powered arm (!). It has a one foot reach, and thanks to the inclusion of some actual geared motors (not servos), it can lift three pounds 17" into the air. Just imagine the possibilities...
Well, okay, so you'll have to imagine some possibilities besides grapes, but three pounds is an awful lot for such a little bot.
Now, you might think that TurtleBot and Bilibot are poised to duke it out in the affordable ROS platform arena, and while I for one would pay to see that actually happen, that's not the way it's going down. It's important to keep in mind that this isn't really a competition between the two robots, since ultimately, the goal is to make ROS and a physical, hackable ROS platform easily available to anyone who wants one to mess with. You can think of TurtleBot and Bilibot as different flavors of the same concept, and Garratt and Willow Garage have even been collaborating on some of the common software.
If you just can't wait another second, you can order an armed Bilibot right now for $1,200, which includes your choice of five colors plus custom engraving. This first batch is being more or less hand-built, so some bulk discounts will hopefully be appearing in the near future that might help bring that price down a bit. Either way, I'd say it's a pretty sweet deal, and we're all looking forward to seeing what's possible when clever people start doing clever things with this robot.
This headless, two-armed robot may be tomorrow's factory worker.
Its name is FRIDA, and it's a creation of ABB, the Swiss power and automation giant, which introduced it early this month at the Hannover trade show, Europe's largest industrial fair.
Designed for assembly applications, FRIDA is capable of using its human-like arms to grasp and manipulate electronic components and other small parts. The machine is a concept robot that ABB created to show off its vision for a new kind of industrial robot.
Traditional industrial robots are big, expensive, and hard to integrate into existing manufacturing processes. They're also difficult to reprogram when production changes become necessary and they can't safely share spaces with human workers. This barrier to entry has kept small and medium companies "robot-less" -- at a time when robots, more than ever, could boost productivity and ameliorate labor shortages.
With FRIDA, ABB is the latest among several companies building a new generation of industrial robots that are lighter, safer, more affordable, easier to deploy and reconfigure. Call it industrial robotics 2.0.
FRIDA, which stands for Friendly Robot for Industrial Dual-arm Assembly (you knew an acronym was coming, didn't you?), is lightweight and compact -- a person can carry it using a handle that comes out at the top. To make it even safer, its motors have limited drive power and soft pads cover its body. The robot has 7-axis arms, each with a servo gripper for small-part handling. Inside the torso is a control system based on ABB's IRC5 industrial controller.
So what can FRIDA do? One scenario ABB envisions is using it to bring more automation to the fast-paced, and mostly human-powered, assembly lines found in the electronics industry.
Other companies seeking to explore this new market for assembly robots include Motoman, a division of Japan's Yaskawa, which offers a dual-arm robot called Motoman SDA10D, and the German automation firm pi4_robotics, which early this year unveiled the Workerbot, also a dual-arm robot. Another player, it seems, might be Rodney Brooks' secretive start-up, Heartland Robotics.
Like FRIDA, the SDA10D and Workerbot are designed for assembly applications, but they are larger than the ABB robot and thus require more safety precautions (for example, fencing or sensor safeguarding) when operating near people.
ABB, like many other robotics companies, insists that its robot isn't designed to replace human workers, but rather to work alongside them. It says FRIDA fits into spaces used by people and could be "easily interchanged with a human coworker when the production order is changed or a new layout is required."
The company has built several prototypes and is using them in pilot applications, but it has not provided details about these tests. It hasn't discussed availability or cost either, emphasizing that FRIDA is a concept. This means that there are two big unknowns about the robot. The first is whether it will be really affordable for companies that would benefit from this kind of robot. The second is whether it will be easy to program the robot to perform assembly tasks. I guess we'll find out when more details about the robot become available.
What do you think? Is this the beginning of a new era in industrial robotics?
CMU's Ben Stephens is lucky enough to have a Sarcos humanoid to play with, and play with it he has, using a motion capture system to teach the robot to dance, if you want to call what the robot's doing dancing. There's some serious researchy stuff going on also, though: while dancing, the robot manages to not fall over, dynamically keeping its balance while coming as close as possible to replicating the captured human dance movements.
In unstructured environments (like any environment where humans are allowed to run around), balance is a big issue for robots, since they never know when they may accidentally get shoved by a wayward human. And it's important that the robot be able to deal with being shoved, partially for the sake of the complicated and expensive robot, but also for any small children and/or pets who may find themselves underneath a robot with inadequate balancing skills. To this end, Ben has been teaching the Sarcos robot to deal with a push in the same way that humans do: by taking a step forward to keep its balance:
If you've ever watched humanoid hobby robots do just about anything, you've probably noticed that they're fairly horrible at keeping their balance. Let's hope that research like this eventually trickles down to the consumer level, if for no other reason than to make humanoid soccer and humanoid kung-fu competitions a little more interesting.
UPDATE 4/20:Watch videos of the PackBots inside the reactors.
The Associated Press is reporting that two PackBot ground robots from iRobot have entered Unit 1 and Unit 3 of the crippled Fukushima nuclear power plant and performed readings of temperature, oxygen levels, and radioactivity.
The data from the robots, the first measurements inside the reactors in more than a month since a massive earthquake and tsunami damaged the plant, revealed high levels of radioactivity -- too high for humans to access the facilities.
The remote-controlled robots entered the two reactors over the weekend. Details of the mission -- such as what areas of the reactors the robots inspected and from where they were operated -- are still scarce, but Tokyo Electric Power Co. (TEPCO), the plant's operator, said that the robots opened and closed "double doors and conducted surveys of the situation" inside the buildings.
From the AP report:
The robots being used inside the plant are made by Bedford, Massachusetts, company iRobot. Traveling on miniature tank-like treads, the devices opened closed doors and explored the insides of the reactor buildings, coming back with radioactivity readings of up to 49 millisieverts per hour inside Unit 1 and up to 57 millisieverts per hour inside Unit 3.
The legal limit for nuclear workers was more than doubled since the crisis began to 250 millisieverts. The U.S. Environmental Protection Agency recommends an evacuation after an incident releases 10 millisieverts of radiation, and workers in the U.S. nuclear industry are allowed an upper limit of 50 millisieverts per year. Doctors say radiation sickness sets in at 1,000 millisieverts and includes nausea and vomiting.
iRobot, which had sent two PackBot 510 robots and two Warrior 710 robots to Japan last month, was one of several organizations providing robotic help to the Japanese authorities. Known as a robotics superpower, Japan has relied on robots from other countries because its own machines haven't been as extensively tested as robots like the PackBots, widely used in Iraq and Afghanistan.
TEPCO officials said that the radiation data from the robots don't change their plans for shutting down the plant by the end of this year. And though more robots will be used, a TEPCO official, Takeshi Makigami, said that robots are limited in what they can do and eventually "people must enter the buildings."
Indeed, the Fukushima disaster is a major test for what robots can and cannot do, and some observers have criticized the fact that better, more advanced robots aren't available to deal with this type of problem. Robots might be capable of navigating inside the reactors and assessing their environments, as the PackBots did, but they probably won't be able to perform major repairs and clean-up work -- just the kind of job we'd expect robots to do.
UPDATE 4/18: The AP has filed another report with more details, but some of the new details are confusing. Though the AP reported previously that the robots made measurements inside Unit 3, the new report quotes TEPCO officials saying that the robot "was impeded by broken chunks of ceiling and walls blown off during hydrogen blasts." So where does the Unit 3 radioactivity data come from?
Then there's this:
TEPCO spokesman Shogo Fukuda said the company has only now begun using the robots because it took several weeks for crews to learn how to operate the complex devices.
It's puzzling that in the midst of the worst nuclear disaster since Chernobyl, TEPCO decided to spend so much time training their workers to operate the PackBots. I'd think that there are qualified PackBot operators for hire, from iRobot or other company, which means the robots could have gone inside weeks ago. Isn't time a crucial issue in this disaster?
But there's more:
So far, just one of the two provided PackBots has been used, said Minoru Ogoda, an official with Japan's Nuclear and Industrial Safety Agency, which is monitoring TEPCO's remediation efforts.
Hmm. Look at the photos (especially this one) and you can clearly see that there are two robots there, and they look very similar, so how is it possible that just one of two PackBots has been used? Either this TEPCO spokesman is confused or there's something lost in translation.
TEPCO spokesman Shogo Fukuda said the company hadn't anticipated using robots in the power plant until they were offered by iRobot.
Very strange that a company the size of TEPCO didn't think of using robots following the disaster and that, rather than seeking help with robots, it had to wait until robots were offered to them. Again, either this spokesman is misinformed or this report is inaccurate. Otherwise this would mean a huge lack of judgement on TEPCO's part for not seeking robots faster when this disaster hit.
What do you think?
Packbot working inside Unit 3 (photos taken 17 April 2011)
UPDATE 4/18: The robots have entered Unit 2 as well, and TEPCO has released new photos of the robot and what it found inside the buildings.