Pretend you're a robot. (I do this all the time, it's great!). Okay, are you pretending? Awesome! Now, take these colored blocks and BUILD ME A TURTLE, ROBOT!
If you're panicking right now, that's understandable. A general purpose robot would probably have no idea what a turtle was, much less how to build one out of blocks. There are ways that you could teach the robot about turtles and blocks, but you, being a human, are hopelessly flawed and would only be able to teach it your conception of what a turtle should look like and how to use blocks to make one. What the robot really needs is to be able to examine a bunch of different examples of a bunch of different turtles, and then use machine learning to choose the best, most reliable, and most efficient one to build. And rather than have you try and do that all on your own, researchers at the University of Washington are paying strangers to do it as part of a crowdsourced effort.
By the time we put together our post on the DARPA Robotics Finals, we were too late to put Video Friday in time for, you know, Friday. And we feel bad about that, we really do. But not we feel especially bad now, because Automaton reader Mike wrote in to let us know how sad he was that we missed (another) Video Friday.
We can't turn back the clock, but we're going to do what we can to make up for it: Mike, this Video Monday is for you.
Yesterday afternoon, DARPA held a briefing to discuss the forthcoming DARPA Robotics Challenge Finals. It's been about six months since the DRC Trials were held in Miami, so we've been expecting an update, and DARPA certainly delivered.
Program manager Gill Pratt spent over an hour explaining what we have to look forward to in Southern California (yes, the Finals will be held in California!) next June (yes, the Finals are not happening this year, as DARPA decided to give teams some extra time) and we've got all the highlights for you.
It's possible, even probable, that if you're reading this article on IEEE Spectrum, you either know how to program a robot or could figure it out if you really put your mind to it. But for the rest of us (indeed for most people), programming is not necessarily a skill that they have at their fingertips. And even if you're comfortable with writing code in general, writing code that gets a very complicated and expensive robot to do exactly what you want it to do is (to put it mildly) not easy.
The way robots are supposed to work (if we believe every science fictions show ever, which we do) is that they can listen to you yell at them, understand what you're on about, and then follow the instructions that they've been given just as well as a human can. "As well as a human can" means understanding abstract concepts and making inferences when necessary, which is something that robots, as a rule, are absolutely terrible at.
Robots like to have detailed instructions about everything: if you want a scoop of ice cream, they need to know what ice cream is, where it is, how to open it, what to scoop it out with, how to grip the scoop, how to perform the scooping action, how to verify that the scoop was successful, how to get the ice cream from the scoop into a—oh wait, we forgot about the bowl, the robot has to have all the bowl stuff figured out in advance.
And there's the problem: "get me a scoop of ice cream" is actually an incredibly complicated chain of actions that need to be executed in just the right way, and no human has the patience to spell it all out like a robot would want.
Cornell is trying to fix this problem by teaching robots to interpret natural language instructions, even casual ones, so that a PR2 can bring you some fancy ice cream.
I like drones. Drones are fun. But as with many robots, at some point you have to answer the question of, "Okay, that's cool, but what does it do?" We're not entirely convinced that drone delivery is going to be a thing, but one application that has actually managed to turn into a potentially viable product is the capability to follow someone around with a camera. In the space of about a week, three separate systems have shown up that promise to be able to act as autonomous aerial camerabots.
One less crazy idea is to just have drones perch: that is, to spend as much time as possible not flying by finding somewhere near where they need to be that they can land and sit. And wouldn't it be great if drones could recharge themselves by perching on powerlines and harnessing the magnetic fields that they emit?
We'd better hope that there will never be a time when robots will be able to do absolutely everything without any help from humans, because that's the time when our entire species is likely to become redundant. Until that time comes, the technique of human exploitation is a valuable skill for robots to learn, because it's a great way of being able to complete objectives with a minimum of hardware or software. hitchBOT is a robot that'll attempt to exploit the kindness of humans by using them to transport itself across Canada by simply asking people for a ride.
Earlier this month, Japanese telecom giant SoftBank surprised everyone by unveiling an interactive personal robot called Pepper, which will go on sale in Japan next year. Now we're learning that's not the only robot SoftBank had in the works. One of its subsidiaries, Asratec, announced last week that they've built a prototype bipedal humanoid called the ASRA C1 and have also developed a new operating system for robots, V-Sido.
Origami, the art of folding pieces of paper to create shapes, is an appealing concept for robotics because you can transform two dimensional materials into three dimensional structures that are inherently flexible, or, as a roboticist would say, "deformable." What's more, structures that fold and unfold enable all kinds of interesting functionality that would otherwise only be possible with systems that are much more complex.
The approach can be particularly useful in designing wheels for robots, and earlier this month at the IEEE International Conference on Robotics and Automation (ICRA) two research groups presented origami-inspired wheel systems that allow mobile robots to be nimbler and stronger.
We're back from ICRA in Hong Kong, just in time to return you to your regularly scheduled Video Fridays. Just because ICRA is done, though, doesn't mean that we're taking a break: in just a few weeks, this year's Robotics: Science and Systems Conference (RSS) will be held at UC Berkeley, and IROS 2014 is only a few months away, taking place in Chicago in September. As usual, there will be all sorts of other robot stuff going on in the near future, some of which we know about but is TOP SECRET, and some of which will be a surprise to everybody. And those surprises are usually the best. No surprises today, though: we're back to normal, and it's time for robot videos.