Most robots are powered by electrical motors that are big, bulky, heavy, and if they break, you have to replace them. Animals, on the other hand, use a biological motor—a muscle—that also requires electricity, but is far more efficient and, given a chance, can repair itself. We're just starting to be able to manipulate biological structures like these in clever enough ways to let us harness their awesomeness, and engineers at the University of Illinois at Urbana-Champaign have worked them into a tiny little "bio-bot" that uses muscle cells to walk.
Restrictions on small drones, model aircraft, sUAS (small unmanned aircraft systems), or whatever you want to call them are being promulgated almost too quickly to chart these days. Late last month the U.S. Federal Aviation Administration forbade modelers from engaging in first-person view piloting, and the National Park Service announced its intention of banning essentially all model aircraft from its 84 million acres—to name just a couple of prominent federal clamp-downs. In addition, 36 states have been busy formulating their own drone regulations this year, four of which have recently enacted laws, mostly focused on protecting people’s privacy. But get ready for a novel variation on this theme working its way through the North Carolina State Assembly.
Pretend you're a robot. (I do this all the time, it's great!). Okay, are you pretending? Awesome! Now, take these colored blocks and BUILD ME A TURTLE, ROBOT!
If you're panicking right now, that's understandable. A general purpose robot would probably have no idea what a turtle was, much less how to build one out of blocks. There are ways that you could teach the robot about turtles and blocks, but you, being a human, are hopelessly flawed and would only be able to teach it your conception of what a turtle should look like and how to use blocks to make one. What the robot really needs is to be able to examine a bunch of different examples of a bunch of different turtles, and then use machine learning to choose the best, most reliable, and most efficient one to build. And rather than have you try and do that all on your own, researchers at the University of Washington are paying strangers to do it as part of a crowdsourced effort.
By the time we put together our post on the DARPA Robotics Finals, we were too late to put Video Friday in time for, you know, Friday. And we feel bad about that, we really do. But not we feel especially bad now, because Automaton reader Mike wrote in to let us know how sad he was that we missed (another) Video Friday.
We can't turn back the clock, but we're going to do what we can to make up for it: Mike, this Video Monday is for you.
Yesterday afternoon, DARPA held a briefing to discuss the forthcoming DARPA Robotics Challenge Finals. It's been about six months since the DRC Trials were held in Miami, so we've been expecting an update, and DARPA certainly delivered.
Program manager Gill Pratt spent over an hour explaining what we have to look forward to in Southern California (yes, the Finals will be held in California!) next June (yes, the Finals are not happening this year, as DARPA decided to give teams some extra time) and we've got all the highlights for you.
It's possible, even probable, that if you're reading this article on IEEE Spectrum, you either know how to program a robot or could figure it out if you really put your mind to it. But for the rest of us (indeed for most people), programming is not necessarily a skill that they have at their fingertips. And even if you're comfortable with writing code in general, writing code that gets a very complicated and expensive robot to do exactly what you want it to do is (to put it mildly) not easy.
The way robots are supposed to work (if we believe every science fictions show ever, which we do) is that they can listen to you yell at them, understand what you're on about, and then follow the instructions that they've been given just as well as a human can. "As well as a human can" means understanding abstract concepts and making inferences when necessary, which is something that robots, as a rule, are absolutely terrible at.
Robots like to have detailed instructions about everything: if you want a scoop of ice cream, they need to know what ice cream is, where it is, how to open it, what to scoop it out with, how to grip the scoop, how to perform the scooping action, how to verify that the scoop was successful, how to get the ice cream from the scoop into a—oh wait, we forgot about the bowl, the robot has to have all the bowl stuff figured out in advance.
And there's the problem: "get me a scoop of ice cream" is actually an incredibly complicated chain of actions that need to be executed in just the right way, and no human has the patience to spell it all out like a robot would want.
Cornell is trying to fix this problem by teaching robots to interpret natural language instructions, even casual ones, so that a PR2 can bring you some fancy ice cream.
I like drones. Drones are fun. But as with many robots, at some point you have to answer the question of, "Okay, that's cool, but what does it do?" We're not entirely convinced that drone delivery is going to be a thing, but one application that has actually managed to turn into a potentially viable product is the capability to follow someone around with a camera. In the space of about a week, three separate systems have shown up that promise to be able to act as autonomous aerial camerabots.
One less crazy idea is to just have drones perch: that is, to spend as much time as possible not flying by finding somewhere near where they need to be that they can land and sit. And wouldn't it be great if drones could recharge themselves by perching on powerlines and harnessing the magnetic fields that they emit?
We'd better hope that there will never be a time when robots will be able to do absolutely everything without any help from humans, because that's the time when our entire species is likely to become redundant. Until that time comes, the technique of human exploitation is a valuable skill for robots to learn, because it's a great way of being able to complete objectives with a minimum of hardware or software. hitchBOT is a robot that'll attempt to exploit the kindness of humans by using them to transport itself across Canada by simply asking people for a ride.
Earlier this month, Japanese telecom giant SoftBank surprised everyone by unveiling an interactive personal robot called Pepper, which will go on sale in Japan next year. Now we're learning that's not the only robot SoftBank had in the works. One of its subsidiaries, Asratec, announced last week that they've built a prototype bipedal humanoid called the ASRA C1 and have also developed a new operating system for robots, V-Sido.