Naughty robots can now be tamed with this snazzy smile-detecting device from the University of Tsukuba AI Lab. Anna Gruebler and her colleagues have developed a wireless headband that captures electromyographic (EMG) signals from the side of the face, detecting when you're smiling with delight or frowning with disapproval.
Unlike cameras with smile-detection algorithms, this device can work in low light, while you're walking around, and when you're not looking into your computer's camera. Part of the charm, the researchers say, comes from the discreet headband design that beats traditional face electrodes and wires.
Last year, Gruebler proposed the device to control avatars on Second Life in a hands-free way, as in the explanation video below. More users would approach her avatar, she says, because it was smiling and looked friendly.
The trainer tries to teach the robot her preference: Give the ball or throw it. Although the Nao starts out slow and hesitant, it speeds up after acquiring experience and feedback from the trainer. Their study compared it to using a manual interface: While users made mistakes using a dial, they never confused smiling and frowning -- a natural, intuitive way to interact with a robot.
The main idea, the researchers say, is that it's similar to how parents teach and encourage babies.
The next step is to apply the device to other real-life situations. If you could train a robot with a smile or frown, what would you have it do?
Angelica Lim is a graduate student at the Okuno and Ogata Speech Media Processing Group at Kyoto University, Japan.
Yeah, so this right here is a giant robotic spider. By "giant" I mean that those legs are 20 centimeters long each, and if the body adds another 20 centimeters, we're looking at a robot arachnid that's a terrifying two feet across (0.6 meters). For what it's worth, this is approximately twice the size of the largest real spider, the Goliath bird-eater, and the Goliath bird-eater doesn't even jump.
Oh yes, this robot jumps.
The neat thing about spiders (if you're into spiders, anyway), is that they're hydraulically operated. Instead of moving their limbs with muscles, they do it by increasing the blood pressure in whatever limb they want to extend. Hydraulically operated robots work the same way, except they have a hydraulic pump instead of a heart and hydraulic fluid instead of blood. This can be a very effective way of providing power to limbs, which is why Boston Dynamics uses a hydraulic system in AlphaDog and PETMAN.
Anyway, back to this freaky thing. Designed by a team at the Fraunhofer Institute for Manufacturing Engineering and Automation in Germany, this prototype robospider was 3D printed, meaning that more of them than I would personally be comfortable with can be manufactured quickly and cheaply. A hydraulic pump in the body provides fluid pressure to the limbs allowing the robot to crawl forwards and backwards, and some versions are apparently powerful enough to leap off the ground, grab you by the throat, and rip your head off. Or maybe not that last bit. Maybe.
In any case, having eight legs makes the robot exceptionally nimble, which is the whole reason for utilizing this design. The body of the spiderbot also contains the control system and a variety of sensors to enable it to perform its primary mission, which is as "an exploratory tool in environments that are too hazardous for humans." Like, I dunno, environments that are full of giant spiders?
This is not the first sticky-treaded robotank, but as far as I know, it's the first one that can manage to go around corners and make that tricky transition from horizontal to vertical. The somewhat unfortunately named "Tailless Timing Belt Climbing Platform" (or TBCP-11) comes from Simon Frasier University way up there in Canada. It weighs 240 grams, and has no problems climbing up whiteboards, glass, and other slick surfaces.
The sticking power of those treads comes from the same handy little Van der Waals forces that geckos use to effortlessly stick to, well, everything. Instead of tiny hairs, though, TBCP-11 uses tiny mushrooms, which provide a substantial amount of conformable surface area for the robot to use to adhere to walls.
Maximizing compliant surface area has been an issue for gecko-type (aka dry-adhesion) climbing robots for a long time; the material itself is spectacular, but the tough part is getting enough of the material to make contact with your climbing surface. For example, check out the picture of Stickybot III's toes in this article, and notice how little of the adhesive the robot is relying on to stick. This is one of the advantages of the TBCP-11: the continuous loops of adhesive material provide a lot of adhesion power.
While this robot does have some autonomous capability, it's still tethered for power, since batteries are heavy. It's going to take a little extra work to increase the strength of the adhesive so that the TBCP-11 can bring its power source onboard, and the SFU researchers are also trying to figure out how to get the thing to turn without the treads coming loose and causing the TBCP-11 to plummet to its doom.
But here's the problem: That's just part of the story. In fact, a small part. If we want to achieve a true robotics revolution, the reality is that the robots I mention above and others that the press likes to cover are not going to be enough. We need robots that can do everyday jobs, performing basic tasks over and over, safely and reliably. In other words, we need robots that will become so enmeshed in our lives that people stop paying attention to them: They will be ... boring.
I find that the tech press and people in general are not so inclined to become “excited” about boring robots. They should.
To make an analogy, consider the air travel industry. Today, the press no longer cares for successful round trips or tout the “miracle” of flying; instead, flying is an everyday routine that helps millions of ordinary people build business relationships, visit family and friends, journey around the world. What was once considered the epitome of human dreams and desires is now a commoditized, uninteresting service that people take for granted, with the only things we care about being paying less and avoiding hassle.
I won't say I didn't wish flying was a better experience. But the great thing about what happened to air travel is that it became accessible to millions of people. And that's the true commercial flight revolution. What about a true commercial robotics revolution? To get there what we need is for robots to become as routine and uninteresting as passenger flight has become in the past century.
Robots today are like the first airplanes. They are as remarkable a technical endeavor as flight once was, and current demonstrations are as entertaining (and unproductive) as the first airplane stunts once were: They're great to watch, but true global change lies in the hands of real products that are safe, affordable, and -- that's right -- boring.
Let me shamelessly plug my own employer here. I work for Adept Technology, based in Pleasanton, Calif., the biggest U.S. industrial robotics company. Some of our robots have been featuredin thetech press, because, yes, they make for cool videos. But let me introduce you to a “very boring” autonomous mobile robot platform called the Adept MT400 [photo above].
This is a small mobile vehicle designed for human environments. Basically it just roams around, commanded through push buttons, sensors, tablets, or smartphones, while avoiding bumping into people and objects. We're offering this little guy for third-party developers, including end-users, integrators, researchers, and entrepreneurs, who are interested in developing applications for it.
Back to the air travel analogy, we're like an aircraft manufacturer looking for an airline -- a partner to build a service that can attract customers and become a promising business.
The MT400 is just one example, of course. What we need from robotics companies and roboticists everywhere are more boring robots: Robots that would be most appreciated when they complete a task in a manner that is smooth and economical; robots that investors and companies can trust building business models around.
What will the future look like when, similar to flying, robots become boring enough for companies to create services that help millions and millions of people? I don't know for sure, but I do know it won't be boring.
Agree? Disagree? Let me know.
Erin Rapacki is a product marketing manager at Adept Technology. She lives in the San Francisco Bay Area.
Photo illustration: Street crossing photo: neovain via Flickr; robot photo: Adept Technology. Photo of MT400: Adept Technology
Man, just when we were getting close to making actual robotinsects, some thoughtless researchers had to go and invent a robotic insect-eating plant. Sigh. The artificial venus flytrap in the pic (which can apparently be abbreviated "VFT") is a creation of Mohsen Shahinpoor from the University of Maine.
Like a real VFT, this artificial plant has an intelligence of sorts, in the form of ionic polymeric metal composite trigger hairs on the inside of its polymer leaves. When something (like a tasty insect) touches on one of the hairs, a copper electrode triggers the leaves to snap shut in 0.3 second, and a series of teeth interlock to keep whatever the robot has caught from escaping.
For now, this robot flytrap only snacks on the old-fashioned biological sorts of flies. It also doesn't currently have the infrastructure required to turn said flies into robot food, but that's just a matter of hooking up a microbial digester like this one to turn bugs into robot fuel.
If you've ever gotten stuck in a t-shirt, this robot is for you.
For most people, putting on a t-shirt isn't a chore, but researchers at Nara Institute of Science and Technology in Japan have identified this important task as particularly difficult for the elderly or disabled with limited arm movement.
A cross-laboratory team led by Tomohiro Shibata and Takamitsu Matsubara developed a two-arm robot to slide a shirt over and onto a person's head and torso. Since a person's neck or arms may not be in the exact same position each time, a scripted movement could potentially cause distress.
Enter the team's reinforcement learning approach. Just like a child learning through experience, the robot is taught once how to clothe a human user, and then is given several attempts to put the shirt on by itself. The success is measured with motion capture system at the end of each trial, which lasts about 10 seconds.
In the video, we see that after three learning trials, the robot has learned the trajectory to place the shirt on without any trouble. According to Shibata, Japanese reporters who tested out the system gave it a thumbs up, saying it makes it easy to put on.
For now, the system has been tested with only a couple of different t-shirts and with several subjects including a few patients. The next step, says Shibata, is to try out the system with more subjects and patients, and with different t-shirts. "Our approach could be applied to other types of important clothing tasks such as pulling up/down pants."
PETMAN is an adult-sized humanoid robot developed by Boston Dynamics, the robotics firm best known for the BigDog quadruped.
Today, the company is unveiling footage of the robot's latest capabilities. It's stunning.
The humanoid, which will certainly be compared to the Terminator Series 800 model, can perform various movements and maintain its balance much like a real person.
Boston Dynamics is building PETMAN, short for Protection Ensemble Test Mannequin, for the U.S. Army, which plans to use the robot to test chemical suits and other protective gear used by troops. It has to be capable of moving just like a soldier -- walking, running, bending, reaching, army crawling -- to test the suit's durability in a full range of motion.
Marc Raibert, the founder and president of Boston Dynamics, tells me that the biggest challenge was to engineer the robot, which uses a hydraulic actuation system, to have the approximate size of a person. "There was a great deal of mechanical design we had to do to get everything to fit," he says.
As I said before, this is the first time I see a machine performing movements like that -- remarkably human, yet uncanny valley-esque at the same time.
Led by Dr. Robert Playter, Boston Dynamics' VP of engineering, development of PETMAN got its start with a $26.3 million Army program. Two years ago, the company, based in Waltham, Mass., first demonstrated PETMAN's legs by putting them to walk on a treadmill. This year, the company showed that the robot legs canrun at up to 7 kilometers per hour (about 4.4 miles per hour) and announced it had completed a prototype of the body.
But until now, the extent of PETMAN's full capabilities was a mystery.
Raibert says the humanoid and its behavior are still under development. "We plan to deliver the robot to the Army next year."
According to the Army requirements, the robot has to have about the same weight and dimensions of a 50th percentile male (the size of a standard crash-test dummy), or a mass of 80 kilograms (about 180 pounds) and height of about 1.75 meters (nearly 6 feet). PETMAN also has to simulate respiration, sweating, and changes in skin temperature based on the amount of physical exertion. Boston Dynamics used motion-capture systems to study the movements of humans as they performed a variety of exercises.
The robot relies on a tether that provides hydraulic power, but its body had to share space with many sensors and other components. Cramming everything together became a big engineering puzzle. And not only the legs had to be strong, Raibert explains, but the upper body too, to allow the robot to crawl and stand up.
And I know some of you are wondering: Will it have a head? "We were a bit late getting the articulated neck mechanism working," he says, "but it is coming along, and a head along with it."
I also asked Raibert if they could eventually use PETMAN or PETMAN-related technologies in other projects. In other words, are we going to see PETMAN used in applications other than the chemical suit tests?
"You bet," he says. "There are all sorts of things robots like PETMAN could be used for. Any place that has been designed for human access, mobility, or manipulation skills. Places like the Fukushima reactors could be accessed by PETMAN-like robots (or AlphaDogs), without requiring any human exposure to hazardous materials. Perhaps firefighting inside of buildings or facilities designed for human access, like on board ships designed for human crews."
This, of course, will mean another big challenge for his team: Transforming the humanoid from a tethered system into a free standing, self-contained robot. Boston Dynamics, however, has already demonstrated its ability to transition to tether-less machines with its BigDog project.
One question remains unanswered, though: Will BigDog become PETMAN's best friend?
Late last month, Northrop Grumman's ultra-futuristic X-47B unmanned combat air system (UCAS) performed its first test flight in "cruise mode;" that is, with its landing gear up in its typical flight configuration:
While a robot wingman does sound cool, there are probably two things which are incorrect about the term "wingman." One would be the "man" part: there's no flesh, blood, or other specific piece of humanity and/or masculinity inside. The other thing is that the X-37B is nobody's wingbot. It's entirely capable of running missions on its own, either controlled remotely by a human, or completely autonomously.
These missions will eventually include aircraft carrier take-offs and landings, refueling, reconnaissance, and attack missions, which will look uncannily like this:
It's pretty wild how the CG footage looks nearly identical to the real thing: we're totally living in the future right now.
It's not quite an invasion, but in recent years we've seen a small parade of quadruped robots strutting out of labs around the world. In the United States, Boston Dynamics has introduced its now-famous BigDog and, more recently, a bigger bot named AlphaDog. Early this year, we wrote about FROG, a prototype built in China, and just a few weeks ago we described the SQ1 robot, a South Korean project.
Now it's time to unveil the latest addition to this pack: HyQ is a robot developed at the Istituto Italiano di Tecnologia (IIT), in Genoa. The machine, built by a team led by Professor Darwin Caldwell, is a hydraulic quadruped (hy-q, get it?) designed to perform highly dynamic tasks such as running and jumping.
Legged locomotion remains one of the biggest challenges in robotics, and the Italian team hopes that their robot can become a platform for research and collaboration among different groups -- a kind of open source BigDog.
One of HyQ's key design features, Semini said, is that its legs are actively compliant, and the robot can change the stiffness of each limb. The robot does that by rapidly adjusting the hydraulic flow on its leg actuators. This capability allows the robot to run and jump, as well as negotiate rough terrain, its actuators absorbing shocks and vibrations without damage to the body.
As you can see in the video below, HyQ, which weighs in at 70 kilograms, can walk and trot at speeds up to 6 kilometers per hour. Starting at 0:53, you can see the robot moving its legs very slowly, almost like a giant crawling insect. The robot can also rear like a horse, and even squat jump, getting all four feet off the ground. At the very end of the clip, you can see another trick: The robot can kick!
To achieve the required high joint speeds and torques, HyQ uses two hydraulic cylinders and one electric motor on each leg, which are built with aerospace-grade aluminum alloy and stainless steel. Hydraulics gives the legs speed and robustness; the dc motor allows the robot to elevate its hip joints 120 degrees. There are position and force sensors on each joint, and an inertial measurement unit on the body.
HyQ uses a torque-control approach to move its legs. Its control system, which has a detailed model of the robot's body, uses inverse dynamics to calculate what torque it needs to apply to each joint.
And if you're wondering where the robot's head is, Semini said that's one of the next steps. They plan to build a head equipped with a stereo camera and laser range finder for navigation and mapping. Eventually, they also want to add a manipulator arm to the body, so the robot can grasp objects or push away obstacles.
After that, another big challenge is getting rid of the tether, which will mean designing an on-board hydraulic system to power the actuators. (If you want more technical details, visit the project's page and check out the group's publications.)
As for applications, unlike BigDog and AlphaDog, the Italian quadruped wasn't designed for carrying heavy payloads. HyQ is a smaller system, which the IIT researchers say could be used for search and rescue missions in dangerous environments. You could send the robot to navigate autonomously looking for victims, for example, or teleoperate it to investigate a disaster-stricken zone.
The researchers are currently testing HyQ on a special treadmill at their lab, making the robot walk, trot, and jump while attached to the hydraulic tether and a safety harness. Soon, however, they want to let their beast run free. "We need a parking lot to walk this dog," Semini says.
Microsoft has followed up their recent release of Robotics Developer Studio 4 and the Parallax Eddie platform with this demo showing how Eddie can be programmed to be an autonomous party photographer, aka "Roborazzi." I'd tell you all about it myself, but wouldn't you rather hear it from the dude who actually put the project together? Sure you would!
Not a bad demo, really. It's a nice way to leverage what Kinect is best at (people tracking), and the camera integration makes it handy to have around for all of those wild parties that you've never invited me to. Oh well, I guess now I can at least watch it all on Flickr. :(