Several robotics companies have been working on telepresence robots lately. I say: boring.
Is that what we want robots for? A robo-body that we can inpersonate to roam around the office when we're out? A toy robot to check that burglars didn't break into the house? A remote-controlled Roomba to check on our family and pets when we're traveling? (Scrap that last one: I forgot iRobot killed its ConnectR project.)
Consider Anybots, a Silicon Valley startup led by Trevor Blackwell. Last time we talked to him, he had a bold vision of using humanoids as personal servants that could clean up the table after dinner, take the garbage out, load our dirty socks into the washing machine. The robots would be remote controlled by human operators, which in a sense makes them telepresence robots. The difference is these telepresence robots weren't designed to just help people communicate -- they were designed to take care of household chores. Now that's an application!
Recently, though, Anybots has focused on telepresence robots to help office workers interact. Sure, Anybots is a business, they need to come up with things to sell, create revenue to invest in newer, better robots. Maybe their current office robot is a first step along the road to one day populate our homes with humanoid servants. Hey, Trevor, is that the plan?
Still, watching the video from the early post and then watching the video below got me a bit depressed. Which do you think represents a more exciting vision for the future of robotics?
Raffaello D'Andrea and his group at ETH Zurich are building a bunch of these amazing flying machines, which they plan to transform into an autonomous stunt flying squad. So far the vehicles can fly in circles and perform audacious flips, but the researches want more for their repertoire: they're designing control algorithms to make a dozen or more quadrocopters guide themselves into complex, acrobatic flight formations.
The trick involves more than just the flying robots. The machines are designed to fly within a special sensor-equipped environment where they'll "teach themselves -- and each other -- how to fly." The researchers call their airspace the Flying Machine Arena. The video above shows how users will be able to control the vehicles by moving a "magic wand" -- the controller has markers and the arena's sensor system captures the gestures and sends control signals to the vehicles. From their site:
Human beings learn from experience: when we try something and fail, we try doing it a different way the next time around. And we are incredibly efficient at this process.
We are so adept, in fact, that when it comes to learning complex activities such as racing a car or playing a violin, we can easily outperform automated systems. This is why we use autopilot programs for the routine aspects of flying a plane (such as cruising, take-off and landing), but why we still need human pilots to handle unexpected events and emergencies.
We are currently developing algorithms that will narrow the learning gap between humans and machines, and enable flight systems to ‘learn’ the way humans do: through practice.
Rather than being programmed with detailed instructions, these flight systems will learn from experience. Like baby birds leaving the nest, they will be clumsy at first. Over time, however, they will become capable of sophisticated, coordinated maneuvers.
Unlike humans, these systems won’t make the same mistake twice. And, when networked, they have the added advantage of being able to learn from each other’s successes and failures. The result is an impressively steep learning curve!
Last week at the Consumer Electronics Showcase in Las Vegas, iRobot announced a new version of the Looj, their gutter-cleaning robot. While it wasn't the totally new product announcement many people (including yours truly) were expecting and hoping for, it does reflect iRobot's commitment to improving products based on user feedback. The new Looj has an interior antenna, a better auger that keeps it from flipping over, and the battery door no longer requires the use of a screwdriver to remove it. The new Looj isn't available for a couple of months, so the old ones are still in the iRobot store and on sale.
Perhaps more interesting was the second, quieter announcement via email: iRobot is shelving the ConnectR project, the Roomba-shaped robot that was iRobot's entry into the telepresence ring at last year's CES. iRobot has been running a pilot program to determine the real market needs for such a robot and it seems that now those needs are too far away from what the ConnectR currently is. They made sure to say that they're not necessarily out of telepresence entirely -- they maintain it's a place they need to be -- but just that whatever product they create, it's not going to look like the ConnectR. I think this is probably a good move; while I remain skeptical of current telepresence robots anyway, I think a small, low-profile package with lots of buttons is not what we'll ultimately be looking for.
Photo: The Biorobotics Lab at Case Western Reserve University
Not what you think.
This Spectrum slideshow is about how "creatures from across the animal kingdom offer design principles to make robots more useful, engaging, and lifelike."
It includes German mechatronic jellyfish, Stanford's gecko-inspired StickyBot, the EPFL robootic salamander, Northwestern University's RoboLobster, the poop-free robotic chicks by Sega Toys, and Puppy, the 12-DOF pneumatically-actuated beast you see above, a mechanical greyhound developed at Case Western Reserve University.
PS: Read also Spectrum's April cover, "March of the SandBots," by Daniel Goldman, of Georgia Tech, and Haldun Komsuoglu and Daniel Koditschek, of the University of Pennsylvania.
There are certain topics that inspire rashes of articles about the inevitable robot uprising. Typically they start with some mild panic -- perhaps due to a recently released movie, or a newly announced military robot -- then they interview an expert who describes how unrealistic their concerns are, then they conclude that there's nothing to worry about... yet. The opening of the new Terminator movie last week is no exception. Here's a sampling of some articles I've already run into:
As a robotics engineer, this drives me nuts. Science fiction is a real double-edged sword for the robotics industry: on one hand, it has encouraged many people to go in to the field and has inspired ideas of what robots should look like or how they should act. On the other hand, a large number of the most popular sci-fi stories and movies describe the uprising of robots against their creators, and has become a common perception of the future of robotics. Whether they're Cylons, Terminators, or NS-5s, the moral of the story is always clear: beware your creations.
It is a little-known fact that giving your robot red LED eyes embeds in it the potential for evil. Red/green LEDs will just make it morally confused. Image from kotaku.com
Not that this fear is all bad; it does force us to think hard about the implications of what we create, and several organizations are working on ethics systems that can be programmed into our machines and AIs. But realistically, despite its popular science appeal, we're so far from the kind of technology that could threaten us the way it does in movies that it's a counterproductive argument to be having right now, and a distraction from the amazing advances we have made and the potential beneficial applications we have right now.
So do robotics a favor, and if you ever interview a roboticist or write about the subject, don't bring up Robot Armageddon; focus on what incredible ways the technology can help us here and now. Go see the Terminator movie, and enjoy it, but rest easy -- my Roomba can't find its way out from under a chair, so I'm not too concerned about it searching for Sarah Connor any time soon.
Gliders use low-power variable buoyancy systems to glide up and down through the water column for months at a time and carry payloads of different kinds of sensors to collect oceanographic data. These are primarily water quality sensors -- salinity, temperature, optical quality, and so on. During the mission, the glider will occasionally surface to send and receive information via the Iridium satellite network, and will allow its "drivers" to redirect it if necessary. While it is incredibly energy efficient, it's also slow and likely to drift, making it hard to hit particular distance targets during a mission. In the podcast, Shapiro mentions the importance of keeping the glider navigating through the eastbound Gulf Stream current to provide the glider with a little extra "oomph" and allow it to travel a greater distance over its lifetime.
The Rutger's lab's first attempt, glider RU-17, failed off the coast of the Azores. While disappointing, it was a valuable way for them to learn about the difficult task of controlling a glider over such a great distance. Their second glider, named RU-27 (nicknamed Scarlet Knight), will take advantage of these lessons learned and hopefully succeed. Since its launch on 29 April it has already covered more than two thousand kilometers.
I am a confessed Roomba evangelist. One of the most frequent questions I get is how Roomba knows where to go -- does it build a map? Does it do rows of your carpet? Why is it spinning in circles?
Since I'm friends with a few past and current Roomba engineers, I've always known that Roomba is mostly random. It does have a few behaviors -- for example, wall-following, or the spiral pattern it uses when it starts up or when it's in "dirt detect" mode -- but mostly, it wanders, as anyone who's spent time watching their Roomba with rapt attention may have noticed.
The next question, of course, is how effective this random pattern is. We humans are very methodical about our vacuuming to make sure it covers the whole floor; how can we be sure that our wandering robots are getting the same coverage?
Well, wonder no more. An enterprising Roomba user took a long-exposure shot of his Roomba in a dark room as it did its cleaning to see exactly where it went over the full cleaning cycle.
You can see where Roomba starts -- the spiral pattern in the middle of the room -- and how it covered the remainder in a criss-cross pattern. Keep in mind that the light this photograph shows is a point on the Roomba itself; on either side of the line is about six inches of actual vacuum coverage. As you can see, the Roomba gets darn near everywhere in that room, and arguably covers the space more often than most of us would with our normal upright vacuums (if, indeed, we vacuum at all. Cough cough).
It wouldn't be impossible for Roomba to build a map or to follow a traditional "lawn mower" pattern across your floor; the technology exists, and there have been plenty of Roomba clones that do that, actually. iRobot seems to subscribe to the KISS principle of design, making the Roomba more cost-effective compared to its competitors -- but at the end of the day, still keeping my floor squeaky clean.
A robot penguin and its 3D Fin RayÂ® structure. Source: Festo
Following up on a previous post, Festo's latest creation deserves a closer look. To start with the obvious: Why robot penguins?
Penguins are amazingly efficient swimmers: According to tests by Festo's engineers, their body shape shows a flow resistance 20 to 30% lower than the hydro-dynamically most favorable known technical bodies. If penguins were to run on gas, their energy efficiency would allow them to swim 1,500 kilometers through icy Antarctic waters - on just one liter (0.26 US gallons) of fuel!
Festo's AquaPenguin (video) and AirPenguin try to replicate some of this success and serve as both, a model and a testbed for new, bio-inspired technologies. The bionic Fin RayÂ® structure, derived from the strange functional properties observed in tail fins of fish, is one such example.
Festo is a firm believer in learning from nature. For example, their BionicTripod implements a 3D version of the Fin RayÂ® structure in a gripper to achieve an operating range that by far transcends that of the conventional tripod configuration and allows pick-and-place applications with an offset angle of up to 90 degrees (see video).
Happy Friday, everyone! There's really nothing insightful or informative about this post, other than some robot video amusement for your afternoon.
Our first video is from the group Flight of the Conchords, who have created a song that describes life after the coming robot Armageddon. Personally I'm prepared to welcome my new robot overlords, thanks to my Old Glory Robot Insurance, but it's interesting to see the situation from the robots' side. Be sure to watch through the end for the binary solo.
Our next video is the latest from our friends at Boston Dynamics. After a miserable winter spring has finally taken root here in Boston, even giving us a couple of 90 degree days in the last week. We all have our ways to beat the heat -- and it seems BigDog prefers the beach.
And finally, Festo shows us some neat bio-inspired designs for robots and automation. The two coolest ones are penguin-related. First up are swimming robotic penguins, which are really amazing and definitely appear to move like real penguins. Then they show *flying* penguin robots which, while impressive, is pretty weird (and I question the need for them to be "flapping" as they are). Keep watching after that for some really neat applications of the finray effect as well -- I like the giant lobby finrays.
Among the attractions at RoboBusiness last week was Keepon, a little yellow puffball robot. I had the opportunity to speak with Marek Michalowski, one of Keepon's developers, about the robot's place in autism therapy and its totally sweet dance moves.
Keepon, originally developed by Hideki Kozima, is a "BeatBot." The little robot was made famous by a Wired music video showing it dancing -- and indeed, a lot of Michalowski's work has involved teaching Keepon to respond to musical beats. But believe it or not, Keepon was actually originally developed for autism therapy.
It's a relatively simple robot: a few motors controlling degrees of freedom, a microphone, and two cameras in the eyes. When used for therapy, Keepon is remotely controlled by a therapist in another room using a standard laptop keyboard while the camera feed plays on the computer screen. It doesn't speak, can't manipulate objects, and never even changes facial expression, and yet it's shown a lot of promise in helping autistic children develop emotional responses.
A demonstration of Keepon for the History Channel, showing off Keepon's guts and how it can interact with kids
Michalowski said that in a typical autism therapy session, therapists are trying to teach children to respond to them emotionally. The therapist often has a lot of difficulty with getting the kids to establish and maintain eye contact, establish any physical contact, or express any emotional identification. With Keepon, the therapist stays outside the room, and Keepon is their only representative. The robot is placed in the room with the kids while the therapist remotely controls it; they can have Keepon look around at different kids and control its motion in a way that suggest a physical response (for example, when I poked Keepon's side, Michalowski managed to make it look like Keepon was responding to something ticklish).
What they've found with Keepon is that the kids do actually start responding to it. They'll maintain eye contact with it, which the therapist can observe through the video feed to the control computer. Some start petting it. Some autistic kids have a tendency to repeat certain motions or actions over and over; if the therapist starts matching the beat of that motion with Keepon bouncing or dancing, these kids often notice, and start changing up their frequency to make it into a game, watching Keepon keep up with them and breaking in to smiles and laughter as they watch the little robot.
The irony in all this, says Michalowski, is that people at times describe severely autistic people as "robotic" -- not expressive, not emotionally empathetic, and sometimes painfully literal -- yet it takes a robot to bring out the expressiveness and emotion in these children.
Michalowski and Kozima have started a company called BeatBots LLC to commercialize Keepon and its eventual brethren. While the primary application is of course autism and other behavioral therapies, they're not unaware that many people would love a cute little dancing robot of their own. Right now they're trying to develop a low-cost version -- the research platform currently uses precision motors and expensive high-def cameras -- that they hope will gain popularity.