Automaton iconAutomaton

Boston Dynamics Demo Shows Robot Jumping Over Fence

Several months ago we mentioned that Boston Dynamics had received a grant to work on a new version of the "Precision Urban Hopper", a small wheeled robot designed to hop over obstacles 40-60 times its size. Working with Sandia National Labs, they've created a demonstration platform using the hopping mechanism whose demo has been making its way around YouTube. But: why is this demonstration important? 

 

 

A lot of the coverage I've seen has mentioned that this could be a "PackBot killer" -- suggesting that it may compete with iRobot's highly successful millitary platform, or the similar Talon robot from QinetiQ. Though the platforms have a common shape, I don't think this is the interesting thing about this. The platform is designed specifically to demonstrate the hopping mechanism, and it carries no other payload -- no teleoperated arm to disarm IEDs, no weapons, none of the sensor payloads found on the iRobot and QinetiQ packages. What I think we'll see instead is the development of this mechanism for installation on platforms like PackBot -- or, more likely, SUGV -- and a similar version of the Talon. iRobot has always had videos showing PackBots that can be thrown through a window and be able to immedately start rolling around in a building. It seems like a natural extension of this is a SUGV that can hop up through a second-story window, right itself, and perform its mission.

Previously:

Boston Dyanmics to Develop Two-legged Humanoid (And a New Hopping Robot in Their Spare Time)

Project Romeo: A new humanoid personal assistant robot

Robotics in Europe is historically linked to automation, which is why comparatively few projects focus on autonomous robots. Even fewer are working on humanoids. The three notable exceptions I'm familiar with are Pal Technology's REEM-B (Spain/United Arab Emirates), Justin (Germany/Italy) and Aldebaran's Nao (France).

Now a new European (or rather French) project led by Cap Digital (a coalition of companies, labs and institutions in Paris) aims to build a new humanoid robot. Project Romeo unites more than 12 partners including 5 companies, 7 national laboratories and the Foundation Voir et Entendre (Foundation for Vision and Hearing), with the goal to produce a first robot prototype by 2010 and a second, fully functional robot by 2011. The project's goals are ambitious: The robot should be able to assist the elderly and visually-challenged people at home using gesture and voice recognition. It should be able to manipulate daily objects including doors, dishes and keys, and it should be able to help a person get up in case of a fall.

For now little technical details have been released. Aldebaran, who has taken the technical lead of the project, has revealed that the robot will stand 1.2-1.5 meter (47-59 inches) high and will be a bipedal humanoid. In a recent interview with GetRobo, Rodolphe Gelin, the Head of Cooperative Projects at Aldebaran, also mentioned that the new robot will not use the Zero Moment Point (ZMP) algorithm which keeps most current humanoids stable. Instead, the project will develop a new algorithm that allows for a faster and more dynamic walk. Finally, in addition to gesture and voice recognition the robot will also allow for musical interaction: "We think this is a new and good way for people to interact with a robot because it is still very difficult to communicate with a robot with speech," said Gelin. "Of course, we will work on the dialogue but if you rely solely on dialogue you will always be disappointed." I'm not sure if I am ready to sing a song to a robot, but I'll admit that it's an interesting idea to explore.

For now nothing is known if there are plans for a project Juliette...

Robonica Offers A New Spin on Robotic Toys

Robotic toys have yet to really catch on; the excitement of an iPod playing robot, a surveillance robot, or a dancing robot tends to wear off pretty quickly. A new company called Robonica thinks there's a better way: introducing robotic toys as the means to playing a structured game.

Long the purview of companies like Wowwee, many robotic toys suffer from the "Christmas morning problem": they're exciting for a few minutes, but then the novelty wears off. Even new companies like Bossa Nova Robotics are set up for the same problem with their robots Prime-8 and Penbo (Prime-8 is a gorilla that farts; Penbo is a penguin that pops out a baby robopenguin. Nothing gendered about that AT ALL). While they're very fancy for toys, they aren't good at holding kids' attention spans.

Which is why I was intrigued when I heard that Robonica, a company based in South Africa with a commercial office in Massachusetts, was taking a different approach: releasing a set of robotic toys designed to allow the owner to compete in a structured game or competition.

Robonica was born out of frustration – frustration with the inability of the current generation of radio-controlled and robotic toys to provide any form of structured and interactive play, and frustration with the increasingly anti-social and intangible realities of video games.

On 28 September they'll be releasing Roboni-i, a 2-wheeled programmable robot that can not only wander around autonomously responding to stimuli -- roughly the capability of many Wowwee-type toys -- but can also become a playable virtual avatar in a structured MMO (massively multiplayer online) environment. The gaming structure can also be exported to local, in-person competitions and tournaments using the physical robots.

 

 

What's fascinating to me about this is that it's taking the best business models of video and tabletop games and combining them with these robotic toys that have a lot of potential but are lacking a true niche. Compare this to something like Dungeons and Dragons or Magic: The Gathering, or video games like World of Warcraft and Halo. The structured social and competitive network encourages gamers to buy the required "hardware" -- whether it's collectable cards or robots -- and makes it worth having. And no one can question the success of these MMO models.

Wowwee's Robosapien and Bossa Nova's Prime-8 do have "suggested" games and built-in communication capabilities with other robots in the line; in fact, Prime-8 is fundamentally not all that mechanically different from Roboni-i. But what they're lacking is an overarching game structure. There's a difference between saying your robot is capable of playing laser tag with other robots, versus marketing a robot as an avatar in a highly integrated virtual and real world infrastructure to play that laser tag.

Now, Robonica is really a pioneer in this realm, so it's easy to question whether or not people are actually willing to drop money on these little robots just to play a game made up by the vendor. Then again, in elementary school I was dropping embarassing amounts of money on small circular pieces of cardboard just to play a loosely structured game, so I can see a programmable robot being a far more attractive investment. Most other robotic toys are running between $50 and $100; Robonica hasn't released its pricing yet, but I'll be curious to see where they end. With the programmability and accessories for the gaming, I could easily see it falling closer to the $250 of a LEGO Mindstorm kit.

Attack of the Robot Bears

When designing a robot designed to interact with people in difficult environments, it is a real challenge to make the robot accessible and friendly to humans. Too humanoid and you run the risk of entering the Uncanny Valley; too mechanical, and it's not comforting at all. Two different companies, Vecna Robotics and Japan's RIKEN, have decided that teddy bear-like robots are the way to go.

The BEAR -- Battlefield Extraction-Assist Robot -- has been around for a few years and has even put in some conference appearances, but I haven't seen it move before. Vecna -- who has also just won some funding for continued research on its robot -- recently released a video showing testing of the BEAR alongside computer animation of its intended battlefield role. But the BEAR is also designed to be capable of lifting humans, an application which Vecna says it would like to market to hospitals and nursing homes, where lifting patients is one of the most difficult and common problems encountered by healthcare staff.

 

By the way, Vecna has an entire YouTube channel, if you want to keep up with future BEAR videos.

The more recently announced robot is the RIBA -- Robot for Interactive Body Assistance. Developed jointly by a research lab and a company in Japan, the RIBA isn't designed for military applications like the BEAR, but does share the healthcare application and the teddy bear face. It can lift a 61kg (135 lbs), compared to the BEAR's theoretical 500 lbs.

 

RIBA, I think, has the "teddy bear" thing going on a little better than the BEAR robot does, but I'm not feeling too warm and fuzzy about either of them. Are comforting character faces the way to go? Should we try harder to make humanoid robots friendly-looking? Or is there a better direction go to?

Robot Pillow Makes Me Uncomfortable

I’m all for emotional support robots, especially soft and fluffy ones that look like baby seals. This robot pillow, though, goes a little too far. It’s called Funktionide, and was designed by Stefan Ulrich after a two month research project by our favorite German bio-inspired robotics company, Festo.

The “Funktionide” is an autarkic amorph object whose intention is to provide the user with a atmosphere of presence thus counteracting the feeling of loneliness.

Firstly, if this thing crawled into bed with me, it would cure my loneliness but probably not with a sense of what you’d call relief. Maybe, terror. And secondly, inventing words isn’t fair.

[ Funktionide ] via [ Eggshell Robotics ]

The All-In-One Guitar Playing Robot and Game

With all the excitement surrounding the release of Guitar Hero 5 and The Beatles: Rock Band, I decided now was a good time to post a video I shot last month at National Instruments Week. In the past we've covered the way Guitar Hero could help amputees train brain-computer interfaces and how to turn the controllers into real musical instruments. It's been over a year since we posted about Slashbot, a robot that could play the game.

Today's video features another guitar-playing robot. But this one is different: the musical game (in this case, the open-source Frets on Fire clone), the vision acquisition system (that reads the notes off the screen), and the robotic control are all running off a single processor. Check it out:

This demo was a way for Intel and National Instruments to show off their new virtualization tool, which allows engineers to assign a specific task to a particular core. I thought it was a rather impressive way to show off the technology, but I'd be curious to hear your thoughts. Is this virtualization capability worthwhile?

Boston Startup iWalk Lands Funding for Robotic Prosthetics

It's been a good couple of weeks for Boston-area robotics startups: two young companies have recently closed on significant venture rounds. One of them, iWalk, is commercializing prosthetic limb technology developed at MIT's Media Lab under Dr. Hugh Herr, himself a double amputee.

Herr, an avid rock climber, lost both legs to frosbite after a climbing accident at age 17. He immediately began building his own prosthetic limbs to enable him to get back into climbing. As he started his academic research career he dedicated his efforts to developing prosthetic limbs using robotics. Among his projects is a powered prosthetic ankle and foot called the PowerFoot One.

 

The PowerFoot One

 

I had the opportunity to hear Dr. Herr speak at an MIT robotics conference last November. At the time, I knew nothing of his background or research. Halfway in to the presentation as he was describing his research, he rolled up the cuffs of his slacks to reveal metal and electronics. Until that point I'd had no idea he was a double amputee; as he'd walked to the front of the room, his gait -- though not completely normal -- was so smooth, I'd never have guessed he was dealing with anything but a couple of stiff joints.

Herr's work, including an assistive device for patients with muscle control difficulty, has been commercialized by other companies in the past. But iWalk -- who has tested prototypes with veterans and other disabled patients -- appears to be focused entirely on the technology coming out of Herr's research, and the $20M Series-B investment is just what they need to start serious production.

Check out this short video of a National Geographic feature on Hugh Herr.

Robotic Fish, Coming to a Pool Near You

Nature is full of great examples of highly efficient mechanical systems, and roboticists are keen to capitalize on those designs. When it comes to underwater vehicles, fish are a popular animal to emulate.

MIT's RoboTuna, developed in the early 90s, has spawned (ha!) two spinoff robotic fish projects. One we've talked about before: GhostSwimmer. This project, led by Boston Engineering, has evolved. The previous demonstration videos showed the robot propelled by a tail propeller but directed by the movement of the "fin"; the new version is actually propelled by the tail fin motion. Check it out here:

 

 

 

Next up is a newer set of fish from MIT. The two different types of fish mimic two different types of swimming mecahnics in fish like trout versus fish like sharks. MIT's news release says "the new robotic fish, each less than a foot long, are powered by a single motor and are made of fewer than 10 individual components."

 

 

 

Soon I'm sure we'll have even more roboticists swimming with the fishes!

 

Thomas and Janet: first kissing humanoid robots

First, the rehearsal:

Then the kiss:

While at the IEEE-sponsored International Conference on Service and Interactive Robotics (SIRCon) 2009, IEEE Spectrum scored an interview with the developers of theatrical robots Thomas and Janet, who they claim are the first kissing humanoid robots.

The first kiss happened back on 27 December 2008, during a robotic performance of several scenes of Phantom of the Opera at National Taiwan University of Science and Technology (known as Taiwan Tech). Chyi-Yeu Lin, a mechanical engineering professor, directed the performance in front of a packed house of about 400. The overcrowded auditorium burst out in a resounding cheer when Christine (played by Janet) and the Phantom (played by Thomas) kissed.

Lin’s team spent three years developing the autonomous robots hand-eye coordination, intrinsic self-balancing mechanisms, and other technologies. He says that most of the movements during a scene are programmed into the robot ahead of time.

However, their startup and synchronization is controlled by a network connected to a computer that acts as a server for both robots.To make the robots smooches and expression seem realistic, the team adopted several techniques, including manual molding, non-contact 3D face scanning, and 3D face morphing. The robot’s six expressions come about via servos pulling at several points in the face and mouth.

Showing the video of the play at SIRCon, Li-Chieh Cheng, a Ph.D. student at Taiwan Tech’s Intelligent Robot Lab, said such performances bridge the distance between advanced robotics technologies and the public.

“Available service robots could be very expensive and are only used at certain places. However, tickets for theater performance are affordable for everyone,” Cheng says.

But last December’s performance wasn’t perfect. “In addition to unexpected malfunction of motors, the network controlling robots were somewhat interfered with by signals from walkie-talkie used by stage staff,” Cheng says.

Taiwan Tech has some grand plans. “We aim to form a group composed of autonomous robots, which are like well-trained versatile performers. They can not only perform different plays, sing songs, or broadcast news, but also interact with real persons appropriately,” Lin told IEEE Spectrum.

Human actors aren’t the only things in the works. “We’re designing life-size robots of panda and other animals with humanities, who can be gently hugged by children without causing danger and interact with them,” Lin says.

Robots evolve to exploit inadvertent cues

Robots evolve to exploit inadvertent cues

Human interaction heavily depends on inadvertent cues: A competitor's sweaty handshake before a negotiation, a girl blushing when introducing herself, or the trace of a smile crossing the face of a poker player all convey important information. Sara Mitri and colleagues at the Laboratory of Intelligent Systems (disclaimer: my former lab) at the EPFL in Switzerland have now shown that it is not just humans who can develop, detect and use inadvertent cues to their advantage (PNAS: "Evolution of Information Suppression in Communicating Robots with Conflicting Interests").

The researchers set up a group of S-bots equipped with omnidirectional cameras and light-emitting rings around their body in a bio-inspired foraging task (see picture above). Like many animals, the robots used visual cues to forage for two food sources in the arena. Rather than pre-programming the robots' control rules, the researchers used artificial evolution to develop the robots' control systems. As expected, robots capable of efficiently navigating the arena and locating food sources evolved in a matter of a few 100 generations.

This is when things became interesting: Due to the limited amount of food, robots now began to compete for resources. Robots began to evolve strategies to use light inadvertently emitted by their peers to rapidly pinpoint food locations, in some cases even physically pushing them away to make room for themselves. As evolution progressed, the exploited robots were soon all but extinct. A new generation of robots ensued that could conceal their presence by emitting confusing patterns of light or by ceasing to emit light altogether.

I think this research highlights an interesting point: Robots have applicability far beyond engineering. As a leading evolutionary biologist involved in the study put it: "Robots can be quite useful to get a better understanding of the interaction between organisms". While still in its infancy, watch out for robots boosting research in biology, psychology or medicine.

Thanks Sara!

Most Commented Posts

Automaton

IEEE Spectrum's award-winning robotics blog, featuring news, articles, and videos on robots, humanoids, automation, artificial intelligence, and more.
Contact us:  e.guizzo@ieee.org

Editor
Erico Guizzo
New York, N.Y.
Senior Writer
Evan Ackerman
Berkeley, Calif.
 
Contributor
Jason Falconer
Canada
Contributor
Angelica Lim
Tokyo, Japan
 

Newsletter Sign Up

Sign up for the Automaton newsletter and get biweekly updates about robotics, automation, and AI, all delivered directly to your inbox.

Advertisement
Advertisement
Load More