Automaton iconAutomaton

Parallel Link Robots: Manipulation Too Fast for the Eye

Last November the International Robot Exhibition (IREx) took place here in Tokyo, with more than 100,000 visitors coming to see the latest robotic creations by universities, research institutes, start-ups, and also the large, worldwide known industrial robot makers. The area of industrial robotics was very large, as usual, and apart from the choreography of massive assembly and welding robots, I was not expecting to see anything new. I was wrong! It turns I was quite amazed by the superfast parallel link manipulators presented there.

These manipulators, like the ABB IRB 360 shown in the video below, are able to move so fast and with such a degree of accuracy that it becomes difficult to follow them by eye. 

 

The features that impressed me the most were the links made of carbon fiber, to reduce inertia and increase operational speed, and the link mechanisms installed to control the orientation of the end-effector.

In the video above we can see the ABB IRB 360 operating with a high-speed vision system to collect parts and arrange them according to the colors, forming pre-defined patterns, rotating each part so that they be aligned.

The task of destroying the pattern and placing the parts randomly on the conveyor belt for the robot to assemble them, however, was still done by a human (not shown in the video). Some things are better left to humans...

UPDATE: The IRB 360 is also a skilled pancake handler!

IREx: Where Are the Humanoid Robots?

Once hailed as the future of robotics, humanoid robots were conspicuously absent from the International Robot Exhibition (IREx) in Tokyo last November. Only one booth still presented them as "the future," but without any practical uses. Apparently, some roboticists regard applications as part of "future work."


Humanoid robots for drilling holes and holding cases of nuts and bolts

Another booth, in the university area, had a humanoid secretary to greet their visitors. I was really scared, and only managed to take its picture after assuring myself that it was not a moving corpse, but only a practically unintelligible machine. With such a morbid "receptionist," I completely failed to pay attention to the other research on display at that booth!


Not your usual receptionist

One fresh look at humanoid robots was the cardboard robot, which does not intend to perform any task that is better done by humans. Instead, it is a cleverly-built structure made of layers of cardboard and servo motors, that functions as a mobile mannequin. Low cost is definitely one of its main features, and the fact that it is meant only to display clothes in shop windows allows it to be very lightweight.


Robotic mannequin

Finally, the humanoid that left the strongest impression was perhaps one that was supposed to play table tennis. It was modeled after an athletic human body, with well-defined muscles and even sunglasses!


Table tennis humanoid robot: motionless

Alas, in the four days of exhibition, I only saw the robot moving in the videos that the company was showing -- the robot moved quite violently to hit the ball with the paddle, which caused it to oscillate wildly, probably eliminating any chance of image processing with the mounted cameras to try to catch the next ball.

The fact that I was exhibiting another robot just across from this booth provided me with a vantage point to keep an eye on this bot. But on the stage it remained motionless, indifferent to the efforts of its exhibitors, who tried to get it to work. It was only in the last day that the robot finally moved, although in an unexpected way. It fell on the wooden stage with a loud thud, in front of many visitors, plunging like a big piece of ham.

During those four days, I could think of a few robotic mechanisms that might be able to play table tennis in a more reliable way, without the need to look like a human being. But maybe I'm just being too hard on these human-shaped bots, and some day one of them will prove me wrong ... in a table tennis match. What you think?

Photos: Paulo Debenest

4 Ways To Climb a Wall, Robot Edition (Video)

 

Spectrum has recently profiled Israeli roboticist Amir Shapiro, who develops bio-inspired robots at Ben-Gurion University of the Negev, in Beersheba, Israel. I asked Shapiro if he had any videos of his wall-climbing bots. I'm glad I asked. He's just posted the clip above showing four of his creations: 1. A magnetic robot capable of climbing on metal surfaces; 2. A snail-inspired robot that secrets hot melt glue to stick to walls; 3. A robot that has 3M sticky tape on its wheels and can climb on smooth surfaces like a whiteboard or glass; 4. A four-legged wall bot that uses claws made of fishing hooks to climb rough surfaces like a cat or rodent. Fore more about Shapiro's work, visit his sites: http://www.bgu.ac.il/~ashapiro and http://bgurobots.pbworks.com/

National Instruments Introduces LabVIEW Package for Robotics Design

national instruments labview robotics starter kit

A long time ago in a galaxy far away, two friends and I -- all EE undergrads -- set out to transform an RC car into a line-following mobile robot. We wanted to control it remotely from a PC, where we could implement PID and various other controllers. It took us long nights in the lab writing C code, soldering digital potentiometers to the remote control, and tweaking the finicky analog video camera. In the end -- after major repairs following an incident in which the robot hit a lab bench and exploded into pieces -- the little bot worked, making three nerds very happy.

As I look back, over 10 years later, it was a lot of work for a simple bot, and I can only imagine what it takes to build much more complex robotic systems. Wouldn't it be nice if there existed a robotics development platform with a graphical interface and support to standard sensors and actuators? You could simply connect a set of blocks representing the robot’s parts on a screen, then concoct a control strategy, put the hardware together, and click "Run."

On Monday, National Instruments announced one such platform. It's called LabView Robotics. In addition to LabView, the popular data-acquisition application, the package includes a bunch of tools specific to robotics. It can import codes in various formats (C, C++, Matlab, VHDL), offers a library of drivers for a wide variety of sensors and actuators, and has modules for implementation of real-time and embedded hardware. NI says engineers could use the package to both design and run their robotic systems. 

Meghan Kerry, an academic product marketing engineer at NI, tells me that developing a relatively complex robot is becoming just too complicated, involving lots of software development, modeling and simulation, hardware integration, and so forth. "A roboticist needs to be an expert in all of those areas or manage a team of experts," she says. NI, she adds, wants to make design simpler and faster, so "a roboticist doesn’t have to spend time with things like developing drivers and can focus on the robot's algorithms and intelligence."

Now, I haven't tested LabView Robotics, so I don't have a detailed, first-hand review to report. The product is aimed at industry and academic users; a full license costs US $15,000, or you could get a $2,000 starter kit [photo above] that includes evaluation software, RIO hardware, sensors, motors, and other parts. What I find most interesting is the fact that the new package is an open platform, so users can easily share designs and code. If lots of people begin to do that, a huge resource for robotics projects could emerge. Bet you could even find a recipe for a line-following robot to be built in 15 minutes.

Photo: National Instruments

UPDATE: Here's a video showing some robots NI engineers have been testing:

 

Robot Babies Are Always A Mistake

diego-san baby robot

I know the following about these images: they come from the November 2009 edition of Kokoro News (which is in Japanese). The guy in the picture is Dr. Javier Movellan, and the robot next to him is “Diego-San.” They’re from the Machine Perception Laboratory at UCSD. Since I can’t read Japanese, I don’t know what this robot is for or why on Earth it has a gigantic baby head. I also don’t know why these pictures were included in the article:

diego-san baby robot

Look, we’ve been over this before… You don’t. Make. Robot. Babies. Humans are hardwired to respond in a particular way to other humans in general, but more specifically when it comes to babies, and we can instantly tell when something’s wrong and it’s like a punch to the gut. Like, it’s not just mildly creepy, it’s seriously #@$*%^ up.

diego-san baby robot

I’m quite sure that Diego-San is an incredible robot doing incredible research, and hopefully we’ll get more details on that, but seriously now, whoever put that head on there needs to get out of the lab a little more.

If anyone cares to translate the article and let us all know what’s going on with this thing, there’s a PDF at the read link below.

[ Kokoro News (PDF) ]
[ UCSD Machine Perception Laboratory ]

Neato Robotics Challenges iRobot (Update: CES Video)

Update (Jan 7, 2010): Added hands-on video from CES.

Out of nowhere last week, a very quiet Silicon Valley startup called Neato Robotics announced the official release of the XV-11, a robotic vacuum cleaner. Priced at $399 and officially for sale in February, the XV-11 is clearly positioned to bite off a chunk of the iRobot Roomba's marketshare.

I'm kind of kicking myself right now for not having paid more attention to these guys before. While researching Willow Garage a year or two ago, I came across Neato because of some overlap in the original technical contributors. However, their website, though it said they were working on home robots, gave no indication that they were on a serious commercial path any time in the near future. Now with a press release, fleshed out leadership team, and brand new website, Neato has come out with guns blazing. Color me pleasantly surprised.

Their leadership team includes a CEO who previously led mobile navigation development at Agilent, among other technologists and engineers with backgrounds that do include robotics. Some of the original team I had come across -- though it's unclear how many still remain -- had come out of the Stanford AI lab, which also spawned Willow Garage.

 



The oddly shaped XV-11 supposedly does better with corners than "traditional round robots", by which I wonder if they mean the Roomba

 

The biggest difference between the XV-11 and the Roomba is that the XV-11 actually maps the room it's cleaning using SLAM. Roomba, which uses something of a random walk algorithm for coverage, is very different (and this is often the biggest cause of confusion in users who wonder why Roomba keeps going over one spot but ignores others). To achieve accurate SLAM, XV-11 has an onboard laser rangefinder to build a map of its surroundings. This, by the way, is a big deal for hobbyists who will be looking for cheap navigation solutions.

 

How will it end up comparing? That remains to be seen; there are lots of questions about battery life, quality of the vacuum and sweeping system, and usability that have to be tried out. Hopefully we'll be able to give it a shot when it becomes available early next year. In the meantime, I'm excited to see how the first American-designed competitor to the Roomba pans out.

NRL's UAV Sets 26-Hour Flight Endurance Record

The Naval Research Laboratory of Washington, DC, reports that their Ion Tiger, a hydrogen-powered fuel cell UAV, has recently accomplished a 26 hour endurance test flight.

The Ion Tiger, as Automaton has reported in the past, is being used as a technology demonstration platform for their latest PEM fuel cell technology. It weighs approximately 37 pounds and carried a 5-pound payload during the test.

"NRL has now demonstrated that PEM fuel cell technology can meet or surpass the performance of traditional power systems, providing reliable, quiet operation and extremely high efficiency. Next steps will focus on increasing the power of the fuel cell to 1.5 kW, or 2 HP, to enable tactical flights and extending flight times to 3 days while powering tactical payloads."

Follow the link to the NRL press release for more information.

Surgeons and Robots Scrub Up (Audio Slide Show)

At Johns Hopkins University, doctors and engineers collaborate to create the next generation of robots for the operating room.

This slide show is part of our special report "Robots for Real."

Embed this audio slide show on your site: 

Festo's Cyberkite

I've long been a fan of Festo's Bionic Learning Network with their impressive list of projects ranging from flying penguins to bionic grippers. Now the German industrial control and automation company has just released a video of an autonomous kite, intended to showcase Festo's mechatronic actuators and display the Festo logo in the skies. The project was conducted in cooperation with German company aeroix based in Berlin, which already helped Festo develop its insulated hot air balloon (video).

The design of the bionic wing is based on Festo's Stingray project. It combines a wing with a large volume to hold an aerostatic lifting gas with a good lift-to-drag ratio and high rigidity, which allows the wing to maintain its position even in the absence of wind.

The entire system is engineered to autonomously and intelligently cope with strong and turbulent wind conditions. For example, the servomotors used to control the kite periodically switch to generator mode to recover energy from the steering motion and any excess energy from the compliant guy-ropes is also redirected to batteries.

For more detailed information on the kite, including another great video and a photo slideshow, have a look at the aeroix website (in German). If you have not already done so, make sure to also check out Festo's robot penguins or listen to a recent Robots interview with Markus Fischer, the head of the Bionic Learning Network (full disclosure: I'm part of the Robots Podcast team).

Cyberhand Controlled via Electrodes Directly Implanted into Arm Nerves

European researchers have successfully implanted tiny electrodes directly into motor and sensory nerves of an amputee's arm stump, allowing him to move and feel sensations from a robotic hand. While overall this project looks less advanced than other projects such as the Luke Arm developed by Dean Kamen's DEKA, the direct implantation of electrodes seems more straight forward than other current approaches, such as surgically rerouting an amputees’ residual arm nerves to the pectoral muscles and then generate control signals via electrodes detecting pectoral muscle contractions. The researchers also hope that this novel method will allow for faster and, ultimately, more complex control and sensing of artificial limbs for partial amputees.

In this first trial a single amputee chosen from 30 volunteers underwent tests with the implanted electrodes for 1 month before having them removed - more long-term implants are still a major challenge. However, according to researchers the patient mastered the robotic hand within a few days and by the time of the trial the hand obeyed the commands it received from the man's brain in 95 percent of cases. Researchers are now working on significantly increasing the amount of time the hair-thin electrodes can stay in the body.

For more information have a look at the Cyberhand website (unfortunately has been offline for the past few days), a video in English or some more comprehensive videos in Italian and German.

Most Commented Posts

Automaton

IEEE Spectrum's award-winning robotics blog, featuring news, articles, and videos on robots, humanoids, automation, artificial intelligence, and more.
Contact us:  e.guizzo@ieee.org

Editor
Erico Guizzo
New York, N.Y.
Senior Writer
Evan Ackerman
Berkeley, Calif.
 
Contributor
Jason Falconer
Canada
Contributor
Angelica Lim
Tokyo, Japan
 

Newsletter Sign Up

Sign up for the Automaton newsletter and get biweekly updates about robotics, automation, and AI, all delivered directly to your inbox.

Advertisement
Advertisement
Load More