Automaton iconAutomaton

U.S. Senator Calls Robot Projects Wasteful. Robots Call Senator Wasteful


Tom Coburn, a senator from Oklahoma, and PR2, a robot from California.

A U.S. senator has cited three robotics projects as examples of "wasteful" research that lack useful applications and shouldn't have received government funding.

In a recent report, Senator Tom Coburn of Oklahoma takes aim at the National Science Foundation, the premier source of funding for science and engineering in the United States, raising questions about the agency's management and priorities. In one section of the report, Coburn criticizes the NSF for squandering "millions of dollars on wasteful projects," including three that involve robots.

"A dollar lost to mismanagement, fraud, inefficiency, or a dumb project is a dollar that could have advanced scientific discovery," the report says.

Coburn didn't give the roboticists a chance to respond, so I reached out to the three groups—from the University of California, Berkeley; University of California, Davis; and Rowan University, in Glassboro, N.J.—to hear their side.

Of course, they aren't exactly thrilled to see their work "featured" in the report. One scientist quipped that Coburn has just sparked a robot uprising. Picture hordes of bots descending on Washington, D.C. to show the senator who's wasteful by using him as cookie dough.

The researchers say they welcome scrutiny and agree that there are many improvements the NSF could make. But they argue that the Coburn report evaluated their projects superficially and out of context.

One of the projects deemed questionable involves a PR2 robot, made by Silicon Valley firm Willow Garage and by far one of the most capable robotic systems in existence. Berkeley researchers taught the PR2 how to fold towels, a demonstration that captured people's imagination.

But apparently Coburn wasn't impressed. His report notes that the robot cost $1.5 million and complains that it "took nearly 25 minutes to fold each towel." [UPDATE: The report references the wrong NSF grant; this is the correct one, a $1.2 million award. And the Berkeley researchers got the robot for free.]

Here's the "exclusive" unveiling of the report on ABC's "Good Morning America."

Berkeley computer science professor Pieter Abbeel, one of the researchers behind the project, told me that the towel folding experiment was just a small part of a much broader effort aimed at creating robots that can handle the complexities of real environments. Here's what he wrote in a rebuttal:

"[I]n order to expand the use of robots beyond manufacturing the machines must be far more sophisticated in terms of their ability to deal with complexity. That's what our work is all about. Towel folding is just a first, small step towards a new generation of robotic devices that could, for example, significantly increase the independence of elderly and sick people, protect our soldiers during combat, and a host of other applications that would revolutionize our day-to-day lives."

Coburn also discussed the report with Neil Cavuto from Fox News. After seeing footage of the PR2 folding towels, Cavuto says: "I guess many folks would like that. But how's the robot doing? Did it indeed fold clothes?" The senator admits he doesn't know details about the project. "It just caught my eye," he says.

I asked Coburn's office for more information on how they selected the projects they thought shouldn't have been funded. Did researchers or policy experts with relevant scientific backgrounds help Coburn prepare the report? Who are his co-authors?

Coburn "is the author of the report," John Hart, the senator's spokesman, told me in an e-mail. He added that the senator, who is a physician, "does have a scientific background," in addition to a business, accounting, and public policy background. "This is a multi-dimensional discussion."

I also asked whether Coburn and his staff contacted the researchers prior to the publication of the report to ask for more information or offer them a chance to address the criticism.

"Yes," Hart said. "Scientists and researchers who are privileged to receive federal funds should welcome and expect questions about their work." He added: "There are no sacred cows that should avoid examination and, if necessary, dissection."

But all the researchers I contacted told me they never heard from Coburn's staff. They said they were puzzled that the report relies so much on press reports rather than material with more scientific content—an approach they found a bit, well, unscientific. One researcher asked if Coburn would judge whether a patient is sick just by looking at the person's face.

In another project criticized in the report, a UC Davis group is studying how people interact with and control their bicycles. The researchers also want to build a robotic bike.

Mont Hubbard, a professor of mechanical engineering, is working with other faculty and students to develop dynamic models that can accurately describe how people ride bikes. The goal of the project, which received a $300,000 NSF award, is to understand the design parameters that could lead to bikes that are safer and easier to control by different groups of people and for different tasks.

mont hubbard uc davis instrumented bicycle

The researchers are using a bike equipped with sensors [photo above] and also building a robotic bicycle to identify the parameters that their models need to take into account. As it turns out, Hubbard says, we know very little about how a bike's design affects safety, performance, and our ability to control it. In particular, we need to learn more about how the dynamics of the bike and rider affect each other.

"There's plenty to be discovered," Hubbard says. "Just because Senator Coburn knows how to ride a bicycle, it doesn't mean that's the end of it."

He adds that increasing bicycle usage would have "health benefits, transportation benefits, environmental benefits." Surveys show that although Americans don't bike much, many more would if they felt bikes were safer, he said.

The third project criticized in the report was a "robot rodeo," a three-day event that took place at a conference for computer science educators in Dallas, Texas, last year. The organizers, Jennifer Kay, a computer scientist at Rowan University, and Tom Lauwers, a robotics entrepreneur, say the goal of the event was to "introduce robot programming to the nearly 1200 educators attending the conference, and to raise awareness amongst participants of how robots could be used in their classrooms."

They say that despite evidence that robots can be used as educational tools to excite and motivate students, only a tiny fraction of educators have ever programmed a robot or tried them in their classrooms. They told me that the event—which involved months of planning and dozens of volunteers—received only $6,283 from the NSF, a number that the Coburn report doesn't mention. (Just for reference, that's one-fifth of what the Senate Hair Care Revolving Fund spent last year.)

And yes, Kay and Lauwers say, the event was designed to be fun:

"Perhaps the Robot Hoedown and Rodeo was singled out because it has an intentionally eye-catching name, and because on the surface it appears 'fun.' Indeed in his report Senator Coburn states, 'Videos of the event posted to YouTube suggest the effort was a source of enjoyment for observers.' It is precisely this 'fun' which our program aims to associate with Computer Science education, so that our current students will choose to become the future researchers that make the kinds of transformative discoveries that improve our society and our economy."

Coburn acknowledges that NSF grants have supported many scientific breakthroughs, but he insists that the agency could save between $1 billion and $3 billion by eliminating inefficiencies and duplication.

Among other things, he calls for the NSF to defund its social and behavioral sciences division and sharpen its focus on "truly transformative sciences with practical uses outside of academic circles and clear benefits to mankind and the world." (Full disclosure: IEEE Spectrum has collaborated with the Directorate for Engineering of the National Science Foundation to coproduce "Robots for Real," an award-winning special report with clear benefits to mankind and the world.)

But picking "winners" is a challenge even for experienced NSF program managers and the scientists who help the agency review its grant applications.

"In many cases, it can be difficult to identify, in advance, what kinds of research proposals might lead to transformative results," says Dana Topousis, an NSF spokesperson. "For instance, when NSF funded a graduate research fellow in the early 1990s to study digital libraries, we couldn't predict that that graduate student would co-found Google."

So who knows? The next Google may very well be a robotics company founded by a pair of NSF-funded researchers. Then again, there's only one way to find out.

READ ALSO:

Robots Are the Next Revolution
Mon, March 28, 2011

Blog Post: So why isn't anyone acting like it?

PR2 Does The Impossible, Folds Towels
Wed, March 31, 2010

Blog Post: The robot may not be the fastest at folding towels, but the fact that it can do it entirely autonomously is nuts

Top 20 Robot Videos of 2010
Tue, January 11, 2011

Blog Post: Quadrotors performing acrobatics, ultrarealistic humanoids dancing, dexterous robots folding towels, and more

Who Wants a Free Robot?
Tue, May 04, 2010

Blog Post: Willow Garage is giving away PR2 robots to 11 institutions in the U.S., Europe, and Japan

Batcopter Does High-Tech Robotic Bat Harassment for Science

When bats leave their caves at night to go eat bugs, they can swarm in the millions while somehow managing to not crash into each other, which is a pretty clever trick. Kenn Sebesta, a researcher at Boston University, is wondering just how exactly they pull this off, and there's nothing better than good old fashioned experimentin' with robots to see how the bats do what they do.

This is Batcopter 2.0 (aka "Quady"), a home-built quadrotor made from carbon-fiber arrow shafts, twine, glue, zip ties, bamboo, foam, and netting to make sure that any bats not doing their jobs wouldn't get decapitated by a stray prop. A GoPro camera was stuck on the front and the whole thing was piloted from the ground with an array of three high-speed infrared cameras watching the glowing hot robot-on-bat nighttime aerial action:

To control the Batcopter, Sebesta says he and his colleagues used OpenPilot, an open source autopilot platform for small UAVs, which "allowed us to get so far so fast and was the real hero."

The UAV did end up having an unfortunate accident shortly thereafter, but not before collecting terabytes of high quality video of the bats interacting with movements of the UAV. The Batcopter team is planning to analyze this footage to try and see if there are any fundamental laws of flying that the bats follow to keep from colliding with other bats and wayward robots. If there are, it could lead to better autonomous flight controllers for UAVs, as well as ultrasonic squeaks of relief from bats everywhere as scientists find something else to do with their time.

UPDATE: No animals were harmed in the making of this robot! Professor John Baillieul, who directs Boston University's Laboratory for Intelligent Mechatronic Systems, writes us to say the researchers involved in the project, which includes several biologists, are very careful to design and use technology that is animal-friendly and meets all of the acceptable standards of animal care and use in the laboratory and field. "We do hope to use robotic air vehicles to observe bats and other flying animals in ways that have not been done up to now," Baillieul says, "but I can't emphasize too strongly that we have not harmed and are not seeking to harm or harass animals in any way, including making them fearful."

[ Boston University ] and [ OpenPilot.org ] via [ Slashdot ]

Thanks, Kenn!

Using Robots to Train the Surgeons of Tomorrow

This article is the first of a series that will explore recent advances in surgical and medical robotics and their potential impact on society. More articles, videos, and slideshows will appear throughout the year.

da vinci surgical system robot
Da Vinci surgical system. Photo: Kelleher Guerin

How can the skill of a surgeon be measured? A patient's body has no buzzer that alerts the surgeon when mistakes occur during an operation. There is no Yelp-like website that ranks a surgeon based on user reviews. It is surprising that people can spend less time selecting a surgeon for an operation than they might selecting a restaurant for dinner or a mechanic to fix their car.

According to a study from the U.S. Agency for Healthcare Research and Quality,  surgical complications, including postoperative infections, foreign objects left in wounds, surgical wounds reopening, and post-operative bleeding, resulted in a total of 2.4 million extra days of hospitalization, $9.3 billion excessive charges, and 32,000 mostly surgery-related deaths in the United States in 2000.

To what extent training is responsible for those errors is unknown. Some argue that most surgeons never achieve true expertise. One thing is certain, though: Residents need better, more effective training. It isn’t sufficient to have residents merely go through the motions; they must be able to practice deliberately. The problem is that residents already work inhumanely long hours (recent regulations limit their training to 80-hour work weeks, but they typically work more than that) and they must learn a growing number of surgical techniques and technologies, which means new generations of surgeons are having less and less time for hands-on practice.

In the past few years, several research groups, including our team at Johns Hopkins University, have been working to analyze and automate the training process using modern robotic surgical tools. Our goals are to create an objective, standardized method of surgical training as well as to reduce the time and cost of having an experienced surgeon in the training loop.

Surgical skill can be broken down into theoretical skill (consisting of factual and decision-making knowledge) and practical skill (the ability to carry out manual tasks such as dissection and suturing). Theoretical skill is often taught in a classroom and is thought to be accurately tested with written examinations like the Medical College Admission Test (MCAT) and the United States Medical Licensing Examination (USMLE). Practical skill, on the other hand, is much more difficult to judge.

Practical skills, such as driving a car, swinging a golf club, or throwing a football, are most effectively taught "in the field" through demonstration. In 1889, Sir William Halsted at Johns Hopkins University revolutionized surgical training by developing an apprentice-style technique still being used in most modern training programs of surgical residents today. According to this method, a resident would “see one, do one, teach one,” implying that after minimum exposure and the completion of a procedure once, a resident will have mastered the skill and will be capable of teaching the next novice. (Residents practice certain procedures more than once, but the principle is still that one time is really all the exposure they'd need before going out in the field and performing on their own.) Although many talented surgeons are trained this way, the method is time consuming, and evaluating a student's performance is a subjective task that varies depending on the student/teacher pair. The method also involves a lot of yelling.

With the advent of technologies such as robotic surgical systems and medical  simulators, researchers now have the tools to analyze surgical motion and evaluate surgical skill. Our group is studying human-machine interaction for surgical training and assistance in multiple contexts with increasing levels of complexity. The first level involves a system that understands what the human and environment are doing. The next level of interaction is for machines to provide assistance to a human operator through augmentation. The last level is to have a robot perform a task autonomously. We'll describe the state of research in each of these areas.

Understanding the surgical environment

language of surgery
Language of surgery. Photo: Carol Reiley

There is an active effort to develop new approaches to surgical training and evaluation. Using techniques from speech recognition, our group is developing mathematical models for motion recognition and skill assessment. These models may be the key to standardizing surgical training by decomposing complex surgical tasks like suturing, blunt dissection, and cutting into elementary “chunks” of motion -- and thus decode the "language of surgery."

These motions can be compared to phonemes, the elementary units of speech. Sequences of subtasks can be constructed like words to form sentences (analogous to various surgical tasks), which can then be used to form paragraphs (analogous to surgical operations). And, just as in speech, a recognition program might call attention to poor "pronunciation" or improper "syntax" in surgical execution, and can try to understand the intent of the surgeon from recorded motion and video data. (This research typically focuses on telepresence surgery as performed using the da Vinci system from Intuitive Surgical.) Using our skill evaluation system, trainees can have their trials evaluated offline or see their trial synchronized with a prerecorded expert trial to shorten the learning curve.

Augmenting the surgical environment

kidney stone image overlay
Kidney stone image overlay. Photo: Balazs Vagvolgyi

Super-surgeon performance can be achieved if human intelligence can be combined with robot accuracy and precision. Computer-integrated surgery, using equipment such as a robotic system with a video display, can enhance human senses by providing additional information. For example, the visualization can overlay a reconstructed CT scan of a tumor on the operating site, or the robot can use force feedback to prevent a surgeon’s hand from puncturing a beating heart.

Studies have shown that superimposing graphics, sounds, and forces over the real-world environment in real-time can assist with training.

Robots with intelligent sensors can address humans’ physiological limitations such as poor vision or hand tremor. Even the best surgeons can use intelligent assistance to improve performance. Force sensing “smart” surgical instruments will allow for safer and more effective surgeries. For example, they can be used to measure the local tissue oxygen saturation on the working surfaces of surgical retractors and graspers so that tissue doesn’t become permanently damaged.

steady hand eye robot
JHU Steady Hand-Eye Robot. Photo: Marcin Balicki

The JHU Steady-Hand Eye Robot is a robot used for retinal microsurgery where the surgeon and the robotic manipulator share the control of the instrument. This reduces hand tremors and allows for precise and steady motion. Shaky-handed surgeons, there’s hope for you yet!

The robot surgeons of the future

Researchers are now moving towards understanding how humans and machines can work together as a team to collaboratively finish a surgical task. Training models can be used to automate portions of a tedious task or to predict surgeons’ intent to automate an instrument change. Automation might also allow a surgeon to utilize more than two arms of the system at the same time: although the da Vinci surgical system has four arms (three to hold tools and one for the camera), the third arm generally sits idle, since humans can only control two arms at any given moment.

university of washington raven surgical robot biorobotics laboratory
University of Washington Raven. Photo: BioRobotics Lab

The University of Washington’s Raven System is an impressive mobile surgical robot used for telesurgery. In the next few months, seven schools are receiving this system as a part of a multi-institutional grant: Johns Hopkins University, UC Santa Cruz, University of Washington, UC Berkeley, Harvard, University of Nebraska, and UCLA. A few orders are already in for the next iteration that include schools in Florida, Toronto, and Minnesota. This standardized research surgical platform will lead to exciting new research in telesurgery and surgical training these next few years.

Raven is a mobile laparoscopic surgical system. Because Raven is modular, it is more portable than massive surgical robots used in hospitals and is able to be reassembled by a team of people. And while most commercial surgical robots weigh nearly half a ton, Raven is only 23 kilograms (about 50 lbs). This makes it ideally situated for hazardous environments.

Telesurgery experiments with the Raven generally involve a surgeon at a safe location operating a robot in the field; for example, underwater in a submarine pod or in the desert under scorching temperatures and gusting winds. Control commands and sensor feedback are transferred over a wireless connection.  Research questions include how time delays affect performance, how multiple surgeons can operate robots together to complete a surgery, and how surgeons can train on the platform most effectively.

The surgical environment in the operating room is unlike any other because of the constantly moving objects, because no two procedures are identical, and because of the sterilization/FDA approval issues. The state of surgical robotics is still a long way from one-button autonomous surgery, but the future of surgical training might be undergoing a major “facelift.”

About the authors:

Carol E. Reiley is currently finishing her doctoral research in surgical robotics at Johns Hopkins University and running TinkerBelle Labs, focused on creating low-cost, do-it-yourself projects. Reiley, who was the student chair on the IEEE Robotics and Automation Society for 2008-2010, earned her bachelors at Santa Clara University in computer engineering and her masters in computer science at Johns Hopkins.

Gregory D. Hager, an IEEE Fellow, is a professor in the computer science department at Johns Hopkins University, where his research interests include computer vision, robotics, medical devices, and human-machine systems. He directs the Computational Interaction and Robotics Lab and is the deputy directory of the NSF Engineering Research Center for Computer-Integrated Surgical Systems and Technology (CISST).

Robots Make Bavarian Breakfast Together

TUM Rosie robot preparing breakfast

Once upon a time, a charming American robot called James met a striking German bot by the name of Rosie. They liked each other, so they moved in together. Now they spend their days taking long walks in the lab and doing other things that robots do.

James is a PR2 robot, built by U.S. robotics firm Willow Garage, and it traveled to Germany as part of the PR2 Beta Program, an effort to popularize personal robots. At the Technical University Munich (TUM), James was introduced to Rosie, a dual-arm robot with a curvy figure and four eyes [photo above].

Their courtship was at first a bit mechanical, but they soon found many things in common: Both run ROS (Robot Operating System), use Hokuyo laser scanners and Kinect 3D sensors, and have omnidirectional mobile bases.

On a recent spring morning, James and Rosie were seen together cooking the traditional Weisswurst Frühstück, a Bavarian sausage breakfast.

A typical Bavarian Sausage BreakfastIt was a demonstration prepared by researchers at CoTeSys (Cognition for Technical Systems), a Munich-based high-tech cluster. This is how the researchers summarize the experiment:

TUM-Rosie is collecting the sausages, putting them into the pot with boiling water, waiting for them to be cooked and, finally, finding and getting them out of the pot into the serving bowl. [The PR2 robot] TUM-James is meanwhile slicing the french baguette using a regular electric bread slicer and in the end serving the sausages and the bread to the class of highly regarded roboticists. [...]

TUM-James makes use of recent advances in the field of real-time RGB-D sensing using a Kinect sensor for the detection of the bread slicer and the baguette. In the serving task it uses PR2's haptic capabilities in order to grasp and manipulate the plate.

TUM-Rosie is also using Kinect and perception algorithms from COP [cognitive perception] module in order to calibrate the skimmer and use it as a new tool center point of the arm. Furthermore it learns the 3D models for the pot and the bowls in order to be able to localize them at any arbitrary pose on the table. Lastly, it uses the torque sensors to resolve depth measurement inaccuracies through contact detection with the objects and blob segmentation in order to localize sausages inside the pot.

The couple has a promising life ahead of them, and we look forward to hearing about their future adventures and, hopefully, seeing some baby robots too.

PS: This is not the first romantic meal the robots have together. Last year, the pair prepared a somewhat more mainstream breakfast: pancakes. Guten Appetit!

Thanks, Dejan!

READ ALSO:

Thomas and Janet: First Kissing Humanoid Robots
Mon, August 24, 2009

Blog Post: The theatrical robots performed the first robot kiss during a performance of Phantom of the Opera

Kokoro's I-Fairy Robot Conducts Wedding in Japan
Sun, May 16, 2010

Blog Post: The groom is a robotics researcher. The bride works at a robotics firm. Now a robot has married them

PR2 Robot Learning To Bake Cookies, Humanity Surrenders to Yumminess
Fri, June 10, 2011

Blog Post: Everyone lucky enough to own a PR2 will soon be able to push a single button and get a fresh batch of cookies

Robots Are the Next Revolution, So Why Isn't Anyone Acting Like It?
Mon, March 28, 2011

Blog Post: What we need a Microsoft or a Google or an Apple to do is build a robot OS that runs on standardized hardware

PR2 Robot Learning To Bake Cookies, Humanity Surrenders to Yumminess

This is PR2. PR2 plays pool. PR2 brings you beer. And now, or very soon anyway, PR2 will bake you cookies. Warm, gooey, chocolate chip cookies. Seriously, is this not the greatest robot in the world or what?

This video comes from graduate student Mario Bollini, who's a member of Daniela Rus' Distributed Robotics Lab at MIT CSAIL. It's not in the video, but as you can see from the picture, PR2 (or "bakebot" for the purposes of this demo) is also able to cream butter and sugar, and we already know that it can break (or not break) eggs. It does make a bit of a mess, which is the reason for the surgical smock, but a separate group is programming the robot to wipe down the table afterwards. Incidentally, I love how when PR2 finishes adding an ingredient to its mixing bowl, it just drops the container on the floor. Now that's my kind of clean-up.

Bollini hopes to have PR2 making cookies from start to finish within the shockingly short time of a month. Or actually, it'll be just making one single giant cookie at a time, but you know what, I'm totally okay with that. 

[ MIT ]

Japanese Ministry of Self-Defense Spends $1000 on Flying Robot Soccer Ball

One day, the Japanese Ministry of Self-Defense decided to wander into Akihabara, a major electronics shopping center in Tokyo. In what I'm told is a relatively typical Akihabara experience, a year and a half and about a thousand dollars later they came out with this crazy spherical flying robot about the size and shape of a soccer ball.

According to the video, this is the world's first truly spherical flying robot (this may or may not be true). It can buzz around at up to 60 kilometers per hour [about 40 mph] or hover stably in narrow spaces like hallways. But its neatest trick is to land by just smacking into the ground and rolling to a stop to absorb the impact. It's also ideal for operating indoors, since keeping all of the flying and steering components inside the robot lets it happily bounce off walls, doors, windows, light fixtures, and startled people.

The robot relies on one propeller for thrust and eight separate wings for control, and while it doesn't currently carry a payload, it's designed to mount a camera or other sensors. Next up is to instill this thing with some autonomy, and at only $1000 a pop, they're cheap enough that someone who's not with the Japanese Ministry of Self-Defense should venture into Akihabara and bring us all back a sweet little robot soccer ball kit.

[ TV Tokyo ]

Thanks, Paulo, for helping us with the Japanese!

Killer Robots on YouTube, Watch It While It Lasts

Don't tell anyone, but this looks to be a full-length copy of Killer Robots that's made an appearance on YouTube. We were off giving a talk (and watching other events) and weren't able to brave the mobs of delirious robot fans around the RoboGames heavyweight combat arena, but the Science Channel brought in a squad of cameramen led by Grant Imahara (from Mythbusters) to tape the whole thing.

If you're in too much of a hurry to watch it all, you should probably see a doctor and/or get your priorities straight, but the last two matches (starting at about 36:00) are some the best that I've seen in the last three years of RoboGames and Combots. Now hurry up and watch it already, 'cause there's no telling how long it's going to last online and who knows when it'll be on TV again.

[ Killer Robots ]

[ RoboGames ]

Robots Make Shuffleboard a Slightly More Interesting Sport to Watch

I'm not entirely sure what shuffleboard is. So really, I'm not at all qualified to compare this robotic version of the sport to the real thing. But it's nifty that a bunch of students at Oregon State University got a chance to build these robots as part of their coursework, proving that robots can be for learning and fun and evil, all at the same time! Not that I'm insinuating anything about shuffleboard, but I digress. Here's video of a match:

Not bad for eight weeks and 200 hours of work, right? Now someone just needs to invent robotic curling. There's an action-packed sport that's somehow different from and significantly better than shuffleboard. Oh wait, apparently someone did:

I know nothing about this, besides that I found it on YouTube after searching for "robotic curling," but it does sort of look like it might possibly be autonomous, which would be pretty cool. There's video of another match here. If you know anything about it (it's something to do with an "SMU championship"), speak up in the comments!

Photo: Jesse Skoubo/Corvallis Gazette-Times

Via [ Corvallis Gazette-Times ]

Running Hexapod Gets Fancy New Tunable Legs

You may not realize it, but you've got a lot of springiness going on in your legs. You may also not realize that you change that springiness depending on whether you're running or walking, what surface you're on, and whether or not you're carrying stuff. Our bodies (and most animals) are able to dynamically adapt our legs and gaits to make us more efficient under changing conditions. Dynamic adaptation is something that robots are notoriously bad at, but EduBot, a son or cousin or something of the venerable RHex, has been experiment with six new "tunable" legs that allow it to adjust its gait on the fly.

EduBot's legs are made out of carbon fiber, and by changing the location of a slider along the leg, the overall stiffness of each leg can be adjusted independently. Of course, once the stiffness of the legs changes, EduBot has to adapt its gait to match, which it does all by itself by analyzing its own speed, efficiency, and stability. A bunch of different experiments were performed to help the robot learn what leg stiffnesses and gaits produced the most desirable movement patterns on different surfaces and while carrying different loads, and generally the robot was able to figure out what worked best within 70 tries worth of experimentally fiddling with its own programming. I say "generally," because sometimes it took longer, and because watching the robot failing to use the correct gait is pretty funny:

Overall, these experiments have shown that EduBot runs fastest and most efficiently with stiffer legs, but that things can change on softer surfaces (say, grass, or a shaggy carpet) or with payloads, indicating that adaptive and dynamic leg compliance really would be a useful thing to have on a robot, despite the added complexity. Next up will be teaching the robot to adjust its legs on the fly, and it'll be interesting to see how this technology might benefit other robots (or even humans) with similar limbs.

EduBot's new legs were presented in an ICRA paper entitled "Experimental Investigations into the Role of Passive Variable Compliant Legs for Dynamic Robotic Locomotion," by Kevin C. Galloway, Jonathan E. Clark, Mark Yim, and Daniel E. Koditschek, from Harvard University, Florida A&M, and the University of Pennsylvania respectively.

[ EduBot ]

Brilliant Little Jumping Robot Only Needs One Motor

Jumping offers a way for very small robots to get over very large obstacles using a minimal amount of energy. It's tricky, though, because while the first jump might be pretty easy, subsequent jumps depend on the ability of the robot to right itself, aim, and go again. That's essentially three separate subsystems, but since you're only ever using one at a time, the risk is that your robot ends up being three times as bulky as is strictly necessary. And in small robots, efficiency is everything.

EPFL's locust-inspired jumping robot solves one of these problems with a weighted roll cage that helps the bot passively return to an upright position whenever it lands. A second motor then allows the robot to rotate within the cage to change its jumping direction. This works quite well, but it adds bulk plus another motor to the whole system.

Jianguo Zhao and a team from Michigan State University have created a jumping robot that somehow manages to do everything that it needs to do with just one single motor. It can change its orientation, right itself, and then jump (really freakin' high) with one motor and some clever mechanical engineering. Check it out:

The actual jumping mechanism was directly inspired by the legs of a frog, but it's really the rest of the robot that's so cool. Everything is driven by one tiny pager motor, and here's how it works:

  • To jump, the pager motor engages a gear which pulls the robot's body down towards its legs, slowly charging four torsional springs. The gearing and springs help keep the power requirements low without sacrificing jumping energy. When the springs are fully charged up, the gear trips a little lever, and the legs are released. Boing!

  • After re-entry, the robot inevitably finds itself lying prone. By driving the pager motor backwards, the same gear that charges the springs instead spins against the ground without engaging anything, allowing the body of the robot to rotate to a new position.

  • To get up, as the robot's body is pulled down towards its legs, little arms deploy outwards, driven by that same downward motion. These arms push the robot up into a standing position, and keep it there until liftoff.

I really love how simple and clever this all is. It's efficient, too: the robot is 8 centimeters tall and only weighs 20 grams, including the motor and a 50 mAh battery, but it can make approximately 285 jumps without needing to be recharged.

The designers think that it should be possible to make the robot jump even higher and farther, and of course at some point they're going to want to stick some sensors on there or something to move it from just being awesome to being awesome and useful at the same time.

This robot was presented at ICRA in a paper entitled "Development of a Controllable and Continuous Jumping Robot" by Jianguo Zhao, Ning Xi, Bingtuan Gao, Matt W. Mutka, and Li Xiao, all from Michigan State University.

[ MSU ]

Advertisement

Automaton

IEEE Spectrum's award-winning robotics blog, featuring news, articles, and videos on robots, humanoids, automation, artificial intelligence, and more.
Contact us:  e.guizzo@ieee.org

Editor
Erico Guizzo
New York, N.Y.
Senior Writer
Evan Ackerman
Berkeley, Calif.
 
Contributor
Jason Falconer
Canada
Contributor
Angelica Lim
Tokyo, Japan
 

Newsletter Sign Up

Sign up for the Automaton newsletter and get biweekly updates about robotics, automation, and AI, all delivered directly to your inbox.

Advertisement
Load More