Automaton iconAutomaton

EPOS Robotic Facility Simulates Satellite Repair Mission

DLR EPOS

Space robotics may appear to be a purely scientific endeavor -- brave little rovers exploring planets in search of life -- but it turns out there's a multi-million dollar market in space just waiting for the right kind of robot. This market is satellite servicing.

Geostationary communication satellites fire small thrusters to stay in orbit. When they run out of fuel (typically helium or hydrazine), or when a battery or gyroscope fails, these expensive satellites often have to be abandoned, becoming just another piece of space junk, even though their mechanical systems and electronics work fine.

Space agencies and companies around the world are developing robotic servicing systems (the United States demonstrated one such system in its Orbital Express mission), but putting these systems in space, docking them to satellites, and performing repairs remains a big challenge.

To address the problem, DLR (Germany's NASA equivalent) launched the European Proximity Operations Simulator, or EPOS, initiative. EPOS is a robotic facility designed to simulate on-orbit rendezvous, docking, and repair maneuvers. EPOS allows engineers to do computers simulations of a docking system with hardware in the loop.

For this project, DLR partnered with Robo-Technology, a small industrial robotic integrator that designed, built and programmed the EPOS hardware. Its main components are the two robots and the 25 meter linear track, which defines the working range. Both robots offer 6 degrees of freedom, so the two satellites can be positioned relative to each other, but also relative to instrumentation in the lab.

The simulation is as realistic as possible, so there is a sun simulator, the satellites "believe" to be in zero gravity and the high control bandwidth of 250 Hz enables the 1:1 simulation of contact dynamics during the capturing and docking. Even the capturing of a satellite with non-zero rotation can be simulated with EPOS using actuators which are remote-controlled as if a radio transmission from ground to space would take place.

DLR is using the system now to evaluate approach scenarios and test docking cameras using 1:1 satellite mock-ups. And just like on Earth for more common objects, things would be so much simpler for the roboticist if the design of the satellite had considered the robot limitations. But then, where would the fun be?

The video below explains the EPOS initiative (even if you don't understand German, it's fun to watch the robots in action).


Photo and video: DLR

Samuel Bouchard is a co-founder of Robotiq in Quebec City.

Evolution of Adaptive Behaviour in Robots by Means of Darwinian Selection

M.Waibel-EPFL-LIS-Cooperative-Robots

An essay by my two former PhD supervisors summarizing more than a decade of research in Evolutionary Robotics has created quite a buzz in the blogo-sphere:

Ever since Cicero’s De Natura Deorum ii.34., humans have been intrigued by the origin and mechanisms underlying complexity in nature. Darwin suggested that adaptation and complexity could evolve by natural selection acting successively on numerous small, heritable modifications. But is this enough? Here, we describe selected studies of experimental evolution with robots to illustrate how the process of natural selection can lead to the evolution of complex traits such as adaptive behaviours. Just a few hundred generations of selection are sufficient to allow robots to evolve collision-free movement, homing, sophisticated predator versus prey strategies, coadaptation of brains and bodies, cooperation, and even altruism. In all cases this occurred via selection in robots controlled by a simple neural network, which mutated randomly. ...


Once one looks beyond the blogosphere claptrap caused by any mention of words like "evolution" or "predator" in the context of robotics, there are some interesting insights from this body of work: Foremost, the experiments outlined in the essay offer an intriguing real-world illustration of evolution at work. Another is that they offer a powerful way to study biological phenomena such as the evolution of group behaviors like communication or altruism in a highly-controlled, real-world system. Yet another is that embodiment does matter  - not only can the use of robots result in stronger testing of hypotheses and in higher predictive power than purely computational models, in some cases it is the best way to gain insights into the complex behavior resulting from a robot's - or animal's - reaction to the environmental changes caused by its own actions.

However, for all the work's merits I think that its limitations may be even more revealing. In spite of the apparent behavioral complexity of the robots, all behaviors were achieved with extremely simple brains consisting of only a handful of neurons. This is surprising, given that even the brains of animals as simple as the nematode worm C. elegans exceed this by an order of magnitude and animals like the fruit fly Drosophila melanogaster by a factor 10'000. Clearly, we are far from evolving intelligent robots. Furthermore, the performance of robots invariably stagnated after a few hundreds of generations of evolution across all experiments. For now it is unclear if this is a direct cause of using extremely simple brains, or is tied to deeper reasons such as an inadequate genetic encoding. Either way, we are far from (re-?)creating open-ended evolution in the lab.

If you find this interesting, I suggest you have a look at the essay and accompanying videos

 

NASA and GM Develop Dexterous Humanoid Robonaut2

robonaut2 nasa gm r2

NASA and General Motors have unveiled a humanoid robot called Robonaut2, or R2, that they say will be able to "assist astronauts during hazardous space missions and help GM build safer cars and plants."

robonaut1The robot was designed with dexterous hands capable of using the same tools as humans do. NASA and GM boast that the robot "surpasses previous dexterous humanoid robots in strength," being able to lift a 9-kilogram weight (20 pounds) with its arms extended, but details are sketchy.

The R2 is based on the original Robonaut created by NASA and Darpa a decade ago [see photo, right]. It was a fairly advanced android for its time, but it never travel to space.

UPDATE: Popular Mechanics has more technical details and specs:

The biggest upgrades from the original Robonaut are R2's thumb, which now have four degrees of freedom (as opposed to three), and its overall speed, which have improved by a factor of four. One result of all of this engineering is the kind of breakthrough only a roboticist would swoon over: R2 can use both hands to work with a piece of flexible material. If that sounds simple, consider the amount of sensory data, cognitive processing and physical dexterity needed to manipulate something that flows and bends in your fingers. In the series of baby steps that comprises robotics, R2 is leaping.

Still, the two existing R2 prototypes are still essentially legless—GM has no need for a bipedal robot awkwardly swaying through its plants, and NASA plans to fit the robot with at least as many mobility platforms as its predecessor. R2's lower half is intended to be modular, and so is its redesigned head, which could fit a variety of sensor suites, depending on the mission or environment. Of course, until the agency's budget is sorted out, [Robonaut2 project manager Myron Diftler] can't confirm what those missions will be, or when the robot could be deployed. Which means the robot, or some version of it, is more likely to show up in a GM plant before leaving the planet.

See the new Robonaut2 in action in the video below:
 

Photos and videos: NASA

RoMeLa's Chemically-Actuated ChIMERA Robot Moves Like Amoeba

Apparently Professor Dennis Hong at the Robotics & Mechanisms Laboratory (RoMeLa) at Virginia Tech is exploring robotic locomotion not only with strange multi-legged robots but also with robots with no legs at all.

When we wrote about iRobot's blob 'bot, I should have known that others were working on similar chemical actuation projects. It turns out Professor Hong and his team are developing an amoeba-inspired robot called ChIMERA (Chemically Induced Motion Everting Robotic Amoeba), which can slide using a technique known as whole-skin locomotion.

Though technical details are still under wraps, Travis Deyle at Hizook did a great job in summarizing what is known about ChIMERA and related projects.

To see the "amoebot" in action, watch the video below. It's a talk Professor Hong gave at TEDxNASA -- ChIMERA stars at 07:27.
 

 

Will Robots Pick Your Grapes One Day?

Robots have revolutionized the factory. What about the field?

Over the past century, agriculture has seen an explosion in productivity, thanks to things like mechanization, synthetic fertilizers, selective breeding, and, of course, pesticides -- lots of it.

But it remains to be seen what role robots will play in working the fields. Automation was possible in factories because tasks were repetitive and the environment well-defined. A robot arm welding a car chassis does the exact same job over and over. When it comes to crops, though, everything changes: the environment is unstructured and tasks -- like picking a fruit -- have to be constantly readjusted.

It's a huge challenge, but some companies are up to the task. Take Vision Robotics, for example. It is using advanced vision and localization techniques to develop systems like its autonomous grape-vine pruner.

 

 

 

We've written about them before; now they've added the impressive (and bucolic) video above, which is a demonstration the company gave to the grape and wine industry. The company, based in San Diego, Calif., developed a vision system that uses stereoscopic cameras to create a virtual 3D image of the grape vines. Articulated cutting arms do the trimming at an exact angle and location.

From what I understand their goal is to have a tractor equipped with the articulated robotic arms. Mobility is a priority, and the machines must be able to access most of the areas of the tree being cut. The tractor might be driven by a person, but everything else would be controlled by an on-board computer.

Another promising application is fruit picking. Again a robot would distinguish between fruit and leaves by using vision. A camera mounted on the robotic arm detects colors and compares it reference data in its memory. A match means the fruit is picked.

Over the next few decades we could expect a time when robots will work tirelessly on our fields. Just like they do in our factories.

Boston Dynamics Wins Darpa Contract To Develop LS3 Robot Mule (It's a Bigger BigDog)

boston dynamics ls3

A bigger BigDog is coming.

Boston Dynamics, developer of BigDog and PETMAN, announced today that it has won a Darpa contract to develop a new robot mule to help soldiers on foot carry gear in the field.

The robot, called Legged Squad Support System, or LS3, will be able to navigate rough terrain, carrying 180 kilograms (~400 pounds) of load and enough fuel for missions covering 32 kilometers (~20 miles) and lasting 24 hours.

Boston Dynamics says LS3 won't need a driver, because it will automatically follow a human leader using computer vision or travel to designated locations using sensors and GPS.

Breeding, er, building the robot will take 30 months and cost US $32 million. The first LS3 prototype is expected to debut in 2012.

"If LS3 can offload 50 lbs [23 kg] from the back of each solider in a squad, it will reduce warfighter injuries and fatigue and increase the combat effectiveness of our troops," Marc Raibert, president of Boston Dynamics and principal investigator for the program, said in a statement.

The company, based in Waltham, Mass., is teaming up with the likes of Bell Helicopter, Carnegie Mellon, NASA's Jet Propulsion Laboratory, among others, to develop LS3.

The LS3 follows on the footsteps of BigDog, and Raibert expects the new robot to make "a major leap forward." We can't wait for the videos.

Illustration: Boston Dynamics

When People Meet Robots: Tell Us Your Stories of People-Machine Interactions

I'm calling for case studies, stories, anecdotes of the interaction between intelligent robots and people in public spaces and working places for a feature page in next quarter's IEEE Robotics and Automation magazine.

Here's why: How many people do you know who treat their PCs like a pet, or fear their laptop will attack them in the night? Now give that laptop its own set of wheels, set a doll on top, and suddenly the story changes: The perceptive area of our brains flashes neon: "Human!"

When computer users encounter a problem with their system, they blame the software provider or the malevolent who sent them a virus. They attribute any intent to the far side of the keyboard, not inside the box.

However, the fact that a laptop can be used to actuate motors and drive around a building on its own may change the perception from a machine controlled by human beings to a machine that is a being itself.

Will that perception evolve over time now that we have commercial robot operating systems like Motivity and hobby systems like SPARK and Mindstorms that let children and computer-literate adults program interactive and intelligently navigating robot applications? Or will motion and a face continue to cause people to treat robots like human beings?

I'm looking for stories about people interacting with real robots, in the workplace, in public or in the classroom, that show how neophytes feel when they first meet robots in the course of their normal daily activities, and, if possible, how those perceptions or interactions change over time.

Please send your contributions to jdietsch [at] mobilerobots.com

Jeanne Dietsch is co-founder and CEO of MobileRobots, based in Amherst, N.H.

3D Printers Go Mainstream With Hewlett Packard

The above video demonstrations the workings of a professional 3D printer. Think of all the millions of things you can do with such a wonderful device! But starting at $20k, usually $30k after adding the required 'extras', you'll quickly forget about purchasing your own all together.

You may also have heard of 'the 3D fab that fabricates itself.' But when you realize the amount of technical know-how required for it to 'fabricate itself,' and the lack of precision the machine offers, once again the dream dies.

So why are the professional 3D printers so expensive? Anyone who has been seriously in the market to purchase one can tell you its the market strategy of the big players. First, they don't actually publish prices. They don't. You have to contact them by phone, interview in person with a salesman, and just before you sign the contract they mention an additional ~$10k of equipment you need that doesn't come with it. Its like a cars salesmen who, at the end of reaching an agreement, then asks "would you like wheels for your car?"

For example, there is the $1k heater to melt the support wax away - but is actually not much different than a simple $30 toaster oven. Or the $1k trolley with wheels that you can otherwise hand make in 20 minutes for $20. And I kid you not, the salesmen even said to me, "its a special price just for you."

Their strategy is to see how much you are willing to pay for it.

Perhaps they understand the market more, but if they mass-manufactured the printers, dramatically dropping costs, and sell the units at a point where the masses could afford it, perhaps they could potentially make significantly higher profits with significantly higher sales.

Fortunately for us DIY dreamers, in a major turn of marketing strategy, it appears Stratasys has recently come upon the same conclusion.

Stratasys, a leading 3D printer manufacturer, has signed a definitive agreement with HP for Stratasys to manufacture an HP-branded 3D printer. With the reduced costs associated with mass-manufacturing, and potentially large new user base, both companies are set to take 3D printing to main-stream.

So how much will their new 3D printer set you back? What precision can we expect? How much would a refill cartridge cost? Well, nothing more concrete has been announced, yet.

Stratasys' most affordable unit begins at $15k, so given my own experience in mass manufacturing of electronics, I'd say we are looking at around $5k per unit with 'acceptable' precision. Units could potentially be sold below cost, perhaps as little as $3k, with the companies calculating they can make their main profit from 'specially patented' refill cartridges.

Perhaps still too expensive for a personal at-home printer, but definitely affordable for small and mid-sized companies. And its only time until more affordable printers from jealous competitors join in.

Most Commented Posts

Automaton

IEEE Spectrum's award-winning robotics blog, featuring news, articles, and videos on robots, humanoids, automation, artificial intelligence, and more.
Contact us:  e.guizzo@ieee.org

Editor
Erico Guizzo
New York, N.Y.
Senior Writer
Evan Ackerman
Berkeley, Calif.
 
Contributor
Jason Falconer
Canada
Contributor
Angelica Lim
Tokyo, Japan
 

Newsletter Sign Up

Sign up for the Automaton newsletter and get biweekly updates about robotics, automation, and AI, all delivered directly to your inbox.

Advertisement
Advertisement
Load More