Astrobee Will Find Astronauts’ Lost Socks

It'll be up to robots to keep space stations clean and functional while humans are away

3 min read
Astrobee Will Find Astronauts’ Lost Socks
Rendering: NASA Ames

At some point in the not too distant future, NASA wants to put a permanent space station called Gateway in orbit around the moon. Unlike the International Space Station with its permanent astronaut residents, Gateway will mostly be a transit point, a staging area for astronauts heading to the lunar surface or, eventually, to Mars. NASA expects that Gateway will spend a lot of time empty (it may only be crewed for a total of six weeks a year), but it'll need to be ready to welcome astronauts whenever they arrive, offering a safe, warm, and air-filled space. So who's going to keep an otherwise empty space station ship-shape and Bristol fashion? Robots, that's who!


During the test, the small, cube-shaped robot adeptly navigated the station to find the location designated as a "vent" used for cabin air circulation, and used computer vision to automatically detect the foreign object blocking the vent – an "astronaut sock," represented by a printed image of a sock. Then, Bumble called for help to clear the blockage. For its next test, Bumble completed a survey of Bay 6 of the space station's Japanese Exploration Module, building a high-resolution multi-sensor 3D map. During this journey, Bumble found itself bumping into and untangling itself from stray cables, and coping with simulated space-to-ground communication interruptions. It ultimately persevered and completed its mission objectives, with a little timely help from ground operators.

The goal of NASA's Integrated System for Autonomous and Adaptive Caretaking project, or ISAAC, is for the autonomous systems already on board space stations (like the life support system, power system, that sort of thing) to work with mobile autonomous or semi-autonomous robots to manage any situations that require a physical intervention. NASA uses a micrometeoroid strike as an example—the only way of dealing with that is to find where the hole is, grab a patch, and slap it on, and that means a robotic system capable of moving around and manipulating objects.

Even with its cute little arm, Astrobee isn't really intended to do all that much manipulation, and it'll almost certainly need help keeping an otherwise empty space station up to snuff. It seems likely that some of this help will come from Robonaut 2, which we're hoping will return to the ISS sometime this year to resume testing that it was engaged in until it ran into some, uh, issues. But once it's back on station with its legs attached, R2 can start wandering around and doing (very slowly and carefully) many of the same tasks that astronauts do. That's always been one of the goals of R2, although on Gateway, it'll be doing chores instead of the astronauts, rather than alongside them on the ISS. And there are more robots that could potentially join the ISAAC project as well, like GITAI's arm, which will undergo testing on the ISS after it launches on the next SpaceX flight.

The ISAAC team is now engaged in its second phase of testing aboard the station, which focuses on managing multiple robots as they transport cargo between an uncrewed space station and an uncrewed visiting cargo spacecraft. In addition to testing ISAAC with these new variables, the team is adding an improved operator interface to simplify managing the vehicle-robot systems. In the third and final phase of testing, the team will throw even more difficult fault scenarios at ISAAC, such as mock cabin air leaks or fires, and develop robust techniques to respond to anomalies that occur when responding to these simulated crises.

While the ISS and Gateway will both be close enough to Earth to allow humans to step in whenever necessary, either through supervised autonomy or full teleoperation, longer term ISAAC is pushing towards the kind of increased robotic autonomy we're going to need for Mars exploration. Ideally, by the time humans get to Mars we'll have a base all set up and ready to go in advance, and robots are likely to be the ones doing the literal, and metaphorical, heavy lifting.

The Conversation (0)
Illustration showing an astronaut performing mechanical repairs to a satellite uses two extra mechanical arms that project from a backpack.

Extra limbs, controlled by wearable electrode patches that read and interpret neural signals from the user, could have innumerable uses, such as assisting on spacewalk missions to repair satellites.

Chris Philpot

What could you do with an extra limb? Consider a surgeon performing a delicate operation, one that needs her expertise and steady hands—all three of them. As her two biological hands manipulate surgical instruments, a third robotic limb that’s attached to her torso plays a supporting role. Or picture a construction worker who is thankful for his extra robotic hand as it braces the heavy beam he’s fastening into place with his other two hands. Imagine wearing an exoskeleton that would let you handle multiple objects simultaneously, like Spiderman’s Dr. Octopus. Or contemplate the out-there music a composer could write for a pianist who has 12 fingers to spread across the keyboard.

Such scenarios may seem like science fiction, but recent progress in robotics and neuroscience makes extra robotic limbs conceivable with today’s technology. Our research groups at Imperial College London and the University of Freiburg, in Germany, together with partners in the European project NIMA, are now working to figure out whether such augmentation can be realized in practice to extend human abilities. The main questions we’re tackling involve both neuroscience and neurotechnology: Is the human brain capable of controlling additional body parts as effectively as it controls biological parts? And if so, what neural signals can be used for this control?

Keep Reading ↓Show less
{"imageShortcodeIds":[]}