It’s been a little over two years since we were first introduced to Astrobee, an autonomous robotic cube designed to fly around the International Space Station. Tomorrow, a pair of Astrobee robots (named Honey and Bumble) will launch to the ISS aboard a Cygnus cargo flight. There’s already a nice comfy dock waiting for them in the Japanese Experiment Module (JEM), and the plan is to put them to work as soon as possible. After a bit of astronaut-assisted setup, the robots will buzz around autonomously, doing experiments and taking video, even operating without direct human supervision on occasion.
NASA has big plans for these little robots, and before they head off to space, we checked in with folks from the Intelligent Robotics Group at NASA’s Ames Research Center, in Moffett Field, Calif., to learn about what we have to look forward to.
Here’s a video from NASA to get you caught up:
Each Astrobee robot is a cube about 30 centimeters on a side. The propulsion system, which you can read more about in our earlier article, is based around a pair of impellers that pressurize air inside of the robot, which can then be vented through a series of 12 different nozzles spaced around the robot’s body. By opening and closing different nozzles in different combinations, the robot can rotate or translate in any direction, without external moving parts or the need for canisters of pressurized gas.
Astrobee also comes equipped with the onboard sensing and computing necessary for fully autonomous operation. Its flight software is based on ROS and is upgradable on-orbit. The robot can carry a variety of modular payloads, and will be equipped with a little arm that it can use to grab onto handrails so that it can take video of astronauts without its motors running. The arms will be installed a little bit later, since they ended up requiring a little extra troubleshooting, but that should take only a month or two. Eventually (and really not too long from now as these things go), Astrobee will be able to perform all kinds of useful tasks—both things that astronauts are spending their time on now, as well as things that only a robot can do.
The robot that we saw at NASA Ames back in 2017 was Astrobee prototype 4D. The flight-ready design is mostly the same; the most noticeable difference is that the crushable blue foam pieces around the propulsion modules have been replaced with smaller foam corner bumpers covered in black Nomex fabric. The exterior of the robot is colorful and stylized, featuring prominent arrows on the sides to show which way the robot is facing, and each Astrobee is a different color so that you can tell them apart: Honey is blue and Bumble is yellow, and there’s a third Astrobee, Queen, which is green and will join the other two on orbit later this year. LED arrays around the impellers can be used for turn signals or other kinds of human-robot interaction.
Aren’t they adorable?
For more on Astrobee, we spoke with Trey Smith, a member of the Intelligent Robotics Group at NASA Ames.
IEEE Spectrum: What do we have to look forward to when Astrobee gets to the ISS?
Trey Smith: For the rest of this fiscal year, we’re going to go through a commissioning period. It starts with on-orbit hardware functional checkouts, then we start doing very basic kinds of motions, and then there’s an activity where we have the crew hold on to the Astrobee and fly it around so that it can pick up a bunch of image data to build a map from. And then we start flying around on our own around the module. As we get later in that commissioning process, we’ll be using less and less crew time. We have a really high tempo of activities that we’ve sketched out for that commissioning period. I think it remains to be seen whether we can actually get the crew time to support all of those activities, but we’re hoping to start a week after we arrive.
How autonomous is Astrobee? How much astronaut supervision will it need?
We’ve gone to great lengths to try to minimize the burden that Astrobee imposes on crew. It can autonomously undock, redock, and perch. When we’re talking about guest science, if there’s a hardware payload, the crew will need to swap the payload in. Depending on the way that the guest scientist has structured the experiment, they might require crew to observe it. But we hope that in general, experiments can be done fully autonomously.
And then can we really fly around without the crew babysitting the robot? Well, it remains to be seen. Our intent is to always structure our activities to have somebody on console [on the ground]. However, there are frequent loss of signal periods with the space station, and Astrobee will be allowed to continue operating during those loss-of-signal periods. If it runs into some kind of problem that requires operator intervention, it’ll just hang tight and station keep until an operator can help it.
We have high hopes that it will be robust enough—it’s quite possible that there will be occasional events where Astrobee gets stuck in some way, and we’ll have to call a crew member to put it back on its docking station. I would be shocked if that didn’t happen sometime during the service life of the robot. But we certainly are striving to make that as infrequent an event as possible.
How will Astrobee communicate with astronauts when it’s operating autonomously?
Something that wouldn’t have been there when you saw Astrobee last are the eyes on the touch screen. The intent of the eyes is to give you a sense that the robot is aware. They’re cartoonish, and show somewhat naturalistic motions of looking around and blinking occasionally. We’ve talked about ways that we can use the touch screen also to reinforce the signal lights show. So if we’re trying to indicate that the robot is going to turn to the right, then the touch screen could also show moving arrows or something like that.
[Human-robot interaction] is still an area that we’re working on, in the sense that we’ve been really focused on the core functionality of the robot—things like making sure we can robustly move and not bump into things. And some of the finer points of using proper turn signals and things like that have been delayed until later on in the project. We have some designs, but probably during the earliest on-orbit operations, we wouldn’t expect to see very much of those things being used. Of course, part of the point was always that we were enabling guest scientists to do their own human-robot interaction studies.
Are there any plans to use Astrobee’s perching arm for other tasks, like manipulation?
Astrobee’s primary function is to be a platform for guest science. So, we’re happy if researchers want to do any kind of activity with the arm. That said, the requirements that we developed the arm to were to make sure that the robot was capable of perching on hand rails so that it can dwell for extended periods and take video of crew activities with the propulsion spun down, which both saves power and also makes it less annoying.
But if you’re talking about future activities—everybody at NASA is really excited about work on the Gateway space station that would be in near lunar space. We don’t have definite plans for what would happen on the Gateway yet, but there’s a general recognition that intravehicular robots are important for space stations.
In terms of grand visions, I’ve been talking about Gateway, but if you’re wondering about how Astrobee will be used over the next year or two, we actually have a queue of guest scientists built up. Some of them are on the integrated payload list and are manifested on a flight to launch to the ISS, and they’re already doing things like integrated testing with Astrobee.
Since Astrobee is going to be spending most of its time doing science, it’s worth taking a look at the first few custom science payloads that’ll be installed into the modular payload bay of one of the ISS Astrobees. We have details on two of them: one from Stanford, and one from Astrobotic and Bosch USA.
Astrobee’s perching arm will be great for grabbing onto handrails, but what if you want to grab onto other stuff? Gecko-inspired grippers can stick to just about anything, which makes them potentially ideal for smooth surface perching or object manipulation, especially if you’re trying to deal with objects that are moving or spinning.
The ISS can be a noisy place, but all of that noise contains information that a clever robot could find useful. For example, maybe the robot notices that a motor sounds slightly different from one day to the next, suggesting that it could need some maintenance. The SoundSee payload “uses a custom array of microphones and machine learning to analyze information contained in emitted noises…and determine whether a machine, or even a single component of a machine, needs to be repaired or replaced.” We hear they want to start by analyzing the exercise bike that the astronauts use to keep fit.
The tentative schedule right now is for the initial Astrobee checkout to happen sometime around 30 April. If the robots are good to go, the next step is calibration and mapping, which will probably have to happen every time the robots learn to navigate around a new module. The Astrobee team hopes that, eventually, the robots will have free reign of about six modules, which (sadly) won’t include the Cupola.
The third Astrobee, Queen, is set to fly to the station later this year, and all three robots will then get their perching arms. Meanwhile, Astrobees four and five (Melissa and Killer) will be down here on Earth, helping with troubleshooting, testing, and payload integration. NASA isn’t sure yet just when we’ll start getting Astrobee updates from the ISS, but as soon as we hear anything, so will you.
[ Astrobee ]