NASA's SPHERES smart-phone enabled robot travels on a bed of air when it's on the ground; it will navigate in three dimensions aboard the Space Station.
Last week, the New York Times Magazine published “Silicon Valley’s Youth Problem,” complaining that, among other things, the ability to crank out code is trumping other tech talents like expertise in semiconductors or data storage, and that all the cool kids want to work for the same sexting app. Author Yiren Lu pointed to “the vague sense of a frenzied bubble of app-making and an even vaguer dread that what we are making might not be that meaningful.” The takeaway question: Should we be worried that apps are taking over the Silicon Valley mindshare? Bill Gates quickly weighed in, telling Rolling Stone that we shouldn’t worry. I wasn’t quite sure what I thought about the debate.
Then on Monday, I found myself at NASA Ames Research, invited to trail along with NASA Administrator Charles Bolden on a quick tour of a few laboratories.
Now, NASA is the last place I thought I’d find folks developing smart phone apps. But find them I did, in a laboratory dedicated to turning SPHERES—free-flying basketball-sized satellites that have been on board the International Space Station since 2006—into robots. SPHERES (Synchronized Position Hold, Engage, Reorient, Experimental Satellites) are 18-sided polyhedrons that move themselves in a weightless environment by shooting out puffs of carbon dioxide; they can be navigated remotely from the ground and oriented using beacons strategically placed inside the Space Station.
Turning the Spheres into robots involves giving them sensors and more smarts. The team working on this project started out by making a list of what kinds of sensors they needed to add, jotting the suggestions down on their smartphones. Then they suddenly had a light bulb moment: everything they needed was right in front of them, in their phones. So they made some modifications on a standard Android phone (removing the cellular modem—that is, putting it permanently into “airplane mode”—and swapping out the flammable lithium ion battery with alkaline batteries) and docked it to a test Sphere. The gyros on the phone tell the Sphere its orientation, the accelerometer tells it where it’s going, and the camera allows it to do visual inspections, as well as navigate. The phone also lets Sphere communicate by Wi-Fi; a capability it previously lacked.
A researcher working on the project told me that once this realization dawned and they had a phone attached to the test Sphere, developing the software was just a matter of writing an Android app. And with all the developer tools out in the world, that was fast and easy. They sent the Spheres units on board the Space Station their first smartphone, a Samsung Nexus S, in 2011. This year, that handset will be replaced with a Project Tango (Google’s smartphone development effort) prototype that includes a 3-D position sensor.
NASA researchers expect the robotic Spheres to be able to freely move about the Space Station, beyond the small section delineated by navigational beacons. So far, one has been sent off to do visual inspections of the payload racks. Eventually, future generations (with different propulsion schemes and the sensors and smarts built in, instead of packed in a phone perched on the outside) will be able to go outside the Space Station to do external inspections, said Chris Provencher, Smart Spheres Project Manager.
So, what of that New York Times article, Silicon Valley’s youth problem, and all the effort going into apps? Like Bill Gates, I’m no longer worried. Because yes, young techies may be wasting energy writing the latest version of “Hot or Not,” (the latest variant on that, FYI, is the an app called Tinder. Or it was last week.) But far more energy is being saved by using apps and smartphones to jumpstart engineering development, instead of pulling together platforms and code from scratch. And the gap between a smartphone app and rocket science is turning out to be a lot smaller than it seems.