Anki to Release Impressive Feature-Packed SDK for Cozmo Robot

Cozmo's SDK offers access to some surprisingly sophisticated features on this little robot

5 min read

Anki Cozmo Robot SDK
Image: Anki

When we first saw all of Anki’s PR for its forthcoming Cozmo robot, we were impressed—mostly. Anki seemed to be determined to overinflate an otherwise interesting and capable little piece of hardware into an expectation-laden “part of the family.” We’re not interested in expectations: We want to know what this robot can do, and how it’ll continue to hold our attention after the first 5 minutes.

Today, Anki is announcing what we have to look forward to in the SDK that’ll come with Cozmo. At first, this didn’t seem like that big of a deal—lots of companies release SDKs with their robots in the (usually futile) hope that developers will latch onto it and imbue their robots with all kinds of new and exciting features continually and for free. However, after speaking with Anki co-founder and president Hanns Tappeiner, we’re a bit more optimistic that Cozmo’s SDK might actually motivate you (and other people) to do some really cool stuff with this robot.

Our interview with Hanns Tappeiner, and some sample code, coming right up.

IEEE Spectrum:How is Cozmo’s SDK different from other SDKs that you get with other robotic toys?

Hanns Tappeiner:Most of the company here are robotics geeks, so we’ve worked with a lot of robotics SDKs. One of the things that we really felt that we need to do is open up not just the kind of functionality that’s normally available in robotics SDKs— things like turning on lights or moving motors— but all the stuff that we’ve developed over the last four years that’s much higher level than that. For example, recognizing faces and tracking them in real time. Planning paths from point to point while avoiding obstacles. We’ve implemented full SLAM for the robot, so the robot can localize and figure out where he is while creating a map. 

As robotics geeks, we’re interested in trying to figure out how we can advance the field of robotics overall. One of the big missing things is that while there is robotics software out there, it’s usually mostly accessible to people who are in the field of robotics. It’s not as accessible to people who might be good developers, but who don’t know enough about robotics to really use this kind of stuff. No matter how complex and how high-level the functionality is, we want to make it available to people with single lines of code. And that’s what the Cozmo SDK does.

img

Hanns isn’t kidding. He showed us a series of simple demos from the beta version of the SDK along with the code (written in Python) behind them. Below are four video clips, followed by a gallery of screenshots of the code. It’s pretty slick how much capability you can leverage with these commands: for example, one demo has Cozmo autonomously driving around a trio of cubes and picking up the farthest one. All you have to do is to tell the robot “go pick up that block” using “cozmo.PickupObject” command, and it does all of the path planning, motion, orientation, and manipulator control for you:

Images (4): Anki

The press release includes a bullet-pointified list of things that you can do with Cozmo’s SDK; here they are along with some extras that we added based on our coversation with Hanns:

  • Use the computer vision system to track and recognize faces and facial expressions and estimate their position and orientation in 3D space.
  • Tap into the localization system with access to the robot’s internal map and all objects in it.
  • Utilize path and motion planners with obstacle avoidance, etc.
  • Explore Cozmo’s behavior system and execute high level behaviors such as look around, findFaces, findCubes, etc.
  • Use the entire animation system of the robot with access to all animations and sounds our character team has created.
  • Cozmo’s personality engine can be turned on and off. When it’s on, Cozmo interjects its cute little behaviors into whatever you program it to do.
  • The SDK gives you access to all of the raw sensor data if you want it
  • Cozmo can recognize (and tell the difference between) cats and dogs, although it can’t identify them individually.
  • Cozmo can also recognize other Cozmos. This isn’t a feature that will be enabled as part of the default behaviors on launch, though.

We were particularly interested in how Cozmo does SLAM, since it’s pint-sized and has just a single camera on it:

Hanns Tappeiner: The robot has one camera in his face, which is the only camera we’re using. The camera is the main sensor, but we have a very high quality IMU, so even when we don’t see landmarks, we’re still keeping track of how the robot is oriented. We’re using the wheels to estimate how far we’re driving, which is fairly approximate. The robot might drive around blind for a minute or so, and the longer he does that, the more his actual location will be off, but the moment he sees a landmark like any of the cubes, he’ll know exactly where he is again

img

IEEE Spectrum: So the SLAM with the single camera works because the cubes and charging dock are a known size, so the robot is able to tell how far away they are?

Hanns Tappeiner: That’s correct. And what makes this really nice for us is that each cube has an accelerometer in it. One of the problems with landmarks is that when you see them again, you don’t know whether you moved, or they moved. The moment one of the cubes moves, you know it through the accelerometer, and you can remove them from the map. He’ll also map out things that he doesn’t really understand: the moment he runs into something, he’ll register it as an obstacle, it’ll be added to his map, and next time he’ll plan a path that will not run into obstacles like that. 

IEEE Spectrum: When should we expect to be able to start playing with the SDK?

Hanns Tappeiner:We’re releasing the SDK as a beta version right at the launch of the product in October. This was actually a fairly big conversation here: should we release the beta, or should we wait for another six months until we have a full SDK? We decided we definitely want to release the beta, because there are so many different kinds of developers out there and we wanted to get it out there with all of the functionality in it for people to comment on.

Once that is done, our guess is that within a few months after launch, before Christmas, we’re going to start the K-12 education part. We’re not going to be developing our own graphical programming language because there are already some out there like Scratch and Swift which are doing a really good job at this. We’re going to spend a lot of resources making sure that Cozmo is integrated into languages like that as well as it possibly can be. 

IEEE Spectrum:Have you thought about working on integration with ROS?

Hanns Tappeiner:I love ROS. My co-founders and I all used ROS in grad school. We’re not going to be developing a huge amount of code for ROS, but that’s not necessary. One of our engineers, I think in 30 minutes or so, used the current SDK to create a ROS node for it. We’re going to publish that on GitHub, and we’re going to publish the other stuff too. But developing more code inside of ROS— that’s what we want researchers and makers to do. I feel like ROS is the perfect kind of tool when you’re a researcher, but it’s not accessible enough for someone who is for example a game developer. We feel like we need to go one level farther, where people literally without any knowledge of robotics can use this high-level functionality. 

IEEE Spectrum:Do you think that Cozmo will be useful for research?

Hans Tappeiner:Yes. I can’t talk about it much yet, but we’re fairly well connected in the research robotics area, and I have a lot of friends at Carnegie Mellon, MIT, Stanford and so forth who we’re meeting with in the next few weeks. Cozmo is getting a lot of excitement.

Despite our initial pessimism (which we still don’t think was entirely misplaced), we’re optimistic about this SDK. Cozmo, along with the beta SDK, should be on sale in October for $180, or $160 if you pre-order one in advance.

[ Anki Cozmo SDK ]

The Conversation (0)