At the InnoRobo conference in Lyon, France, last month, I got a chance to speak with Colin Angle, CEO of iRobot -- in a very candid interview about his view on the robotics industry, his vision for AVA, a new robot platform his company is developing, and how he sees things shaping up in the coming years.
Cool over function
In keeping with a presentation Colin gave earlier in the day, he started off our conversation with a discussion on how there have been hundreds of millions of dollars spent on making cool demos – but relatively little in the way of solving high value business needs.
To illustrate his point, he mentioned the incredible effort that has been undertaken on the development of humanoid robots. He calls this an exercise of “cool over utility." As he explained it, the challenge of having to build a system that supports the model of bipedal legs and actually executing walking and balancing has been a costly adventure. Even the most exciting systems often have a team of scientists walking behind them, and the systems have a mean-time to failure of about 45 minutes, with limited performance – all to the cost of millions of dollars.
Compare that to the iRobot Warrior, which Colin feels is the first practical human-sized robot ever designed. Handling drops of up to 6 meters [20 feet], it's able to carry payloads of over 90 kilograms [200 pounds] and navigate rough terrain to go where human-sized systems should go -- in other words, the Warrior shows you don't need bipedal systems to solves a high-value mobility problem.
Thoughts on remote presence
So, in keeping with my focus on remote presence systems, I steered the conversation to remote presence and how he saw their AVA prototype [photo right] potentially accomplishing this. Colin quite nicely broke down the problem and how AVA is an attempt to resolve the puzzle.
First and foremost, he wants to deliver an experience better than being there yourself -- regardless of the travel time. He wants to mimic "presence" in such a way that the experience for you (the pilot) is rich, deep, and intuitive. And keeping with many people in this space, he does not feel that remote-controlled webcams or the Cisco telepresence solutions are solving this.
To achieve ubiquitous remote presence, a remote controlled webcam is not effective since there is limited ability for the pilot to truly understand the environment. While a person could learn the environment over time (e.g., where the offices are, where the conference rooms are), wouldn't it be better to have the remote presence system know the entire layout and allow you to request the place to go and simply take you there? Cisco telepresence solutions are not effective in other cases simply due to the very nature of the systems themselves -- limited in freedom, tied down to a single location, and very limited in being able to represent you outside of the magic screen.
Colin's vision is the ability to have a surrogate "you" -- one that could, in any location, be able to be present and do things that you normally would do. Go to the room you wish to go, carry on a conversation outside of a room, be aware of who is around, where they are spatially and go to them with minimal effort.
How AVA fits in
Colin was amused that people thought the AVA was iRobot's effort into robotic telepresence -- he sees the AVA as a "generic platform" for supporting all of the robotic functions that are necessary for enabling remote presence that iRobot is known for (“Do what iRobot is great at”). For instance, as we discussed the functional components of AVA, he pointed out the various features that the robot platform supports:
Downward facing IR for cliff detection
Braking systems to ensure the system is not going to fall
Small physical footprint (on the order of a human) to ensure fast turning radius and strong stability
Bumpers and upward facing sonar for detection of objects that could potentially collar the head of the device
Two PrimeSense sensors to enable a better understanding of the world through 3D mapping both of the navigation environment (downward facing) and the environment in front (on the camera assembly)
A LIDAR component that he wants to reinvent to bring the cost down (most expensive piece of the system)
Control surfaces for participants to move the system without physically pushing the system (through the bumper pads on the neck) to improve management of the system
Telescoping neck (via lead-screw) to ensure a lower center of gravity for movement/motion while affording a variable height for engagement with participants either standing or seated
Positioning control for the neck/head component
Supports adding manipulators on the system through a rail mechanism on the back of the neck of the AVA
A platform for application development
Colin said that iRobot's primary focus is on the "robotic functions" for a "generic platform" -- to help others overcome the liability issues. iRobot has done a lot of work -- through their previous designs and their own operating system (AWARE2) -- to make as safe and reliable platforms as possible. Rather than trying to make a specific platform for remote presence, Colin said that it is iRobot's intent to build the platform and let developers/designers create a solid system.
I got somewhat confused here -- it sounded like he was suggesting that iRobot would not compete in the application development and would not build a system for specific purposes -- like remote presence. And, when I pressed, he clarified that iRobot would not get in the way of something like Pad-to-Internet-to-Pad communications (e.g., FaceTime, qik, Skype), but in terms of building a navigation interface (e.g., a web front-end for piloting the system) for the pilot to interface with AVA, iRobot might offer a solution. Like Apple, iRobot's solutions for various applications could sit alongside of any other third-party solutions -- enabling these developers to build a better interface/application that would interface as well with AWARE2 and control the AVA platform. Here's how he put it:
Yes, it is our intention to develop apps for AVA alongside other developers, as we need to, as you say, “prime the pump”. As we look at the way things are likely to play out, iRobot is committed to being best in the world at autonomy/navigation software, platforms, manipulation, and the integration of 3rd party hardware – while we aspire to be a one of many application developers.
But for remote presence, the idea of having a tablet with a camera and a large screen (like the Motorola Xoom or the iPad 2) connecting to the AWARE2 API would easily support the creation of a remote presence system and allow the developers to rapidly iterate versions. And with an extendable head and telescoping neck, the placement of the pilot's face would be an easy effort and allow for remote presence to potentially become true.
Other juicy bits
From other conversations, I learned that there are a number of the AVA prototypes out in the market space already -- in the midst of prototype development for various problems. I could see the vision Colin has allows for an augmented reality for the pilot -- being able to have a click-and-response action within the view of the remote presence system (e.g, open doors, tag people in a meeting, set vision points to track where people are and respond to them rapidly by turning the head). How this comes about will be an interesting exercise in the coming years.
This article appeared originally at Pilot Presence.
Sanford Dickert is a technologist and product manager focusing on the intersection of engineering, collaboration, and team dynamics. He's held numerous senior positions in engineering, product development, and digital marketing. He writes about remote presence systems at Pilot Presence.