Anki, Jibo, and Kuri: What We Can Learn from Social Robots That Didn't Make It

It's been a tough few years for social home robots: Where do we go from here?

Photograph of Anki's vector robot on a shelf, looking sad.
Photo-illustration: IEEE Spectrum; Original photo: Anki
Sad robot: Anki, maker of social home robots, shut down abruptly this week.
Advertisement

This is a guest post. The views expressed here are solely those of the author and do not represent positions of IEEE Spectrum or the IEEE.

News of Anki’s shutdown has spread like wildfire through the social robotics research community. Following the demise of Jibo and Kuri less than a year ago, it now seems that three of the most viable contenders to lead the budding market of social home robots have failed to find a sustainable business model.

Anki’s Cozmo robot was arguably an even bigger blow to the hopes of social robotics enthusiasts than Jibo or Kuri. Anki was a well-funded company, having reportedly secured over US $180 million in total investments, and was selling Cozmo (and a newer model, called Vector) at a competitive price point. When Jibo and Kuri tanked, some attributed it to their high price tag, but now that Anki with its $250 product is also on its way out, perhaps it is a sign that there is something more fundamentally wrong with the concept of social home robots.

I have worked in the research field of social robotics and human-robot interaction for over 15 years, designed several companion robot prototypes, and have co-founded a social robotics startup in 2012, which shut down two years later. In my circles, there is much discussion about what happened, and why. Some say that the technology is just not ready for the user experience that these companies promised. Others have argued that robotics research tends to over-promise and under-deliver when it comes to technological capabilities. Some mention the lack of a “real need” for a social robot, a missing “killer app,” or the impossibility of competing with disembodied voice agents. But I am not convinced. I actually view the closing of these companies not so much as proof that they were doing something wrong, but rather as an opportunity to learn important lessons for the next generation of social robots. 

In my analysis, the current moment on the social robotics timeline is akin to the era following the failure of the Apple Newton, long before today’s ubiquity of smartphone devices. The Newton, introduced in 1993 and pronounced dead in 1998, was the first commercial handheld computing device. It was a massive technological feat, but was priced too high for its usefulness and did not work well enough in its most common applications. That said, almost immediately after Apple’s cancellation of the Newton product line, rival Palm dominated the early 2000s with a very successful line of handheld devices almost identical to the Newton, leading directly to the development of smartphones within five years.

Like the Newton, Jibo, Kuri, and Cozmo were courageous trailblazers into a virtually unknown use-case. Like the Newton, these products were conceived based on some research but also on a lot of intuition. And like the Newton, the first generation of social home robots ended up not making much sense to consumers.

Despite its commercial failure, the Newton was the first to experiment with many concepts that still have value today, in some cases almost unchanged from the original design. These include taking notes on the move, checking your digitally updating calendar, and synchronizing contacts between your handheld device and your computer. The Newton even had the capability to write custom apps. I remember being a freshman CS student in the 1990s, thrilled by the possibility of so easily programming a computer that fits in my pocket. One thing that did not work, though, was using handwriting as an input method, and that is still true today.

For years to come, developers and product managers looked at the Newton as the foundation on which they built their own designs and systems. They learned both what worked and what to avoid. In retrospect, the Newton played a pivotal role in our technological present. 

I believe that Cozmo, Kuri, and Jibo (disclosure: Jibo was founded by my Ph.D. advisor Cynthia Breazeal) will play a similar role on the path towards successful social home robotics. If that is true, what exactly can we learn from their experience? Here are four lessons I have personally drawn from closely following these first attempts to put the promise of social robotics research into commercial practice:

Lesson 1: Long-term engagement is the holy grail, and the Gordian knot

All of the social robotics companies were struggling to sustain a long-term use-case for their products. Critics of the products would often say that this kind of product may be fun to use for a while, but that its tricks get old quickly. This made it especially difficult to succeed, given the upscale price point of some of the devices.

One big part of the longevity problem is the inability of the robots to escape the single turn structure of an interaction. There is only so far you can go with a single round of conversation, even when you stack a thousand single rounds back-to-back. At some point you want to follow-up and back-refer (“Remember when I asked you about Florida yesterday? I think I’m ready to commit.”); you want your conversant to make connections across conversations (“That is so similar to the story you told me about how your boss talks to you!”); and you want to be able to speak over each other while still understanding what’s going on.

Paul Pangaro recently called for a “Turning Test” of intelligence for systems that can overcome this problem, and much of my own research has been dedicated to studying fluent and overlapping interactions as opposed to turn-by-turn “chess matches” between the human and the machine.

Lesson 2: We need artists

There are clear technological barriers for social home robots. Realistic non-repetitive gesture generation is a largely unsolved problem, and dialog algorithms are not sophisticated enough for meaningful relationships. With Cozmo, Kuri, and Jibo, the combination of both resulted in a sense of canned simplicity that prevented long-term engagement.

I would like to make the argument that we do not necessarily need a lot of complexity or intelligence for successful long-term engagement. I can think of at least two categories of long-term engaging products that capitalize on their simplicity rather than on complexity: board and card games, and soap operas. If we look closely at the structure of these two genres, we can see examples of very specific interaction patterns that are known to keep people engaged for years, despite mechanistic and repetitive components.

Perhaps what the social robotics industry needs is professionals who excel at storytelling, emotional engagement, and structured repetition. Perhaps what this industry needs is artists. Sophia Efstathiou claims in her research that artists are crucial for technology development because they are uniquely positioned to simplify complex ideas and concretize them. I have also recently commented on the need for art in HRI on the HRI Podcast.

Just think of the effect professional screenwriters had when they entered the gaming industry. We moved from narratives created by programmers and engineers for a handful of geeky fans in the 1980s (myself included) to today’s sagas that enthrall millions. With the gaming, streaming, and podcast industries providing us with an unprecedented renaissance of master storytellers, it is shocking that we have seen so little of that profession enter the social robotics industry.

I believe that once we have artists creating captivating storylines for social home robots, we will discover their true magical potential, providing us with what I sometimes jokingly call “the real home theater.” Your robots would have long and intricate story-lines, crafted after known classics or modern variations thereof. They will be involved in love triangles, money heists, data hacking, or the building of empires. Imagine if your robot was part of all of that and drew you into the plot whenever you interacted with it. 

Lesson 3: Embodiment does create emotional bonds

In conversations, I sometimes hear that social robots are just voice agents with little added value. But when you hear how the handful of existing users reported on the emotional reactions they had to social robots, this does not seem to be true. Researchers have shown this emotional response in prior laboratory experiment and field studies, but we now have much more evidence for this happening real people’s homes.

On the recent RoboPsych podcast on the demise of Jibo, Tom Guarriello resigns his admitted cynicism and speaks of Jibo as having an emotional effect on him and his partner. He even got choked up when Jibo said goodbye (his partner shed a tear), and his language describing the robot is often passionate and emotional. Many others have shared similar reactions.

This is not just a novelty effect. Having been around social robots for so long, I can anecdotally support what we know from a large body of academic research. A physical thing in your space, moving with and around you, provides an emotional grip that no disembodied conversant can. Seasoned engineering professors smile at their robots making an odd gesture, and often cannot help but suspend their belief that this collection of motors and control signals is merely that. I have found that very few engineers remain indifferent to viewing the famous Boston Dynamics BigDog kicking video, even if they have worked with robots for decades.

For some reason, the first social robotics companies were not able to translate this effect into a long-standing product. Customer responses indicate to me that there is a vast untapped potential there, and I bet it will not be long before some clever entrepreneur figures out how to translate this treasure into a functional product.

Lesson 4: Design matters

The work of translating emotional and cognitive potential into a viable product is not a novelty of the 21st century. In fact, it is precisely the role of the field of design. There is a long-standing tradition of design research, but the social robotics industry (and academia) only flirts with it in a cursory manner. Our industry is still dominated by engineers and, sadly, too many engineers think that design is something you can just add on at the end to make your product more attractive, whereas nothing could be further from the truth (just ask Steve Jobs in 1996). Design is a front-loaded activity that studies human behavior and works in artful ways to combine elements of history, aesthetics, ethics, psychology, and engineering to create products we want to use. 

Importantly, designers do not think about products in isolation, and I believe that isolation was one of the biggest issues that virtually all of the social robotics start-up companies suffered from. Social home robots were mostly imagined as islands, engaging in their own right, with all the focus on them as both the sales agent and the product that the agent was trying to sell. In contrast, much of the success of a product like Echo comes from the fact that it is part of a huge service ecosystem, owned and managed by Amazon. Perhaps a social robot designed as part of a larger product, be it a football league, a coffee chain, or a cruise ship, may be a more viable business possibility. 

All-in-all, I predict that when designers will start their own social robotics companies and hire engineers, rather than the other way around, we will finally discover what the hidden need for home robots was in the first place.


The future use-cases for social home robots may be as surprising to us now as the Tinder app would have seemed to owners of the Newton 25 years ago. But if academic labs and industrial research groups can find the wisdom to invest in interdisciplinary design, pull together knowledge from the arts and storytelling, and promote empirical and AI research that specifically targets long-term engagement, the long-held promise of social home robots may not be as far as it seems.

Guy Hoffman is an assistant professor in the Sibley School of Mechanical and Aerospace Engineering at Cornell University. He heads the Human-Robot Collaboration & Companionship lab, which studies the computational, engineering, and social aspects of interactions between humans and robots. His designs include Blossom, Vyo, Shimi, and Kip. Hoffman is also on the board of directors of Intuition Robotics, which is developing a social robotics product.

Robotics News

Biweekly newsletter on advances and news in robotics, automation, control systems, interviews with leading roboticists, and more.

About the Automaton blog

IEEE Spectrum’s award-winning robotics blog, featuring news, articles, and videos on robots, humanoids, automation, artificial intelligence, and more.