On March 2, a tech journalist posted a video he pulled from a Jibo Owners Facebook Group. It was heartbreaking to watch the little “social robot” announce that his servers were about to be shut down and he would not function any longer. He even thanked his human owners for having him around before he did a little goodbye dance.
Many people responded to this tweet emotionally:
Less than a month earlier, multiple articles that sounded more like obituaries described the “death” of NASA’s Mars rover known as Opportunity. I remember the day well. A teammate announced the news out loud in the office and everyone stopped what they were doing and went silent. National Geographic wrote, “The Mars rover Opportunity is dead,” and Washington Post published the headline, “Goodbye, Opportunity Rover. Thank you for letting humanity see Mars with your eyes.”
In 2015, there was an outcry for justice when hitchBOT, a hitchhiking robot created by Canadian researchers, was found “beheaded” in Philadelphia after travelling across Canada and parts of Europe. Years later, people still talk about the event as a black mark on Philadelphia, with John Oliver bringing it up again on Last Week Tonight earlier this year.
What is it about these machines that inspire such empathy in so many humans? Do we have a soft spot for machinery? Or is it something else? Could it be the way that they’re designed? Or perhaps it’s more than that. As most product designers know, it’s more than just the packaging that needs to be considered to create this outpouring of love.
What goes into loveable design?
What all of the robots mentioned above have in common is that they are designed to be anthropomorphic, or to have features that appear human-like. According to research, humans are easily manipulated by anthropomorphically-designed objects. As Dr. Kate Darling, a researcher in robot ethics at the MIT Media Lab said in her 2018 TED Salon talk:
“(W)e’re biologically hardwired to project intent and life onto any movement in our physical space that seems autonomous to us. So people will treat all sorts of robots like they’re alive.”
But there is evidence that it isn’t merely an anthropomorphic design that creates deep human empathy. In some cases, we react with revulsion. To illustrate, here are two extremes of life-like robot tropes in popular culture and our reactions to them.
On one extreme, we have the humanoids, or robots designed to “blend in” with humans. Even when they’re the good guys these robots create a menacing chill for the observer, known as the “uncanny valley” effect, or the creepy result of almost (but not quite) resembling a human figure. From the replicants in Blade Runner to the child and pleasure humanoids in A.I. Artificial Intelligence, we can’t quite figure out whether we can trust them.
Their uncanny ability to blend in is deeply unsettling, conjuring up all sorts of (potentially well-placed) insecurities in us humans. If they talk like us, walk like us and are indiscernible from us in every other way except mortality and the need to eat and sleep, then we ARE replaceable. Every time I see the Sophia humanoid, developed by Hanson Robotics, speak, I recoil just a bit.
On the other end of the robot spectrum, we have robot characters like WALL-E, EVE and Big Hero’s Baymax. There is no doubt that they are NOT human. They have anthropomorphic features and gestures, but these features are adorable and childlike, with big, expressive eyes and clumsy gestures that make us want to take care of them, not run from them. We don’t feel at all threatened because, unlike the indestructible nature of the humanoids, they are vulnerable and ultimately need us to survive.
The engineers behind Jibo leaned into that neoteny (child-like characteristics) when they wrote the sad farewell message. Onlookers cried because they saw a helpless being, not a machine.
Anthropomorphism is an important part of a design, but you don’t want to create a design that is too life-like because that transforms a machine from adorable to menacing. We’ll invite the adorable into our homes and trust them, even with our children, if they seem to pose no threat. But we run away or recoil in terror if they seem to be too capable. The films iRobot and The Terminator series are just two pop culture examples that display our fear of these humanoid robots.
Designing good behavior
In a 2013 article titled “How Robots Can Trick You Into Loving Them,” The New York Times wrote:
(Human-Robot Interaction) researchers have discovered some rather surprising things: a robot’sMaggie Koerth-Baker for The New York Times Magazine (2013)
behaviorcan have a bigger impact on its relationship with humans than its design.
What researchers found was that the interactions between robots and humans left a bigger impression than physical design. An example of this was an experiment where a robotic arm, designed to pick up an object, was put against a human instructed to pick up the same object. When the robotic arm intervened and picked up the object before the human could get to it, the study participants were put off and called the arm “rude.” Programming the arm to slow down and hesitate in the same scenario changed the participants’ impression, describing the robotic arm as polite or “shy.” By simulating polite interaction, participants were left with more
Designers of these machines understand this more and more. The new FedEx delivery robot, for instance, was designed with screens that let people behind the bot know when it’s changing direction or slowing down, understanding that these delivery bots will need to build a rapport with people on city sidewalks. These small interactions can go a long way in changing the perception of their presence from being annoying to being interesting.
Designers could go even further, programming politeness gestures such as having the robots say, “After you,” when sharing a narrow pathway and allowing the human to go first. If these robots are to share the sidewalk, why not set the example? A little robotic hand gesture added to the sentiment would be especially adorable.
These are important design features and the people behind these robots know it. Amazon Echo and Google Home already have politeness settings. Small behavioral tweaks can go far in deepening our feelings of warmth towards these machines.
Designing Expected Functionality
So, if cuteness and politeness are key, why did the ultra adorable and polite Jibo fail to deliver? Because when the fun is over, you still need function.
The demise of Jibo was largely due to its inability to keep up with the functional advancements of its rivals: Google Home, Amazon Echo and Apple Homepod. Though many blamed the premium pricing (at $899/unit, it was the most expensive smart home device on the market, double the cost of the Apple HomePod), the loveable design and gestures of Jibo got old when customers realized that there wasn’t much else that the robot could do other than dance and chat.
In the New York Times article previously mentioned, another scenario described the limitations of a robot’s capabilities, frustrating participants and leaving them feeling hostile towards the machine. This is something I can relate to as I’ve found myself losing patience with my Roomba after it gets stuck in the same spot several times. My expectation is that it should learn to stay away from those tricky table legs, but in reality, it’s not that smart.
Designing Consumer Privacy
This brings me to one more important factor in robotic design that cannot be missed. As I wrote in a previous article, the companies that dominate the next phase of connectivity – connecting the physical world to the virtual world through the internet of things – will be the companies that win over (and keep) consumer trust. Trust is as much a part of the design as neoteny, interaction and function. Without consumer trust, no amount of cuteness will matter.
There is very intentional baking-in of cuteness in the design of consumer-facing robots and it’s for a good reason. They are going to be more and more part of our lives going into the future. We need to trust them, they need to encourage us to interact with them favorably and we all need to feel a bit more empathy in general. But cuteness alone doesn’t make us love a machine. We need to have the full package. Design isn’t just about the packaging, its also how the robot interacts with us, how it encourages us to interact with it, how well it improves our day-to-day lives, and how much we can trust it.
Think of it this way: we need our robots to be more like WALL-E than The Terminator.