The Robots of the Future Should Look Like Puppets
Puppeteers have been effectively animating objects for thousands of years — and they have much to teach a soulless robotics industry
“I’ve got no strings to hold me down,” goes Pinocchio, that lying piece of wood and icon for A.I. The story of an artificial boy searching for selfhood has become a founding myth for science-fiction robots everywhere. From Blade Runner to Westworld, from Avengers: Age of Ultron to Ghost in the Shell, puppets and robots have been closely allayed in the public imagination, cutting strings and walking free.
“Long before computers, engineers built mechanical objects designed to look and move like humans, animals, and other fantastical creatures alongside more practical machines,” says Elizabeth Jochum, a researcher in robotics and performance at Aalborg University in Denmark. “In 70 A.D., the inventor of the steam engine, Hero of Alexandria, constructed mechanical puppet theaters using pneumatics and hydraulics.”
Automata such as this developed throughout the centuries, with clockwork mechanisms giving the impression of life to the Wunderkammern curiosities of 16th-century European courts and the Karakuri puppets of Japan’s Edo period. These proto-robots pulled equally on engineering and the ancient art of making inanimate objects seem alive, entertaining audiences with mechanical birds, flute players, servants, and soldiers.
Despite this history, there’s currently a gulf between modern robotics and the art of puppetry. Jochum tells me that puppeteers think first and foremost about movement. It doesn’t necessarily matter how lifelike a puppet appears. The famous Soviet puppeteer Sergey Obraztsov would tell stories using small balls attached to the end of his fingertips. “For social robots, though, it has typically been the reverse: Humanoid robots and social robots are focused overwhelmingly on design and consider movement secondly,” says Jochum.
“Puppeteers have been around for thousands of years, and they’ve been field testing what works and what doesn’t in front of audiences all that time… and almost nobody in the robotics community is even remotely aware of them.”
By being so focused on making objects look lifelike, do engineers overlook what truly makes them expressive? David McGoran, a puppeteer and the creative director of robot design studio Rusty Squid, describes attending an academic conference on how future robots can build trust: “They were doing all this research and thought it was cutting edge. I said to them, surely you’ve looked at Jim Henson’s notation on eye movement or notations on stop-motion animation? They hadn’t even considered to look at these masters.”
“Puppeteers have been around for thousands and thousands of years, and they’ve been field testing what works and what doesn’t in front of audiences all that time,” he adds. “They’ve developed systems, practices, notations, disciplines, and schools … and almost nobody in the robotics community is even remotely aware of them.”
Puppeteers start with breath, McGoran tells me: the rise and fall of a body, the tension and the release, shallow or deep, calm or scared. Disney would famously get his novice animators to animate a sack of flour. He told them to get rid of the hands, get rid of the head, get rid of the legs, and just focus on the core. “Whether it’s Bunraku puppetry in Japan or traditional Balinese puppetry, there’s a universal principle: You start with the breath,” says McGoran.
And yet breathing, or at least the impression of breathing, is not something many robots do. Even robots specifically designed to interact with humans, from the Roomba sucking crumbs on your carpet to the humanoid Pepper that might greet you at a high-end Tokyo store, there’s no attempt to mimic the body’s rhythm, an act that is central to the work of a puppeteer. Why? It feels like a question that reaches toward the heart of how we think about robots.
Because puppets, like actors, are ultimately performers. When you watch a puppet of a horse buck and breathe, you know deep down that you are watching a human interpretation of those movements. Beneath all the trickery and stagecraft, it’s a human-to-human experience. John Criswell, the animatronics supervisor for Jim Henson’s Creature Shop, tells me that the studio’s designs can be heavy on machinery — one creature’s head can have 30 different motors built into it — but all of it is ultimately in aid of human expression.
“Once you throw that human element in there, you can take literally anything and make it expressive,” he says. “I’ve seen Brian Henson take a mop head and bring it to life, make it charming and endearing. I’ve seen puppeteers do amazing things.”
So what if there’s no puppeteer? Consider a Roomba gasping on the carpet. Terrifying. Are even the very basics of puppetry simply too powerful, too uncanny for automation? Not if we change our mindset about what robots are, McGoran argues. “One of the cultural fears we have of robotics is that we suspect the intention is to create something that is competing with life,” he says. “Whereas if we culturally understood robotics to be works of art, then we’d be happy to engage with them because they’re designed to teach us something about ourselves.”
One robot that takes on this approach is Blossom, which was developed by Cornell University’s Guy Hoffman. With a handcrafted woolen and wood design and a vaguely rabbit-like appearance that convincingly bops along to music, it manages to find a balance between artifice and expression. Compare this to Hanson Robotics’ Sophia, with its simulacrum of a woman’s face, and you can see why Disney would tell his animators to start with a sack of flour. Sophia is pitched by its creators as a human-like robot. Certainly, it is more intelligent, and yet Hoffman’s dancing doll somehow seems much more human.
Is there a place for more machines that prioritize movement, and might we begin to see more machines that work as actual puppets controlled by human hands?
The puppy-like Tombot takes a similar approach. Made in collaboration with Jim Henson’s Creature Shop, the therapy robot is designed to provide emotional support for people with Alzheimer’s and other mental illnesses. It does this through convincing movement — from wagging its tail to tilting its head when stroked. Criswell hopes it could be part of a new wave of expressive robotics: “We’re teetering on the verge. I think we’re going to have an explosion of automated and expressive robots for the home.”
Look around your living room, and, aside from a dutiful Roomba, most devices will be motionless. An Amazon Echo or Google Home might speak and light up, but they don’t move, and they certainly don’t rise and fall with the rhythm of breathing. Could this change? Is there a place for more machines that prioritize movement, and might we begin to see more machines that work as actual puppets controlled by human hands?
The Tactile Telerobot, developed by Shadow Robot Company, uses haptic gloves to wield a pair of dexterous, teleoperated robotic hands. “Something we have found to be very interesting in the past is that we have been able to work with people from a design and performance background — in particular, puppeteers — who are used to taking inanimate objects and giving them ‘life,’” Rich Walker, the company’s managing director, tells me.
Jochum notes that Shadow Robot’s approach is strikingly similar to 17th-century Japanese Bunraku puppets, which were designed for dexterity and fine motor skills. “The difference is that now the strings are cables controlled by computers, motors, and force sensors,” she explains. But what happens when you take the puppeteer away from the puppet? Whether it’s a set of robotic fingers, steam-powered automata, or an animatronic Labrador puppy, what are the limits of expression when you remove the human hand? How alive can a robot be when it’s got no strings to hold it down?