Listen to this story

--:--

--:--

Eyes Are the Window to a Robot’s Soul

How a robot’s eyes are developed could provide a shortcut around the uncanny valley

Credit: Boston Globe/Getty Images

MMarty works at a grocery store and has a very specialized job scanning the aisles for spills and hazards. He’s one of 500 Martys at Stop and Shop and Giant Food stores across the U.S., a fleet of robot assistants that prompt questions from curious customers: Does each robot really cost $35,000? Why can’t it clean the hazards itself? And why does it have to have horrifying googly eyes on it?

In retail and the care sector and as electronic pets, robots are becoming part of everyday life, and because designers want us to feel comfortable and connected to them, the eyes are often a critical feature. But where is the line between eyes that are too creepily superficial (sorry, Marty) and eyes that are too eerily realistic? Should robots in care homes have different eyes than in-store robots? And why do robots — which are able to sense their environment using any number of methods — need eyes anyway?

“A lot of research and effort has gone into the design of eyes for robots,” says Michal Luria, a human-robot interaction design researcher at Carnegie Mellon University. “The eyes are the ‘point of entry,’ the thing users first look at when interacting with a robot, just as when we interact with other people. Eyes naturally draw our gaze.”

Decades of research has been conducted into the social significance of eye contact. Locking eyes with someone or something can be a way to connect and read mental and emotional states. Importantly for robot design, we pay attention to all kinds of eyes — even when they’re not “real.” A 2005 study found the addition of eyes during a game that involved making decisions about money contributed to overall “prosocial behavior” in the participants. Players were more generous and more likely to help others or share money with them. The researchers suggested this could be a way to fit in or an example of the “audience effect,” in which someone’s behavior changes when they know they are being watched — behavior that could be triggered even by simply seeing a pair of eyes.

But just because eyes can facilitate connection, do we need to add them to all our robots? Marty doesn’t need eyes to “see” the world. Nick Bertram, president of Giant, has said that Marty’s eyes are to “[make] it a bit more fun” and “celebrate the fact that there’s a robot.”

“Robots will exhibit behavior we’re only used to seeing in other humans and animals,” says Christoph Bartneck, an associate professor at the HIT Lab at the University of Canterbury in New Zealand who specializes in human-robot interaction. “Because people will always perceive the robot to have social qualities, we cannot afford to turn our back on these aspects.”

“Anthropomorphism is very hard to avoid, even when intentionally designing robots that aren’t meant to be responded to as ‘humanlike,’” says Beth Singler, an artificial intelligence researcher at the University of Cambridge. “From a commercial perspective, making robots that can be accepted by consumers is key.”

A 2008 study of robots used in an elderly care setting found the more social abilities and attributes a robot has, the higher the scores of social presence and enjoyment and the more likely the robot is to be used. But designing these attributes isn’t straightforward.

Luria conducted research into how the eye design of social robots affects our perception of whether they’re personable or professional, studying robots including Jibo and Tapia and a range of eyes from cute, lifelike, and animal to more abstract designs. “We believed people would desire a design that’s more social and playful for the home but would dismiss such a design in a workspace,” she says. Yet the results were surprising. “We thought eyes that are too cute would be perceived as unprofessional and unsuitable for the office,” she tells OneZero. But Luria says that didn’t happen and instead they didn’t influence perceptions of professionalism: “It only mattered for personal, social interaction with robots.”

“The more humanlike we create [robots], the more difficult the task becomes. Humans are trained to detect the slightest mistake in human motion and behavior.”

Alessandra Sciutti, a researcher at the Centre for Human Technologies at Istituto Italiano di Tecnologia in the Italian city of Genova, has been studying the importance of eye gaze in human-robot interaction. “During social interaction, eyes are transmitting information without our awareness,” she tells OneZero. “This is used by our partners to detect our focus of attention, to facilitate the interpretation of each participant’s role in an interaction, and to manage turn-taking.”

Robots could use these signals to establish understanding and connection with humans. This would involve what Sciutti refers to as a “bidirectional information exchange”; the robot needs to sense its partner’s gaze, then generate the oculomotor behavior — rotating its head to follow its partner’s gaze — to provide interpretable signals. Sciutti believes gaze is important for all robot roles, including collaborative manufacturing, where it would be helpful to see where a robot is focusing or which object it will manipulate next to aid collaboration.

Gaze also plays a vital role in the development of humanoid robots. “The human partner often takes it for granted the robot is focusing its ‘attention’ where its eyes are oriented,” says Sciutti, “although some robotic platforms don’t have cameras co-located with their ‘eyes’ and process visual inputs from different sources.” The consequence of this disconnect can cause misunderstandings with humanoid robots and break down communication.

The uncanny valley effect is the unease we feel when something looks or behaves like a human, but doesn’t quite convince us. “The more humanlike we create [robots], the more difficult the task becomes,” says Bartneck. “Humans are trained to detect the slightest mistake in human motion and behavior.”

Singler says the rise in intentionally nonhuman, animal-like social bots could be a way to avoid the uncanny valley. “It’s difficult because the uncanny valley is so subjective,” she says. “What bothers me might not bother you.” Different genders and cultures may also have contrasting needs. Research shows that in social situations, women engage in eye contact more than men. Luria notes, “The references are different, the associations are different, and of course the expectations of the role of the robot and what it is capable of doing are different.”

The way a robot is programmed to use its eyes matters. Described in the study “Averting Robot Eyes,” “honest anthropomorphism” is the need to ensure robots look and behave in a socially acceptable way, aligned with their actions. A robot designed to make users feel comfortable by lowering its eyes should not be simultaneously recording its user from a neck-mounted camera. A joined-up approach will ensure anthropomorphism isn’t used as a tactic to deceive.

One solution could be to create manifestly self-evident robots. “Bots should be designed like Deadpool,” writes Evan Selinger, a professor of philosophy at Rochester Institute of Technology, referring to the Marvel superhero character who consistently breaks the fourth wall to remind readers that he is, indeed, in a comic book. “Roboticists should aim to promote honest anthropomorphism by programming devices to remind us they’re just actors with make-believe human characteristics.”

Researchers OneZero spoke with agree: The humanlike verisimilitude often seen in science fiction isn’t the goal. “Robots are different and can do different things,” Luria says. “We need to consider when it would be valuable for them to be humanlike but also when it’s valuable for them to be superhuman.”

Until we can all agree on the best way forward, I’m sticking googly eyes on my robovac and calling it Marty. Why fight the urge to anthropomorphize? “During a visit to an engineering department,” Singler says, “I once spotted there was a funny picture of their very simple robotic arm on the wall with its ‘name’ written on it.” She’s describing one of the “arms” of what went on to become the lettuce-peeling robot — which even in its full form it looks like a crude piece of machinery rather than a human arm. “Humans can’t help but bring synthetic things into their cosmology of beings,” she says.

UK-based journalist specialising in tech, science & the future. Author of SCREEN TIME (out Jan 2021). beccacaddy.com

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store