Eyes Are the Window to a Robot’s Soul

How a robot’s eyes are developed could provide a shortcut around the uncanny valley

Becca Caddy
OneZero
Published in
6 min readSep 12, 2019

--

Credit: Boston Globe/Getty Images

MMarty works at a grocery store and has a very specialized job scanning the aisles for spills and hazards. He’s one of 500 Martys at Stop and Shop and Giant Food stores across the U.S., a fleet of robot assistants that prompt questions from curious customers: Does each robot really cost $35,000? Why can’t it clean the hazards itself? And why does it have to have horrifying googly eyes on it?

In retail and the care sector and as electronic pets, robots are becoming part of everyday life, and because designers want us to feel comfortable and connected to them, the eyes are often a critical feature. But where is the line between eyes that are too creepily superficial (sorry, Marty) and eyes that are too eerily realistic? Should robots in care homes have different eyes than in-store robots? And why do robots — which are able to sense their environment using any number of methods — need eyes anyway?

“A lot of research and effort has gone into the design of eyes for robots,” says Michal Luria, a human-robot interaction design researcher at Carnegie Mellon University. “The eyes are the ‘point of entry,’ the thing users first look at when interacting with a robot, just as when we interact with other people. Eyes naturally draw our gaze.”

Decades of research has been conducted into the social significance of eye contact. Locking eyes with someone or something can be a way to connect and read mental and emotional states. Importantly for robot design, we pay attention to all kinds of eyes — even when they’re not “real.” A 2005 study found the addition of eyes during a game that involved making decisions about money contributed to overall “prosocial behavior” in the participants. Players were more generous and more likely to help others or share money with them. The researchers suggested this could be a way to fit in or an example of the “audience effect,” in which someone’s behavior changes when they know they are being watched — behavior that could be triggered even by simply seeing a pair of eyes.

But just because eyes can facilitate connection, do we need to add them to all our robots? Marty doesn’t need eyes to “see” the world…

--

--

Becca Caddy
OneZero

UK-based journalist specialising in tech, science & the future. Author of SCREEN TIME (out Jan 2021). beccacaddy.com