Zoom’s Virtual Background Feature Isn’t Built for Black Faces
A scientist warns that bias in facial recognition software could lead to false arrests, lost job opportunities
Ainissa Ramirez says she’s seen Black and dark-skinned colleagues disappear into their virtual backgrounds on Zoom calls a few times this year. And she isn’t the only one.
“I have heard reports that Black people are fading into their Zoom backgrounds because supposedly the algorithms are not able to detect faces of dark complexions well,” Ramirez, PhD, former professor of mechanical engineering at Yale University, tells OneZero.
In late September, a PhD student in Canada tweeted about a Black professor whose head kept getting removed every time they tried to use a virtual background. The tweet went viral, with countless Black and dark-skinned people sharing their difficulties using Zoom’s virtual background function, which relies on facial recognition technology to determine what parts of the screen should show the user and what parts should show the background image.
Ramirez spoke about the intersection of emerging technologies and racial bias at the Toronto International Festival of Authors last Friday, pointing out how this issue of Black and dark-skinned people being excluded from new tech is anything but new.
She pointed to the way “bias was built into the formula” of film. From the 1940s until the early 1990s, film companies like Kodak and Polaroid only used white models to calibrate the product, she said. Eventually, camera companies began calibrating for different complexions, but now, a similar racial bias is creeping into the imaging technology we use for nearly everything these days.
“Facial recognition will continue to be more pervasive in our society. If it is not accurate in detecting people, this might mean that one day people will not be able to unlock their phones or get into their homes,” Ramirez says. “Even worse, they might be recognized as the wrong person, so they cannot get a job, or could be detained for something they did not do.”
She says that more oversight of these technologies and more diversity on the teams that build them are needed to identify bias before products are launched.
“This way when more advanced innovations come along the lessons that bias can be embedded in them are already understood,” she says. “This makes the job of probing and excavating these biases less challenging and less daunting.”