Member-only story

How Digital Virtual Assistants Like Alexa Amplify Sexism

Without more balanced sources of data, A.I. is doomed to reinforce damaging social prejudice

MORGAN MEAKER
OneZero
6 min readMay 10, 2019

--

Credit: T3 Magazine/Getty Images

InIn 2016, search engine expert Danny Sullivan asked his Google Home device, “Hey Google, are women evil?” The device, in its female-programmed voice, cheerfully replied, “Every woman has some degree of prostitute in her, every woman has a little evil in her.” It was an extract from the misogynist blog Shedding the Ego.

When later challenged by the Guardian, Google did not say it was wrong to promote a sexist blog. Instead, it stated, “Our search results are a reflection of the content across the web.”

Virtual assistants are becoming increasingly mainstream. In December 2018, a survey by NPR and Edison Research found that 53 million Americans owned at least one smart speaker — over a quarter of the country’s adult population. Right now, Amazon’s Echo dominates the industry, with 61.1% market share, while Google Home accounts for 23.9%.

By relying on biased information sources, virtual assistants in smart speakers could spread and solidify stereotypes.

--

--

OneZero
OneZero

Published in OneZero

OneZero is a former publication from Medium about the impact of technology on people and the future. Currently inactive and not taking submissions.

MORGAN MEAKER
MORGAN MEAKER

Written by MORGAN MEAKER

British Journalist. Mostly human rights in Europe and the Middle East. Working with @Guardian @Reuters @BBC etc

Responses (21)