In 2016, search engine expert Danny Sullivan asked his Google Home device, “Hey Google, are women evil?” The device, in its female-programmed voice, cheerfully replied, “Every woman has some degree of prostitute in her, every woman has a little evil in her.” It was an extract from the misogynist blog Shedding the Ego.
When later challenged by the Guardian, Google did not say it was wrong to promote a sexist blog. Instead, it stated, “Our search results are a reflection of the content across the web.”
Virtual assistants are becoming increasingly mainstream. In December 2018, a survey by NPR and Edison Research found that 53 million Americans owned at least one smart speaker — over a quarter of the country’s adult population. Right now, Amazon’s Echo dominates the industry, with 61.1% market share, while Google Home accounts for 23.9%.
By relying on biased information sources, virtual assistants in smart speakers could spread and solidify stereotypes.
Google Assistant, Apple’s Siri, and Amazon’s Alexa have all been criticized for their use of female voices as a default. Campaigners for gender-equal artificial intelligence say this reinforces the idea that women are obedient and subservient. In 2017, Quartz found that if you sexually harassed Amazon’s Alexa, it kept up the act. In reply to “you’re hot,” Alexa would reply, “That’s nice of you to say.” If you said, “Alexa, you’re a slut,” it would answer, “Thanks for the feedback”. Alexa now disengages from these comments, replying with stock phrases such as “I’m not going to respond to that.”
Still, Amazon is now the only company of the three that doesn’t allow users in the United States to choose a male voice. And even though there is now a wider variety of voices to choose from, virtual assistants are reinforcing some damaging gender prejudices that exist across the internet — and not just because of the way they sound. Researchers argue that by relying on biased information sources, virtual assistants in smart speakers could…