OK Google, Black History Month Is Over. What Now?

A commercial and four flawed algorithms

Photo: NurPhoto/Getty Images

DDuring the Grammys this past January, Google released a commercial highlighting some of “the most searched” terms in honor of Black History Month. The 90-second ad featured footage of notable Black figures such as Beyoncé, LeBron James, Whitney Houston, Lil Nas X, and Serena Williams. When the commercial aired, Black Twitter was not surprised that Google would try to capitalize on the contributions of the Black community. Black people regularly contribute to the success of others without adequate or equitable compensation or consideration (see the National Football League or the Democratic party).

Despite the benefits Google has received from the Black community, the company has refused to or has been slow to correct the discriminatory algorithmic practices at YouTube, such as its language filter, ads, and its search algorithms. Whether intentional or unconscious, all of these biases have harmed the Black community. For some people, Google is the internet. Civil rights considerations must be central to big data and the platforms they drive. Google should not celebrate the contributions of Black people without also making their platforms welcoming to them.

Google’s YouTube algorithm

For several years, advocates have warned that white supremacists use Google’s YouTube algorithms to spread white nationalist propaganda to recruit new members. Google’s response was slow, and reports alleged that Google executives left some content up to capitalize on engagement and views. The platform finally acknowledged the communities of hate thriving on YouTube with its policy banning white supremacist content. However, the policy has not been effectively enforced, and Google has not shut down white supremacist channels on the site.

YouTube must become inhospitable to the alt-right. In 2018, more than two-thirds of Black adults used YouTube. Fostering communities of hate when such a large Black population exists on the site increases the chances Black users will encounter racist content or harassment. For example, Black women are more likely to face online harassment and be pushed off of social media sites as a result. Google has sent a clear message of who it actually values.

Google’s language filter algorithm

Google must also closely evaluate the tools it uses to address the problem of racial and cultural bias. A Google algorithm designed to flag offensive content labeled 46% of inoffensive tweets by African-American users as offensive. Google’s Perspective tool was originally designed to stop abuse and harassment, but the filter technology that uses it could entrench prejudice against African-American Vernacular English (AAVE). The failure to incorporate AAVE into their algorithms, while at the same time putting out that celebratory YouTube video, suggests that the accomplishments of the Black community matter but that Black dialects do not. Technology will not be the silver bullet solving the problem of content moderation. Neither will sensitivity training nor diverse hiring. Dismantling these structures will require racial literacy and more multifaceted changes.

Search engines and online platforms need to prioritize accurate and trustworthy results and recommendations.

Google’s ad algorithm

Dr. Latanya Sweeney, a computer science professor at Harvard and former Chief Technologist at the Federal Trade Commission, found that results from a Google search of her name ran alongside ads suggesting she’d been arrested. Her research revealed that if a name is “black-identifying,” a Google search is 25% more likely to appear alongside an ad suggesting an arrest record. Advertisers make money when users like or click on ads. While her paper does not prove definitively why this phenomenon occurs, it notes that these ads would only appear if they received more engagement with black-identifying names. These ads could further entrench stereotypes that Black people are more likely to commit crimes, transferring dangerous stereotypes to digital architecture.

Google’s search algorithm

Dr. Safiya Noble’s book Algorithms of Oppression opens with her surprise that between 2010–2012, a Google search for the phrase “black girls” would return links to pornography. She argues that Google’s failure to examine the data and algorithms fueling its search engine risk transferring “old media traditions into new media architecture.”

In 2009, the first Google Images result for the term “Michelle Obama” was an image of the former First Lady altered to include monkey features. Google was reluctant to remove the picture from search results, despite the racist history of comparing Black people to animals — monkeys and apes in particular. Google initially released an apology ad before the blog hosting the picture finally removed the image. If Google and its peers do not remedy and prevent the spread of centuries-old stereotypes and propaganda, we will see their continued proliferation into the digital age. This will continue to fuel racist and bigoted understandings of the Black community.

Noble closes her book by calling for institutions to once again earn the trust of the public and further a fully inclusive democracy. To do so, search engines and online platforms need to prioritize accurate and trustworthy results and recommendations. To say the internet has a huge impact on our society is an understatement. And the data and privacy missteps committed by Big Tech disproportionately affect historically marginalized communities. Until Congress passes meaningful privacy and algorithmic accountability legislation that centers on civil rights principles, companies like Google will need to ensure their platforms aren’t benefiting from the engagement of Black users without protecting their interests.

she/her. attorney. equity, civil rights, labor, media, + tech. (views my own)

Sign up for Pattern Matching

By OneZero

A newsletter that puts the week's most compelling tech stories in context, by OneZero senior writer Will Oremus. Take a look.

By signing up, you will create a Medium account if you don’t already have one. Review our Privacy Policy for more information about our privacy practices.

Check your inbox
Medium sent you an email at to complete your subscription.

The undercurrents of the future. A publication from Medium about technology and people.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store