The Enduring Anti-Black Racism of Google Search

How neoliberalism, Google, and the U.S. porn industry coded searching for ‘black girls’ with racism

Photo source: Jesper Klausen/Science Photo Library/Getty Images

When Algorithms of Oppression was published in 2018, it was a landmark work that interrogated the racism encoded into popular technology products like Google’s search engine. Given that many Americans are currently using Google search to try to understand racism after the national uprising sparked by the murder of George Floyd, it’s a good time to remember the architecture they are using to do so is itself deeply compromised — and how that came to pass. This excerpt, from Safiya Umoja Noble’s enduring work, explains why anti-Black racism appears, and endures, in tech products we are told to view as neutral.

On June 28, 2016, Black feminist and mainstream social media erupted over the announcement that Black Girls Code, an organization dedicated to teaching and mentoring African American girls interested in computer programming, would be moving into Google’s New York offices. The partnership was part of Google’s effort to spend $150 million on diversity programs that could create a pipeline of talent into Silicon Valley and the tech industries. But just two years before, searching the phrase “black girls” surfaced “Black Booty on the Beach” and “Sugary Black Pussy” to the first page of Google results, out of the trillions of web-indexed pages that Google Search crawls.

In part, the intervention of teaching computer code to African American girls through projects such as Black Girls Code is designed to ensure fuller participation in the design of software and to remedy persistent exclusion. The logic of new pipeline investments in youth was touted as an opportunity to foster an empowered vision for Black women’s participation in Silicon Valley industries. Discourses of creativity, cultural context, and freedom are fundamental narratives that drive the coding gap, or the new coding divide, of the 21st century.

Neoliberalism has emerged and served as a framework for developing social and economic policy in the interest of elites, while simultaneously crafting a new worldview: an ideology of individual freedoms that foreground personal creativity, contribution, and participation, as if these engagements are not interconnected to broader labor practices of systemic and structural exclusion. In the case of Google’s history of racist bias in search, no linkages are made between Black Girls Code and remedies to the company’s current employment practices and product designs. Indeed, the notion that lack of participation by African Americans in Silicon Valley is framed as a “pipeline issue” posits the lack of hiring Black people as a matter of people unprepared to participate, despite evidence to the contrary.

Google, Facebook, and other technology giants have been called to task for this failed logic. Laura Weidman Powers, of CODE2040, stated in an interview with Jessica Guynn at USA Today, “This narrative that nothing can be done today and so we must invest in the youth of tomorrow ignores the talents and achievements of the thousands of people in tech from underrepresented backgrounds and renders them invisible.” Blacks and Latinos are underemployed despite the increasing numbers graduating from college with degrees in computer science.

Filling the pipeline and holding “future” Black women programmers responsible for solving the problems of racist exclusion and misrepresentation in Silicon Valley or in biased product development is not the answer. Commercial search prioritizes results predicated on a variety of factors that are anything but objective or value-free. Indeed, there are infinite possibilities for other ways of designing access to knowledge and information, but the lack of attention to the kind of White and Asian male dominance that Guynn reported sidesteps those who are responsible for these companies’ current technology designers and their troublesome products.

Framing the problems as “pipeline” issues instead of as an issue of racism and sexism, which extends from employment practices to product design. “Black girls need to learn how to code” is an excuse for not addressing the persistent marginalization of Black women in Silicon Valley.

Who is responsible for the results?

As a result of the lack of African Americans and people with deeper knowledge of the sordid history of racism and sexism working in Silicon Valley, products are designed with a lack of careful analysis about their potential impact on a diverse array of people. If Google software engineers are not responsible for the design of their algorithms, then who is?

These are the details of what a search for “black girls” would yield for many years, despite that the words “porn,” “pornography,” or “sex” were not included in the search box. In the text for the first page of results, for example, the word “pussy,” as a noun, is used four times to describe Black girls. Other words in the lines of text on the first page include “sugary” (two times), “hairy” (one), “sex” (one), “booty/ass” (two), “teen” (one), “big” (one), “porn star” (one), “hot” (one), “hardcore” (one), “action” (one), “galeries [sic]” (one).

In the case of the first page of results on “black girls,” I clicked on the link for both the top search result (unpaid) and the first paid result, which is reflected in the right-hand sidebar, where advertisers that are willing and able to spend money through Google AdWords have their content appear in relationship to these search queries.

All advertising in relationship to Black girls for many years has been hypersexualized and pornographic, even if it purports to be just about dating or social in nature. Additionally, some of the results such as the U.K. rock band Black Girls lack any relationship to Black women and girls. This is an interesting co-optation of identity, and because of the band’s fan following as well as possible search engine optimization strategies, the band is able to find strong placement for its fan site on the front page of the Google search.

Published text on the web can have a plethora of meanings, so in my analysis of all of these results, I have focused on the implicit and explicit messages about Black women and girls in both the texts of results or hits and the paid ads that accompany them. By comparing these to broader social narratives about Black women and girls in dominant U.S. popular culture, we can see the ways in which search engine technology replicates and instantiates these notions.

This is no surprise when Black women are not employed in any significant numbers at Google. Not only are African Americans underemployed at Google, Facebook, Snapchat, and other popular technology companies as computer programmers, but jobs that could employ the expertise of people who understand the ramifications of racist and sexist stereotyping and misrepresentation and that require undergraduate and advanced degrees in ethnic, Black/African American, women and gender, American Indian, or Asian American studies are nonexistent.

One cannot know about the history of media stereotyping or the nuances of structural oppression in any formal, scholarly way through the traditional engineering curriculum of the large research universities from which technology companies hire across the United States. Ethics courses are rare, and the possibility of formally learning about the history of Black women in relation to a series of stereotypes such as the Jezebel, Sapphire, and Mammy does not exist in mainstream engineering programs.

We need people designing technologies for society to have training and an education on the histories of marginalized people, at a minimum, and we need them working alongside people with rigorous training and preparation from the social sciences and humanities. To design technology for people, without a detailed and rigorous study of people and communities, makes for the many kinds of egregious tech designs we see that come at the expense of people of color and women.

Search engine results perpetuate particular narratives that reflect historically uneven distributions of power in society. In order to fully interrogate this persistent phenomenon, a lesson on race and racialization is in order, as these processes are structured into every aspect of American work, culture, and knowledge production. To understand representations of race and gender in new media, it is necessary to draw on research about how race is constituted as a social, economic, and political hierarchy based on racial categories, how people are racialized, how this can shift over time without much disruption to the hierarchical order, and how White American identity functions as an invisible “norm” or “nothingness” on which all others are made aberrant.

The reproduction of racial hierarchies of power online are manifestations of the same kinds of power systems that we are attempting to dismantle and intervene in — namely, eliminating discrimination and racism as fundamental organizing logics in our society. Tanya Golash-Boza, chair of sociology at the University of California, Merced, argues that critical race scholarship should expand the boundaries of simply marking where racialization and injustice occur but also must press the boundaries of public policy so that the understanding of the complex ways that marginalization is maintained can substantially shift. Michael Omi and Howard Winant, two key scholars of race in the United States, distinguish the ways that racial rule has moved “from dictatorship to democracy” as a means of masking domination over racialized groups in the United States.

In the context of the web, we see the absolving of workplace practices such as the low level of employment of African Americans in Silicon Valley and the products that stem from it, such as algorithms that organize information for the public, not as matters of domination that persist in these realms but as democratic and fair projects, many of which mask the racism at play. Certainly, we cannot intervene if we cannot see or acknowledge these types of discriminatory practices. To help the reader see these practices, I offer here more examples of how racial algorithmic oppression works in Google Search.

On June 6, 2016, Kabir Ali, an African American teenager from Clover High School in Midlothian, Virginia, tweeting under the handle @iBeKabir, posted a video to Twitter of his Google Images search on the keywords “three black teenagers.” The results that Google offered were of African American teenagers’ mugshots, insinuating that the image of Black teens is that of criminality. Next, he changed one word — “black” to “white” — with very different results. “Three white teenagers” were represented as wholesome and all-American. The video went viral within 48 hours, and Guynn, from USA Today, contacted me about the story. In typical fashion, Google reported these search results as an anomaly, beyond its control, to which I responded again, “If Google isn’t responsible for its algorithm, then who is?” One of Ali’s Twitter followers later posted a tweak to the algorithm made by Google on a search for “three white teens” that now included a newly introduced “criminal” image of a White teen and more “wholesome” images of Black teens.

What we know about Google’s responses to racial stereotyping in its products is that it typically denies responsibility or intent to harm, but then it is able to “tweak” or “fix” these aberrations or “glitches” in its systems. What we need to ask is why and how we get these stereotypes in the first place and what the attendant consequences of racial and gender stereotyping do in terms of public harm for people who are the targets of such misrepresentation. Images of White Americans are persistently held up in Google’s images and in its results to reinforce the superiority and mainstream acceptability of Whiteness as the default “good” to which all others are made invisible.

There are many examples of this, where users of Google Search have reported online their shock or dismay at the kinds of representations that consistently occur. Meanwhile, when users search beyond racial identities and occupations to engage concepts such as “professional hairstyles,” they have been met with the kinds of images seen above. The “unprofessional hairstyles for work” image search, like the one for “three black teenagers,” went viral in 2016, with multiple media outlets covering the story, again raising the question, can algorithms be racist?

Understanding technological racialization as a particular form of algorithmic oppression allows us to use it as an important framework in which to critique the discourse of the internet as a democratic landscape and to deploy alternative thinking about the practices instantiated within commercial web search. The sociologist and media studies scholar Jessie Daniels argues that it would be more potent and historically accurate to think about White supremacy as the dominant lens and structure through which sense-making of race online can occur. In short, Daniels argues that using racial formation theory to explain phenomena related to race online has been detrimental to our ability to parse how power online maps to oppression rooted in the history of White dominance over people of color.

Often, group identity development and recognition in the United States is guided, in part, by ongoing social experiences and interactions, typically organized around race, gender, education, and other social factors that are also ideological in nature. These issues are at the heart of a “politics of recognition,” which is an essential form of redistributive justice for marginalized groups that have been traditionally maligned, ignored, or rendered invisible by means of disinformation on the part of the dominant culture. You cannot have social justice and a politics of recognition without an acknowledgment of how power — often exercised simultaneously through White supremacy and sexism — can skew the delivery of credible and representative information.

Because Black communities live in material conditions that are structured physically and spatially in the context of a freedom struggle for recognition and resources, the privately controlled internet portals that function as a public space for making sense of the distribution of resources, including identity-based information, have to be interrogated thoroughly. In general, search engine users are doing simple searches consisting of one or more natural-language terms submitted to Google; they typically do not conduct searches in a broad or deep manner but rather with a few keywords, nor are they often looking past the first page or so of search engine results, as a general rule. Search results as artifacts have symbolic and material meaning.

How pornification happened to “black girls” in the search engine

Typically, webmasters and search engine marketers look for key phrases, words, and search terms that the public is most likely to use. Tools such as Google’s AdWords are also used to optimize searches and page indexing on the basis of terms that have a high likelihood of being queried. Information derived from tools such as AdWords is used to help web designers develop strategies to increase traffic to their websites. By studying search engine optimization boards, I was able to develop an understanding of why certain terms are associated with a whole host of representational identities.

The pornography industry closely monitors the top searches for information or content, based on search requests across a variety of demographics. The porn industry is one of the most well-informed industries with sophisticated usage of SEO. A former SEO director for has blogged extensively on how to elude Google and maximize the ability to show up in the first page of search results. Many of these techniques include long-term strategies to co-opt particular terms and link them over time and in meaningful ways to pornographic content. Once these keywords are identified, then variations on these words, through what are called “long tail keywords,” are created. This allows the industry to have users “self-select” for a variety of fetishes or interests.

The U.S. dominates the number of pages of porn content, and so it exploits its ability to reach a variety of niches by linking every possible combination of words and identities (including grandmothers, as previously noted) to expand its ability to rise in the page rankings. The U.S. pornography industry is powerful and has the capital to purchase any keywords — and identities — it wants. If the U.S. has such a stronghold in supplying pornographic content, then the search for such content is deeply contextualized within a U.S.-centric framework of search terms. This provides more understanding of how a variety of words and identities that are based in the U.S. are connected in search optimization strategies, which are grounded in the development and expansion of a variety of “tails” and affiliations.

The information architect Peter Morville discusses the importance of keywords in finding what can be known in technology platforms — and draws attention to what cannot be found, by stressing the long tail phenomenon on the web. This is the place where all forms of content that do not surface to the top of a web search are located. Many sites languish, undiscovered, in the long tail because they lack the proper website architecture, or they do not have proper metadata for web-indexing algorithms to find them — for search engines and thus for searchers, they do not exist.

Such search results are deeply problematic and are often presented without any alternatives to change them except through search refinement or changes to Google’s default filtering settings, which currently are “moderate” for users who do not specifically put more filters on their results. These search engine results for women whose identities are already maligned in the media, such as Black women and girls, only further debase and erode efforts for social, political, and economic recognition and justice. These practices instantiate limited, negative portrayals of people of color in the media — a defining and normative feature of American racism.

Search engine design is not only a technical matter but also a political one. Search engines provide essential access to the web both to those who have something to say and offer and to those who wish to hear and find. The web reflects a set of commercial and advertising practices that bias particular ideas. Those industries and interests that are powerful, influential, or highly capitalized are often prioritized to the detriment of others and are able to control the bias on their terms.

Many people say to me, “But tech companies don’t mean to be racist; that’s not their intent.” Intent is not particularly important. Outcomes and results are important. In my research, I do not look deeply at what advertisers or Google are “intending” to do. I focus on the social conditions that surround the lives of Black women living in the United States and where public information platforms contribute to the myriad conditions that make Black women’s lives harder.

The nature of representation in commercial search as primarily pornographic for Black women is a distinct form of sexual representation that is commercialized by Google. Pornography is a specific type of representation that denotes male power, female powerlessness, and sexual violence. These pornographic representations of women and people of color have been problematized by many scholars in the context of mass media. Rather than offer relief, the rise of the internet has brought with it ever more commodified, fragmented, and easily accessed pornographic depictions that are racialized. Biased traditional media processes are being replicated, if not more aggressively, around problematic representations in search engines.

Although Google changed its algorithm in late summer 2012 and suppressed pornography as the primary representation of Black girls in its search results, by 2016, it had also modified the algorithm to include more diverse and less sexualized images of Black girls in its image search results, although most of the images are of women and not of children or teenagers (girls). However, the images of Black girls remain troubling in Google’s video search results, with narratives that mostly reflect user-generated content that engages in comedic portrayals of a range of stereotypes about Black/African American girls. Notably, the White nationalist Colin Flaherty’s work, which the Southern Poverty Law Center has described as propaganda to incite racial violence and White anxiety, is, at time of writing, the producer of the third-ranked video to represent Black girls.

Porn on the internet is an expansion of neoliberal capitalist interests. The web itself has opened up new centers of profit and pushed the boundaries of consumption. Never before have there been so many points for the transmission and consumption of these representations of Black women’s bodies, largely trafficked outside the control and benefit of Black women and girls themselves.

Reprinted with permission from NYU Press

blkfem scholar of race, gender, tech | asst prof | new media mafia | person to know | tweeting as citizen of the world | rts/favs/flws/links ≠ endrsmnts

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store