We Must Fight Face Surveillance to Protect Black Lives

An urgent letter from the Algorithmic Justice League

Joy Buolamwini
OneZero

--

re(Sisters) Unite. Photo: $han

The Algorithmic Justice League is an organization that combines art and research to illuminate the social implications and harms of artificial intelligence. Our mission is to raise public awareness about the impacts of A.I., equip advocates with empirical research to bolster campaigns, build the voice and choice of most impacted communities, and galvanize researchers, policymakers, and industry practitioners to mitigate A.I. bias and harms. More at https://ajlunited.org.

AJL Family,

We are holding space to grieve, to mourn, and we are also full of righteous anger. The murders of George Floyd, Breonna Taylor, Ahmaud Arbery, Nina Pop, and Tony McDade are only the latest in what feels like an endless chain of police and vigilante violence against Black men, women, children, trans folks, and nonbinary people.

Lines from Joy Buolamwini’s poem “Pressure on the Neck

At the same time, we recognize the intense power and possibility in the massive wave of multiracial mobilizations that is sweeping the country, even in the midst of the pandemic and in the face of brutal police repression. People everywhere are organizing to demand structural transformation, investment in Black communities, and deep and meaningful changes to policing in the United States. We know that criminal justice reforms alone are not enough to transform our society after hundreds of years of slavery, segregation, overpolicing and mass incarceration, disinvestment, displacement, and cultural, economic, and political erasure, but they are an important piece of the puzzle.

Yet even now, as we mobilize for Black lives, local, state, and federal police — as well as other agencies such as the Drug Enforcement Administration (DEA), Customs and Border Protection (CBP), Immigration and Customs Enforcement (ICE), and various U.S. military forces — are deploying a wide range of surveillance technologies to collect, share, and analyze information about protesters. Many people understand that mobile phones double as surveillance tools. But law enforcement agencies are also gathering photo and video documentation of protesters by filming protesters with body cameras, smartphones, video cameras, and drones; using both government and commercial software systems to scrape social media for photos and videos; gathering footage from CCTV systems; gathering footage from media coverage; and more. Many police departments, including the Minneapolis police, then analyze that footage — both in real time and in the days and weeks after protests — and use facial recognition technology to attempt to identify individuals.

Since many of these systems have demonstrated racial bias with lower performance on darker skin, the burden of these harms will once again fall disproportionately on Black people.

Face surveillance, the use of facial recognition technology for surveillance, thus gives the police a powerful tool that amplifies the targeting of Black lives. In addition, performance tests including the most recent gold-standard government study by the National Institute of Standards and Technology has found that many of these systems perform poorly on Black faces, echoing earlier findings from the Algorithmic Justice League. So not only are Black lives more subject to unwarranted, rights-violating surveillance, they are also more subject to false identification, giving the government new tools to target and misidentify individuals in connection with protest-related incidents.

We are in the midst of an uprising of historic magnitude, with hundreds of thousands of people already participating and potentially millions taking part in the days and weeks to come. At these scales, even small error rates can result in large numbers of people mistakenly flagged and targeted for arrest. Since many of these systems have demonstrated racial bias with lower performance on darker skin, the burden of these harms will once again fall disproportionately on Black people, further compounding the problem of racist policing practices and a deeply flawed and harmful criminal justice system.

Police are deploying these increasingly sophisticated surveillance systems against protesters with little to no accountability and too often in violation of our civil and human rights, including First Amendment freedom of expression, association, and assembly rights. These harms disproportionately impact the Black community based on historical patterns of discrimination and overpolicing by law enforcement. Such patterns already lead to more frequent stops and arrests on a lesser standard of reasonable suspicion, compounded with other forms of discrimination in the criminal justice system (by judges, prosecutors, and risk assessment tools, among others) against Black lives that further lead to higher incarceration rates, lost job and educational opportunities, loss of livelihood, and loss of life.

The Algorithmic Justice League therefore urges that we rein in government surveillance of Black communities in general and police use of face surveillance technology specifically. We have gathered resources to help organizers include demands to halt police use of face surveillance technology within broader campaigns for racial justice at municipal, state, and federal levels.

We are calling on every community, organization, and politician who is serious about racial justice to specifically include a halt on police use of face surveillance technology, among other broader and sorely needed transformational policies.

This may seem daunting, but the tide is rising. In cities and states across the country, people have organized to successfully block the rollout of face surveillance technology:

On the municipal level, San Francisco became the first city to ban government use of facial recognition technology in 2019 with Oakland and Berkeley following suit. In Massachusetts, Somerville, Brookline, Northampton, Cambridge, and Springfield have successfully halted government use of this technology. Next week, Boston is poised to join the list. This progression demonstrates the domino effect of successful advocacy in one city to influence change in neighboring communities.

On the state level, the state of California has enacted a three-year moratorium prohibiting police from using facial recognition with body cameras, which went into effect in January 2020. The state of New York is currently considering similar legislation prohibiting facial recognition in connection with officer cameras and has also introduced legislation proposing a moratorium on all law enforcement use. The state of Massachusetts is actively considering a broader moratorium on all government use of facial recognition — which would cover police use in addition to other state agencies and officials.

On the federal level, since May 2019 there have been three public hearings on facial recognition technology (linked below) that have taken place in front of the House Committee on Oversight and Reform to examine how facial recognition technology impacts our rights and emphasize its discriminatory impact on Black lives. At the first hearing, Neema Singh of the American Civil Liberties Union (ACLU), Claire Garvie of the Center on Privacy and Technology at Georgetown, David A. Clarke School of Law professor Andrew Ferguson, and former president of the National Organization of Black Law Enforcement Executives Dr. Cedric Alexander all testified along with Algorithmic Justice League founder Joy Buolamwini, who stated:

These tools are too powerful, and the potential for grave shortcomings, including extreme demographic and phenotypic bias, is clear. We cannot afford to allow government agencies to adopt these tools and begin making decisions based on their outputs today and figure out later how to rein in misuses and abuses.

Following this series of hearings, Congress is currently considering several initiatives that would limit the use of facial recognition technology, including legislation that would place limits on the use of face surveillance by federal law enforcement agencies.

Now is the time to build on the momentum of these successful initiatives.

Given the extent to which police power has been militarized and systematically weaponized against Black lives, it is more imperative than ever that we ensure that law enforcement cannot deploy face surveillance technology to suppress protests or infringe on civil rights and liberties.

If you have a face, you have a place in this conversation. The people have a voice and a choice, and we choose to live in a society where your hue is not a cue for the dismissal of your humanity. We choose to live in a society that rejects suppressive surveillance.

We choose to beat the drum for justice in solidarity with all who value Black lives.

Joy Buolamwini, Aaina Agarwal, Nicole Hughes, and Sasha Costanza-Chock for the Algorithmic Justice League

Press contact: comms@ajlunited.org

Resources

Educational materials

Model legislation

Congressional hearings on facial recognition technology

Ongoing campaigns

Organizing toolkits

--

--

Joy Buolamwini
OneZero

Founder Algorithmic Justice League. www.ajl.org | www.poetofcode.com | Telling stories that make daughters of diasporas dream and sons of privilege pause