Dr. Timnit Gebru, a leading A.I. researcher, was let go from her job at Google earlier this week. This response is written by Joy Buolamwini, the founder of the Algorithmic Justice League, a group focused on equitable and accountable A.I.
Before the headlines, the covers, the blockbuster papers, the awards, and Coded Bias, the feature-length film that glimpses our friendship, Dr. Timnit Gebru, Deborah Raji, and I locked arms in sisterhood. This was a sisterhood formed knowing that as outsiders in academic institutions and emerging researchers exploring the limitations of artificial intelligence, we would need each other.
This week Timnit was ousted from Google for demanding research integrity and Deborah was featured in the 2021 Forbes 30 under 30. These cases are examples of how as highly visible and accomplished Black women we live at the intersections of privilege and oppression, praise and evisceration. The contrast of the Forbes recognition and Google’s Gebrugate reminded me of how we were attacked by Amazon for showing that they, like their peers, sold biased A.I. products despite the impact and recognition of our prior research.
Achievement and acclaim, we have learned many times over, do not provide immunity to racism, sexism, misogynoir, intimidation, censorship, or haterade.
The spotlight both shines and burns, and it is imperative that we rally behind one another at all stages of our careers. I am forever indebted to Timnit, who showed up for me long before others understood the value of my work. As I wrote on Twitter, she has always had my back.
When I had the experience of coding in a white mask to have my face detected and decided to change research directions to explore algorithmic bias, it was then Timnit reached out to me as a late-stage PhD candidate at Stanford. She gave me the encouragement and validation to continue my exploration even as some colleagues around me at MIT were indifferent to the work or questioned my abilities. “Are you sure you want to do that kind of work?… It involves a lot of math.” This kind of questioning seemed to dismiss my degree in Computer Science from Georgia Tech achieved with highest honors. Why should I be afraid of math?
I repeatedly heard, “Why are you focusing on dark-skinned faces?” “Why Black women?” These statements revealed how unusual it was to some of my colleagues to center people of color and Black women in particular in A.I. research. Why not center people like me, people like Timnit, people like Deborah?
When I connected with Timnit, she made me feel both seen and heard. Situated at one of the leading computer vision labs in the world, she took me under her wing and helped me refresh the computer vision knowledge I had gained as an undergraduate. She didn’t question my abilities. She expanded them. She didn’t approach my ambitions with doubt. She became first an intellectual companion and soon a dear friend. I am not alone.
Along the way, Deborah (who I call Agent Deb) reached out to me on Facebook after seeing my TED Talk. She shared her emerging interest in issues of bias and representation in computer vision especially as she dove into datasets and saw what I was seeing: pale male data being used to represent all of society. Having just experienced Timnit’s generosity in working with me as a master’s student at the time, it was easy to pay it forward and mentor Deb who was an undergraduate at the time. Agent Deb and Timnit were invaluable supporters as I brought my MIT master’s thesis to completion in the Fall of 2017. It was a joy to co-author Gender Shades (JB & TG, 2018) and then Actionable Auditing (DR & JB, 2019) together with these Face Queens, building upon my thesis concepts, methodology, dataset, and analysis. Beyond talking about algorithmic bias and gawking at datasets, we also showed up to talk each other off the ledge.
“Joy, you can be a poet and do computer science. Don’t let their laughter discourage you.”
They were right, and I went on to produce works like “AI, Ain’t I A Woman?” featured in art exhibitions around the world, sign a book deal with Penguin Random House, combat symbolic annihilation through ad campaigns with iconic brands like Levi, Apple, and Olay, while doing impactful research, influencing legislation, and founding the Algorithmic Justice League.
“Timnit, you have to finish your PhD. You are so close, and the field needs you.”
Dr. Gebru is an all-star researcher whose pioneering work on Datasheets for Datasets, Model Cards for Modeling, along with our joint work on facial analysis technology and much more has shifted how entire industries and research communities approach artificial intelligence. As a co-founder of Black In AI, she continues to pay it forward by ensuring there are meaningful opportunities for Black people often marginalized in A.I.
“Deb, don’t take their criticisms and dismissals as anything more than jealousy. You killing it!”
At 24, Agent Deb has co-authored some of the most visible and impactful papers on algorithmic auditing. Before finishing her undergraduate degree, she was the lead author of the Actionable Auditing paper that put Amazon on its toes. We had so much fun together working on the Black Panther Scorecard and continue to collaborate as she leads algorithmic harms research for AJL’s CRASH project.
I have been so fortunate to share the journey of working towards algorithmic justice with Timnit and Agent Deb.
Our sisterhood demonstrates that we can choose collaboration over competition.
We can choose to affirm each other when others fail to acknowledge our worth.
We do not have to cower to institutions.
And as brilliant Black women, we will always own our power.
We encourage you to do the same.