Facial Recognition Makes Changing Your Name Pointless
Whatever precautions I’ve felt necessary to protect my privacy, hiding my face from the world has never been one of them
Almost 20 years ago, I started a new life for myself. I was 18, in college, and newly enamored with a trend known as “indie porn” — and I jumped headfirst into exploring the scene, becoming a part of a vibrant online community that was challenging assumptions about what erotic media could be.
For all my enthusiasm, I was very aware that my interest in pornography — however feminist, thoughtful, and social justice-minded it might have been — wasn’t particularly on-brand in my other life as an Ivy League college student. But that didn’t feel like too much of a hurdle to overcome: All I had to do was just pick a different name, one that would separate my online self from the woman carving out a life in college.
Throughout my adult life, I’ve used different names to explore different parts of myself. In my indie porn days, I was Lux Nightmare; as a roller derby skater, Joey Hardcore was my name. And then, of course, I built a career as the writer, Lux Alptraum. All of these women have lived separate lives, bleeding together only when I wanted them to. And yet they’ve all shared the same face. Whatever precautions I’ve felt necessary to protect my privacy, hiding my face from the world has never been one of them.
But all these years after that initial rebranding, I’m less secure, less safe, and less in control of my identity than ever before. In mid-January, the New York Times published a profile of Clearview AI, a company whose facial recognition technology helps ID unknown people through their online images. The practice has the potential to, as journalist Kashmir Hill put it, “end privacy as we know it.” But it’s not just privacy I’m worried about losing, in the sense of wanting to keep certain things out of the public eye. I’m terrified that the ability to present myself to other people on my own terms, the ability to control how much information I’m sharing about myself when I put myself out in the world, has been utterly eradicated.
That freedom to self-define and redefine the self is directly at odds with the world that major tech companies have built for us.
Most of us are aware of the concept of “code-switching,” which describes our tendency to present different versions of ourselves in different situations. Even people whose identity shifts aren’t as extreme or as bifurcated as my own nevertheless understand that there are different selves that we reveal in different contexts and communities. The recent Dolly Parton Challenge meme, in which participants share four photos that represent the radically different ways that they present themselves on LinkedIn, Facebook, Instagram, and Tinder, perfectly encapsulates this concept. We all want the ability to be slightly different people in different parts of our lives. But that freedom to self-define and redefine the self is directly at odds with the world that major tech companies have built for us.
The internet that I valued in my youth, the one that let me iterate endless identities without worry that they’d automatically be linked to one another, is directly at odds with infrastructure of the modern internet, which places a high premium on being able to track us as we move across the online world and out in the real one, connecting our various iterations into one cohesive unit. It’s not merely because corporations want to monetize our data or better tailor their advertising (although it is partly because of that). It’s also because keeping tabs on us is one of the best ways these organizations can think of to promote safety and crackdown on abuse.
“There’s a very valid argument to be made that [tracking users] is necessary for the prevention of abusive behavior,” says a senior product strategist who has worked for major tech companies and asked that his name be withheld. In his work, he’s seen the way that companies are drawn to personally identifiable information as an abuse prevention strategy. An abusive user who’s identified only by an email address can easily create a brand new email address once they’ve been banned from your service. It’s much harder, however, to create a brand new legal name — indeed, the ability to monitor abusers has been one of the justifications that Mark Zuckerberg has put forth to defend Facebook’s real name policy.
The flip side of this strategy is that every mechanism that makes it easier to track bad actors — whether it’s a real name policy on Facebook, the numerous sites online that require a Facebook account for login, or law enforcement using Clearview AI’s facial recognition software to crack cases and track down suspects — also makes it easier to track and follow people who are vulnerable to harassment and abuse.
It is not surprising that in discussions of technology like ClearView AI, this risk is often overlooked. “The people who are most contributing to tech right now are contributing with hetero, cisnormative, often white perspectives,” says Morgan Klaus Scheuerman, a PhD student in information science at the University of Colorado-Boulder whose research focuses on the ways that technology design can harm transgender people and other marginalized populations. Although major tech companies are slightly more diverse than they were a few years ago, white people still dominate the field, and women make up barely a quarter of employees at companies like Apple, Google, Amazon, and Microsoft.
It’s not just that the people at these companies aren’t representative of the U.S. at large. As a generally privileged group, they’re also less susceptible to stigma and abuse, and thus less likely to understand the threats posed by the inability to escape being tracked.
“There’s this idea that if you have nothing to hide… you shouldn’t be concerned about [tracking], says Scheuerman, noting that view ignores the realities of what it means to go through life as a marginalized person. And as tracking systems become harder and harder to escape, shifting from easily abandoned email addresses to names to the very faces that define who we are, our ability to control our own identities slips through our fingers like grains of sand.
My penchant for swapping names depending on my circumstances may seem like a frivolous one, something worth sacrificing if it means gaining the ability to solve major crimes. But for many people, the ability to operate under a different name can come with far higher stakes — like escaping an abusive partner, or protecting one’s children from the stigma of one’s work or personal pastime, or being able to engage in protest movements without fear of being immediately outed and identified. If software like Clearview AI’s is integrated into our everyday lives, many more of us will lose the ability to determine who we are in each and every moment, to decide for ourselves who gets to see which part of us in any given situation. And that’s something we shouldn’t be willing to give up.