On a good day, Blake’s job as a Pinterest moderator meant clicking through an endless dragnet of fetish material, adult nudity, and “sexy lady” miscellany. Hide, delete, strike, or ban? These decisions took mere seconds. In fact, the pornography label was considered so straightforward that new hires were reportedly trained on it.
But one day in 2017, Blake discovered something horrifying. Within a network of sexual child images sat an innocently named Pinterest board. “Something like ‘My Pics,’ or ‘My Computer,’” Blake told OneZero. The board was set to secret, visible only to its owner. It contained photos of an infant and a toddler, “nothing violating in any sense, but photos you’d take if you were babysitting,” Blake says. But because the board orbited other accounts that posted child abuse content, Blake suspected these children were at risk of sexual exploitation. They escalated the issue to a manager, asking if Pinterest should investigate the user’s IP or report the photos to law enforcement.
But to their dismay, Blake said they were told that “no imminent threat” was present. “I remember being like, ‘What the fuck.’” The account was ultimately banned for violating Pinterest’s policy against inappropriate images of minors, yet “no external or even further internal actions were taken.”
OneZero spoke to three former Pinterest moderators, or trust and safety specialists, who worked at the company between 2016 and 2019. These individuals served as frontline workers, keeping harmful content at bay. According to their testimonies, they had inadequate resources and minimal mental health support. They worked full-time as contractors for Pinterest’s Trust and Safety team, earning between $25 and $32 per hour. All three signed nondisclosure agreements (NDAs) with Pinterest, and because of that, OneZero is referring to them using pseudonyms.
The pornography label was considered so straightforward that new hires were reportedly trained on it.