Pinterest, a website best known for inspiration boards, has also been hosting sexualized photos of young girls, conspiracy theories, and white supremacist propaganda, a OneZero investigation has revealed.
In the past, the social media platform has been praised for its proactive moderation efforts, but OneZero found numerous instances of harmful content that not only evaded Pinterest’s abuse filters, but were actively recommended by its search algorithm. These findings underscore the risk of the site’s moderation approach, which “hides” dangerous material rather than removing it from the platform.
“We want Pinterest to be a place for inspiration and that means we need to be deliberate about creating a safe and positive space for our users,” a Pinterest spokesperson told OneZero. “If people find content that shouldn’t be on Pinterest, we encourage them to report it to us. We rely on the reports from users to evolve our standards as well as using proactive tools to find and remove content. Additionally, we seek advice from outside experts on how we might improve our policies and enforcement.”
For years, Pinterest has sought to distance itself from other platforms by insisting it isn’t a social network, but a “‘catalog that’s hand-picked’ for users,” according to CEO Ben Silbermann. When Facebook and Twitter have opted to combat disinformation by removing millions of harmful posts and accounts, Pinterest chose another route, launching a search ban on “polluted content” in 2018 that started with anti-vaccination terms, cancer cures, and other health misinformation.
Instead of completely eliminating this type of content, Pinterest’s search engine blocks results for keywords likely to produce misinformation, according to the Wall Street Journal. These efforts were expanded last year, when Pinterest began serving verified health facts in response to vaccine-related searches in a move that echoed Facebook’s and Twitter’s approach, The Guardian reported, and banned hundreds of URLs and groups related to vaccine conspiracy theories. It similarly restricted “plantation wedding” content last year after civil rights advocates called on the company to stop glorifying the heritage of slavery, according to a BuzzFeed News report.
‘A Permanent Nightmare’: Pinterest Moderators Fight to Keep Horrifying Content Off the Platform
Moderators reported seeing child pornography content ‘every couple hours’
Now, a Pinterest search for “vaccine” returns a disclaimer about medical hoaxes, and surfaces dozens of pins linking to helpful information from the World Health Organization and Centers for Disease Control. Users looking for “plantation wedding” ideas may find a warning that “people have reported Pins from this search,” meant to discourage them from thoughtlessly engaging with the material. As OneZero reported last October, Pinterest also developed a recommendation algorithm to improve its homepage experience.
These decisions have resulted in considerable goodwill for the company. Although Pinterest has been in the news recently over alleged discriminatory, retaliatory, and racist remarks by managers within the company, it has so far avoided Facebook-level scrutiny. Instead, the company has received positive coverage for its moderation and curation efforts from outlets like OneZero, The Guardian, and other tech publications.
“Since the reckoning over social networks began in 2016, a popular genre of content has emerged that I like to call Hey, These Search Results Are Bad,” The Verge wrote last year. “Clearly, certain subjects… lead to more stories about bad search results than others. And so I was delighted to see that Pinterest had taken note of this phenomenon — and taken a surprisingly bold step to protect against it.”
Pinterest’s moderation approach isn’t foolproof, however. OneZero found numerous pins in clear and potentially harmful violation of Pinterest’s community guidelines. On Thursday, Caroline Orr, a reporter at the National Observer, also tweeted images of BDSM products next to pins of children’s furniture. Although sometimes hidden from Pinterest users, this material was often easily discoverable elsewhere online.
Some problematic content is organically served to Pinterest users through recommendations based on their browsing behavior. But much of this content can be located through searches conducted while logged out of the platform, and on search engines like Google.
“We block results entirely if we believe they are more likely to be unsafe, cause significant harm or pose an imminent danger,” Pinterest’s spokesperson said. “This blocking happens at the search query level, and doesn’t result in the removal of the underlying content, since not all of the results may violate our policies.”
Pinterest does not allow searches on its homepage until a user has logged in. But this restriction can be circumvented by navigating to an existing user page, such as Pinterest’s own account, which then contains a search bar. Google search offers another workaround, as it links to a major archive of Pinterest content. Boolean search strings (combinations of keywords that produce specific search results on the web) can reveal a host of content that might otherwise be caught in Pinterest’s keyword filters. Although people may not browse Pinterest this way, these loopholes expose the potential risk of Pinterest’s decision to mask, rather than delete, harmful content.
Pinterest told OneZero that logged-out searches “should feature the same blocking and redirection to authoritative resources as logged-in searches.” A logged-out search for “coronavirus” directs users to vetted medical resources, for instance. The company said it also works with search engines to expedite the removal of content that has been removed from Pinterest, but clarified that some harmful material may fall through the cracks.
“While we work hard to identify and take action on this content so it won’t be discoverable, some violating content will sometimes appear in search engine results, even in instances where we have requested that it be removed,” Pinterest’s spokesperson added.
Using these search methods, OneZero discovered sexualized images of young girls hosted on the platform, some appearing to be children. “Sexualization or sexual exploitation of minors, like grooming, sexual remarks, or inappropriate imagery,” is prohibited by Pinterest’s guidelines. OneZero did not confirm the age of the individuals in these photos, which were posted under a known code name for child pornography. Regardless, OneZero has reported the images to Pinterest and the National Center for Missing & Exploited Children.
OneZero found numerous pins in clear and potentially harmful violation of Pinterest’s community guidelines.
The Department of Justice notes that generic terminology is often used to conceal child pornography and “prolong [its] existence” online. In 2013, Utah’s Internet Crimes Against Children Task Force — part of a national network of agencies working to prevent child exploitation online — began investigating 12 cases of people posting child pornography to Pinterest boards after the company reported the material to the National Center for Missing and Exploited Children. As of May 2013, the last time these investigations were publicized, charges in those cases were pending, and the Internet Crimes Against Children Task Force did not immediately respond to OneZero’s questions about their current status.
In addition, OneZero found pins promoting various medical misinformation — claims that vitamins can prevent cancer, that 5G causes the coronavirus, and propaganda for the “medical freedom” anti-vaccination movement. This includes links to personal health blogs, a YouTube video falsely claiming that Bill Gates is responsible for the coronavirus, anti-vaccination merchandise, and entire Pinterest boards dedicated to the myth that vaccines cause autism, despite the fact that the company has taken a hard-line stance against medical misinformation. Earlier this year, Pinterest rolled out custom search results to combat hoaxes about the coronavirus, and the platform says it does not allow users to save (and thereby amplify) certain health advice. While these restrictions may shield users from seeing certain material on Pinterest’s platform, the company is still hosting that material, which means it can be encountered through a Google search. Searching Google Images for “5G” and “coronavirus” on Pinterest reveals a deluge of images and links to YouTube videos that propagate this particular myth.
After OneZero viewed a board containing anti-Black and pro-Ku Klux Klan pins, our homepage surfaced a link to an article glorifying the Confederate flag and pins of Confederate flag imagery.
OneZero also found YouTube videos, Soundcloud streams, and merchandise on Pinterest promoting QAnon ideology. Unlike Facebook, which has removed QAnon-affiliated pages for “targeted inauthentic behavior,” this type of content is readily available on Pinterest. While the company appears to have blocked the acronym “WWG1WGA” (referring to the QAnon slogan: “Where we go one, we go all”), it has not blocked the alternative “WWG1WGALL,” which Pinterest’s search bar actually autocompletes when logged in. Pinterest also hosts books and imagery related to Tempel ov Blood, a group associated with the Order of the Nine Angles, a Satanic, neo-Nazi group linked to threats of extremist violence.
In its guidelines, Pinterest prohibits “Support for hate groups and people promoting hateful activities, prejudice, and conspiracy theories.” However, a significant amount of white supremacist and far-right content has been shared on the platform. This ranges from racist commentators to anti-Black and Islamophobic imagery to anti-Asian “kung flu” memes. For example, there are numerous pins linking to videos from Stefan Molyneux, a Canadian podcaster who routinely espouses “race science” and other white supremacist ideology. Molyneux was recently banned by YouTube and Soundcloud for hate speech and was suspended by Twitter for spam and platform manipulation.
Dangerous DIY Sunscreen Recipes Are Spreading on Pinterest
Misplaced concerns about chemicals in over-the-counter sunscreens are fueling a new problem on social media
Pinterest’s recommendation algorithm also appears to elevate racist content. After OneZero viewed a board containing anti-Black and pro-Ku Klux Klan pins (while logged into an account), our homepage surfaced a link to an article glorifying the Confederate flag and pins of Confederate flag imagery. As OneZero has previously reported, Pinterest’s algorithm allows users to control what appears on their homepage by “turning off” boards, histories, and topics that influence these personalized feeds. For power users, this is a useful step, but casual users may be less inclined to customize their experience, and there are other ways that Pinterest places harmful content at people’s fingertips.
“Generally speaking, we limit the distribution of or remove hateful content and content and accounts that promote hateful activities, false or misleading content that may harm Pinterest users or the public’s well-being, safety or trust, and content and accounts that encourage, praise, promote, or provide aid to dangerous actors or groups and their activities,” Pinterest’s spokesperson said of the company’s guidelines. Pinterest does not have specific policies regarding hyperpartisan content, however, which may or may not fall under the aforementioned rules.
Mike Caulfield, a digital literacy expert at Washington State University Vancouver, has documented problematic content on Pinterest over the years. In 2018, he showed how a search for cooking “the perfect egg” led to pins about fringe conspiracy theories within 14 minutes of browsing. Caulfield theorized this was due to the way Pinterest uses boards to construct a model of your various interests. Indeed, users have gamed this system by sharing benign content alongside more extreme stuff, or as Daily Beast reported in 2018, pancake recipes next to QAnon memes. When OneZero viewed a pin linking to a video about 5G and the coronavirus, Pinterest recommended related content including “deepstate secret kill lists,” “Satanic elite bloodlines,” and biological warfare.
Pinterest’s autocomplete feature also nudges logged-in users toward troubling content. Typing “Chinese virus” autocompletes with “memes” and “jokes.” Meanwhile, an entry for “AOC” suggests “stupid memes.” Pinterest is aware of these effects. During the 2016 presidential election, it was accused of proliferating disinformation that originated on other platforms, such as Facebook and Twitter. It was again blamed for hosting polarizing and hyperpartisan memes during Canada’s 2019 federal election. The Atlantic Council’s Digital Forensics Research Lab, which studies disinformation online, revealed that “by clicking on only a single hyperpartisan and often hostile meme, the platform would recommend other politically intense memes.” Clicking on a single anti-Justin Trudeau meme led to a much larger network of far-right conspiracy theories.
The upcoming presidential election will be a stress test for Pinterest’s algorithms. Even now, a search for “election fraud” surfaces content that overwhelmingly supports false claims about mail-in ballots.
“Political content is not a popular use case on Pinterest, and we don’t want to become the go-to place for political news and debates,” Pinterest’s spokesperson said. The platform does not allow political campaign ads, and says it’s working to address “adversarial behavior.” Earlier this year, Pinterest also updated its policy to ban misinformation about voting and the census, reported the Washington Post.
In the past, Pinterest has consulted the advice of companies like Storyful, a social media intelligence firm, which helped to inform its decision to block certain cancer-related searches. As OneZero previously reported, Pinterest has also welcomed the expertise of researchers like Caulfield, whose findings led to additional measures around combatting vaccination misinformation.
Will Oremus contributed reporting to this story.
Have a tip about Pinterest? You can contact Sarah Emerson securely on Signal at +1 510 473 8820, or email email@example.com.