How to Keep Child Predators Out of Virtual Playgrounds Like ‘Fortnite’ and ‘Minecraft’

There’s no perfect fix, but experts on moderation, sexual exploitation, and internet law tell OneZero there is hope

Will Oremus
OneZero
Published in
13 min readJan 16, 2020

--

Illustrations: Ana Kova

OOnline games that are wildly popular with kids, like Fortnite, Minecraft, Clash of Clans and Roblox, have become hunting grounds for pedophiles. Recent reporting suggests that by at least the thousands — and perhaps far more than that — kids are being groomed, cajoled, tricked, intimidated, or blackmailed into sending sexual images of themselves or others to predators who trawl gaming platforms and chat apps such as Discord for victims.

While there is often an element of moral panic at play when a new trend or technology poorly understood by adults is identified as a threat to children, an investigation by the New York Times made clear that the problem is real, widespread, and growing. If you’re still skeptical, a firsthand account of what it’s like to hunt predators from an employee at Bark, a startup that develops parental controls for online apps, illustrates just how pervasive the problem is on social media and messaging platforms. It’s impossible for a parent to read these stories without coming away alarmed about kids’ safety and well-being in the world of online gaming.

What the reporting so far has not made fully clear, however, is what can actually be done about the threat. Most of the companies mentioned in the Times story — including Roblox, Discord, Sony (maker of Playstation), and Microsoft (maker of Xbox and Minecraft, among others) — pointed to at least some measures they have put in place to protect underage users, but few could demonstrate meaningful success. Nearly every approach discussed had obvious shortcomings. And Epic Games, maker of Fortnite, didn’t even respond to requests for comment, from either the Times or OneZero. Supercell, maker of Clash of Clans, did not respond to OneZero, either.

Experts on content moderation, online sexual exploitation, and internet law told OneZero that there is hope for addressing the problem in meaningful ways. It just won’t be easy — and some argue it won’t happen without changes to the bedrock law that shields online platforms from many forms of liability.

--

--