‘Among Us’ Has a Moderation Problem

The game’s sudden popularity caught the small development team by surprise — and now there’s a lot of work to be done

Eric Ravenscraft
OneZero

--

Photo: IGDB

At any given moment, there are tens of thousands of people playing Among Us on Steam, the largest platform for PC gaming. Many more join them on their mobile phones and Nintendo Switches. All told, half a billion people reportedly play the game every month. It’s a sprawling player base that’s largely based around text interactions — you complete tasks as a team of colorful astronauts, convening every now and again to root out imposters attempting to sabotage your mission.

It’s also almost entirely unmoderated. Innersloth — the four-person team behind the game — is working with limited resources to build crucial moderation tools and keep bad actors out. Despite promises that such tools would be implemented, Among Us currently contains no basic reporting mechanism or blocking tools to keep gamers — especially teenagers and children — safe.

Online multiplayer games from publishers with larger staffs and more years of experience tend to use robust moderation tools to prevent abuse on their platforms (with varying levels of success). Roblox uses an automated system to filter harsh language and intervene when it appears that a player is trying…

--

--

Eric Ravenscraft
OneZero

Eric Ravenscraft is a freelance writer from Atlanta covering tech, media, and geek culture for Medium, The New York Times, and more.