Listen to this story
Twitter is giving the devil his due.
After a OneZero article about a Satan parody account whose owner felt his tweets were being unfairly hidden from public view, the company acknowledged late last week that a feature intended to limit trolls had been mistakenly affecting some legitimate accounts, including his. Twitter would not say how many other users were inadvertently caught up by its algorithm, which limits the public visibility of certain accounts based on their behavior and other signals, even if they haven’t broken the platform’s rules.
“Upon further investigation, we realize some of these systems were impacting people using Twitter in a healthy way and so we adjusted them,” a Twitter spokesperson said in a statement. “Thank you for surfacing this to us, we’re always working to improve.” The owner of the Satan account, a 26-year-old British man named Michael, confirmed to me that his account had been restored to good standing and has been rapidly gaining followers since then.
The snafu highlights how Twitter’s efforts to automate troll-fighting can quietly go awry, with little recourse for those affected. And it isn’t the first time.
Twitter has been besieged for years by spam bots, porn bots, and human users who use the site to spew hate or harass people. Its human content moderators have proven unable to stem this tide of ugliness, so last year it tried something new: an automated filtering system. The goal of the system — which Twitter has never given a formal name — is to make the site feel friendlier and healthier by downgrading accounts that show signs of obnoxious or spammy behavior. It can downgrade them in different ways, such as burying their replies behind a warning at the bottom of threads, or preventing them from showing up in search results by default. Their followers can still see their tweets, but they become largely invisible to the average user. (When it launched, I dubbed it “Twitter purgatory,” a term that Twitter PR didn’t appreciate.)
Notably, Twitter doesn’t notify users affected by the system, and it won’t disclose its inner workings. That became a problem just weeks after its launch, when Vice reported that some prominent conservatives, including several Republican members of Congress, were not showing up in the platform’s search suggestions. Twitter quickly fixed the issue, but not before the report sparked an outcry and even Congressional hearings, in which Twitter was accused by conservatives of political bias — a charge it has repeatedly denied.
Vice called the practice “shadow banning,” and the term has stuck among Twitter’s conservative critics, even though it’s slightly misleading: The users affected don’t become invisible; they just become harder to find.
Last month, a series of direct messages from the popular Satan parody account (Twitter handle @s8n, 887,000 followers and counting) prompted me to look into why the company’s systems appeared to be hiding him from replies and search results. The account’s owner, Michael, was desperate, having spent weeks trying, without success, to contact anyone at Twitter who could tell him what was going on or why. Without commenting directly on his account, Twitter implied to me, at the time, that Satan had been limited because he was linked to other accounts that violated the platform’s rules — an accusation that Michael denied.
But it appears that story eventually prompted Twitter to take a closer look at how the feature was affecting certain accounts, including his. Two weeks after it was published, I got another direct message from Michael: “I have some good news, I’m pretty sure my account has been fixed.” A quick check confirmed that Satan once again showed up in public search results, and that his replies to other people’s tweets were no longer hidden behind a warning message at the bottom of the thread.
The recurrence of a problem similar to the one that sparked the “shadow banning” outcry suggests that the company’s opaque filtering algorithm has a penchant for overreaching.
Twitter confirmed to me last week that there had been a mistake. Specifically, the company said it found that too many accounts were being flagged by an aspect of the software that looks for potential links to other spammy or abusive accounts. That is, accounts that hadn’t actually broken Twitter’s rules themselves could still find themselves filtered from public view through a kind of guilt by association. It seems that some of those links to other, more problematic accounts turned out to be rather tenuous, prompting Twitter to adjust the system.
Twitter said that it believes that the platform’s systems are working properly now, but that it continues to monitor them.
Still, the recurrence of a problem similar to the one that sparked the “shadow banning” outcry suggests that the company’s opaque filtering algorithm has a penchant for overreaching. And the lack of transparency means that such missteps can go unnoticed and unaddressed even as those affected cry out for help or an explanation. In both instances, Twitter fixed the system only after the problems were reported in the media, which happened only because they involved high-profile accounts.
At least one other popular parody account, a cat-centric account called @catsu with close to 900,000 followers, told me it believes it, too, has been unfairly affected by the search-suggestions filter. “I don’t think it’s particularly fair to hide my page from people searching,” the account’s owner told me via direct message. “I mean, I literally just post cats!”
The owner of another, smaller account, @TweetOfSpirit, told me they too had been limited for reasons they didn’t understand. Both said they were still being affected, as far as they could tell. Twitter could not immediately comment on either one.
Michael, meanwhile, told me his Satan account had stagnated for months due to the filtering, but has taken off since returning to Twitter’s good graces. It has gained some 40,000 followers since the OneZero story was published on May 3. The additional engagement allowed Satan to help a friend reach a $6,000 GoFundMe goal to pay for a life-saving blood transfusion for her cat.
While Satan told me he appreciated Twitter getting the issue sorted out, he added that it was “a shame it took five months and an article to fix things.” At the risk of playing the devil’s advocate, I agree.