Facebook Groups Can Be Fixed

The company must find a solution that balances discussion and dialogue with the need to protect users and advertisers from hate speech

Photo: SOPA Images/Getty Images

This week, Sheryl Sandberg announced that Facebook would publish the results of its long-awaited civil rights audit. According to a post by Sandberg, the two-year-long audit process “has had a profound effect on our culture and the way we think about our impact on the world.”

Facebook’s system of closed Groups is central to that impact. Since the company launched its Pivot to Privacy in 2016, Groups have emerged as one of the platform’s most popular — and most touted — features. In a best-case scenario, they preserve some of the weird, wonderful spirit of the early internet. In a worst-case scenario, they serve as bastions of bigotry, racism, and white supremacy — and as a convenient way for Facebook to learn what you care about, and use that knowledge to hit you up with targeted ads.

“Can anyone sex my duck?”

In most contexts, that kind of statement would attract raised eyebrows, and possibly a call to the authorities. But on the BackYard Chickens Facebook group — which has 513,000 members and attracts north of 1,000 posts per day — it’s standard fare (the original poster was looking to determine if their baby duck was male or female).

Many people have responded to the Covid-19 pandemic with a newfound focus on the domestic — growing their own food, baking bread, and even embracing home organization. My family already did all those things pre-pandemic. So in an attempt to up the ante — and to help my three-year-old through months spent in isolation — we bought Anna and Elsa, two four-week-old chickens.

I had chickens growing up before they had the hipster-chic appeal they do today. Back then, there were no chicken diapers or $10,000 designer coops. And if you wanted to learn how to care for your flock, you referenced printed tomes like Chickens in Your Backyard, with drab covers and prose written for farmers, by farmers.

Today, things are different. Any time of the day or night, you can turn to a Facebook group like BackYard Chickens for answers to all your poultry-related queries. Want to know the best kind of chicken water dispenser? Concerned you mistakenly purchased a rooster? The group’s members are all too happy to offer guidance, often in intense detail.

There’s a strange, somewhat meditative appeal to these groups. In a time where filter bubbles constantly berate you with highly spun political news, and every other article seems to share some kind of horrifying update about Covid-19, it’s strangely comforting to start a vigorous, friendly debate among strangers about what breed of chicken you’ve purchased (Anna is almost certainly a Barred Rock, but the jury is still out on whether Elsa is an Orpington or Americana).

In many ways, Facebook’s Groups hearken back to an earlier, seemingly more innocent time in the internet’s history. I’m too young to have experienced Usenet groups. But I clearly remember the early days of AOL, when using the internet meant playing in the company’s walled garden of chat rooms and interest groups.

As the internet shifted in the 2000s — first toward private websites, and then toward social media — much of the funky, grassroots appeal of its early days were lost. Services like Twitter and Facebook prioritized public, announcement-style speech over private discussions and dialogue.

In the late 2010s, however, there was a major backlash against the internet’s new direction. Scandals like Cambridge Analytica’s Facebook-mediated meddling in the 2016 presidential election highlighted the extent to which social networks had been overwhelmed with sponsored, polarized, and sometimes entirely fake content. Mark Zuckerberg’s grilling by the Senate was a highly visible expression of this collective angst and made Big Tech’s maleficent side a topic for dinner table discussion.

Taking conversations out of the public eye removes an important element of scrutiny and accountability that’s otherwise present on competing networks like Twitter.

Against that backdrop, Facebook announced a major overhaul of its platform. The company would refocus its efforts on content from friends and family and place more emphasis on private discussions. In Zuckerberg’s own words, Groups would be “at the heart of the experience.” The company saw them as embodying the spirit of genuine, person-to-person communication and as a way to lure users back to its platform.

They began to push Groups aggressively. In 2019, the company launched its More Together ad campaign, with Groups as a major focus. The campaign emphasized real groups, highlighting everyone from marathoners to dog lovers. The company’s ads feature smiling, glossy portraits of each group’s members, showing their quirkiness and authentic, down-to-earth appeal.

At first, the campaign seemed to be working. By late 2019, the platform was enjoying consistent growth in both its user base and its revenues. But despite appearances, Facebook Groups are not always the well-meaning, wholesome places the company likes to depict. In the last month, major problems with Facebook’s Groups have begun to emerge. They’ve resulted in a flight of advertisers and businesses that could threaten the company’s very existence.

According to a major investigation by Wired, Facebook groups have been exploited by entities ranging from white supremacists to those pushing extreme political agendas, domestic terrorism, and health misinformation. Covid-19 added fuel to this fire, revealing the ways that Groups coordinate to send traffic to fake news websites and boost their reach. One report revealed that Groups were peddling fake suicide pills to those at risk of taking their own lives. They’ve also been revealed for silencing Black members and promoting racist attitudes.

Wired’s conclusion was that Groups’ private nature is ultimately its undoing. Taking conversations out of the public eye removes an important element of scrutiny and accountability that’s otherwise present on competing networks like Twitter. They also lack moderation and curation, something that was present even in the early days of AOL’s chat rooms, when the company employed armies of paid moderators to keep things civil.

According to a report in the Wall Street Journal, Facebook has known about these tendencies of the Groups platform since at least 2016. Only now — with a public outcry and a mass boycott by over 400 advertisers, who are tired of their messaging appearing beside conversations by Nazis and white supremacists — is the company beginning to act. Much of this may be driven by its own staff. According to survey company Blind, 42% of Facebook employees are concerned about the advertiser exodus. The company has begun by removing over 220 so-called “Boogaloo” groups, which it says incite violence.

The culling will almost certainly continue. In 2018, YouTube faced a similar advertiser exodus, which video creators like me still refer to as the Adpocalypse. Overnight, YouTube parent company Google cut millions of channels from its Partner Program and began to use A.I. and human moderation to weed out inappropriate content.

It was a major shock to millions who rely on the platform for their income. But over time, the changes worked. Advertisers came back, and the removal of low-quality channels meant more revenue for professional creators. My own CPM, the rate I’m paid per 1,000 views of my videos, more than doubled.

Even in the best of times, Groups don’t exist for their members. They exist for Facebook.

Problems remain with the new changes on YouTube. Creators who focus on political topics like LGBTQ rights can risk being cut out of earnings. And Google’s A.I. can be flawed, flagging videos like my review of a popular vacuum cleaner as inappropriate because they use words like “trigger” or include sudden noises that the platform thinks might be gunshots. But overall, the changes at YouTube have been successful and provide a blueprint that Facebook will likely follow for Groups.

Will it work? I’m dubious. YouTube is fundamentally a public-facing platform. Recovering from the Adpocalypse meant making tweaks to the platform’s criteria for acceptable content, not reconsidering its very reason for existence. Actually fixing Groups would require the latter. Why? Because even in the best of times, Groups don’t exist for their members. They exist for Facebook.

Unlike the chat rooms of the internet’s early days, every Group you engage with is monitored, and your Group memberships are recorded by the company. These engagements are then used as ever more grist in Facebook’s advertising mill, allowing the company to learn about your desires and preferences. It then uses this data to target ads to you.

Advertisers are even offered the option to target their ads to members of a specific group. If I create a chicken-related product, I can easily build a Facebook ad campaign that targets my advertisements to verified members of the BackYard Chickens group. I can drill down further, for example, targeting group members who live in California and are a certain age and gender.

Joined a Facebook Group to discuss a new sport or hobby during Covid-19? Expect to begin seeing related ads as soon as minutes or hours after joining. While it’s unclear whether it still does so, as of 2016 Facebook used its targeting data to reach users with ads beyond its own platform as well. So if you start discussing Quidditch on a Facebook Group today, you might see ads for broomsticks and Bludgers the next time you read a news article or browse an unrelated blog.

This fundamental purpose of Facebook’s Groups — mining your interests for ad data — will make it hard for the company to reform, even if it weeds out the hate speech and other related dreck. Even when your Group memberships are entirely wholesome, they’re still primarily there to inform Facebook about your interests and help it sell you products.

Facebook could potentially change this aspect of Groups. Its Whatsapp platform already uses end-to-end encryption, which means that even Facebook can’t access the contents of the conversations taking place there. And the company has promised to introduce end-to-end encryption on more of its products (including Messenger) but has thus far failed to actually do so. Even without encryption, the company could commit to avoid using Groups data to target ads. In 2017, Google made a similar move, committing to stop scanning Gmail conversations for ad targeting.

With Groups occupying an increasingly important place in Facebook’s overall business model, though, it’s unlikely the company will make such a move. It’s already likely spent a fortune advertising Groups and pushing its members toward using them.

Cutting off such a crucial source of user data would be a major hit to Facebook’s ad targeting models. Advertisers might come back to the platform if Facebook can guarantee that their ads won’t accompany racist or otherwise inappropriate content, as they did to YouTube. But they’re unlikely to come back if Facebook’s ads don’t work or are poorly targeted to users. Removing targeting data from Groups would likely make Facebook’s targeting algorithms less effective.

Groups reveal some thorny problems with the modern internet.

How can Facebook clean up its groups, while keeping its users and advertisers happy? As Wired suggests, it could begin by making all large groups public. If you’re forming a book club or taking your support group online during Covid-19, you could remain private. But if you’re coordinating the conversations of tens of thousands of users, Facebook could make those conversations public by default. This would shine the cold light of public scrutiny on racist or bigoted content, adding a crucial element of accountability.

And it could make members more self-aware that their conversations are taking place in a public forum and could be used to sell them products or otherwise reach them with commercial information — both by Facebook and others. Ad targeting is not necessarily a bad thing — many users (myself included) are perfectly willing to accept targeted ads if they mean access to free online services.

But this should be done in a transparent way so that users at least know what they’re trading away. Facebook could even commit to avoiding targeting ads to tiny, personal Groups. Join a Group to plan Grandma’s virtual birthday party, and you wouldn’t see ads for Zoom. But join a discussion of the show Normal People, and you might see ads for Hulu.

I hope that Facebook can find a solution that balances discussion and dialogue with the need to protect users and advertisers from hate speech — and to avoid violating their privacy. Groups reveal some thorny problems with the modern internet. But they also preserve a crucial element of its reason for existence — bringing together disparate individuals to engage over the topics they’re passionate about.

I don’t want to stumble on Nazi propaganda while scrolling through my news feed. And I don’t want my activities used to target ads without my knowledge. But I do want a friendly, constructive place to discuss my chickens, analog film cameras, and the other esoteric things I care about.

There are millions more users like me. If Facebook can address our needs, Groups will truly be a unique and game-changing addition to the company’s business model and today’s internet.

Co-Founder & CEO of Gado Images. I write, speak and consult about tech, privacy, AI and photography. tom@gadoimages.com

Sign up for Pattern Matching

By OneZero

A newsletter that puts the week's most compelling tech stories in context, by OneZero senior writer Will Oremus. Take a look.

By signing up, you will create a Medium account if you don’t already have one. Review our Privacy Policy for more information about our privacy practices.

Check your inbox
Medium sent you an email at to complete your subscription.

The undercurrents of the future. A publication from Medium about technology and people.