BIG TECHNOLOGY

Anatomy of a Facebook Privacy Scandal

How Facebook reacts to privacy scandals on the inside and why they persist

It had been a turbulent few weeks for Facebook when Mark Zuckerberg addressed his employees last Thursday. WhatsApp, the crown jewel of Facebook’s messaging empire, was under fire. A poorly worded privacy update had sent its users flocking elsewhere. Millions downloaded competitors like Signal, a messaging alternative that jumped from 20 million users to 40 million just this month.

Zuckerberg’s weekly internal Q&A, the audio of which Big Technology obtained, was an opportunity to identify the root cause and plot Facebook’s path forward. The WhatsApp privacy changes were relatively nondescript, yet the reaction was explosive. So Facebook’s employees naturally wanted to discuss why the company was so distrusted and whether there was anything it could do about it.

“The Facebook brand has become toxic,” said one employee, through a moderator. “What are we doing to improve our brand?”

Zuckerberg listened, paused for four seconds, and then began his answer. “Sure,” he said. “So, a number of things.”

What followed was a revealing look into why Facebook seems stuck in an endless cycle of privacy scandals. The company is willing to introspect but only to a certain point. It wants to build trust with users, but it’s unwilling to reexamine the fundamentals of an advertising business many find intrusive. So the beat goes on.

Zuckerberg, in his answer, first addressed the actual WhatsApp change. “People,” he said, “are understandably sensitive about how their data is used and how it gets connected between different services.” The line touched on a notification that alerted WhatsApp users that Facebook was building a tool for businesses to host their chats with them. Businesses had used their own tools to do this for years, so the ensuing freakout seemed overblown. But the employee question wisely covered Facebook’s brand, not the privacy change itself, forcing Zuckerberg to wrangle with people’s unease with Facebook itself.

Then Zuckerberg got into it. He said that when people feel negatively about Facebook, they tend to attribute those feelings to the corporate Facebook brand. And when people have good experiences, they attribute that to satellite brands like WhatsApp and Instagram. “It’s not sustainable if people attribute all of the good stuff to somewhere else and all the bad stuff to Facebook,” Zuckerberg said. The company had been working on linking its apps to the mothership though, including labeling them “by Facebook.”

Zuckerberg then said Facebook’s brand benefits when people use its products. “When people naturally use our products, they understand the value that they bring,” he said. He cited the early days of Covid as one example when people used Facebook a lot and its brand metrics improved. Let the products do the talking, in other words, and Facebook will rise. “But when there is such an important day-to-day scrutiny of the company,” Zuckerberg said, “it is very difficult to build a positive brand in that environment.”

The two answers’ focus was telling. One noted that Facebook’s satellite brands could raise up Facebook’s core brand even though Facebook’s recent problem took place on WhatsApp. The other essentially argued the way to improve Facebook’s brand was for people to use it more. Left unsaid was the glaring reason behind Facebook’s poor reputation: The fact that since it introduced the News Feed in 2006, it’s made a habit of having your information show up in places you don’t expect.

The core of Facebook’s reputational problems can be traced to its ad system and associated data collection, which wasn’t lost on its employees. In internal posts about the WhatsApp update, some demanded to know more about the WhatsApp data Facebook uses and why it uses any at all. “Most folks I know,” wrote one Facebook employee, “are angry because fb did make a blanket commitment that whatsapp data will never be shared with fb and we didn’t honor it.” In the Q&A, another employee asked whether Facebook collects WhatsApp data for advertising purposes and, if so, how. Zuckerberg said that if people were wondering if the company could read WhatsApp messages, “the answer to that is no.”

Advertisers can still use the phone numbers of WhatsApp users who interact with them to build custom audiences in Facebook’s ad platform, but the conversation didn’t dwell on that. Nor did it discuss whether that practice was wise. That millions of people downloaded competitor products and it didn’t spark a reckoning on that front was telling.

Facebook could be a great, trusted business if it meaningfully paired down its targeting. In a 2018 press call, I shared this sentiment with Zuckerberg. He replied: “People tell us that if they’re going to see ads, they want the ads to be good.”

Today, people are telling Facebook another thing as well. They’ve grown weary of the way it uses data, sometimes to the extent that they’ll look for alternatives. Most often, they come back. WhatsApp downloads picked up again after a few days. But at a certain point, they may not.

Facebook knows its reputation on privacy is a competitive disadvantage. This concern was the impetus behind Andrew Bosworth’s Big Shift memo that encouraged Facebook “to become the undisputed leaders in providing privacy aware software.” But so long as it glances over its ad system when these moments arise, the distrust and endless privacy issues will persist.

Links of the week:

Like many of you, I’ve been obsessed with the GameStop scenario unfolding in the stock market. Here are the two pieces I found the most interesting:

Both of these articles look beyond the exuberance and who actually gets hurt. Worth a read in my opinion.

Dissent: Does TikTok log our keystrokes?

Earlier this month, I spoke with BuzzFeed’s Ryan Mac on the Big Technology Podcast about Facebook and Twitter’s decisions to ban Donald Trump. Anupam Chander, a Georgetown law and technology professor who is focused on this issue, wrote in to set the record straight. I appreciate him sharing his thoughts, and am thrilled to publish them here:

I’m really loving your podcast. Ryan Mac in your interview said he was spooked by TikTok’s “keylogging.” I’ve tried to explain in a variety of venues that that concern was a bit off the mark. The original story is based on security researchers Talal Haj Bakry and Tommy Mysk — who reported here that multiple popular apps (e.g., Weather Network) were constantly reading the iOS clipboard. TikTok immediately stopped doing this and explained that it was an anti-spam measure. To this day, according to those researchers, apps like NPR and Reuters still do it.

Thanks, Professor Chander!

This week on the Big Technology Podcast: Tech After Trump: A Conversation With Marshall Kosloff and Saagar Enjeti of ‘The Realignment’

Trump is out of office, and now the U.S. political system is in for a reset. The Republican Party will decide whether it’s the party of those who objected to the election or those who did not. The Democratic Party, which controls the White House, Senate, and House of Representatives, will have a chance to solidify its identity while grappling with its own divisions.

Marshall Kosloff and Saagar Enjeti, hosts of the podcast The Realignment, speak about the shifting state of U.S. politics on their hit show each week. And while they focus on political change, they spend a surprising amount of time focusing on technology, recognizing the industry’s power. The two joined this week’s Big Technology Podcast to discuss how they see the U.S. political system realigning after Trump and what that will mean for the tech industry.

You can listen on Apple Podcasts, Spotify, and Overcast and read the transcript on OneZero. You can also subscribe to their Substack here: therealignment.substack.com. (PS: I am appearing on their show tomorrow as a guest to talk about GameStop.)

I write the Big Technology newsletter. Sign up here: https://bigtechnology.substack.com.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store