How Facebook Can Fix Itself

A former spokesperson for the social network says the stakes are too high for employees not to advocate for change

Barry Schnitt
OneZero

--

Photo: Lionel Bonaventure/Getty Images

Dear Facebook employees,

Honestly, I want the best for Facebook. I poured my heart and soul into the company from 2008–2012. I also want the best for the world, though. I know you do, too. I don’t think those things have to be in conflict. I wrote some thoughts on how I think we get those things more aligned, in case it’s helpful to you. None of this is revolutionary, quite the contrary. Much of it has been said by others but I don’t think it hurts to be restated.

I’ll start with my own admission. In 2009, I said, “We believe in Facebook’s mission that giving people tools to make the world more open is a better way to combat ignorance or deception than censorship.” It turns out that I was wrong. First of all, it’s a false choice. There are more options than just being “open” and “censorship.” Most importantly, though, it’s become obvious in the 11 intervening years that the opposite is actually true. The more successful Facebook is in accomplishing its mission, the more ignorance and deception there appears to be in the world. There is definitely a correlation here. Unfortunately, I also believe there is more than a little causation.

How did we get here?

In part, Facebook’s biggest strengths are also its biggest weaknesses. Early on, Facebook focused on the connection. If you get a person connected to another real person they actually know, a lot of other problems go away. Friends are much less likely to scam you, be inappropriate, or annoy you than strangers. Also, there is a good chance you’ll be interested in the content they share. Unfortunately, it doesn’t solve everything, because, you know what? Your Uncle Daryl isn’t a doctor, doesn’t know shit about vaccines, and is easily misled by others on the topic. This is doubly bad because of the connection. You’re more likely to believe misinformation from Uncle Daryl than from strangers.

Adding gasoline to the fire is Facebook’s sophisticated content system. Using signals from billions of people and untold pieces of content, it knows what content people will find engaging. You know what’s engaging as heck? Wild conspiracy theories and incendiary rhetoric. Put together a piece of content that comes to you from a trusted source (i.e., your friend) and Facebook making sure you see the really tantalizing stuff, and you get viral misinformation. That’s why Facebook’s system is so susceptible to it and why it spreads so quickly. When the integrity of your entire system is based on the quality of the connection and not the quality of the information, the forces of misinformation see a vulnerability — and they are exploiting it aggressively. The Plandemic movie was a recent and tragic example.

It has been said that a lie gets halfway around the world before the truth has a chance to put its pants on. Now, Facebook’s speed and reach make it more like a lie circles the globe a thousand times before the truth is even awake. This is no accident. Ironically, the one true conspiracy theory appears to be that malevolent nation-states, short-sighted politicians, and misguided interest groups are using conspiracy theories to deliberately misinform the public as a means of accomplishing their long-term strategic goals. The same could be said for those deliberately using incendiary and divisive language, which is similarly allowed to propagate your system.

Facebook’s biggest strengths are also its biggest weaknesses.

Why isn’t Facebook doing more to address this?

Unfortunately, I do not think it is a coincidence that the choices Facebook makes are the ones that allow the most content — the fuel for the Facebook engine — to remain in the system. I do not think it is a coincidence that Facebook’s choices align with the ones that require the least amount of resources, and the choices that outsource important aspects to third parties. I do not think it is a coincidence that Facebook’s choices appease those in power who have made misinformation, blatant racism, and inciting violence part of their platform. Facebook says, and may even believe, that it is on the side of free speech. In fact, it has put itself on the side of profit and cowardice.

You don’t have to be on that side, though. Facebook has seemingly limitless resources at its disposal. You’ve got some of the smartest people in the world who work at Facebook. I know, I’ve worked with them. You’ve developed some of the most advanced technology in history and have mountains of capital. As one example, the company has said it may spend as much as ~$34 billion for stock buybacks since just 2017. The main ingredient that you lack is the will.

How to find the will?

First of all, it’s helpful to realize the world has changed and so has Facebook. In the four years I worked at Facebook, a lot of precedents were set that are still playing out today. Some of them made sense for the 2008 world but don’t make sense now. In 2008, the professional arbiters of truth — the press — were much stronger both in terms of resources and distribution. In 2008, Facebook’s reach was growing but it only touched a small percentage of the population. In 2008, people used Facebook more to keep up with friends than as a news or information source. Today, all of that has changed dramatically.

Newsrooms have been decimated and the press’ overall distribution has been similarly reduced. Meanwhile, Facebook has become a primary source of news and information for billions of people. In short, when we decided that Facebook would take a hands-off approach to content, the world didn’t need Facebook to fact-check or contextualize information. The world needs that now desperately.

Facebook says, and may even believe, that it is on the side of free speech.

I still believe that Facebook does more good than harm. There has been no better example than the emotional support the platform allows users to offer one another during the current health crisis. The value of connection with family and friends during this time is incalculable. However, just doing more good than harm is not enough.

If you think of Facebook as the place where people get their information, it’s like the one grocery store in a town. Everyone shops there and its shelves are mostly filled with food that is nutritious, fun, entertaining, engaging, and so on. However, sprinkled through the shelves are foods that look like regular stuff but are actually poison. I’m not talking about junk food with frivolous or empty calories. I’m talking about food that literally poisons one’s mind, turning him or her against science, facts, and other people. If you accept that there’s poison among the aisles, would you spare any resources to root it out? Are there any risks you would not take? At the very least, you would not hesitate to put warning labels on the poison.

That’s not the way Facebook has thought in the past, though. Instead, I believe there is an inherent intent bias within Facebook. That is, you know your intentions are good, and therefore you focus on the good outcomes and dismiss the bad. I was definitely guilty of this. It’s easy to do, especially when detractors have the opposite bias. That is, they see some bad outcomes and assume bad intentions.

It would be helpful for Facebook to cut through all of that and be honest with itself. If you believe that productive information on Facebook can create a sisterhood of truckers, sell Clif bars, start revolutions in the Middle East, and defeat a terrorist organization, then you must also believe that misinformation you host and distribute can destroy lives, incite violence, torture those who have already endured unspeakable tragedy, and convince people to make devastating health choices.

Promoting free speech shouldn’t be used as a “get out of tough choices” card. Yes, people have the right to express ignorant or misinformed views but that doesn’t mean you are prevented from providing context on those views or that you are required to give them distribution.

For centuries, the main way people received the free speech of others was through publishers or the press. People heard speeches. However, the vast majority of people read about it in the paper, where it was put in context. Even with the advent of radio and TV, the actual video or audio of the speeches were followed by commentary of reporters. These employees of for-profit private companies provided context and attempted to arbitrate truth. Was it perfect? No, but it helped keep the forces of misinformation and divisiveness largely at bay.

That system has been disrupted, in large part, by you. You have a responsibility to take an active role in fixing it and/or finding a new system that works better. The Facebook Journalism Project and the support of fact-checkers are a great start but they are band-aids. Alas, we are hemorrhaging civility and truth. The scale and sophistication for a real solution is orders of magnitude more.

What should Facebook do?

I don’t have a silver bullet, but I know you need to build trust. You need to show the world that you are not putting profit over values. Therefore, I would suspend the stock buyback program. As I mentioned, you’ve committed ~$34 billion to stock buybacks. It looks like you’ve spent about $20 billion. That’s $14 billion left (please check my math). I’d devote the equivalent resources toward realizing the goal of better informing users. You’d be showing that you’re literally choosing users over profit.

What’s the metric? I don’t know, but I have confidence that you can figure it out. You have swung the pendulum all the way toward enabling expression. Let’s move it toward the quality of information, or an outcome of an accurately informed public. Success on this would be infinitely more valuable to your investors than artificially propping up the stock with buybacks.

It’s helpful to realize the world has changed and so has Facebook.

I’d put the company in lockdown. We did it in 2011 when Google was launching Google+. They had orders of magnitude more resources, more engineers, the largest distribution platform in the world, and had committed everything to squashing Facebook. We worked day and night and kicked their ass. We humiliated them. This challenge is even more daunting but also infinitely more important. I know you can do it.

It will be hard, though. You’ll need courage, money, and brainpower. You’ll also need to cast aside long-held beliefs. Just because taking a specific action could be a “slippery slope,” doesn’t mean it’s wrong. Just because a solution isn’t currently “scalable” doesn’t mean it’s unworkable or that you couldn’t eventually scale it. Just because something is an “edge case” doesn’t mean it’s irrelevant.

In case it is not clear, the stakes are high. We are in the midst of a global pandemic. Nearly 400,000 people are dead. Many more are likely to die and that risk is being made worse by content you host. Every. Single. Day. The only way the stakes could be higher is if we were on the brink of a world war. Thankfully, we are not. However, I encourage you to ask yourself where a concerted and systematic undermining of science and truth and rampant divisiveness ends if it is left unchecked. A lasting peace? I doubt it.

Whatever you do, I can promise you this: You will continue to be criticized. People will always say that you are both doing too much and not enough. That is the price of leadership. I used to tell Facebook colleagues who complained about criticism to go work at MySpace. No one bothers to criticize them. You don’t work at MySpace, though, because we trounced them, as well. You work at Facebook and you can beat misinformation and divisiveness, too. I’m rooting for you. We all are.

Your friend,

Barry

--

--

Barry Schnitt
OneZero

Former Pinterest, Facebook, and Google communications and marketing. Currently, consulting and advising. https://www.linkedin.com/in/bschnitt/