Big Technology

Who Criticizes the Tech Critics? A Meta Talk With ‘Real Facebook Oversight Board’ Members Carole Cadwalladr and Yael Eisenstat

The two join Big Technology to discuss tech criticism, Cambridge Analytica, and how Facebook can begin to right its wrongs

Carole Cadwalladr and Yael Eisenstat

OneZero is partnering with the Big Technology Podcast from Alex Kantrowitz to bring readers exclusive access to interview transcripts — edited for length and clarity — with notable figures in and around the tech industry.

To subscribe to the podcast and hear the interview for yourself, you can check it out on Apple, Spotify, or wherever you get your podcasts.

Carole Cadwalladr and Yael Eisenstat are two of the most prominent Facebook critics worldwide. Cadwalladr is the journalist who broke open the Cambridge Analytica story for The Guardian and The Observer. Eisenstat, a former CIA officer, worked on election integrity inside Facebook for six months before quitting and speaking out against the company. The two Facebook critics join Big Technology Podcast to discuss some people’s disenchantment with tech criticism, the role of Facebook’s Oversight Board, and how the company might fix its product.

Alex Kantrowitz: Hi Carole and Yael, let’s start with the criticism of tech criticism. I read this post by Lee Vinsel that argues tech critics are overselling what these companies are capable of. Essentially he says they’re flipping their press releases and adding a dystopian spin. What do you think about that? For instance, could Cambridge Analytica really manipulate us based on the data it collected?

Carole Cadwalladr: That goes to the heart of so many questions. I mean, doesn’t it? Part of the problem is Facebook doesn’t let anybody in. Because it’s proprietary data and methodologies and all the rest of it… we’re not able to send in independent researchers to make those assessments. And that’s one of the huge problems at play.

I think it’s a stretch to suggest that Facebook’s targeting technology doesn’t work, considering that the advertising is a trillion-dollar industry across the world. I personally find it a stretch to believe that advertising doesn’t work given the huge amounts of money that go into it. And I find it a stretch to think that the more information you have about individuals and the more highly you’re able to target them, that doesn’t have an impact. I can’t see how we can’t take them seriously.

Some folks took the Cambridge Analytica story and spun it into this idea that Facebook let go of 80 million user records, and that’s why Trump won. What do you think of that narrative?

Carole Cadwalladr: For me, the most troubling thing was that the data had come from Facebook and then Facebook was also then being used by the same company to target people. And that was this twofold process that Facebook was directly implicated and hadn’t kept people’s data secure, and then it allowed itself to be used for targeting purposes. And so, again, for me, it was never about… There was no evidence around the efficacy. There was a very interesting story by the British Channel 4 News ahead of the 2020 U.S. election. They’d got the entire database from the RNC and went around looking at the data that the targets voter suppression, and how they had been specifically targeted by Cambridge Analytica using Facebook’s own tools to suppress the Black vote. And there was some very compelling evidence around that.

Part of my suspicion around Cambridge Analytica was that it didn’t have to be that sophisticated to be effective. So things like voter suppression, for example, don’t take a highly sophisticated approach. It’s just about scaring people, deterring them. That’s an easier thing than tweaking this and that lever in their brains. There’s some blunt tools, which when you have this amount of data you’re able to deploy.

Yael, what do you think about this growing line of criticism of tech criticism?

Yael Eisenstat: What concerns me about the piece, is it lumps every single person into one general bucket of tech critic. And then yes, he goes into more detail in the piece, but each so-called tech critic comes with their own set of experiences background, and I think each one should be evaluated on their own merit.

I would caution when people too easily lump all tech critics into some large bucket of what’s behind saying something like that? Is it because they’d rather silence some of these so-called tech critics who are making very well-reasoned, well-researched points from their own experience.

How about the rise of authoritarianism across the globe. Some say Facebook is responsible. Is it?

Carole Cadwalladr: The leaders who’ve been most adept at using social media are particularly using it to fan fear. The fear-based messaging, it’s very effective at that. What we’re also seeing is this is a blunt tool is that it doesn’t have to be used in a hugely sophisticated way. There’s a lot of questions. There’s a lot of unknowns. These are historical forces. But I think one of the things we have seen, which has been very troubling, is that alignment between these leaders and the Silicon Valley companies and particularly Facebook.

We saw that very clearly in the relationship between Trump and Zuckerberg, and the inability and willingness and reluctance of Facebook to upset its critics in the ruling party. And we see that very directly, for example, in India where the relationship between Facebook and the ruling party has been unpicked, particularly by the Wall Street Journal.

You see it’s an alignment of interests in an alignment in power. And it’s receded slightly. We don’t know for how long in the United States, but it hasn’t in other parts of the world. And we very, very strongly feel that the United States has a real duty to other countries which are still facing this and have absolutely no traction or access, or means of leverage on these American-based companies.

Yael Eisenstat: I hate to talk about my work, but when I put on my TED Talk which is very much about how Facebook is contributing to radicalizing people based on having worked on counter-extremism issues my entire career, the first reaction from so many people was, “Oh, well, what about Fox News?” Or what about this, that, and the other? So let’s be clear. I am stating emphatically. No, I do not think Facebook is solely responsible for all of our woes in the world. That said, we have an industry which, due to the permissive environment in the United States that let the internet flourish, gave free rein to scale as recklessly and quickly as they wanted to, to dominate the entire global landscape of what one might call discourse or the public square, or whatever terminology you want to use, without any checks and balances. And not in any way liable in the way that traditional media is for certain content, or in any way responsible for the actual consequences.

Again, not of the speech that is posted on their platform, but on their tools and what they decide to do with that speech. I mean, without going into a long explanation of this, you have platforms that are still, to this day, as much as Facebook likes to pretend it’s not true, predicated on keeping people engaged, growing their daily, monthly active users, and making sure that people continue to remain on the platform so they can continue to hoover up our data. So they continue to refine their targeting practices so they can sell this to advertisers, whether or not their targeting tools are perfect.

And that’s a whole nother conversation, but it’s still what they’re selling to advertisers. They’re selling this ability. And so there’s so many practices between how the recommendation engines are steering people, and they are steering people towards more and more extreme content about how you might get a pop-up recommending that you join this group. And the next thing you know, in that group, you’re meeting other people who share white supremacist ideas. And then you’re hatching a plan and go off to Oakland and murder an officer. Which happened, those two men met on Facebook.

And so there’s so many things happening within these platforms that we are not allowed to touch because it’s being mischaracterized as all being about free speech. And so I’m not saying that Facebook… Listen, anger, hatred, polarization, divisive content, political rhetoric, all of that has always existed. But now we have an entire different environment where there’s cheap tools to engage in information warfare and propaganda.

Even if you connect a predator to a young girl, that company will claim that you can’t take that to court because of Section 230, we’re not liable. Even if it’s their own tools that recommended that that person connect with that young girl. These are all the things that get lost when we start using excuses like, “Oh, but it’s always been bad,” or, “Oh, but technologies always come and disrupt what happened before.” And sorry, that was a little bit high level, but it’s very frustrating to me when people say, “Oh, but Facebook isn’t the only reason people are acting the way they are.”

So it’s a contributing factor, but not the whole deal.

Yael Eisenstat: I think right now it’s the biggest contributing factor, but yes, it’s a contributing factor.

Facebook created an Oversight Board to review some of the content moderation decisions that they make. You’re both part of an opposition to the board called the “Real Facebook Oversight Board,” which is essentially an advocacy group. Is there an issue with Facebook letting the public make content decisions?

Yael Eisenstat: As a high concept, the idea of an oversight board, which is an external group of experts who have the ability to actually look under the hood and really think through speech issues, in theory, it’s a really good idea. However, to in any way claim that pulling together a group of experts, who, whether we like it or not, were handpicked by Facebook and are essentially paid by Facebook, and then to give them this very limited remit of, you can only really overrule us on content that we took down, where I would argue the more dangerous stuff is the stuff that is still up.

This group of people are not accountable to any of us; they weren’t chosen by the public. I really wanted to believe in the idea in theory, but it really is passing the book on accountability and responsibility to this external board so that therefore Mark Zuckerberg can say, “It wasn’t me. They made this decision.” A) it’s passing responsibility, B) it doesn’t address any of the issues that I really care about, which are the systemic underlying issues of how the platform is monetized, what that is doing to our cognitive capabilities. How all of the tools, the targeting tools, the algorithms, the recommendation engines, none of that is in the oversight board’s purview. So bottom line, in theory, interesting, but do not for a second confuse it as some sort of a true governing body that is really tackling the issues of which Facebook has, for so long, evaded any responsibility for the real decisions they’re making. That’s my quick level.

What’s the main criticism of turning these decisions over to the public though? Because directionally, it seems to make sense.

Carole Cadwalladr: Facebook is desperate to figure out how to do this self-regulation. I really do think that what we see here is this accountability theater, it sets up this fake supreme court, which is using fake law, essentially, to pretend to use the tropes and mechanisms of a nation’s state, which is applying to a private company. At the same time, we’ve got no idea about how Facebook actually does content moderation, because it won’t explain its metrics. It won’t explain why it took down Trump when it did, but why it didn’t before. It’s very unclear. So instead of just cleaning up its own house and doing more like Twitter has of showing its workings, at least having some level of transparency about this. It’s just outsourced it to a politically convenient body who is going to take the flack one way or the other.

There’s a highly politically contentious decision to be made about Trump and Facebook has incredibly conveniently just got that off its ticket. It’s not going to be facing the music. And what is deeply troubling to me is the way that the oversight board is being taken seriously. It is being considered as a source of legitimate supreme court. There’s legal scholars who are endlessly writing about it. And actually Alex, I saw your post yesterday about how the big tech companies essentially capture the think tanks and the incredible soft power that they exert and how it has an impact. I think that we see that on so many levels and you look at the amount of money that Facebook has to spend on lobbying and PR, and you see the way the machine works, and you see the way that it plays favorites with reporters, and you see the way that it gives access to data to academic institutions.

There is no well-funded body that is able to be the Facebook-counter disinformation unit, essentially, which is what we need. There’s a democratic need for that, and the press, of course, is able to do some of that but there’s so much of it. There’s been this amazing reporting, just in particularly the last 18 months, incredible tech reporting coming out of the states, but that’s still just the tip of the iceberg. We know that there are so many harms, there are so many problems and also it doesn’t actually… It makes some difference, but we see the ways that it’s actually very hard to land a punch on these tech companies.

The idea that this foul judicial body is being taken as a legitimate authority is incredibly troubling. There’s some amazing individuals on the oversight board with brilliant reputations, absolutely experts in their fields. I cannot understand why they would allow their reputations to be captured and harnessed by Facebook for their own political purposes, I have to say. I know that they’ve gone into it with the best of intentions, but I really do worry about it, I have to say.

Yael Eisenstat: Alex, you’re right that it sounds, in a way, better than letting Facebook itself make all these decisions, but let’s also be frank. A lot of this is because the U.S. government has not caught up to figuring out what the guardrails should be in place around companies like Facebook. So we’ve continued to allow Facebook to self-regulate. So that’s part of the problem too. There are three major buckets, obviously, I’m going to be very clear. There’s no one magical fix that’s going to suddenly make a healthy information ecosystem where trust is restored and truth wins over.

Between data privacy regulation, what to do about antitrust, and how to actually decide what these platforms are and are not accountable for. Until the U.S. government figures out some of that, Facebook gets to continue to claim that they have this supreme court that’s making these grand decisions, and you’re right. I don’t want Mark Zuckerberg to be the one to make a decision about what is truth and what is fiction, but we’ve also given him free unchecked power to do so. So when he claims this as a supreme court, you see all these brilliant academics and journalists scrambling to plead their case in front of this oversight board about how they should handle their decision about Trump. It’s just fascinating to me that now all these people are giving all this legitimacy to this board, which will make one of the most consequential decisions for the future of our democracy, and they’re not accountable to me. They are making one of the most consequential decisions and so that’s just very concerning to me.

Who do we want to make these decisions in the end? Can a board like this exist, and we’re happy that the government and Facebook aren’t making decisions? For context, the board is now deliberating on whether Trump should be permanently banned.

Yael Eisenstat: Right, but they’re deliberating based on two posts that Facebook decided to flag, which weren’t even the most troubling. And by the way, all of Trump’s older posts still remain up on the Facebook platform and you can still engage with them. That in and of itself is troubling. I’ll say, I’m going to just reference someone else. I think one of the best letters and best cases about what should be done with the oversight board that I saw was the one from Jameel Jaffer. He wrote a brilliant letter to the oversight board about why they shouldn’t actually rule on this at all, and his recommendations for how they should go back to Facebook and say, “Before we rule on this, here are the things we actually want to see you do.” So I recommend people to look that one up.

Carole Cadwalladr: I totally agree. The people I’ve spoken to who understand the makeup of the board the best think that they are going to overturn the Trump decision. So that’s the sense that I’m getting, is that they are going to rubber-stamp Trump.

That, in many ways, solves Facebook’s problems, doesn’t it? It wasn’t their decision, it was their independent supreme court who made that decision. In a sense, Facebook has got a win either way, but the prospects of them having a win and Trump being back on Facebook is really deeply problematic. That’s why I find the coverage to date of the board as essentially paving the way for that, and that’s why I think we’re having this conversation.

What do you think? What’s your view?

I do find the criticism that it gets Facebook out of discussing some of the harder problems very legitimate. I also don’t want Facebook making these choices, and I don’t want the government making these choices. So the public to me seems to be the best way to do it.

Okay, now I want to talk about your organization, the “Real” Facebook Oversight Board. Obviously, you’re not going to be making decisions on content. What is the idea behind the organization?

Carole Cadwalladr: It was an emergency response to increasing alarm and horror at the fact that Facebook was refusing to take action over Trump’s blatant use of Facebook to subvert the election.

What we could see was that there were all these brilliant academics and civil rights leaders making this case separately, but it was this idea of trying to bring them together to bring coherence, in a way, in authority, to that. To put pressure on Facebook and, to some degree, the board agreed upon three demands and Facebook conceded to two of them.

You can never say exactly why, but at some level, it’s like I said, the civil rights leaders with this moral authority, and then these brilliant scholars in the field, people like Yael, who’s got this amazing hands-on experience in bringing them all together to raise the profile of the problems in the homes, and then go forward. I mean, it’s very much up in the air, but I think this idea of trying to do a shadow governance structure, so where the oversight board is providing oversight in this very very narrow way around very very narrow topics, and won’t consider a whole other spectrum of Facebook harms. That’s where we see the Real Facebook Oversight Board can potentially play a role in modeling what independent oversight could look like. I think the other thing about it which was very much front and center, at launch, is that you pick up a poem with a slightly silly name. It was just like being a pain, being a thorn in Facebook’s side was very much part of the thinking.

Considering all the stuff that they tried to do in trying to shut us down, I think it was actually quite effective. The confusing name is confusing in that it’s not accidental. I mean, the point about it was that Facebook hadn’t launched the oversight board, they denounced it and announced it and then said it wasn’t even going to launch before the election. So we thought, “Sod it! We’ll appropriate it, we’ll subvert it.” And in that, I think, it might be slightly silly, but it worked, I think.

Yael, Facebook seems to be addicted to engagement. Take us inside, how is it viewed internally and what did you see when you were trying to solve this problem?

Sure, and I’ll give you two concrete examples, but let me start by saying they love to counteract this talking point about how they’re all about engagement. I would just keep asking them, “How come on your quarterly shareholder reports, you continue to use daily and monthly active users as your metric of success? As long as that’s your metric that you’re reporting to your shareholders, then I’m going to continue to believe that’s your priority.”

Clearly, I was part of what you would consider more of the risk side of the things, right? So that already in and of itself, we’re not revenue generators, we’re considered the cost centers, essentially. The ones who are trying to possibly make you slow down. Who are highlighting things that are going to lead to problems in the future. That is never usually valued as much inside a company who obviously has to keep growing and keep monetizing, especially the way they do. I’ll give you two quick examples. The very first thing, one of the first questions I asked in one of the internal tribes, is what they call them at Facebook, was just asked. So, I don’t remember exactly how I worded it. This was in 2018. I was preparing for the U.S. midterm election, and I just asked, “Why are we not using any fact-checking program that we’re using in organic content for political ads? Clearly, we have the technical capability to have some sort of program to do this, and we’re actually taking money for ads. So I would assume that the bar is even higher because this is actually where we’re profiting and we’re adding these disclaimers like the ‘paid for by’ which is making it appear like we’ve already validated these ads. So it might give them even more credibility. And your average Facebook user doesn’t necessarily know the difference between content and ads. They just see it in their feed.”

So I was asking all these questions, “Why aren’t we at least making sure that if we’re taking money for a political ad, that that ad is not engaging in blatant disinformation?” It was really interesting because a bunch of the PMs and the different people in our teams all started chiming in on this, whether in the tribe itself or in conversation with me, and they got excited. They knew I was hired to have this team and they knew I was asking the question. It was interesting. It was like, “Yes, let’s do this,” and they started putting out the plans on what they could do. Then suddenly the conversation went silent. I never heard anything about it again, and came to learn later that, of course, Mark Zuckerberg had already decided he would never fact-check Trump. So he’d already decided that they would never do this, but that’s exactly what I was hired to do.

An even better example is when the civil rights audit was going on at Facebook. So we knew it was going on, and so my team had coordinated with a bunch of other teams to put together a plan that was technically, actually, somewhat easy, just to ensure that no ads that made it through our system engaged in blatant voter suppression. Meaning it couldn’t lie about voting procedures. It couldn’t lie about where—all the different categories. The date. All these things, which we were doing for organic content, apparently. So we worked with the teams who had built the ML systems to start screening content for voter suppression, and just decided that we would run the ads through the same systems. That was rejected too, and it was rejected on all sorts of weird narratives about, “Well, that won’t scale globally.” And I was like, “Of course it won’t scale globally. Every single election has its own norms, values, political realities, laws in those countries. I get that Facebook wants everything to scale globally, but do not say you are trying to protect actual elections if you’re not.”

And I said, “One ad that gets through, that engages in voter suppression is more dangerous than all of these other things we’re looking at right now.” And it was just completely rejected. In the long run, I would say once again, it’s because they knew that that would mean that ads from certain politicians wouldn’t be approved and that would be very politically difficult for them. So there are things that could have really been done quite easily that they refused to. Then the other thing is, anything you do that’s going to build friction into the system is going to be looked at, right? Because today we want everything to be first, fast, and free.

I would argue if you are trying to figure out how to protect democratic debate and try to get to a point where blatant disinformation is not at the top of everybody’s feed and is not beating out actual, more wonky fact-based content, you might have to build even the tiniest bit of friction into your system. And so it’s the question, “What do you value more?” And right now Facebook continues to value growth over protecting democratic discourse and protecting trust and information. There are things that could happen and they’re refusing to do.

Do we need to focus more on stuff like the share button? It’s not a question of, should you take it down or not? It’s a question of, should you slow it down?

Yael Eisenstat: Right. So content moderation is hard. There’s no question, and there’s no super-easy clear-cut answers on content moderation. But the bigger issue to me is, again, the tools and it’s frictionless virality, not just virality. I would be really interested in seeing some of the data behind when Twitter tried to introduce some friction into how you could engage with certain tweets. I think there’s an interesting experiment there, I’d love to see.

Twitter put a speed bump in front of the retweet. That caused retweets to drop somewhere around 20%. Twitter used that as a reason to say, “Okay. Well, people don’t like it. We’re going to put it back.”

Yael Eisenstat: Right. So that counters the whole point. See, they’re going for emotional reaction and things that make you react emotionally are what you’re going to engage with most quickly. I know we’re running out of time, but I just want to say quickly, I want to be very clear because people like to use free speech, which I also think is a bit of a red herring, to say you can’t do anything.

I really think it’s about the tools, and our U.S. government has no right in talking about what speech should stay up or stay down except for illegal speech, of course, but what they can do is, I think they need to dig in much further into the tools of… Let’s even just look at January 6. I would like to know if some of the people who have been charged right now, in the insurrection, did they go looking for stuff, to steal content? Did they go looking for QAnon? Did they go searching because of this whole mirror to society excuse Facebook likes to give? Or do their recommendation engines prey on what they already saw were some of their vulnerabilities and steer them towards this content, recommend them to certain QAnon groups, connect them? Those are the things that you will never hear Facebook talk about. They will always flip it to be about speech because they don’t want you to look under the rugs at the actual tools that they are using.

In terms of funding for the “Real” Facebook Oversight Board, are Facebook’s enemies funding this, or is it publicly funded?

Carole Cadwalladr: We were very fortunate in that Pierre Omidyar at Luminate gave us some money at the beginning. Then what happened is that they found themselves under a barrage of calls from Facebook — not from the Oversight Board you’ll notice — pressing them about why they were funding us and suggesting that they shouldn’t be. It was really quite extraordinary. I think that there is an increase in the number of journalists, in particular, and some academics who find themselves in this adversarial relationship with Facebook, where they don’t really act like a normal corporate company.

So, for example, I think you’re both probably familiar with Andy Stone, who’s that Facebook spokes-dude, as I call him, who’s out on Twitter, not acting like a normal thing, very Trumpian sort of tactics and going in for this sort of hand to hand combat with journalists. I do find it quite peculiar and disorientating, as I do the idea that this multitrillion-pound company is going out and trying to heavy its critics out of very very modest amounts of funding. So, yeah. I mean, as I’ve said, on the plus side, it doesn’t make you think, “Well, it must be doing something right.”

Yael Eisenstat: Funding for the Real Facebook Oversight Board does not mean it’s funding the people who are members of it. One of the things I’m very clear about is I’m not making money off of speaking about these things. I’m speaking about these things because I’ve spent my entire life defending our democracy and I’m not ready to give up that fight yet.

Carole Cadwalladr: Yeah. This is the point about Facebook’s Oversight Board. The members were handpicked by Facebook, they’re getting over six figures for doing it. I mean, it’s a very very different ball game than the people who are on the Real Facebook Oversight Board, as you say, are not paid in any capacity at all. You look at the absolute millions that Facebook has at its disposal. The legions of press offices and lobbyists and all the soft power techniques. You really are taking a knife to a gunfight. But still, I do feel we’ve nonetheless got to get out there with the knives and do what we can.

I write the Big Technology newsletter. Sign up here: https://bigtechnology.substack.com.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store