OneZero is partnering with the Big Technology Podcast from Alex Kantrowitz to bring readers exclusive access to interview transcripts — edited for length and clarity — with notable figures in and around the tech industry.
Over the past week, both Facebook and Twitter suspended President Donald Trump’s account. These companies don’t take such aggressive action lightly, and it took Trump sending a mob toward the U.S. Capitol, which they eventually breached, to force the issue.
For years, BuzzFeed News senior reporter Ryan Mac and I have been watching these companies’ every move. Previously as colleagues at BuzzFeed. Mac joined me this week on the Big Technology Podcast for a discussion on whether the social platforms’ moves were merited, where they go from here, and how he thinks about all the internal Facebook communication he’s obtained in his reporting.
Alex Kantrowitz: Let’s talk about the access you have to Facebook employees’ communications. You seem to know what’s going on inside the company before most employees. How do you do it?
Ryan Mac: With the access to Facebook, it’s funny because I can literally see messages about the access that I have on Workplace. Workplace is an internal social network at Facebook where employees post, talk about things, chat, flag things in groups for other people. And when they flag, for example, “Hey, Ryan Mac is live-tweeting our all-hands meeting, do we know about this?” I think it’s funny, and I see that stuff almost in real time.
How do you decide what to share with the public vs. what you think is worth keeping closer to the chest? Do you ever learn something that you feel does not meet the bar to publish?
I would differentiate between what we put in stories versus what I tweet. When you’re tweeting in the moment, things can go by really quickly, you want to catch everything. And sometimes I do have a tongue-in-cheek tone to my tweets.
But sometimes I’m going to put in a story. We’ve gotten these internal memos and we’re like, holy shit, this is an incredible story. We got a memo from an employee, a data scientist named Sophie Zhang, who writes, “I have blood on my hands. The decisions that I made at Facebook were life and death decisions in places where we didn’t even pay attention. In places like Azerbaijan or Central America or in places we didn’t action political interference.” And you’re reading these things, you’re like, this is an incredible story. This is newsworthy. You immediately drop everything, you start writing it. But not everything rises to that level. We don’t write a story about everything that we see internally. A lot of it makes it into tweets, a lot of it is on the cutting room floor, a lot of it I save for later. But yeah, there’s always that kind of discussion of what’s newsworthy and what’s not.
There was a tweet this week accusing you of corporate espionage, saying you tweet everything that’s said inside Facebook, even the stuff that’s not important. And therefore, there’s less transparency inside the company. Do you ever worry about that effect?
Yes, more than people probably think I do. And I get their perspective. Their perspective is, the more reporters write about our company and what’s said internally, the less people are going to be willing to share. The less our executives are going to engage with us if they know that a reporter is watching. I understand those sentiments. At the same time, I don’t think it’s my responsibility to cater to those sentiments when there is a general population of people whose interests outweigh internal corporate environments.
But I get it. I mean, these people are still working at Facebook, they’re embattled. A lot of them are so proud to work there; a lot of them still think they’re making a difference. And it’s annoying when a reporter is reporting out things that are said from their colleagues.
On the flip side of that, I’ve seen internally, at Facebook, a lot of people defending the reporting process. A lot of people understanding why things are leaked. I think things are leaked because there’s a frustration that executives are not doing enough. They’re not paying attention, they’re not listening to what mid-level or low-level employees are talking about and what they’re seeing on the ground. And we’ve heard that sentiment multiple times. Sophie Zhang’s memo is one where she mentions that the company doesn’t react to a task that’s filed by a low-level employee, it’s really when there’s a PR crisis. When there’s a PR crisis, all hands are on deck.
And you’re the PR crisis, often.
And I am the valve of the PR crisis. People come to me to, I don’t know, start a PR crisis. That’s not how we phrase it, but they know that things will get attention. I report I hold companies accountable and they know that’s the way Facebook acts. Look at this week — what did it take to suspend Donald Trump? A riot?
And a lot of coverage.
So do you think that was the appropriate move, to suspend the president?
I do. In that moment, yes.
It’s interesting that you put it the way you did. They waited till this moment to suspend the president. When would you have done it?
I don’t know if there’s a day that I could point to where I would have done it. And I think there’s a difference between banning him completely and banning the algorithmic spread of his statements. There’s ways to keep him on the platform without allowing him to be reshared or engaged with or liked or commented on. I think there’s that level of gray area or that granularity where it’s not just up or down. But in that moment where there are literally people in the Capitol building, they’re attacking police officers, they’re storming offices, people are literally fearing for their lives, and you have someone posting on Facebook… I don’t even remember what he said.
He said, “I love you, you’re special.” Which is astonishing, given the fact that they were trying to overthrow a democratic government, but he also said, “Go home.”
He eventually said, go home.
But anyone who’s watching that would have gotten the message, “I should go home from the Capitol, but he probably doesn’t care if I stick around too much, given what he’s just said about me.”
And we’re talking about the video that was allowed on Facebook that was later taken down. It said that the election was stolen. So feeding into that, the reason why they were there is, “Oh, he’s saying it’s still stolen.”
So would you have banned him a week before or a couple of months? It’s easy for people to say, why did they wait until now? But it’s also a difficult call to make, to ban the president of the United States from your platform. What’s the line in the sand for you?
It’s got to be the incitement of violence. And you can go back to that May “looting and shooting” tweet. I mean, there was a tweet, but there was also a Facebook post.
I just remember the sentiment that caused internally at Facebook. I wasn’t even paying attention internally at Facebook that much. I had other stories I was covering. It’s not my mandate to cover Facebook on a day-to-day basis. I cover other companies: I write about Tesla, Twitter, Google. It’s more of a corporate accountability thing. But when that happened in May, everything just switched. And you’re like, holy shit, these employees are pissed. They’re very mad. And they were so mad that they did a virtual walkout, which in retrospect is silly. They didn’t show up for work one day, they changed their avatars to a black and white fist. But that has never been done before in Facebook. These people were pretty happy for the most part for the last, I don’t know how many years.
Yeah, I think something did switch when you saw that Facebook post, “When the looting starts, the shooting starts” in the middle of the Black Lives Matter protest.
I mean, it has great historical context. That’s when I actually started to pull threads at the company.
And Facebook kept that post up?
Facebook kept that post up, and Zuckerberg defended it. He said it’s not clear what he meant, it could be interpreted this way. I think there were multiple posts that he made about it.
Right as the ban went into effect on Trump after the storming of the Capitol, I went back and read the blog post about Zuckerberg’s “we stand for free expression” Georgetown speech. And it was like, well, you might’ve tried to stick with this as long as you could, but at a certain point, they did draw that line.
Funny thing about that Georgetown meeting is that, on the same trip, he had dinner with Trump at the White House.
I certainly have felt over the years that if Trump lost there would be a situation like the one we just experienced. And it did put Zuckerberg in this losing position after his previous statements.
One of the things I wanted to talk about also was, at Facebook, there’s this idea to assume good intent. It comes up every now and then on the forums or on Workplace, assume good intent from the people, never assume the worst. And with Trump, they just continued to assume good intent and assume good intent. And at some point, how many times is that person going to bite you? That can come back to bite you multiple times. Maybe that presumption is just wrong or maybe you shouldn’t start with that operating principle.
It’s been an issue for the company. They have been notoriously techno-optimistic for a long time, an issue I wrote about in my book. They’ve tried to address it by hiring more adversarial thinkers. But having those “inputs” in your feedback system only works if you listen to them.
Yeah, you and I reported on this memo that was written by Facebook executive Andrew Bosworth. I think he’s the number three or four person in terms of rank at Facebook. And that memo was called “The Ugly,” and it detailed this idea that connecting the world makes it a better place, no matter what happens. And the examples he used were like if we connect people and terrorist attacks happen on our platform, and suicides are broadcasted on our platform, he effectively said, it doesn’t matter, connecting the world makes it a better place.
He said, “We believe connecting is de facto good.”
De facto good. Right. That was from 2016. We found that in 2018, we published it, he said to us, “Oh, I was just saying that in jest. It was meant to be controversial and spark debate.”
I think what he was trying to say was that he was putting that memo out there as a caricature of some of the beliefs he saw on Facebook. In order for people to reckon with the fact that, if this is the extreme of where our beliefs can go, that connecting people is de facto good no matter the consequences, then perhaps that’s something we should reckon with.
I mean, that’s what he said after the fact.
But the point is, whether he believed it himself or he saw it as a caricature, it pointed to a very serious problem inside the company, which is that that belief was there.
Yeah. And I’ve talked to employees who said that they built products under that belief. Let’s assume good intent that he threw that out as a joke or as a devil’s advocate kind of post. It doesn’t matter because people ended up believing that anyways. People on and under his chain of command or within Facebook who joined, who maybe read that later, believed it. And we reported that in 2018, and I remember Zuckerberg giving us a statement that was a condemnation of that post.
And then soon after that, you saw his tone change around connecting the world. And then I followed his public statements, he’s acknowledged that connecting the world is not always a good thing, there’s bad things that happen in connecting the world. And treating it as this plus-minus column or weighing the scale of, as long as there’s more good, we’re fine here because it’s a net good. The world just doesn’t work like that. And I think they started to recognize that that’s just not a position they can take, it’s not worth their time.
It’s weird because it is their business model, to connect people. But I want to get your take on why you think Zuckerberg finally did ban Trump. Was it the employees? Was it pressure from outsiders? Was it his own conscience after watching it happen?
I think it was a combination of all the above and also seeing other companies react. You can never underestimate how the role of other companies’ decisions play into—
Twitter did suspend Trump, Twitch banned him, but Facebook had stood by Trump’s ability to post for a long time.
Yeah, I talked to Alex Stamos this week about that. Alex Stamos, former chief security officer at Facebook, who’s now at Stanford teaching. And we talked about this and we were like, okay, why now? What happened here? What changed Mark Zuckerberg’s mind? And for Stamos, it was the realization that the reason why we have these principles of standing for free expression here is to uphold liberal democracy and the principles of liberal democracy. But if you’re upholding that principle for someone who is going to undermine liberal democracy with his statements, then it just falls apart. If you’re holding up this free expression ideal and the person that uses that free expression ideal is going to say, well, we should just tear this whole thing down and we should delegitimize the vote, we should doubt everything about this process, then you’re nowhere, you’re at square one.
There’s an argument that it doesn’t make sense to preserve free expression if the result is going to be the loss of free expression.
That’s a really good way of putting it. And I was like, that makes sense. I’ve danced around the idea of when I would have banned him, but that assessment, at least for Facebook, should have been discussed out in the open. They have this oversight board that was supposed to go into place. I mean, it is going into place, but we have heard very little from it.
They’ve been incredibly ineffectual, very disappointing.
It’s a third party board that’s supposed to be Facebook’s supreme court in terms of decision-making. They could have had a public debate about this. They could’ve argued this in public. They could have gamed this out. They could’ve really been out in the open with us and discussed, okay, if this happens and this happens, they play out these scenarios. People that work at Facebook are supposed to be some of the smartest people in the world. They’re hired from top universities around the world. Why can’t they think about this? There’s 50,000 people that work there. And so when it comes to a decision like that, when it actually happens in the moment, it just looks so reactive. Kevin Roose at the New York Times had a great tweet about this. Even if they had the scaffolding in place, they were preparing for this, it still felt extremely reactive in the moment, and that’s all people are going to remember. They’re not going to care about the processes that got them there.
Another theory has been, the Senate was surprisingly won by the Democrats that morning when Ossoff took the second seat in Georgia. And Zuckerberg is looking at the fact that you’re going to have a Democratic White House, a Democratic Senate, a Democratic House. And he just made this decision, what’s best for Facebook right now. Do you think there’s any merit to that?
I don’t know if I should say anything. I’ve seen people talk about that a lot.
Well, I’ll speculate. I think that that’s definitely the case. You’ve done all this reporting about how Zuckerberg’s courted conservative power over the past four years. So it doesn’t seem out of the realm of possibility that that was on his mind when he made the choice.
For me, the moment where I came to that realization was when we reported this story out where he made a secret phone call to Trump right after Trump had won the election. Remember that Trump Tower meeting? He sent Sheryl Sandberg instead. They were like, oh, Mark’s busy. But Jeff Bezos went. Larry Page went.
Yeah. It’s the president of the United States. What’s he doing, reviewing the latest iteration of Messenger?
They all went up the golden elevators in Trump Tower in New York and Zuckerberg was the main person who didn’t go. And it was like, oh shit, he’s taking a stand. But then we found out he placed a phone call to him to congratulate him and we reported that out. He’s a savvy political actor.
Well, this was his top user and it seems like conservative content performs really well on Facebook.
Yeah. And then you get these stories about him texting with Ben Shapiro and hosting grievance meetings with the Glenn Becks of the world.
But he should be hearing from both sides of the political spectrum.
But he doesn’t hear from the other. I think it’s one thing to meet with Glenn Beck and have Ben Shapiro over for dinner, and to get yelled at by Color of Change, for example, which is an activist group.
You don’t think Zuckerberg meets with any left-leaning personalities?
I don’t know if he’s in those meetings. I know the person they usually send out is Sheryl. I’m sure he takes a couple when there’s an extremely bad moment for the company, like after George Floyd and during that Facebook ad boycott. I’m not even sure where he was in those moments. But it’s weird because he used to hate that stuff. The reason why he hired Sheryl was to deal with that stuff. And he hired a whole political arm in D.C. so he didn’t have to touch it. And now he transformed into a political actor over the last four years, and his first main task as a political actor was to court the right. I mean, I guess the other example of course is China. When he became the diplomat for the company and went to China and learned Mandarin and spoke at universities there. He’s a savvy political actor when he needs to be.
It seems like an impossible position to be at the helm of one of these companies. After sharing the news of Facebook blocking Trump, my mentions just became a stream of people calling Zuckerberg a fascist and a Nazi. And I was like, wow. Democrats and Republicans both call Zuckerberg a fascist and a Nazi.
He’s fallen into this awful middle where he is a bogeyman on the right, he is a bogeyman on the left. At Facebook, someone wrote, “We’re pissing off both sides, we must be doing something right.” I’m like, cool, sure, if that’s how you want to play your cards.
Just as we’ve been speaking, Twitter permanently banned Donald Trump from its service. What’s your reaction?
I could sense a disturbance in the force and then had to check Twitter. I mean, I guess it was expected.
I thought they would follow soon after Facebook. It’s just the pattern with these companies, that they go one right after the other.
And again it’s going to be the indecisiveness that people talk about. Right? They suspended him for 12 hours, let him back on, he put a video up yesterday, went kind of quiet, tweeted a couple of times, and then he got knocked out again.
So Zuckerberg is obviously hated by both sides, Jack Dorsey will certainly be in that camp after this move. Is being in charge of a platform with a couple of billion people just an impossible job and there’s no right way to go about it?
I think that’s kind of the wrong way to look at it, to be honest with you. Liked and unliked, I think is not the kind of spectrum that you should be grading these CEOs on. I mean, you want them to be principled. You want them to just have firm principles that they stand for and that they adhere to in spite of essentially pressure or whether or not they’re going to be perceived as liked or not liked, you’re always going to offend someone.
At least if you adhere to your rules or your standards that are clear and open, then hopefully you’ll garner some respect. And I think the problem with these companies is that their rules change on a monthly, weekly basis. There’s this kind of moving the goalposts here, shifting that there, and it’s never clear. I report on Facebook for a living, I couldn’t tell you what their rules are around political actors. Like there seems to be some kind of new wrinkle every week and that’s just frustrating.
It’s interesting that you say they shouldn’t be responsive to public pressure because in the first half you talked about how you are the public pressure.
The reason why they listen to me is because employees have realized that the only thing that these CEOs listen to seems to be public shaming — bad stories in the press. But if they adhere to those principles and stuck to their guns, maybe there would be less of that. It’s kind of like a chicken and egg problem. Maybe if Zuckerberg just stopped listening to me, and just led based on the principles set out by its company.
But there are no easy decisions on this stuff. You try to set principles, but you’re challenged in directions that you might not have anticipated. So I wonder if the act of setting the rules is impossible.
Twitter’s approach is like “our rules are a living, breathing document.”
You’re not into that because that basically means they can do whatever they want.
I’m open to changes, but these things seem to change every week, or there seems to be no standard. For example, we reported earlier this year at Facebook that there’s these conservative pages, Prager U, which puts these videos together and often have misinformation. Charlie Kirk, the head of Turning Points USA. These pages can be fact-checked by third-party fact-checkers.
In certain instances, they did share misinformation that was fact-checked by third-party fact-checkers. After a certain amount of strikes, you get penalties assessed to your page. You get demotion in terms of your reach. You’re not read as widely as many people, the algorithm doesn’t serve you up to as many users as normal. And sometimes you get prevented from running ads. What we reported is, those strikes were being assessed. They were actually sharing misinformation, some about climate change. I forgot what some of the others were, but these Facebook executives who were on the policy team went in and advocated that those strikes should be removed — there’s kind of this tweaking of the rules there. And that allowed these pages to continue to persist and act as if nothing had happened. I think that’s a prime example of Facebook not sticking to its guns. Their statement to us in that story was insane: “Yes, fact-checkers have the ability to deem things as misinformation and to label them, but only we can assess the penalties. We decide the penalties.” Which means that their whole third party fact-checking process is kind of a sham.
Sometimes I think case-by-case makes more sense, but what you’re talking about here is a really serious issue I don’t think many people pay attention to.
And they come full circle on that. I mean, the reason why we reported that story is because people internally were pissed about that, they were mad. They came to us. To illustrate your point of governing by PR crisis.
Which has always been the case for Facebook. It’s interesting because, at the beginning of this year, Zuckerberg very explicitly said, “I’d rather be understood than liked.”
I don’t think he’s either.
When Trump’s account goes away, his supporters don’t just leave the internet. They go somewhere. Alternative platforms can become echo chambers, and you might end up even less able to see those you disagree with as people. What’s your take on that?
The funny thing is, right before we started recording this we published a story about Parler. Apple sent Parler a letter, to start moderating content better, to prevent violence from being incited on Parler, or face expulsion from the App Store. So, I don’t know how long Parler’s going to be in the App Store.
This is an interesting out-of-left-field content moderation decision we didn’t expect.
But like you’re asking me, are people going to moderate these places, and are these places going to exist and how are they going to act? And how is it going to interact with Twitter? Is it going to be some kind of upside-down Twitter?
What are the second-order effects going to be?
I don’t know what Trump looks like not on Twitter. Maybe he goes to Parler and starts going crazy on there. Because so much of his Twitter is about owning the libs, that’s the brand of a lot of people on Twitter. When they get banned, they don’t have the same, like — I tweeted something bad, now I’m going to let the engagement role in which then creates an anger wave cycle.
But there’s more to it than that. Donald Trump’s account is used to communicate with his supporters, but there’s also the factor of him having an immediate line into any newsroom that would build a story around his tweet.
But there’s also the other effect of him no longer being president, right? The reason why we paid attention to his Twitter is because he could start war with North Korea with a tweet. But if he no longer has that power anymore, do I care? If he’s like, “In section 230, we, we will destroy the…” and he’s doing it on Parler—
What we should be thinking about is the fact that there’s lots of supporters. We saw many of them marching to the Capitol over the past week and they will be listening.
That is a concern that a place like Parler or Gab can be a breeding ground. That’s already starting to happen in private places like Telegram that aren’t relegated or regulated or monitored as much as a place like Facebook would be. Do I think Donald Trump is going to start a discord channel and 70 million people are going to be in that? I don’t know.
No, it definitely won’t be the same amount of people.
30 million people, 20 million people?
That’s a sizable amount.
Look, man, I didn’t anticipate people storming the Capitol this week. I can’t tell you what’s going to happen next week.
That was surprising to you?
I should rephrase that. As someone who saw these groups, I mean, it was in the realm of possibility. Did I think they would actually do it, as opposed to LARP and walk around in camo and carry Don’t Tread On Me flags? They did it in Michigan and they did it with guns actually, which is more terrifying. I mean the scenes this week were just, even if you were monitoring it, were just shocking.
Some people would say the networks shouldn’t do these bans because after they set that precedent they’re going to be subject to pressure campaigns from their employees, who have seen them take this action and try to get them to do it again to their political opponents in the future. What do you think about that?
I think that’s a worry. I think there’s now a precedent that’s set, not just for the U.S. You’ve got to think of other places around the world. I’m just thankful I’m not in the position to make those decisions.
It’s pretty remarkable that a large percentage of the world’s speech has migrated to two websites.
I would say Facebook more so than Twitter, but yeah. The Facebook universe.
Democrats hate Facebook, Republicans hate Facebook. For years I’ve been confident that Facebook could weather any storm and now I’m not so sure. So what do you think happens here?
I don’t know. You have a new president who went into the election saying he doesn’t like Mark Zuckerberg, he’s on record saying that. At the same time, the administration is hiring former Facebookers left and right who have worked in various capacities at the company. I think these antitrust lawsuits mean a lot. Also paying attention to the Google antitrust lawsuits that are led by the Texas attorney general, because that has a lot of Facebook stuff in there as well, which was awful PR for the company. The collusion on ads allegation.
Alleging that Facebook and Google made a secret deal, that Facebook would win a set number of ad auctions, which if it’s proven out in court is very bad for both companies.
Beyond the legal challenges, this is just going to be an awful PR year for Facebook because they’re going to be constantly in the news with these legal challenges, with the—
Yeah. 2020 was bad for them. 2018 was bad for them, 2019 was kind of a respite, and then back into it with the election in 2020. It just seems to be never-ending for them. I’m sure they have a plan or some kind of process in place on how to deal with this stuff. We reported, for example, that they went through and did this online training course for all employees on how to handle antitrust discussions and not put stuff in writing. I thought it was very funny, and just a sign of the times.
A better option is don’t do crimes.
Step one, don’t do crimes. Step two, repeat step one. No, there’s literally, “Don’t put stuff in writing when you can get on the phone. If you need to put stuff in writing, have a lawyer on it.” I was like, “What?”
How to evade accountability.
And Zuckerberg has said multiple times on Workplace, as have a bunch of other execs, “Do not talk about antitrust stuff on Workplace, you will be removed.” They’re just terrified of discovery on all this stuff. They’ve created a platform where employees post their every thought and argue every point, and now they’re telling them to shut up and it’s just kind of antithetical to their ethos.
How do you think the antitrust stuff plays out? Do you think they get broken up or is it just a ceremonial fine from the FTC?
I think they’ll probably be some settlement. It feels like they’re always some settlement.
A meaningful settlement or another one they can shrug off?
I don’t know. I don’t know what the government has. I’m definitely gonna try and find out. I don’t know what they have from discovery. Honestly, that Google stuff has been a lot juicier than what’s been revealed in the Facebook antitrust suits in Texas.
It’s not just the FTC, it’s not just the DOJ in Google’s case, it’s all these state attorneys general. My perspective in the past was that these companies were able to weather whatever it took, but maybe not.
Maybe Facebook just becomes a conglomeration of law firms this year.
Is that your 2021 tech prediction?
That’s the biggest threat and Facebook reacts to threats, right? They buy threats, they kill them, they outcompete them, they copy their functions, and now the threat is the government. So they’re going to throw everything they have at this.
And they have a lot to throw. The FTC’s annual budget is 350 million a year—
Facebook made 3 billion in profit last quarter.
And revenue is $16 billion or something like that.
Yeah, they’re going to do $80 billion this year.
In a year, versus $330 million for the FTC. They’ll have expenses and stuff, but it’s a tough fight.
It’s a good time to be a lawyer in D.C. right now.
Now in terms of the product, they’ve been losing teenage users. TikTok is ascendant.
I deleted mine.
Because of security fears or because—
I got creeped out by the key-logging and then, I just couldn’t do another platform. I like it when people post TikToks on Twitter. I think they’re funny, but I don’t want to seek them out anymore. I’ll see the good ones that get posted.
The bottom line is that it’s picking up momentum and it’s a real challenge to Facebook. What do you think’s going to happen with Facebook’s business? Is there anything exciting on the way?
I reported out Facebook’s year-end meeting in December, where they had a product roadmap for everything. They talked up virtual reality, Oculus. They’ve had a lot of success with Portal, which has thrived, in part because of the pandemic, but according to Katie Notopoulos, my colleague and your former colleague, it’s a lovely product. I think she might be the number one fan of Portal. We’re going to get this podcast a Facebook Portal sponsorship.
I don’t take ads from Facebook.
After this discussion, I doubt they will ever want to fund anything with us.
So they like VR. They’re talking about AR. They talked about their innovations in A.I. I don’t know if you saw that they had this tool that they were going to deploy to summarize news articles. I think they’ll continue to keep chugging along as they can, but the main blue app is dying.
I wouldn’t say dying. It’s growing, but it has less luster than it had in the past, especially with teens.
Don’t you feel like those notifications that are like, “Jimmy just posted a three-minute song that he recorded.” It’s just like, I don’t need to see this shit. You’re just getting desperate.
I see what you’re saying.
They’re leaning into WhatsApp. We can talk about WhatsApp a little bit. They’re starting to do the business stuff on there, they’re starting to enroll payments, especially in India.
They have scale.
They have scale, they have 2 billion users. And I think you get to a point where you protect that scale by copying, killing, and legalizing. They’re going to have to focus more on the legalizing front this year.
No doubt they’re going to have to spend more energy on the legal stuff than ever before. What do you think about Facebook? Do you think it’s good for the world? If it went away tomorrow, how would you feel?
I would have nothing to report on. It’d be so boring. I’d be just twiddling my thumbs, just tweeting dumb shit all day, microwaving fish. I would think something would probably replace it. People have been conditioned to spend a lot of their time online now, for better or worse, their activity will probably go elsewhere.
But how would you feel?
How would I feel? I don’t use Facebook that much. But I think it has different impacts in different places. This is the example everyone uses, but Arab Spring, how good everyone felt about showing democracy and posting about it. That was a great time for this whole network. And then, fast forward a couple of years, and you have genocide in Myanmar. That felt pretty shitty, it was awful. I don’t know if I answered that question. How would I feel? Fine, I guess it’s hard to say, I just don’t know where people would focus their time, their energies, what they would throw themselves into next? Is it TikTok? Just start watching TikTok videos and then we’re just going to start dueting with people and shit?
You say dueting like it’s the worst thing in the world?
Maybe that is the end of the world, like the next iteration, we just start singing over each other’s songs. I think it’s better than organizing a genocide or storming the Capitol. We say that and then in 20 years there’s going to be—
The duet rebellion.
Do you think that a lot of the negative stuff that we see in the world today is a direct result of social media?
Some of it is, definitely. I think we fall into this trap of having to gauge social media on this totality of it all, or if we add this thing to the positive side and take this thing to the negative side, where does the balance lie? And is it a net good? Or is it a net bad? And I think, as a reporter, it’s hard to look at the totality of it, but you can report on individual instances where things are good.
For example, those are the things Facebook puts out press releases for. A woman in India started a business and connected to local merchants, like local buyers and built a business. She’s now out of her village and made a living for herself. You hear those stories on Facebook all the time.
But as a reporter, you can always find — and this is where something terrible happens on these social networks. A village in India lynches people because they believe these people are child molesters because they heard a rumor on WhatsApp. We’ve written that story. Buddhists in Myanmar are chasing Muslims out of their country because they’re riled up since their leaders on Facebook Messenger spread rumors that Muslims are dogs.
So it’s hard to say “Oh, Facebook is good and Facebook is bad,” but I don’t think that should be my role. I think my role should be to hold the platforms accountable in these instances where there are negative externalities and in writing about those, hopefully, they can improve and prevent that from ever happening again, or repeating, in an ideal world. But it just seems like it’s a broken record at this point and things keep repeating over and over.
That shouldn’t be your role, and the podcast is a great place for us to hear more about your mindset and your mentality and how you tackle these things because the way that you come into it matters too.