Big Technology

Does YouTube Radicalize? A Debate Between Kevin Roose and Mark Ledwich

The New York Times’ Kevin Roose and researcher Mark Ledwich join the Big Technology Podcast for a debate on whether YouTube actually radicalizes its users

OneZero is partnering with the Big Technology Podcast from Alex Kantrowitz to bring readers exclusive access to interview transcripts — edited for length and clarity — with notable figures in and around the tech industry.

To subscribe to the podcast and hear the interview for yourself, you can check it out on Apple Podcasts, Spotify, and Overcast.

In June 2019, New York Times reporter Kevin Roose wrote The Making of a YouTube Radical, a story about how a 26-year-old man, Caleb Cain, was radicalized through YouTube. For the story, Roose examined Cain’s entire YouTube history and plotted the path he took toward radicalization. Software engineer and researcher Mark Ledwich took issue with the story, citing his own research and claiming the notion that YouTube could radicalize was a myth.

Instead of yelling at — and past — each other, Ledwich and Roose came together for a moderated debate on the Big Technology Podcast, where both stated their points of view, got a chance to respond to each other’s points, and ask each other questions.

Alex Kantrowitz: Kevin, can you introduce us to Caleb, the man you met who said he was radicalized through YouTube?

Kevin Roose: I’ve been looking into online extremism and radicalization for a number of years. After the 2009 shooting in Christchurch, New Zealand, that was the one that was streamed on Facebook. The shooter had this very online manifesto. It was awful and it referred to PewDiePie and that sort of catalyzed for me what became the next year of reporting. I was really trying to answer the question: How does this happen? How do people encounter extremist views online? What role do platforms play in introducing them to new and compelling extremist voices and what are the forces that power that process?

I was really looking for a case study. Someone who would talk to me about their journey, their process. I started looking around and eventually found this guy, Caleb Cain, he was 26. He’s probably now 28. He was from West Virginia and he basically was an Obama supporting liberal, dropped out of college, was having some real troubles in his personal life. Not a lot was going right for him. Started looking on YouTube for self-help videos. Things that might help him feel better. And he stumbled onto this network of creators, including people like Stefan Molyneux, who really helped him feel better. They had videos about self-confidence and meeting people and getting a job. Almost like life-coaching.

He started watching these videos and started watching other videos that were recommended from those videos. He ultimately became pretty far right. He considers himself alt-lite, not fully alt-right. That’s how he characterizes it now, but at the base level, he said that he was radicalized into the far right through his YouTube recommendations and eventually started agreeing with many of the sentiments of people like the Christchurch shooter. And then, actually came out of that was de-radicalized from watching videos by this other cohort of YouTubers, known as BreadTube or the left wing. They do a lot of counter-programming of the alt-right section of YouTube.

He eventually got out of it and started making videos and talking about how he had gotten out of it. I was just fascinated. I was really interested in meeting him. I went to West Virginia, talked to him for days, and then he provided me with his entire YouTube history: 12,000 videos spanning four years. It was this big, giant HTML document.

You were able to download the whole thing?

Kevin Roose: Exactly. There’s this Google Takeout function, where you can download all your data and so he just downloaded his whole YouTube history and sent it to me. We were able to go through and retrace his journey and all the people that he met along the way. That became the story, “Making of a YouTube Radical,” that Caleb was featured in. It’s about him and his journey, but it’s also about what was going on at YouTube at the time. And, the way that those things interacted and the way that the changes YouTube was making around its algorithm, reprogramming it around deep neural networks, changing some of the key metrics that it was optimizing for, changing from clicks to watch time. How all of those have factors helped create the environment in which Caleb was radicalized.

Did you ever put a finger as to how many Calebs there were, and whether this was a one-person problem or was this something that was a larger issue?

Kevin Roose: Since the story and the podcast came out, I’ve heard from thousands of people who have similar stories to Caleb or his family members or friends do. It’s impossible to quantify, but this is not a single person problem. This is a problem that was big enough that YouTube felt it had to address it through changing its policies on White Nationalism, hate speech, borderline content, and also changing its recommendation algorithm.

When was the last time you’ve heard from someone that’s come to you with an issue similar to Caleb’s?

It happens almost every day. There seems to have been this golden era from right-wing YouTube from about 2014 until about 2019 when they started making changes to tamp down the influence of some of these channels. A lot of it took place in that five-year window, but there’s been other rashes of it this year. We’ve seen a lot of people becoming radicalized during the pandemic. QAnon has grown like crazy. Some of the adjacent communities to that have become quite large. It’s not just happening on YouTube, it’s happening all over the internet.

Mark, what’s the counterargument? You seem to believe that people aren’t being radicalized on YouTube. Where’s your proof?

Mark Ledwich: We did a study where I’ve been collecting data about YouTube’s recommendations since late 2018 and we saw those changes, some of them at least. We weren’t collecting at the time when it was really promoting clickbait kind of material. When we started, it was fairly neutral in terms of amount views that videos got. Recommendations were pretty much proportional to that. Early 2019, YouTube really clamped down on borderline content and conspiracy content, so I think we’re aligned the way we understand what’s happened there.

More broadly, I think the influence of recommendations, in terms of someone radicalizing, is just a very small part of a larger process. With the Christchurch shooter, I read that report and there were so many other factors and as they say in the report, it’s often a once-off a highly personalized journey. No one model can explain the different ways people radicalize to extremism.

My analogy for recommendations is that they’re a gentle breeze that’s able to be controlled by YouTube, but there’s larger factors at play in the environment that’s more like a storm. That’s pushing all sorts of directions at different times and for different people, but the recommendations are much more gentle than that, but something that’s in their control.

What you’re saying is that people do get recommended videos and they can go down these rabbit holes, but you have to look at that as just one factor of many?

Mark Ledwich: It’s one factor of many and I wouldn’t describe the recommendations people get as rabbit holes. We found that on average, recommendations go towards more mainstream content, at least since early 2019. It may have been back then, but we don’t have data on that and I haven’t seen anyone else with good data around that either, so we’re speculating as to whether the recommendations were pushing that direction or not.

So there’s two arguments here. One is we shouldn’t put too much emphasis on YouTube alone. Two is that these recommendations don’t always take people down a rabbit hole. Kevin, what’s your response?

Kevin Roose: To the first argument, I agree with Mark that there are other factors. In the story about Caleb Cain, I talk about the larger forces at work here. He was living in an economically depressed area. He didn’t have a lot of career prospects. He had a shitty family life. He did not arrive at YouTube as a blank slate. He was coming in with a number of different personal traits that made him especially vulnerable to this.

I don’t think that we can discount YouTube’s influence, especially now, when all of us are just experiencing the world through our screens. YouTube hasn’t released data in a while, but the last thing we know, as of a couple of years ago, people were watching a billion hours a day of YouTube. It is something like 15% of all internet traffic, is through YouTube. This is not a small part of people’s media diets and especially for the people that I started studying, which was kind of these people who are maybe a few years younger than I am. People who are in their teens and twenties, who for them, YouTube is media. It is culture. It is politics. YouTube is all-encompassing frame of reference for everything. For those people, it’s not just, it’s more than just one in a system of inputs, it is the primary input by a long way.

On the second point, the rabbit hole pipeline argument. I’m interested to hear more about this from you, Mark, because I’ve read your study and I’ve read some other studies about this. There have been some studies that have found strong evidence for the migration of viewers from more centrist and alt-right videos to alt-right and hard-right content over time. Whether that’s through recommendations or through other forces, is a question that probably can’t be answered, except if you work at YouTube.

Do you think it is possible, with the data that we have, that is currently available to us as non-YouTube employees, to account for and quantitatively study the effects of radicalization on YouTube?

Mark Ledwich: I think we can do much better than we’re doing now. But no, I think right now, there’s a lot of room for opinion. That the data that we have right now isn’t definitive for the more holistic question. Is YouTube influencing things, all things considered, people towards extremism? Are they making that worse? We don’t know that yet. There’s definitely studies coming and that have just come out that are quite good.

There’s one recently, that users were panel data from a representative group of Americans that have real traffic. They looking at actual click-throughs on recommendations and looking at which direction they’re going. They have issues with the classifications but they’re fixing those, so I think that will be really good information.

The study you referenced, I think that’s the “Auditing Radicalization Pathway” studies by Ribeiro and others. In that, they looked at people commenting on videos and seeing over time whether they move towards the… They classified channels as Intellectual Dark Web, alt-lite, and alt-right and whether there was a direction towards that. I thought that was quite clever, but they only looked at that one direction. They didn’t look at people moving from Stefan Molyneaux to Jordan Peterson to something more centrist. They didn’t look at that direction.

Mark, your research itself, found that a lot of the recommendations pointed people to a more centrist content?

Mark Ledwich: Yes, towards mainstream content. It’s got an authoritative bias.

That’s after 2019?

Mark Ledwich: That’s since April 2019. That’s the recommendations, but if you’re thinking radicalization, you do have to look at it more holistically. That’s just one input. It could be just the existence of certain content, you could blame YouTube for or that video itself is engaging enough that it’s more likely to radicalize than a book. I don’t believe those things, but that’s in the question when we’re thinking about this more holistically.

We should talk about is scale. Even if the vast majority of YouTube recommendations go to centrist or authoritative videos vs. radical videos, then you still have that error at the end, that people end up going down that path and becoming radicalized. Kevin, can you riff off of that and give your take on it?

Kevin Roose: There’s a scale question and there’s a prevalence question. The prevalence argument is one that YouTube loves to make. Facebook also loves to make this, where they say, only 6% of the content on our platform is political. Or, borderline content is less than 1% of all content on YouTube. I have no reason to doubt that that’s true. We can’t audit that. They don’t make data like that available to the public. If you think about all the many things that people use YouTube for, like figuring out how to fix their broken toilet or tie a bow tie or listen to music. There are people who only use it for those things. That’s totally plausible to me that this might be a small percentage of overall usage of YouTube.

But, the denominator is so big. If you have people watching at least a billion hours a day, I would guess that it’s increased substantially since that figure out. 1% of a number that big is still a fairly big number and especially when you’re looking at the possible outcomes here.

It’s not just that people are getting radicalized and posting a bunch of memes on 4chan. It’s that, in some cases, they’re going out and conducting mass shootings. They’re becoming violent. They’re engaging in coordinated harassment. If it were just people getting sucked into Bigfoot conspiracy theories or whatever, that to me doesn’t really register as a grave harm. But, there’s a real danger here.

I think that I’d be curious to hear Mark’s take on this scale question. I think that even if it’s true that a small number of people relative to overall YouTube consumption are experiencing this pipeline, I do think the existence of the pipeline is something that we need to study and we need to get more transparency from YouTube about.

I’d love to hear Mark’s answer on that also.

Mark Ledwich: I agree with YouTube that it’s a small percent, but, yeah, it’s a huge platform. It skews young, so I’m expecting it to grow much, much larger than it is now. I did comparative numbers. I compared Fox News on cable versus YouTube and cable’s still bigger, as to the amount of views they supposedly get. But, I think, in the next five years, we’ll see that flip.

Mark, if you’re writing this paper pushing back on the fact Kevin is saying there is big scale and maybe this is happening to a percent of people, isn’t that something we need to consider in this discussion?

Mark Ledwich: I’m doubting that it’s a YouTube specific problem because people are going to be watching video content on the internet no matter what.

Let’s say YouTube got rid of all right-wing content and the extreme right everything, there would just be another platform that people watch. We saw a big migration to Rumble when the QAnon crackdown happened. They’re getting millions of views on Rumble now. It’s not as if this pressure to remove the content or really change recommendations will have a massive effect.

Okay, but let’s focus on YouTube because that’s the topic of discussion. Can you make your most convincing argument that YouTube, itself, is not doing what Kevin has argued?

Mark Ledwich: Apart from those points, the way I look at YouTube, the content is reflecting what people want to watch. I think there’s an intuition that it’s more radicalizing for someone that just watches mainstream. When they come on to YouTube, there’s more edgy stuff. I think it’s intuitive to think it’s a rabbit hole because it’s reflecting the population’s demand for content more directly than mainstream news. Although, mainstream is starting to become more like YouTube in that way. I can see the intuition there, but I’d say the type of information that I’m looking for doesn’t exist to show that there’s a radicalizing effect.

And also, the over-concern about it, is like when we talked about… When the discussion of radical Islamic radicalization comes up, a lot of the same arguments apply to this, which is we’re talking about a small amount of numbers. More people die from other means. Alcoholism’s more important, things like that. I think of those things when this comes up as well.

You did research so you’re coming to the conclusion that there’s no real issue here. What in your research led you to believe that?

Mark Ledwich: My research just focused on recommendations. What we saw in recommendations was a mainstream influence, overall. We’re doing more study to look at what effect personalization has on that. That’s the area of research that I’ve done. We’re talking about a wider question, where I’m just speculating, much like everyone else is, about what causes radicalization.

Kevin Roose: I’m curious, Mark and Alex, if I can jump in with a question for Mark?

Yeah, please.

Kevin Roose: Mark, I remember when your original study came out, I got a lot of people emailing it to me, saying what do you think about this. It was sort of interpreted as a response my story, although I know you were looking at this before. You seemed pretty angry about the narrative of the YouTube rabbit hole and I think you were saying things like, this is a conspiracy that the mainstream media is using to repress the influence of social media and these narratives aren’t trustworthy and this is a self-serving thing. I guess I’m just curious, and I don’t want to attribute any arguments to you that you aren’t comfortable making, but I’m curious what you think is behind the meta-narrative here? If this radicalization pipeline doesn’t exist and, in fact, if it’s pulling people in a more mainstream direction and has been for as long as we’ve had the data, why would people be coming out with these stories?

Mark Ledwich: There’s an elite culture which looks down on popular culture and it’s especially true with the New York Times, which I find like a very small subculture of that. I was listening as Ezra Klein talked to another reporter from the New York Times who said he felt he was incentivized to write articles like this, reflexively anti-tech platform articles, and that benefited him at the New York Times. That’s definitely my background what I’m thinking the incentives are inside the place where you work.

The rabbit hole meme took on legs in a lot of places and I just wasn’t seeing anything more than stories. When you tell those stories, you’ll get more of them. And it’s the same on the other side. If you’re more right-wing or an anti-woke YouTube channel, you’ll also get lots of stories of people feeling like the places that they work at are stifling free expression and things like that. I think when you’re in the public arguing about these things, you naturally attract a certain side that you keep hearing from and it’s hard to maintain perspective.

Kevin, before the break, Mark was making some arguments about the New York Times and I guess how the media is incentivized to write these anti-tech stories and it seems like there’s no room for the other side, is what he’s saying. There was a lot of stuff that was said, the floor is yours to respond.

Kevin Roose: I think there are two issues here. One is what are the incentives of reporters, generally? I think that often, people who aren’t close to the media, who don’t work in the media, who don’t have a lot of experience in the media, tend to think that it’s cliques. It’s like traffic, it’s attention. I would say that’s true of some outlets and less true of others. I certainly don’t feel like I’m motivated by traffic.

Then there’s the argument that journalists are motivated by prestige and that stories that win prizes. You don’t win a Pulitzer for the story that investigates the Wall Street bank and finds no evidence of wrongdoing. I think that I would grant that our incentives skew toward holding institutions to account and finding instances in which the public is being manipulated or being taken advantage of. I often tell people, we don’t write about the Boeing planes that land on time and safely, but that’s always been part of the news business and I don’t think that’s changed meaningfully.

There was this other argument that I think Mark and some of his people who agree with him make, which is that the New York Times and other media institutions are being mean to YouTube because they want YouTube to promote their channels more and to boost authoritative mainstream sources and to disappear independent creators. I heard this after I profiled PewDiePie. I’ve heard this for years that there’s an institutional incentive for media organizations specifically to be mean to YouTube because they want their own content to be favored in YouTube, in the algorithm, on the home page, in trending, wherever. I guess I would just love to hear Mark talk about what he thinks that incentive is. Why do you think YouTube gets criticized by mainstream institutions? Do you think it’s more… I don’t know, why do you think that happens?

Mark Ledwich: I think it’s largely political and cultural. YouTube represents a more right-wing version. A more scrappy, low-quality information type of platform. Maybe not everyone, but definitely I think, maybe Tristan Harris and yourself want to have a narrative about themselves, that they’re a large player in saving people from these problems. From that, that influences the way you describe or think about the systems, so that you can say, oh, here’s this one problem that if I can just shine a light on, we can fix. That’s where the bias comes in.

Kevin Roose: Do you think YouTube is better for these changes that it has made since people starting paying attention to issues like radicalization. They say that their algorithm changes have resulted in 70% fewer views of borderline content, conspiracy theories, stuff like that. Do you think YouTube is a better platform today than it was say, two or three years ago?

Mark Ledwich: Yeah, I think it’s better. This is where I definitely credit to you and others pointing this out early, like Zane Up, as well. It’s good that they’re not recommending videos or promoting things like conspiracies or far-right ideology. I think that’s definitely a good thing. I think the changes, there’s five steps forward, two steps backwards. I’ve think they’ve curtailed some really good independent YouTubers. People like Dave Pakman or Laci Green. They don’t get recommended anywhere near as much as they used to because of these changes. I feel like they’ve got very high-quality content, better than a lot of mainstream content, but because of this blanket heuristic, they’re being disadvantaged.

Kevin Roose: That’s interesting. I often wonder. I think YouTube would love nothing more than for there to be a universe of YouTube native creators who do straight news, for lack of a better term. People like Phil DeFranco. He’s doing some opinion too, but people who could just do what TV people on the news do, but do it a YouTubey way. I think that would make them very happy because they love promoting their own creators and my sense is that they promote mainstream news sources because they don’t have to stay up at night wondering if NBC is going to publish some crazy conspiracy theory. It’s sort of a proxy for how much they can be trusted to report as close to the truth as they can. I do think they would like there to be a universe creators who do that kind of thing. I just don’t think YouTube are incentivized to do that kind of thing because it’s not good for views.

The demand for content is much more opinion than it is for straight news. I think you guys find that between your departments as well. Do you worry about the criticism from media has made YouTube defensive, so they wouldn’t take risks in terms of what content they promote?

Kevin Roose: I think that they’re very sensitive to elite opinion and media is part of that. I think they’re sensitive to politicians. I think they’re sensitive to their peers in the tech world. They want to be seen as the good guys and so I think they want to be small seed conservative with respect to… If they take it as a given that their algorithm is going to throw x billion views to a set of channels, they want to make sure that those channels are not going to be the next Alex Jones’. It’s sort of like how now if you search for 9/11 conspiracy theory or moon landing videos, they’ll give you a little thing from Wikipedia on the video. They’re outsourcing trust to Wikipedia because they don’t want to write their own little blurbs.

I think of what they’re doing with this authoritative news push as a version of that, where it’s like they don’t recommend conspiracy theory videos, but they know that they have this algorithm that needs to recommend something. They feel more comfortable creating a safe bucket of things they know are not going to be extremist or contain hate speech and recommending those. I do think there’s a danger of making that bucket too small and not inclusive enough. But I also think that problem is more easily solved than the question of radicalizing people. I think what they don’t want to do is create a situation like they’ve had for the past couple years where they end up, by negligence, recommending these awful videos, to hundreds of millions or billions of people, and they don’t even really know that they’re doing it.

Mark Ledwich: Another aspect to that is, they do have competitors and in terms of the engagement of the platform, how entertaining those recommendations are, matters whether they’re going to release it to TikTok. I feel like TikTok did a better job with their recommendation algorithm, which got much more of the views than on YouTube. They have to think about that. If they make the recommendations extremely bland, then that’s opportunity for competitors that aren’t doing that.

Kevin Roose: Alex, I’m cognizant of the fact that you promised your listeners a debate, so I feel like we should, like, amp it up.

No, this is good. I’m letting this breathe. If you want to amp it up, by all means, but I think people are going to find this fascinating.

Kevin Roose: I agree with Mark about the fact that is a broader phenomenon than just the YouTube algorithm and, in fact, one of the things I’m looking at now is what is this demand side part of the equation. What we see happening now—

You mean there are people involved in these decisions too?

Kevin Roose: Yeah, obviously. People make choices. The thing that I push back on is this idea that people have total control over their information environments, even when things are being recommended to them. There’s been a lot of academic research about the power of recommendations, the psychological power. There have been some interesting studies around things like music recommendations, that people actually like a song more when they know that it’s gotten five stars from Spotify or ended up on some personalized playlist for them. They trust the things that are fed to them by an algorithm. In the case of YouTube, what YouTube’s recommendation algorithm does is it not only recommends videos, but it dramatically constrains the universe of possible videos. There are billions of videos on YouTube, but you are only going to see a couple dozen of them on any given day because those are the ones that are going to appear in your sidebar and on your home page.

I think people have free will and free choice in some aspects, but I really think that part of the mistake people make in the opposite direction is assuming that people are in total control of their choices, because we know. These platforms make billions of dollars a year by trying to change people’s minds in the form of targeted advertising. They know how influential their recommendations are and that’s a huge part of what’s made platforms like YouTube so successful.

Mark Ledwich: I’m not arguing that recommendations are totally controlling people. In your article, you called it “steering people.” I feel like that’s too strong a word in terms of the influence that it’s having. That’s why I use the gentle breeze and a storm analogy. I think with election fraud, which you looked at quite closely, shows this, in that YouTube was curtailing recommendations to videos, on average, you’ll find exceptions, but on average videos that were promoting the election fraud narrative were recommended a lot less. But despite that, the content that was promoting it did really well. Fox News, who are promoted by the algorithm, were disputing mostly the election fraud narrative. They lost views at the same time as places like Newsmax and NTD, and One American News Network. They gained a lot of views, just by not being recommended as much. NTD was an exception, but for the others that’s true.

Kevin Roose: Right. I think, really, if we’re talking about the YouTube algorithm, we have to separate it into two, like algorithm 1.0 or depending on how you count, the pre-2019 algorithm and then what’s happened since. I do think that anecdotally and also from studies that have come out, it does appear that YouTube is recommending much more… It’s much more likely to be recommending things from big professional media organizations than it used to. A lot of the “worst people” have been de-platformed. Stefan Molyneux is no longer on YouTube. Richard Spencer is no longer on YouTube. These people who were… Stefan Molyneux was not a marginal figure. He had hundreds of thousands of subscribers. He got hundreds of millions of lifetime views. These were some of the most popular political voices on the platform. Not only is the algorithm different now, but the pool of available videos it’s picking from is different in some meaningful ways.

Kevin, do you ever think about the fact that the stuff that you’re reporting on might lead to a crackdown for YouTube that gets folks who shouldn’t be demonetized, demonetized — and independent creators never get a real chance to get off the ground?

Kevin Roose: I don’t think that the goal of my reporting is not to take YouTube to take down stuff. That is not a metric that I am aiming toward. My goal is to report on what’s happening on YouTube and if that leads them to want to take down stuff and to feel pressure to take down stuff, then that’s that. But, that’s not my goal.

Right, but you have to know that that’s a very clear potential outcome when you write a story like this.

Kevin Roose: Of course, of course. I’m not naive about that. I guess worry about the false positives less than the false negatives, to be sure. Something like this happened with QAnon and the crackdown over QAnon. A podcast that I really love is this podcast QAnon Anonymous, which is sort of an anti-QAnon podcast but because it had the name QAnon in the title, it got swept in the crackdown, which sucked. I was like, I don’t want to miss this podcast, this is a great podcast. They made a stink about it and got it restored. There are avenues for redress of grievances in the case of a false positive, so I guess I worry less about the overbroad application of these rules. Really, we’re still in a phase where these companies are being flooded with misinformation. We are nowhere near the point of having a totally clean house as far as information integrity goes. I just think it’s a little premature to worry about whether we’re sweeping up too much stuff. Whether YouTube is sweeping up too much stuff when it goes after white nationalism and neo-Nazis.

Do you think that the fact that all these folks are getting banned will actually make their message resonate, maybe with a smaller group of people, but they’ll use it as proof to say look, we’re right, and big media and big tech don’t want you to know?

Kevin Roose: Well, let’s just look at what actually happens to people after they get de-platformed. When was the last time you heard from Alex Jones?

When he shows up on other people’s podcasts.

Kevin Roose: I guess he did just go on Rogan.

He has a smaller audience than this show, but not by much.

Kevin Roose: Right. But Stefan Molyneux is a great example. He still puts up his videos on BitChute and other minor video platforms. But he’s struggling. He’s not happy that he was de-platformed from YouTube. I think that people who have been banned by these platforms generally don’t… It’s really hard to rebuild that audience on a smaller platform.

If I was ranking harms, I definitely worry about building these sort of concentrated spaces like Parler or places like there is, like Gab, a concentrated amount of extremist activity. I actually worry about that less than that sort of contagion effect, where you have people who go onto YouTube, go onto Facebook, they’re looking for self-help or parenting advice or health information or watching boxing videos or whatever and then they’re coming across this universe of extremist content that pulls them in. That is what to me is more worrisome than the small clusters of extremist activity.

This is kind of a ridiculous situation, but…. if YouTube pulls back, maybe TikTok uses the dark arts of social recommendation engines and grows, and we end up with a company based in China that’s crushing YouTube and less responsive to legitimate concerns. What do you think about that, Kevin?

Kevin Roose: I think TikTok and YouTube are different products. They’re both video platforms that are sort of driven by recommendation engines. But TikTok is very short-form videos. I don’t know what the median length of a YouTube video is, but it’s probably significantly longer than a TikTok video. I think TikTok is an interesting case of moderating in the other direction. YouTube started off as total free-for-all and then has gradually winnowed down. We don’t want clickbait, we don’t want nudity, we don’t want neo-Nazis. They shrunk the pool over time, whereas TikTok started out very constrained. There was basically no politics at TikTok at the beginning. That was a conscious choice because ByteDance wanted it to be a fun, lighthearted place. They didn’t want people talking about oppression and injustice. They wanted to keep it light. They wanted teenagers renegading in their bathrooms or whatever.

They have expanded what is considered acceptable over time, where they started letting on political videos and then they started broadening out. I think the diversity of content is probably higher on TikTok now than it was a year ago. I think they’re probably going to end up in a pretty similar place, but they’re approaching it from different directions, which is a really interesting thing to me.

Mark, do you have any last questions or thoughts that you would like to ask Kevin?

Mark Ledwich: I still have a thought. I just want to downplay the influence recommendations a little bit in that the best estimate for political videos I’ve seen is about 40% of the views coming to political videos coming from recommendation. There’s a large amount where it’s coming from links and search and other means.

People are often in social media, they’re in private chats with each other or they’re on Facebook or Twitter and they click on links through that, amongst other means. There’s lots of ways for the content that’s in demand for people to get at. That will change, depending on what you do with the recommendations. If you make your recommendations particularly bland, I think that would actually change that makeup. It will go down to even less than 40%.

Generally, I’ll watch one genre of video so much and then YouTube will recommend something new and then I’ll just start going directly to it. I think that probably mirrors most YouTube users. The recommendation plays one part and then you start going direct. It’s a process.

Kevin, do you have any last thoughts?

Kevin Roose: I’m really appreciative for all the research being done by Mark and others on this. I think it’s super important for the Academy for Engineers and Data Scientists and for journalists all to be looking at these problems simultaneously. Even though you disagree with my reporting, I’m really glad that you’re looking at this and doing the work.

I write the Big Technology newsletter. Sign up here: https://bigtechnology.substack.com.

Sign up for Pattern Matching

By OneZero

A newsletter that puts the week's most compelling tech stories in context, by OneZero senior writer Will Oremus. Take a look.

By signing up, you will create a Medium account if you don’t already have one. Review our Privacy Policy for more information about our privacy practices.

Check your inbox
Medium sent you an email at to complete your subscription.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store