Big Technology

Rest of World Wants to Set a New Path for International Tech Coverage

Sophie Schmidt and Louise Matsakis tell Big Technology Podcast that North America and Europe must look outside to fix their problems inside

Sophie Schmidt and Louise Matsakis

OneZero is partnering with the Big Technology Podcast from Alex Kantrowitz to bring readers exclusive access to interview transcripts — edited for length and clarity — with notable figures in and around the tech industry.

To subscribe to the podcast and hear the interview for yourself, you can check it out on Apple, Spotify, or wherever you get your podcasts.

In 2019, Sophie Schmidt founded Rest of World, a news site dedicated to telling technology stories about what’s happening outside of North America and Europe. After watching her father, Eric Schmidt, run Google for 10 years, she became convinced that the rest of the globe would drive tech’s next era. And so understanding how it was developing there was imperative. Schmidt joins Big Technology Podcast this week with Louise Matsakis, a senior editor at Rest of World, to discuss the site’s work to date, and where it’s heading from here.

Alex Kantrowitz: Welcome, Sophie and Louise. Companies use the term “Rest of World” to describe regions outside of North America and Europe. What made you interested in explicitly covering the parts of the world they deem less important?

Sophie Schmidt: We have big intractable problems in the tech and society category: misinformation, disinformation, surveillance, privacy, you name it. We’re creating panels, and commissions, and we’re shaking our fists at big platforms and saying, “Please fix it.” And it feels a little bit helpless. But the thing that’s not coming up is that every other country in the world is also dealing with it in slightly different ways.

What if the solutions to our problems lie in the sharing of those experiences, and ideas, and learnings? Expanding the dataset. It’s honestly baffling. We have billions of people in the world all using technology all the time. I think the last data I saw said there’s almost 5 billion people online. And depending on how you count Western versus non-Western, something like 80% of all humans live outside of the Western bubble.

That means that you have almost an infinite number of parallel experiments, playing out simultaneously all around us just outside of you. So, why aren’t we comparing experiences? Why is it such a radical suggestion to say that maybe the person who has an answer for us is in Nigeria, or Morocco, or Bangladesh? Do you know what it takes to build a business in Bangladesh or to become a top engineer in Lagos?

The very things that make a lot of countries unattractive to a company, actually, forge extraordinarily talented problem solvers. But we’re not listening to them. At Rest of World, we are looking for them. If you know some of them, come tell us.

Sophie, your father, Eric Schmidt ran Google for 10 years. How did watching him influence your perspective on the gap in coverage Rest of World is trying to fill?

Sophie Schmidt: I remember getting to tag along sometimes when my dad would make international visits, and it was as close to a rockstar reception as you could get. It was foreign dignitaries falling all over themselves saying, “Oh, my God, Google, please come here. Please, please.”

It was a different era, where everyone thought that Silicon Valley had the keys to El Dorado. I don’t think that happens now. But it gave me a really interesting perspective on tech’s power. Steve Coll wrote a book about Exxon, called Private Empire. It had a big impact on me. And he described a universe where Exxon became so powerful in the countries that it operated in.

That it developed, essentially, its own sovereignty. It didn’t have to listen to the state department. It didn’t have to listen to the home country. I could feel that tech was developing that, a different kind of power that didn’t necessarily have to answer to anybody. I found that fascinating, troubling in different ways. But it signaled, I think, the world that we live in now, where you have a lot of different actors.

The interesting thing really, is that the next decade belongs to rising tech giants that are not here, not Western. And those ones, the next generation of them, they’re not looking any longer to Silicon Valley for inspiration. And they’re not looking for permission. So, the game has changed, but I think the nature of tech power is only getting stronger over time.

And you can see that, and you can see that in the way that sometimes states are operating internet shutdowns. These are really complex systems now. Every country on Earth has complex systems between governments, and telecoms, and big tech players, and pro-democracy activists. Everybody is armed. So, the game has only gotten more complicated.

You grew up with a tech background and now you’re on the media side. What do you make of current tensions between the two?

Sophie Schmidt: I think the current state of affairs is a shame. And it makes me sad because I think that there are a lot of good-intentioned people in tech, in media, who want to build productive relationships, who want to be able to share information, who understand that it’s not going to be puff pieces.

And journalists understand that they’re not going to necessarily get all the inner workings. But they understand the function. They understand that it’s an important part of how corporate power is expressed in the world and checked. But now, you have a tough problem of essentially loudmouths on Twitter from both sides, who don’t represent the majority.

And because you have this iced-out tribalism happening, I think it’s very hard for people who are not part of both scenes to understand that the loudmouths don’t represent the feelings of everybody else. It’s very hard to understand how we get out of it, but I’m hopeful. I think it’s a shame, and I hope it can change.

Louise, you had a pretty good job at Wired and you left it for Rest of World. Did Sophie pitch it to you, or did you reach out to her?

Louise Matsakis: I totally reached out to her. So, this actually started for me in 2019. I started looking into a story about China’s social credit system, which is this memefied thing at this point in the West. And it’s totally misunderstood. And it’s often compared to Orwell, if I see another 1984 quote in connection with this program, I’m going to throw up.

And I went into it with all of the same biases. And I was going to write another one of these problematic stories about what China was doing. That’s not to say that China’s surveillance state is not horrifying, what they’re doing in Xinjiang is really troubling. But as I started to look into the story, I realized just how many misconceptions there were, and how this stuff is so distorted.

I started to realize, basically, that so many Western journalists misunderstood China, and as a result, misunderstood so many other places. And I realized that there was this huge asymmetry. Americans and people in the West know so little about people in the rest of the world. Whereas, I think a lot of people in the rest of the world know exactly what’s going on in the U.S.

I’ve met people in Myanmar, I’ve met people in Thailand, I’ve met people in Ghana, who can tell me more about the Senate than probably the average American on the street. So, I ended up writing this feature, actually, the headline was “What the West gets wrong about China’s social credit system.” And that really kicked off for me a big interest in China.

And I started to realize that it wasn’t just China. There were misunderstandings about basically every country, and these distortions were refracted through tech coverage. And I also was watching over the last four or five years, as Trump… everything became about the Trump administration. And everything became about four or five tech companies.

And I saw that so many stories were being missed. Most of them in other countries, or about smaller tech companies, who were often frankly, doing stuff that was way more problematic, or that was hurting considerably more people than the latest Facebook scandal. That’s not to say that I think that these big tech companies should get away with what they’re doing. But there were so many other problems that I thought were really being missed because everyone was narrowing in on the Trump administration, and four or five companies. So, I really decided that I wanted to do more foreign reporting. And when I saw the Rest of World, I was really excited about what they were doing. It seemed like a huge opportunity.

And I was also really interested in the approach. I could have gone to one of a number of international outlets that have bureaus all over the world, and I could have been essentially a parachute journalist. This is a long tradition in a lot of ways of doing international reporting. It’s colonialist, you send in a white reporter, and they tell the story. Maybe they have some stringers behind the scenes who don’t get bylines, or a translator, or a fixer, but the white person is still at the center of the narrative. And what I was really excited about at Rest of World was that we were elevating local voices. I’m working a lot with very young green reporters in countries like Nigeria and South America. And we have reporters on staff too, who are in New York.

But that was really, really compelling to me, letting the people on the ground tell the story themselves, and really nurturing those relationships, and nurturing those voices. So, yeah, you’re right, I left a good gig at Wired. I have a lot of respect for the folks there, but it just felt like an unparalleled opportunity to do something different, and to do it differently.

I was reading through some of the stories as research for this. And there’s this one great story on OKash, a mobile lending company in Africa.

Sophie Schmidt: My favorite.

And I looked at the byline, and it’s from Morris Kiruga, who’s a Kenyan writer, researcher, and blogger. And I was like, “Oh, this is cool. This is different.”

Louise Matsakis: Yeah, exactly. That’s a great story. OKash is this fintech company that’s partially owned by Opera, actually, which is this alternative browser that a lot of people like, and I don’t think that many people who think that Opera is the more ethical option over Google Chrome, or Safari, or something actually know that they’re up to this.

So, it’s this fintech company that lets people take out microloans, basically. And what they were found to do is that if you didn’t repay those loans, they would contact literally everyone in your contact list because when you got the loan, you gave the app access to your phone’s contact list.

Which is totally nuts. You’re late on the loan. And now, your aunts and uncle know that you missed a payment.

Louise Matsakis: It’s gamifying social shame. It’s brilliant, actually, because I’m sure it works. I remember I think Morris told us that the solution that they found was that they had two phones, one for their friends, and one for their loans.

And I think it’s important to talk about the context of why this happened. Kenya has extremely regulated banks for good reasons. They’re trying to avoid economic turbulence. They’re trying to keep the economy stable. But there’s no regulation in fintech. And we’re seeing this now in the U.S. with apps like Robinhood.

This problem is global in a lot of ways. So, the fintech companies have no system for getting people to repay the loan. So, this is the system they came up with in that vacuum because there’s no regulation.

Sophie Schmidt: And here’s the really interesting thing about that, too. These things don’t happen in isolation. It felt to Morris and to the people who were using OKash like this extreme one-off, “Oh, my God, that could never happen here.” We did some digging and the Filipino SEC had banned, I think, 24 similar types of apps just a few months before that. And those ones were inspired by something that came out of China.

The beauty of covering international tech is that you can trace individual episodes of things all the way back to a kernel of an idea in another country. Maybe it was someone sitting in Palo Alto, who came up with a slightly smarter way to nudge their friends, buy one more pair of shoes. And that one idea that’s built in a safe environment in a regulated space, can travel in ways now that we only are beginning to understand.

It can mutate, and adapt, and ping-pong around the world. And until all of a sudden, it lands in Nigeria, and oh my God, it comes out as this totally extreme example. Because the things that can happen outside of regulated environments tell us what the limits of tech are. They tell us what tech can be. That’s why it’s such an exciting space to watch.

I like how your publication just digs a little deeper

Sophie Schmidt: The knowledge gap is enormous. I still have people pitching these things like, “Hey, I was driving around Africa, and oh, my God, they’re using mobile money.” M-Pesa launched more than 10 years ago. And that gap in reporting, the gap in infrastructure of reporting, is what we’re trying to build.

Why Louise is spending time training and developing writers in their own countries, is because most of the countries that we cover have no tech journalism tradition at all. Maybe they’ll come up in one story once. But that’s a big problem because these countries are changing incredibly quickly. And I think of how much attention we put on our tech experiences. And we’re only scratching the surface of what it’s doing.

The thing that makes me really, really excited about our commitment to working and training local writers is that if we’re successful, 10 years from now, we will have trained a generation of tech journalists in their own countries. Maybe they write for us a couple times. And then, they go off, and they go work at DAWN or they go work in the nation. The more capacity we can build in different regions beyond just the TechCrunch trade journal version of this, the better off we’ll all be.

Louise, you wrote a story for Wired about how our conception of the social credit score in China was off. Can you explain what the social credit score is and how we’re misunderstanding it?

Louise Matsakis: Yeah. So, this was a really interesting experience for me. I actually read this academic paper, and I was going to do one of these classic Wired blogs where I was just going to talk to the researchers and then maybe talk to one other person, and write up a little blog. And it was about how the social credit system was working.

And it was actually written by researchers who didn’t speak Chinese. So, I reached out to a Chinese legal expert and asked about it. And he came back and was like, you’re completely wrong about this. These researchers have no idea what they’re talking about.

So it’s like a regulation system, actually. I would compare it almost to restaurant ratings, or Better Business Bureau ratings. It’s not actually an algorithmic tabulation at all. It’s more like a way in which the FDA of China can tell the Department of Agriculture that a supplier is fraudulent, or that a supplier has done some sort of corruption in the past.

It’s basically a way for the government to keep track of how individuals and corporations are behaving. It’s this Black Mirror idea that there’s going to be a number from zero to 100 on your back. And if the number dips below 99, you’re not going to be able to board a train or something like that. That’s the fear.

I’m hearing what you’re saying, and I’m like, “Okay, well, isn’t it weird that someone can’t buy a plane ticket if they get a bad score on this?” Let’s correct in real-time, why isn’t this what people are saying it is?

Louise Matsakis: The idea is like, “Okay, if you broke a law, the social credit system will then report that to other agencies. So, let’s say you owe someone $6,000 or something, the idea in China is like, okay, maybe if you’re not repaying that loan, and you’ve never responded, and they gave you a couple of times to pay it off, why should you be able to buy a plane ticket? That is the argument. Do I think that that’s fair? No.

But that’s very different from some sort of score that’s being tabulated as you walk around constantly. It’s more kind of like, pretty abstract, abstract, or obtuse legal argument. I think, honestly, it’s definitely troubling in some ways, but it’s a question more for constitutional legal scholars or something like that. It’s not this Big Brother surveillance tool people think it is. And there are plenty of other things like that in China, but for some reason, I think —

Just not this one.

Louise Matsakis: Yeah. And also, what’s funny about it is that it reminds me a lot of credit scores, which are very—

It sounds like the credit score, or maybe a background check like we have in the U.S.

Louise Matsakis: Right. And some of the Chinese researchers I talked to were like, “We got this idea from you. I have no idea why you guys are so upset about it.” We’re trying to build a trustworthy system, and they’re doing it their own way. And there are definitely problems with it in authoritarian countries, for sure. But there was no one who was willing to look at the details.

And I think a lot of reporters just jump to assumptions sometimes, which are xenophobic, or that are rooted in colonial ideas, or that are rooted in misunderstanding that led to being able to jump to assumptions about what was actually going on here. And I think that that happens a lot.

When you’re working with writers across the globe, how do you ensure that you’re not taken by some form of propaganda or a reporter in a country that has an agenda?

Louise Matsakis: So, I think there’s a couple of things. The first thing is that we have regional editors in place who have deep reporting experience in each of their regions. So, if I’m working with a reporter in India, I can go to my colleague who is our Southeast Asia regional editor, who has worked in Pakistan, who speaks Urdu, who has worked in the region.

And I can say, “Hey, this seems a little fishy to me, or I don’t really understand the political context.” And she can help me with that. We have intentionally hired a team of reporters and editors who have worked all over the world, and who speak dozens of languages. So, that really helps. That’s one part of it. Another part of it is ensuring that you’re doing the right research.

They need to be working with a co-reporter, they need to be working with a different editor. There are different standards of journalism all over the world. So, it’s spending time with the writer saying, “Okay, this needs to be an exact quote. I need you to record this conversation. I need you to double-check this, or I need you to make sure that you go to the company with this accusation.”

It’s making sure you’re super clear about your standards. And then, I think the last thing is also that we have a really, really robust fact-checking operation. The vast majority of our stories are fact-checked, which is pretty unusual for a digital publication, I think. We’re very careful about how we go about telling these stories for a lot of reasons. But to me, like you said, it’s totally worth it. It’s worth putting that investment in.

You wrote about emotion recognition in China. And we know that Amazon is patenting something like that here in the U.S. Can you talk more about what that is and whether you expect it to be widespread?

Louise Matsakis: I think that this is an evolution in some ways. They’re really different technologies. But I think that that’s how some of the people who are building these tools are looking at it, facial recognition. So, the difference between facial recognition and emotion recognition is facial recognition is matching. Does this person match the images and videos previously that we identified as being that person?

Emotion recognition is about discerning based on body movements, tone, the facial expressions that somebody makes, how their internal state is. Can we look at someone’s face and say whether they’re happy or sad? That’s the idea. The problem with emotion recognition is that it’s basically junk science. It’s not based on any fundamental facts about human behavior.

We all have been really sad and have smiles. That’s a very universal experience. And there’s also differences across cultures. If you showed the same video to people in 20 countries, they’re not necessarily going to make the same facial recognition. They’re not necessarily going to make the same movements in their face. So, I think that the problem is that facial recognition has a lot of issues.

And there are a lot of biased algorithms in it, but this is true. We all open our iPhone now with our face. And it’s clear that this technology is rooted in some sort of reality, but emotion recognition isn’t. And this report, which was really interesting, we had semi-exclusive on it. And it was basically showing that there’s a lot of companies across China who are trying to use this faulty science for a number of different applications.

Whether that’s monitoring students or trying to identify suspicious people at the border. What was really important to me about writing that story was to show that China is not the only country doing this. They’re taking a lot of these cues from big tech companies in the U.S., from companies and other places. I think basically, every big tech company in the West has tried some version of this. Sometimes, they’ll have a caveat. I think Microsoft has a caveat that’s like, this isn’t intended to discern the internal states of people. Well, then what is it for?

Why are companies and countries trying to figure out what people’s emotions are?

Louise Matsakis: One application that we talked about a lot in the story, which is pretty widespread, and seems to be growing pretty rapidly, are students. Can you assess whether a student is paying attention? And we saw this a lot, actually, in the pandemic. A lot of these Proctor surveillance tools that universities and high schools are now using for test-taking. We’ve seen a lot of problems with them.

It’s very similar. Can you discern whether a student is not cheating? Can you discern whether a student is happy, or sad, or engaged in the lesson or not? Not really, you can’t discern that without talking to them is the issue. But the idea is, can you automate some aspects of evaluating students, or evaluating incarcerated people, or evaluating travelers at the border?

You don’t think we’re going to get there at some point?

Louise Matsakis: I don’t think so, honestly. I think that it’s just not a fundamental truth. We know how to mask how we’re feeling on our face. That’s a totally common experience we do every day. So, I don’t think so necessarily. And I think that there’s a lot of problems with it. I think that maybe we could approximate it. And there’s maybe some applications of this technology that might actually be legitimate.

But I think it’s incorrect to assume that it’s just the next generation of facial recognition, which is what some of these companies in China are advertising it as.

Sophie Schmidt: It’s also worth noting that different cultures express emotions very differently. So, none of the insights, even if you solve it in one country, would scale. And that’s one of these really interesting breakdown moments when you try to distribute one product or one tool across different people groups.

Louise Matsakis: I think there’s also just a lot of money to be made in selling snake oil. And I think that that’s what you’re seeing in a lot of these circumstances. And it’s important to point that out where you see it.

Let’s move on to South Korea, where, potentially with the introduction of smartphones, young people are totally comfortable with being alone versus being in the community.

Sophie Schmidt: Yeah. This is a phenomenon you see across East Asia, where young people who are up against enormous societal pressure opt-out. They’re undone, and they go into their homes, and they live life as hermits. Now, in South Korea, what was really interesting was that we commissioned the story well before the pandemic.

And it was so interesting on its own, that you’ve had this community of people who had opted out, but were also using technology to crowdsource tips on how to better seclude themselves. There was actually a whole online community of these people. So, they weren’t entirely cut off. It was a modern take on hermit.

And there were now enough of them that this whole micro-economy developed to serve in a new consumer segment with single-serving delivery tactics. It was great on its own, but then the pandemic hits, and suddenly the entire world was required to adapt to this lifestyle that they already were all doing. And so, it’s one of those great moments where you think, can we call them? Can I ask them how to do this better? But again, we get in our own way, and think that we’re the first people to ever experience something. When in fact, if we can find a way to communicate across borders, and across customs, and all those things, I would love to get their tips.

I would love to understand how to make it more palatable. We’re a year into a pandemic. And living through daily reminders that the choices, and preferences, and behaviors of people very far away from us directly impact our lives. It’s a perfect example of how many advanced versions of what we’re living through already exist in the world.

You also had a story on Clubhouse. Is there a sense, on your end, that Clubhouse might change the way that politics works globally? I’ve heard that authoritarian countries have pretty big spikes in Clubhouse use.

Louise Matsakis: Yeah, totally. We are definitely watching that really closely. So, I think it could go one of two ways, actually. Right now, it’s facilitating these conversations that could never happen before. One of the most interesting examples of this to me was in Saudi Arabia, actually. We published a really great story by our fellow there, Mehr.

She talked to a number of people who were like, yeah, we’re having these political discussions about feminism, about women in the workplace that really couldn’t happen in a Saudi café or something like that. They couldn’t happen anywhere else. There’s no analog equivalent of this. But at the same time, then a Saudi official joined the platform. And it’s like what is going to happen there?

So, I think, yes, this could change politics, this could change things in a lot of countries. But I think the second option, which we’re already seeing glimmers of, is that it’s going to get banned in more places. And the places where it doesn’t get banned, it’s going to be very closely surveilled. And I worry that not just Clubhouse, but a lot of companies like this, that expand internationally really quickly don’t necessarily think about these problems still, which is shocking after the last four years.

They’re complicated questions. And I think that what we’re starting to realize also, and I think a big theme in my reporting, and not just on Clubhouse, but on a lot of apps like this, is that we probably need different solutions for different cultures in different contexts. I’m not necessarily convinced that the same standards, the same community guidelines can apply globally.

It’s a nice idea, and I think that there are some principles that should be definitely followed everywhere. But a one-size-fits-all approach maybe isn’t going to work. But I think it’s really interesting what’s happening with Clubhouse. We’re working on another story right now about politicians in South Korea who are using it. It’s definitely really interesting.

And I think that it has more potential than people realize, and what I really liked about one of the stories that we did, which looked at Clubhouse in four different countries, which was Japan, Nigeria, India, and Hong Kong, which I guess is a city, but a territory. And it was just really interesting to see how it was influencing culture there, and how the conversations were so different in so many ways. So, I was surprised to see that. I was really glad that we did that story. And we actually hosted an event on Clubhouse to talk about it, which was really fun.

So what does the future of the global internet look like? There seems to be a U.S. vision, which is that the internet will be dominated by American companies. And other visions emerge, one in China, another in India, and elsewhere. Where does all this go?

Louise Matsakis: I think it’s probably a mix of the two. I think that this idea that there is a binary is sort of false. It reminds me of the argument of like, are China and the U.S. going to decouple? Not really. I think that the idea that a decoupling is possible is naive. And I think similarly the idea that this is really a battle, and not one vision is going to win out is not really true.

I think that Facebook is going to continue to be used by so many countries. I think that Google is going to continue to be a huge browser provider in many places. And that’s not going to change anytime soon. But I do think that we’re going to continue to see the rise of local alternatives. And I think that these platforms are going to get bigger than people realize.

This is especially true in countries like India. TikTok was banned there. And quickly, there are half a dozen alternatives. And those might seem easy to brush off. But some of them have close to 100 million, if not more than 100 million users, which is a lot, even for a country like India. So, I think that we’re going to see maybe a more balanced internet in a lot of ways.

Where these big Western tech companies still play a relatively large role, but there’s many more local alternatives. And in a lot of places, those local alternatives are going to win out. I don’t think that every country is going to be using WhatsApp Pay, for example. I’m not convinced that Facebook is going to win the TikTok short-form video battles with Reels.

I think that we’re going to see a lot more of a mix. But in a lot of places, you’re still going to be using one or two Facebook apps, or you’re still going to be using Chrome as your main browser. But the space is going to become more crowded. And I think that these big players that have dominated will maybe not dominate quite as much.

Sophie Schmidt: I think the other thing that’s worth noting, again, when you compare across countries is that the less sexy thing to talk about is infrastructure. Internet is this magical thing that exists up in the universe. And there’s lots of things happening all the time. But the reality of how the internet is built is that there are entry and exit points that governments control.

Governments can order telecoms to shut things down, and they’re getting better at it. They can hire their own hacker teams to intercept all sorts of encrypted tools that people are using on the ground. When we talk about a balkanized internet, when we talk about internet nationalism, we also have to remember that there are still practical ground realities when it comes to tech.

There are towers, and their systems, and their engineers. And ultimately, governments. Whether or not we like it, a whole lot of cards. And that’s not to say that the democracy activists in Myanmar are not getting better and smarter. Something like the Milk Tea Alliance is fascinating and so encouraging. But it doesn’t change the facts. The fact is the internet still has to exist somewhere.

It has to have access to an underwater sea cable. And that’s a big challenge, I think, because we can get very excited about the potential of internet transformation. And you had Wael Ghonim on your show a few weeks ago. And he’s a fascinating character. And his views have really changed in the 10 years since he was a symbol of the incredible democratizing power of smartphones. We have to live in a more nuanced version of reality if we’re going to start to understand what the future of the internet looks like.

I’d like to know how you get this stuff noticed. Because these are all important stories, but they don’t neatly inject themselves into the news cycle.

Sophie Schmidt: Yeah. That’s a great question. We face all the same headwinds as anybody creating any content on the internet. And I think that the answer is that we have to go low and slow for a long time. That there isn’t a premade constituency for this type of story. Many of the people that we hit with a story for the first time, they literally didn’t know what was happening. They never heard of this thing.

And so, that creates a different path for us, I think, to build an engaged audience over time. Because we cover a lot of countries too. It’s a pretty small number of people who are agnostic across the country. There are people that like China, people that like Brazil, maybe they’ll stay for a story in Mongolia if it’s really compelling.

We need to be smart enough to understand the way people want to read international news, and then find a way to surf that over time. And it’s going to take time. We’re going to have to experiment, and hopefully, partner with other publications to meet new audiences continually. Because the headwinds are strong, everyone knows that.

I noticed that there’s no ads on the site and you don’t ask for subscriptions. How is this thing going to sustain itself financially?

Sophie Schmidt: So, I’m the primary funder right now. And I’ve made a 10-year commitment to fund us at our current level. So, we’re pretty happy at the moment. We’re structured as a nonprofit for a few reasons. One, I think, just candidly, I didn’t know how I could monetize this when I’m giving people something that they didn’t know that they wanted.

I think there’s also a lot of promise in nonprofit models, and I admire things like the Texas Tribune and The Marshall Project as being able to fill voids in newsgathering that don’t have a market imperative. That’s a really important role, I think, for a publication like ours. I think we’re open to outside funding if we want to expand, but also, having seen lots of nonprofits over the years, one mistake they often make is that they think that they’re companies, and they think that they need to grow forever, they need to keep growing. And they end up veering out of their core competency to justify more grants and all that stuff. I want us to be really efficient, really good with what we have before we start needing to expand.

Louise Matsakis: Just to answer your previous question. For me, about getting readership, I think it’s two things. It’s, first of all, making global connections wherever you can. We publish a story this week about the retail investors in Korea, who also hate short sellers just as much as the Robinhood investors in the U.S. do. So, it’s making these global connections wherever we can.

But I think the second thing is, you know better than anyone that if there is a compelling scoop, no matter what website it’s on, everyone’s going to read it there. So, finding the stories that people need to read to know what’s going on is another big thing that I think about. It doesn’t matter if it’s on Medium. If it’s on some guy’s random blog, you’re going to read it if it’s the story that everyone needs to know.

Before we head out, let’s make sure that folks who are listening can get in touch with you. What’s the best way to do that?

Sophie Schmidt: Our website is restofworld.org.

I write the Big Technology newsletter. Sign up here: https://bigtechnology.substack.com.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store