The Man Whose Science Fiction Keeps Turning Into Our Shitty Cyberpunk Reality
A Q+A with the novelist Tim Maughan, whose disturbing future predictions have had an unfortunate habit of coming true
Why I Made This Future is a recurring feature that invites speculative fiction authors, futurists, screenwriters, and so on to discuss how and why they built their fictional future worlds.
There is nothing boring about Tim Maughan’s works of speculative fiction, which concern, for example, the total destruction of the internet as we know it, the insidious possibilities of monetized augmented realities, and the full collapse of global supply networks. He writes such dramatic devastations, he says, to better examine the digital injustices that are perpetrated on ordinary people every day, and which can look rather boring on paper: data profiling and automated trade networks and surveillance capitalism and other drab inflections of our shitty cyberpunk present.
Maughan’s three books, the short story collection Paintwork, the novel Infinite Detail, and the newly released collection, Ghost Hardware, all take place in the same near-future world — the TMCU, as I like to call it. Therein, hyper-accelerated digital capitalism has thrust us all into a world that is perpetually and wholly online, and accessed via spex, augmented reality glasses that are as common as cellphones. Then, the internet is destroyed. It’s 15 minutes into the future, with the rug pulled out from under it. Maughan uses the conceit to investigate the consequences of having so much of our lives hosted, controlled, and in thrall to for-profit digital platforms and systems.
The Guardian called Infinite Detail the best science fiction book of 2019 for a reason — it gets under the skin of the future to expose the ugly guts of the present. Maughan is, above all, a critic — he writes to expose the disturbing trajectories of the technologies, prejudices, and corporate impulses today. I’ve worked with Tim for years, editing his fiction, barging into his audiobook recordings, and lamenting the future on various corporate social media platforms. So, as prediction after prediction of his has glitched into being this year, it seemed a fine time to talk about surveillance, communication breakdown, and resistance to all of the above.
This may be the second installment of Why I Made This Future, but it could as well be called Why I Warned Against This Present.
The interview has been edited for length, clarity, and obscenity reduction.
Brian Merchant: So, Tim Maughan—
Tim Maughan: That’s me.
Brian Merchant: Wonderful. Let’s start with the big one: Why did you feel compelled to create the future of Infinite Detail and Ghost Hardware — and what are, in your mind, its key cornerstones?
Tim Maughan: For me, I’m creating a fictionalized version of the world that we live in. The three books I’ve published so far, ironically, in a very science fictional kind of way, are all set in the same world. In Paintwork, I was wanting to talk about technology and the privatization of public space — so augmented reality became this really good tool for doing that. So that spills on into another story I wrote that’s set in the same setting, and augmented reality is a key theme in Ghost Hardware.
Brian Merchant: I appreciate your critique of pervasive digital consumerism through the spex. In both Infinite Detail and Ghost Hardware, spex are basically the new iPhones — ubiquitous AR glasses that have their own operating system — everyone’s got one, before the world collapses.
Tim Maughan: They were very fictionalized but very… I would call it generic. I’m certainly not the first person to write about these technologies. They’re AR glasses that you put on that overlay a digital network over the reality you exist in, which is not an alien concept to most people. For me, it’s a really exciting literary device because a lot of the stuff I’m interested in talking about — network culture, digital systems, social media, even supply chains — these are not things that are easy to write about in literature. They’re incredibly boring to write about. So generating a very visual metaphor for dealing with them was what I was looking for in using AR.
Brian Merchant: And spex are a useful mechanism to interrogate surveillance practices, digital commerce, and so on. There’s a memorable scene that takes place after the city updates its policy so that you need spex to get credits for recycling cans. Can you talk a little bit about that, and then what all this mass digitalization and commoditization ultimately leads to?
Tim Maughan: Anybody who lives in the city has seen canners. They go through recycling, both the public and the residential recycling, and take items that need recycling, go and recycle them and get micropayments for each one. You glance at them and you think they’re homeless, which, as a matter of fact, the majority of people doing this work are not homeless, it turns out. In New York, many are cab drivers struggling to compete with Uber.
And the idea in the book is that this new technology that’s being brought in where RFID tags on every bottle and some facial recognition and machine learning is going on in stores. The system, the network, which is in New York in the book, knows when a can has been bought, who it’s been bought by, and who puts it in a recycling bin. And then it gives that payment directly to that person, the idea being it streamlines recycling and it gives people more of an incentive to recycle.
It’s a cool idea, but with these really horrible implications. I actually originally got the idea, and you’ll probably remember this — it was in the Bay Area, it was in San Francisco, maybe like 2013, 2014. There was a story about a bunch of guys, Latino guys in the neighborhood, who every Sunday would get together and play five-five soccer on these five-by-five soccer field pitches in their neighborhood. And they’d do that, it was a tradition that had been going on for decades.
One Sunday they turned up there, and the field was completely overrun by white guys wearing Dropbox T-shirts. And they were like, “Well, this is our field.” And the guy turned to them and said, “No, but we booked it.” And he said, “Well, what do you mean? What do you mean you booked it?” And he pointed at the sign, a flyer that had been stuck up at some point in the previous week saying, “In the future if you want to use these playing fields you have to book them. You can do it by downloading this app to your smartphone.” And the guy that was talking to him goes, “I haven’t got a smartphone. This is my culture. This is what we do.” And the guys go well, I’m sorry, but that’s how it works now. You need a smartphone.”
Someone had come up with a solution — that’s so often the Silicon Valley way — to find a problem, then find a solution to it so that you can then make a product that fills that solution, and think that you’re being helpful without really considering the wider implications of what you’re doing. And what you’re actually doing is building this kind of extra-stratified society. You’re limiting access to people who only have smartphones. Arguably now, smartphone penetration is much higher than it was five years ago, but at the time it was a particularly stark and infuriating example, because the people who’ve done this, honestly to God, did not think they’d done anything wrong.
They really only see just that they were generating solutions for stuff. And that’s the stuff I write, this technology. As you well know from your own work and your own journalism, this is the stuff that’s most scary about it, is the idea that it’s providing universal solutions to things that have been decided are problems. It just ends up making it worse.
Brian Merchant: In the example that you just mentioned, it’s telling that the control is being handed over to the people who have the benefit of wielding, developing, or programming the technology in the first place — owning the technology.
Tim Maughan: The owning is the key point there because most people in this space have been sold book after book — hundreds of self-help books and courses and incubators and stuff that says, “Hey, look, you want to succeed in this industry? You need to find a problem that you can solve.” You get to the point of almost creating a problem to be solved. In order to show off your technical skills and your ability as a problem-solver, you need to find a problem. And even when that problem may not even exist, or it’s a different kind of problem, it isn’t a problem that needs a technological solution. It’s a problem that needs a political solution. It’s a problem that needs a community solution. It’s a problem that’s solved by fixing capitalism, not fixing technology. So many of our problems are byproducts of other systems. So that’s not what is of interest to people. People are interested in becoming an entrepreneur, in becoming a successful CEO, a successful founder.
Before Infinite Detail came out, I was at a workshop incubator thing in Brooklyn. And there was a couple of kids there — this is before Infinite Detail — and they said, “Oh, yeah, we’re making a smart trash can.” And I had started working on the recycling bit in Intimate Detail. I had written that chapter at the time, and my heart just fell. I said, “Right, so what is that? How does it work?” And they were like, “Well, it’s got a screen on the side. And when you put a can in, it thanks you for putting it in.” And I said, “Well, what’s the point? Why?” And they said, “Well, because it would encourage kids to recycle more. It’d be like little video games they can play.” And I’m thinking, “Okay, is it really that hard to get kids to recycle? I don’t feel like it is, but, anyway, whatever.”
I said, “What’s your business model?” He said, “Well, hopefully, we’ll get cities invested in it.” And I said, “Yeah, but…” And I knew exactly what his answer was going to be, and I kept pushing him on it. He said, “Well, yeah, eventually we do want to monetize the data it collects. Yeah, eventually we could be monitoring who’s walking past from the IDs on the phone or the footfall.” And that’s it. People are not even interested in fixing these problems. They’re interested in finding “solutions.” They’re finding other trajectories, other vectors to get data collection, that’s it because they literally have all been told data is new oil, and they fully fucking bought into this.
They don’t even know what the data is for. They’re not even interested in collecting data for specific reasons. They’ve just been told the data will have some value in the future, to grab and own and horde as much of it as you possibly fucking can. And it’s such a disruptive model. It’s literally ending with people dying. When you start looking at surveillance technology and smart technology and how they so often don’t actually work. They’re not just disrupting communities or breaking industries and being disruptive. They’re literally leading up to miscarriages of justice, this misuse of data.
Brian Merchant: While some mainstream criticism is finally emerging to counter some of these tendencies — which for so long got a free pass in the media — it could still serve to be much sharper. That it’s not just a whoopsie, or even a series of whoopsies. That actually, it’s a systemic pattern of impulses, including racist or misogynist ones, that ends up excluding people from crucial services and disrupting the ways that they make ends meet. That these platforms and services are actually displacing people, as you just pointed out, that they are systems of control.
Tim Maughan: And it’s tricky to do in fiction as well because it’s very easy to fall into that kind of Black Mirror trap of saying, here’s an example of when surveillance went particularly bad and the wrong person was killed or the wrong person was implicated, or a stalker was following you, your nemesis or your ex-boyfriend was stalking you on social media. And these things are incredibly, incredibly serious problems, and they’re incredibly real problems. But they’re also exceptional problems and not day-to-day, longer-term systemic problems — where it’s easier for us to be upset by video of someone in Starbucks using the N-word than it is for us to be upset by systemic racism because it’s harder to identify.
Brian Merchant: It’s harder to fictionalize the more boring, more prevalent kinds of surveillance capitalism in a satisfying way.
Time Maughan: I talk about surveillance to people who don’t think about surveillance all the time like I do and you do…And you walk in the house and they’ve got an Alexa. And you say, “I don’t like the Alexa because it’s a surveillance machine.” And they say to you, “Well, I haven’t got anything to hide. I haven’t done anything wrong. It’s not a problem to me. It doesn’t matter if they’re listening to me. I’ve got nothing to hide.”
And it’s like, actually, the reason I dislike it isn’t the fact that I’m worried they might be listening to me now — it’s monitoring my behavior, and that’s what I’m worried about. I don’t care if it overhears what I say, or an algorithm is listening to it or even someone in an offshore call center. Even if they’re listening to it, that privacy thing isn’t what worries me. The issue that worries me is that they’re modeling my behavior, and they’re making judgments based on that, which might not be the right judgments for everybody. And they’re using that model to make decisions about people who aren’t even their users, too, or they’re using it to make decisions about their users.
It becomes a thing about like, well, okay, what information can we collect from Alexas about a neighborhood or just their Amazon use? What decisions can Amazon make geographically in physical spaces? This neighborhood in South Brooklyn, I used to live in, East Flatbush, it’s gentrified. And I’m sure Amazon can pull up a map of where all the Alexas are, where all their Amazon Prime accounts are and go, “Well, this is a neighborhood which is increasingly likely to be gentrified” — aka, more whites.
Tech workers are moving into the neighborhood. What can we do in that neighborhood for them? And suddenly you’re changing the nature of the neighborhood. I lived in East Flatbush, and my neighbors, everybody would get on the train and go to Trader Joe’s, which was five or 10 stops away on the train because the local grocery stores weren’t great. But then Amazon put a Whole Foods there. It wasn’t worth putting a Whole Foods there two years ago when there was “only” a Black and Latino population, but now that there’s white tech worker population moving in, oh, well, they’ll like a Whole Foods.
And that kind of decision-making is really dangerous to me. And then these datasets have been collected, not necessarily by Amazon, but datasets have been collected from Facebook about behavior, about demographics. They’re getting sold to development companies. They’re getting sold to real estate agents. They’re getting sold to architectural companies who are looking at these neighborhoods and going, “Oh, I can build a condo there. Let’s fuck this neighborhood even more. Let’s increase their rents by 50% by building a condo block there.”
You don’t want to fall into that thing of writing a thriller about someone that gets stalked through their Alexa. I’m unfortunately taking on the task of trying to write fiction about the real implications of technology like that. It’s an incredibly stupid, foolhardy thing to do.
Brian Merchant: It can also be hard to write speculative fiction about the mundane trends that are already in motion. Most of us aren’t actually going to interact with, say, an asteroid mining colony. We are going to have to deal with the fact that our data profiles are proliferating in ways that we don’t fully even understand, that we can’t understand because they are hidden from us. We need fiction writers and speculators to try to give us some tools for helping us to kind of decipher these situations, which is what it seems like Infinite Detail aims to do.
Tim Maughan: The book primarily is centered around what might happen if the internet was destroyed or disappeared, or stopped working in some way — if it just went away. But it came to me around about the time of, I guess, it was like 2012, 2013 and the Anonymous stuff is going on. Wikileaks stuff was going on. And there was a real sense sometimes you’d wake up and perhaps the internet wouldn’t be working. It’s jarring to not have Google Docs or to not have Gmail. It felt like the first time it was jarring to not have internet services, and that stuck in my head. And then a bit later on I went on a trip to China, and saw how the supply chain is intrinsically linked to the internet. I spent some time on a container ship. Do you want me to tell the container ship story?
Brian Merchant: You know I do.
Tim Maughan: I did this trip to China. I did some reporting for the BBC where we did this reverse supply chain trip. We started in South Korea and we got on a container ship and went to China, stopped at a bunch of ports, then traveled overland in China. It’s a huge Maersk container ship, about 10,000 containers on this one ship. A crew of about 20 people. And I was sitting on the bridge with the captain one afternoon chatting to him, and there’s a beeping sound. He said, “Excuse me,” and he walks over to the computer. Looks at the computer, types on this computer, just a standard gray box PC, beige PC. It doesn’t have anything interesting to it. Walks back, starts talking to me.
I said, “Hang on. I meant to be understanding what’s happening on this ship. What just happened?” And he was like, “Uh, oh, well, I had an email from Maersk in Copenhagen saying to slow the ship down.” He got off the thing and he got on the phone and he called the engine room and told them to slow the ship down. And I said, “Wait, what?” And he said, “Yeah, they emailed me and told me to slow the ship down.” And I said, “Why?” And he said, “I don’t know. They don’t tell me.” He said, “It’s an automated email that the supply chain algorithms are sending me,” something like that.
He said it probably means that they know that there’s a choke point further up the supply chain, either at the port, meaning we can’t get into a berth, or even like a traffic jam or something it could be, perhaps, which means that everything is going to be slowed down. So, if we get there on time now it’s inefficient. We’re just wasting fuel. We might as well slow down and get in there an hour or two later when the port will be ready for us. He said, “But they don’t tell me that. They just send me an email.” And I just had this weird moment. I’m like, “Oh, hang on. This is just a node in the network, this whole ship is just a node in the network.”
A lot of the containers on the ship are called reefers, which are these refrigerated containers. And they are climate-controlled. They’re amazing pieces of technology. Load unripe bananas into them in the Caribbean somewhere, and by the time they end up in a supermarket in Berlin or something, the reefers have taken them from plucking to the supermarket, and they gradually over a period of time, knowing where it is by GPS, gradually alter the temperature in the box so that the bananas are perfectly ripe by the time they arrive on the grocery store shelf.
So, we had internet on the ship. It was very slow. It was almost unusable at times. It was satellite uplink. And the captain said, “It’s not really for us. It’s for the reefers.” And I was like, “What do you mean?” He said, “Well, most of the staff on the ship, actually, their responsibility is to make sure the reefers are running, make sure they never break down. If there’s a problem with them, then they have to fix them.” He said, “When one breaks down it emails Copenhagen, then Copenhagen emails me. And then I tell a member of my crew to go and fix this.” And he said, “Really, we’ve got a little bit of bandwidth we’re allowed for emails and stuff, but we’re piggybacking on the satellite network that’s been built just for these reefers to communicate with the network.”
I had already started working on Infinite Detail at that point. I had already started thinking about writing a book about a post-internet book because it just seemed really interesting to take that premise and blow it apart, the idea that the internet was permanent, that it’d always be there, but what if it isn’t? And this just was like a bomb going off. It’s like, “Well, hang on, this whole system, none of the humans I’ve met know how this system works, but the internet, this networked algorithmic system that lives on the internet apparently does, and that’s what’s getting us everything from iPhones and plastic crap to really important medicines and food. That’s what is making sure it turns up to the right port.” Really scary, it’s really scary.
Just as I was selling the book, me and my agent were taking it out to publishers, the WannaCry thing happened. It was this awful malware that was hijacking people’s computers, wiping them, threatening that if you didn’t pay them some bitcoin it would wipe it. And it was physically wiping hard drives, writing zeros over hard drives. And this hit the shipping industry really hard. I didn’t know until after the book came out after someone at Maersk read it and talked to me about it. That day that WannaCry hit the Maersk offices, it was apparently one of the scariest things that’s happened to people in that company. Because if they lost that, they didn’t have any backup, they would have no idea where their containers were, no idea where their ships were, anything.
It goes to show that from the moment the banana comes off the tree and goes into one of these containers to the moment you scan it through the automatic checkout thing when you’re leaving the grocery store, that banana is being monitored through this global information system that no human understands. There’s no single human or even a team of humans who can comprehend the complexity of these flows. I was on this container ship that had 10,000 containers on it, and I looked out the window, and I could see another one at any point in the day with a pair of binoculars. With the naked eye, I could see another container ship of the same size on the horizon. With a pair of binoculars, I could see another five.
And the scale of what’s happening, we’ve bought into this system. It’s not only incredibly fragile because of how it’s linked with technology it also only exists to exploit labor on the other side of the world. The reason we don’t make our own iPhones is because it’s much cheaper to get poor people in the different countries to do it for us. So we’ve built this whole, incredible, sophisticated, partly A.I., partly machine learning, complicated network just to exploit our neighbors.
Brian Merchant: It’s a good point, and it carves out this interesting and necessary contrast with the kind of discourse around A.I. that gets most of the attention these days. There’s this cottage industry in Silicon Valley of people warning about the dangers of A.I., of omniscient AGI — A.I. that will become sentient and make all these terrible decisions. That kicks the threat onto the horizon, and it obscures the fact that we, basically, have a crude patchwork form of automated systems that are already serving as this neanderthal A.I. that could go offline, and sometimes does, and cause disruptions that we never think much about.
It’s kind of miraculous that more disasters haven’t happened at-scale yet, given the fragility that you’re describing. It’s this automated, slapdash global and internet-linked supply chain that could go off the rails, as you described.
Tim Maughan: And the supply chain is a scary one because it impacts us really directly. But you start looking around, and it’s everywhere — it’s nothing compared to high-frequency trading on the stock markets. These things that literally people do, and they do not understand. They laugh and put their hands up. And I talk about this in Infinite Detail a little bit.
I talked to these people like, “LOL, yeah, I get it, but I made half a million this morning. I lost a quarter of a million, and then I made another million on top of it, so I’m actually up. In 20 minutes this morning, which was the only trading I bothered doing today because the algorithms that I bought from a company where the salesman comes in, and they’ve all got names like Vector and Phoenix and Hunter. And I buy these algorithms, and I don’t really understand what they do, but they go off and make value for me.”
It’s terrifying. What, how many more, like 20 million people, whatever it is, lose their job in the last few weeks because the stock market tanked? Those places closed because they had to close, but it’s like you’re looking at the stock market, which controls itself. And I stare at it, and I’ve been doing this for like 20 years. This isn’t a new thing. I’ve been staring at it for 20 years because I don’t understand it. Why is this being allowed to make all the decisions for me? Before I even started to understand how automated it was and how weird it was. And looking at it, going, “Why?” We don’t have control over this.
This is apparently the most important thing in the world, this defines whether you’re rich or poor, whether you get a job, whether you can afford to go to college, whether you own your house, whether you can afford to buy a house, whether you get to keep your house, whether countries are invaded, whether the price of oil. All these things have been calculated and decided by this system, but nobody understands. Nobody seems to have any control over it. Why?
Why are we allowing this system that we’ve seceded control to so that it’s some wild animal? It’s like they’re literally bringing a bull into a china shop and going, “Well, I hope it doesn’t break most of the things, but we’ll see which things it doesn’t break. And then when we’re done with those, maybe we can sell more of those.” You wouldn’t bring a bull into a china shop. And you wouldn’t allow something as complicated and as important, or we’ve decided as important, to just run itself.
We’re talking about big structural change. Obviously, I’m not talking about something that we can all change or we can make quick adjustments to. But I’m just horrified by this idea that something so important we don’t understand, and we’re told not to understand it. We’re told just to have faith in it. It’s a cliché to call it a religion, but it’s starting to feel even more like it.
Brian Merchant: So, in the book the bull in the china shop, it kicks everything down. In the aftermath, what happens to society?
Tim Maughan: In the U.K., there is, basically, a military coup. There is a central government, but they’re very slowed down. The land army has regional centers and stuff like that and camps and work camps. The British army has staged a soft coup by stepping in. You’re looking at a government failing to deal with a crisis. At some point, the generals go, “Well, fuck this. We can do a better job of this.” And that’s, basically, what’s happened. Whatever government was in place after the collapse in Infinite Detail has failed, quite understandably failed to deal with this crisis, so that the military has said, “Well, time to stop pussyfooting around. Let’s roll the tanks into cities and start rounding people up to go out to the farms and grow stuff.”
The thing that was terrifying to me was we were two weeks into lockdown, and there were stories in the U.K. They’ve got Twitter called U.K. land army. And it’s a volunteer army asking for people, because of the lockdown migrant workers that the U.K. farming economy is fucking completely dependent on, which, again, is why Brexit and restrictive immigration policies in the U.K. are so fucking terrifying. But they didn’t have this migrant force. Food was dying on the vine. Strawberries were rotting on the plants. The strawberry industry is a big industry. So they were like, “Well, the land army, sign up here. Join the land army. Go out and pick vegetables and stuff.”
In the book, it’s this experimental space before the collapse, which is just an art experiment, really, where they’ve blocked the internet for a few blocks, so it’s impossible to use the internet, and replaced it with this decentralized peer-to-peer network instead. I know people have done projects like this, and people in New York have done smaller-scale projects like this. And I said, “Well, what if you did a neighborhood-wide project?” And that’s all it’s meant to be.
After the collapse, people have plugged into this area, and then the cops come, and there’s scenes that are very similar to the stuff we’ve seen on TV in the last few weeks of running battles with the cops. And so this faction locked itself off and barricades against riot police have gone up.
Brian Merchant: And you go further.
Tim Maughan: Yes. There’s a drone strike. My real-world dystopian obsession is when will we see the first drone strike on civilian territory in the West. Do you know what I mean? It’s going to happen at some point.
Brian Merchant: There is an actual Black Lives Matter protest in New York in a key scene in Infinite Detail.
Tim Maughan: There’s an actual Black Lives Matter protest in New York, and the most unrealistic thing, it turns out, about my depressing dystopian novel is that the police manage to fucking control themselves. It’s peaceful. There is some suggestion of brutality at the end of that chapter. There’s the suggestion of panic and chaos right at the end of that chapter. It’s about Black Lives Matter marches that I went on when I lived in New York.
Brian Merchant: In the book, those scenes happened pretty close to the great crash. It feels like one of the moments that, like today, like we’re coiling everything up — both the expressions of power and our resistance to it. So that moment, when they’re marching in the Black Lives Matter protest and the world’s lights are about to go off — was that intentional that all these lines are drawn up?
Tim Maughan: Yeah. The idea is that we’re the frog in the boiling water, kind of. I know that’s usually used as a climate example, but I feel like it’s a really good example for data stuff. This oppression and all these things we’ve talked about, about this lack of access and this stratification of society and then the very violent and very direct, lethal misuse of data in policing and stuff, particularly policing and stuff.
That Black Lives Matter protest in the book is in… it feels horribly ironic to be saying this. It’s in response to a fictional, accidental killing of an elderly woman in Queens or the Bronx who’s been killed by accident because predictive policing has failed. Some fucking roided-up cop ended up in the wrong stairwell with a gun drawn at the wrong time of day and shot an old lady because predictive policing has suggested that a violent crime might be taking place in that building. It’s just a horrible fucking mess. And this march has been in direct reaction to that. But the idea being that this stuff is going on all over, right?
People are getting to their wits’ end about how technology is creating a stratification. Again, the canner story and even the stories in Limited Edition.
Brian Merchant: Let’s talk about Limited Edition, which is a great story.
Tim Maughan: It was sparked by police brutality, which was sparked by the street execution of a Black man [named Mark Duggan] in London by cops. He was killed in his car by armed police, which is a very, thankfully, still relatively rare thing to happen in the U.K. But it was enough to spark riots in London, which then spread to other major cities in the U.K.
But what was interesting about that, where the story comes from, is how the media, and social media in particular, reacted to this. Some looting started, and as soon as that happened, the narrative stopped being about police brutality and started being about looting and about how these kids were spoiled and entitled and aggressive and thugs and this stuff. And these are all the discussions that were going on in the media at the time that I really wanted to unpick in that story, and it seemed like a cool thing to try and do that using science fiction, as opposed to just an article or something. And you heard it this time, mate, “Why are these people setting fire to their own community? Why are they burning this Target when they’ve got jobs in the Target?”
And my point is these aren’t the community. The community is not built out of Targets. A community is not built from fucking shops and stores and shopping malls. And the character is even, he’s working through this in the book. And he talks about how he sees products come into the community be sold, and then the money from that goes straight out. Him and his friends are all still broke. They can’t afford to buy stuff in those shops. Even the guys that work in those shops hate it and aren’t making enough money to survive on, right?
So there’s no community if the money is going to large, multinational corporations. That’s not how a community works.
Brian Merchant: And I like the vessel — that there’s this online community that you can get credit for, basically, livestreaming a raid on a shop.
Tim Maughan: I wrote this before Twitch was a thing, man.
Brian Merchant: It demonstrates the fact that there is an audience, there’s a solidarity. It complicates this act of “looting,” and demonstrates that there’s a whole audience of people who both support this, who cheer it on–
Tim Maughan: And also who hate it.
Brian Merchant: The hate-streamers.
Tim Maughan: They’re watching because they’re racists. They want to see evidence that Black people are destroying their own communities. They’re excited to see that. And there’s an opportunity to sell advertising to them, and to sell advertising to the people that support the riots.
Obviously, it was very jarring to literally be watching a livestream from Unicorn Riot and people like that on Periscope and on Twitch and other platforms of actual riots taking place. This interface with these custom candy hearts floating everywhere with people’s names and icons and things. And that was almost exactly how I pictured this shit. And ridiculously, I wanted to do it in the most ridiculously dystopian over-the-top Running Man, Robocop kind of way.
And now, it’s like I’m sitting there two weeks ago watching it fucking happen. And my livestream is being interrupted by advertising for some shit.
Brian Merchant: That seems to be a recurring theme here.
Tim Maughan: All this shit is actually happening.