The Modern World Has Finally Become Too Complex for Any of Us to Understand
Vast systems, from automated supply chains to high-frequency trading, now undergird our daily lives — and we’re losing control of all of them
Welcome to No One’s Driving — a column by novelist and tech writer Tim Maughan about how to understand a world governed by systems and technologies that are spiraling out of control.
One of the dominant themes of the last few years is that nothing makes sense. Donald Trump is president, QAnon has mainstreamed fringe conspiracy theories, and hundreds of thousands are dead from a pandemic and climate change while many Americans do not believe that the pandemic or climate change are deadly. It’s incomprehensible.
I am here to tell you that the reason so much of the world seems incomprehensible is that it is incomprehensible. From social media to the global economy to supply chains, our lives rest precariously on systems that have become so complex, and we have yielded so much of it to technologies and autonomous actors that no one totally comprehends it all.
In other words: No one’s driving. And if we hope to retake the wheel, we’re going to have to understand, intimately, all of the ways we’ve lost control. This is the first entry in a series — called, yes, No One’s Driving — that aims to do exactly that. Each month, we’ll examine a technological system that has grown too complex to be understood by, well, just about any one person, and break down how it has spiraled out of control, why that is dangerous, and what we might do about it.
Most of us do not spend a lot of time thinking about the huge, complex systems that keep our technologically dependent society running. And with very good reason. It takes a certain amount of faith and belief — in ourselves, in capitalism, in the digital platforms that mediate our interactions with it, and in the infrastructures that support all of the above — in order to wake up and get through every day. But eating breakfast, pulling on our business-casual Zoom-appropriate shirts — all those mundane acts are made possible by an almost unfathomably complex, algorithmically calibrated, partly automated, and partly sweatshop-labor-dependent global supply chain.
There are currently over 17 million shipping containers in circulation globally, and at any given time, about 5 or 6 million shipping containers cross the sea. The U.S. alone imports over 20 million shipping containers’ worth of products a year. While it’s common to talk about iPhones and high-end sneakers when we talk about imports from China and Asia, the truth is the vast majority of those containers are stuffed which much more mundane goods: socks, umbrellas, pencils, paper, packing materials, bedsheets, fruit, car parts, frozen food, pharmaceuticals — the endless inventory of physical items that make our modern lives possible.
Just as vast and complex, and intrinsically linked to the supply chain, is another sprawling but mostly invisible system: the global financial markets. It’s a vast, highly technologized network linking banks, government agencies, hedge funds, regulatory bodies, stock markets, dark pools, exchanges, news services, and millions of individual human traders and analysts. It has grown to a level of complexity that makes it unknowable by any single human intelligence — everything moves at far too great a speed and scale.
The average daily trading volume on the New York Stock Exchange generally spans between 2 billion and 6 billion shares, with the average daily trading value in 2013 being approximately $169 billion. And the only way to deal with a market of this size and complexity has been a relentless adoption of automation — and the increased handing over of day-to-day analysis and decision-making to software. And in an industry like finance, which is preoccupied entirely with growth, these systems have led to an exponential increase in complexity — while human traders would traditionally average five trades a day, high-frequency trading algorithms can make 10,000 trades every second.
And those platforms of technology and software that glue all these huge networks together have become a complex system themselves. The internet might be the system that we interact with in the most direct and intimate ways, but most of us have little comprehension of what lies behind our finger-smudged touchscreens, truly understood by few. Made up of data centers, internet exchanges, huge corporations, tiny startups, investors, social media platforms, datasets, adtech companies, and billions of users and their connected devices, it’s a vast network dedicated to mining, creating, and moving data on scales we can’t comprehend. YouTube users upload more than 500 hours of video every minute — which works out as 82.2 years of video uploaded to YouTube every day. As of June 30, 2020, there are over 2.7 billion monthly active Facebook users, with 1.79 billion people on average logging on daily. Each day, 500 million tweets are sent— or 6,000 tweets every second, with a day’s worth of tweets filling a 10-million-page book. Every day, 65 billion messages are sent on WhatsApp. By 2025, it’s estimated that 463 million terabytes of data will be created each day — the equivalent of 212,765,957 DVDs.
So, what we’ve ended up with is a civilization built on the constant flow of physical goods, capital, and data, and the networks we’ve built to manage those flows in the most efficient ways have become so vast and complex that they’re now beyond the scale of any single (and, arguably, any group or team of) human understanding them. It’s tempting to think of these networks as huge organisms, with tentacles spanning the globe that touch everything and interlink with one another, but I’m not sure the metaphor is apt. An organism suggests some form of centralized intelligence, a nervous system with a brain at its center, processing data through feedback loops and making decisions. But the reality with these networks is much closer to the concept of distributed intelligence or distributed knowledge, where many different agents with limited information beyond their immediate environment interact in ways that lead to decision-making, often without them even knowing that’s what they’re doing.
Back in 2014 I spent some time investigating the supply chains for goods from China, and I spent a week on a huge container ship. One of the most striking things I saw was how much day-to-day, minute-to-minute decision-making was mediated by technology. Every human in the supply chain — from crane drivers up to the captain of the ship I was on — was constantly receiving instructions from unseen, distant, management algorithms. Displays told crane drivers which containers to pick up and where to place them, while the captain received automated emails about course corrections. What was fascinating — and slightly unnerving — was how these instructions were accepted and complied with without question, by skilled professionals, without any explanation of the decision processes that were behind them.
The captain of the container vessel would regularly receive automated emails telling him to slow down the ship. It’s impossible to know why the shipping company’s algorithms decided this was for the best — the captain himself didn’t know. But he could speculate: Maybe the staff or systems at the terminal ahead reported delays with off-loading, or a mechanical hitch. Or maybe the algorithm saw the delays coming in advance because the GPS trackers on other containers showed delivery trucks stuck in gridlock outside the port. Maybe it decided to slow everything down because a customer deprioritized their order. Or that a change somewhere else in another link of the supply chain meant that getting their shipment from another source became a cheaper or quicker option. Or the cost of oil fluctuated just enough that burning it at the ship’s current rate became inefficient. Or maybe it was all of these reasons simultaneously, or none of them. The point is that we don’t know, the captain of the ship itself didn’t know, and that nobody may know — but that didn’t stop the decision being made.
Which is fine, right? There are a lot of people — executives, traders, investors, and tech developers chief among them — who would argue that it’s better than fine; it’s good. We should be happy about this. There’s food and clothes in the stores, money in the ATMs, stories on our Instagram. And yep, getting those things there was really really complicated, but it doesn’t matter, ultimately, because humans didn’t have to worry about it all — it just takes care of itself. I mean, what could possibly go wrong?
Firstly, and most obviously, parts or all of these systems can simply fail. As we all know, the more complicated something is, the more ways something can go wrong. We’ve had a few opportunities recently to see examples of this happening. Just this year we’ve watched the supply chains struggle under the pressures of the Covid-19 pandemic, leading to shortages and misallocations of everything from protective masks to flour to toilet paper.
The fallout from Covid has had an even more devastating effect on the global economy — although we’ve seen the markets come close to collapse without the help of a global disaster, such as during the 2008 financial crisis. And the internet is certainly prone to failures, perhaps most starkly illustrated by the WannaCry ransomware attack of 2017, which is said to have affected more than 200,000 computers across 150 countries, causing billions of dollars’ worth of damage and lost earnings. Interestingly, one of the industries hit hardest by malware attacks that year was shipping, with them crippling parts of the supply chain as companies like Maersk were forced to watch their networks grind to a halt. What’s worrying is that while none of these were catastrophic failures, and the networks eventually recovered, in some cases it took years of expert analysis and debate to work out what actually went wrong, exactly because of how complex these systems are to understand.
On the other hand, we also have to worry about these systems working too well. These networks were built — or, perhaps more accurately, they evolved — to be as efficient as possible, and as we’ve seen from the above examples we’ve abdicated a lot of decision-making to them in order to achieve that goal. But what we’ve not bestowed them with is an ability to make ethical decisions and moral judgments when doing so.
The global supply chains, with their mega-scale engineering projects and infrastructures, exist primarily because global wealth inequality makes it cheap to have stuff made in certain countries — even when you have to ship it halfway across the globe to sell it for a profit. By leveraging stark gaps in wages and standards of living as efficiently as it can, the supply chain network is actively enforcing that global inequality.
The same is also true for the global financial markets, which unblinkingly focus on creating wealth and growth, regardless how many companies and their workers might fall by the wayside, how many pensions schemes might be jeopardized, or even how many climate-wrecking carbon emissions are produced. And on the internet, streaming platforms like Spotify and YouTube provide us with unlimited entertainment content whenever we desire it, but at the expense of musicians and creators who struggle to earn a living wage. And then there’s the baked-in biases and lack of transparency in algorithmic decision-making, and how that impacts everything from YouTube recommendations to student grading and predictive policing.
Ceding control to vast unaccountable networks not only risks those networks going off the rails, it also threatens democracy itself. If we are struggling to understand or influence anything more than very small parts of them, this is also increasingly true for politicians and world leaders. Like the captain of the container ship, politicians and voters have less and less control over how any of these networks run. Instead they find themselves merely managing very small parts of them — they certainly don’t seem to be able to make drastic changes to those networks (which are mainly owned by private corporate industries anyway) even though they have a very direct impact on their nations’ economies, policies, and populations. To paraphrase the filmmaker Adam Curtis, instead of electing visionary leaders, we are in fact just voting for middle managers in a complex, global system that nobody fully controls.
The result of this feels increasingly like a democratic vacuum. We live in an era where voters have record levels of distrust for politicians, partly because they can feel this disconnect — they see from everyday reality that, despite their claims, politicians can’t effect change. Not really. They might not understand why, exactly, but there’s this increasing sense that leaders have lost the ability to make fundamental changes to our economic and social realities. The result is a large body of mainstream voters that wants to burn down the status quo. They want change, but don’t see politicians being able to deliver it. It feels like they’re trapped in a car accelerating at full throttle, but no one is driving.
They may not be able to do much about it, but there are mainstream politicians and elected leaders who see this vacuum for what it is — and see how it provides them with a political opportunity. Figures like Donald Trump and Boris Johnson certainly don’t believe in patching up the failures of this system — if anything, they believe in accelerating the process, deregulating, handing more power to the networks. No, for them this is a political vacuum that can be filled with blame. With finger-pointing and scapegoating. It is an opportunity to make themselves look powerful by pandering to fears, by evoking nationalism, racism, and fascism.
Donald Trump has still not conceded the 2020 election despite Joe Biden’s clear victory, and is leaning in part on the fact that the United States has a complex and sometimes opaque voting system that most of the public doesn’t understand to spread conspiracy theories about glitchy or malfeasant voting machines switching or deleting millions of votes. It’s perhaps no coincidence that some of the highest-profile figures on the right — like ex-Trump-adviser Steve Bannon or Brexit Party leader Nigel Farage — have backgrounds in the financial industry. These are political players who have seen how complicated things have become and can sense the gap in public comprehension but want to fill it with chaos and conspiracies rather than explanations.
So what’s to be done about all this? Over the coming months I’m going to both locate ways that we can try to increase our knowledge of the seemingly unknowable, as well as find strategies to counter the powerlessness and anxiety the system produces. Along the way I’m going to be talking to a lot of experts about everything from automated shipping and algorithmic trading to financial regulation and political resistance, as well as taking deep dives into how emerging technologies like artificial intelligence and quantum computing could make things better — or a lot worse. I hope you’ll join me as we explore how our systems work, how their complexities impact our lives, and how we can regain some agency within them.