Listen to this story
Earlier this month, during a visit to a YMCA in London, Prince Harry argued that Fortnite should be banned, complaining that the video game is “created to addict, an addiction to keep you in front of a computer for as long as possible.” The prince is just the most recent in a long line of folks who worry about what digital technology is doing to our brains. That group increasingly includes some of the people who brought these products to us in the first place — last year, Facebook’s founding president Sean Parker, who left the company in 2005, complained that the “like” button exploits customers by creating a feedback loop of dopamine rewards.
It’s true that the most effective forms of digital media can be so compelling to some users as to feel addictive. That’s because they activate the brain’s reward system — or mesolimbic pathway — releasing the neurotransmitter dopamine and providing a sense of pleasure or satisfaction. But the concerns raised by critics like Harry and Parker actually represent a crude overgeneralization about how our brains respond to electronic media. Using this intellectual framework to describe what’s happening when we use our devices probably does more harm than good.
Some people have come to believe that smartphone notifications essentially function as Pavlovian stimuli — visual or auditory signals that lead to an involuntary biological response — buying into a pseudoscientific narrative that encourages us to imagine that simple behavioral conditioning is tantamount to shooting up heroin. That’s not the case. As Julie Jargon recently wrote in the Wall Street Journal, “accurately measuring dopamine in the brain is a challenge, and can involve injections of radioactive materials that can be tracked on a PET scan.” Nobody does that while developing digital user interfaces.
Savvy designers probably began suggesting that their data was a measure of dopamine rewards because it helped to sell their products.
So what are software engineers referring to when they use terms like “dopamine hacking” or “brain hacking” to describe the way they design products to hook users? It starts with the acknowledgement that the experience of pleasure can be understood, from a neurochemical perspective, as the release of dopamine. This is true for every experience of pleasure, without distinction. Dopamine is released when you bite into a warm chocolate cake, when you kiss a lover, and when your favorite song comes on the radio. But when it comes to software design, the activities in the brain’s pleasure center are theoretical.
Developers use A/B testing, segmentation, and similar methods to collect data about how certain symbolic rewards — notifications, ringing bells, vibrations, animations, color patterns, etc. — perform in relation to others. These are not neurochemical measures, but rather, very simple controlled experiments. For example, different variations of the same digital content might be presented to users in order to determine which version is statistically more likely to produce a specific behavior. Engineers then use that data to iterate their products in ways that will presumably maximize the desired outcomes.
Savvy designers probably began suggesting that their data was a measure of dopamine rewards because it helped to sell their products. It made it sound as if their psychometrics were a biological way of delivering messaging to consumers. The neurochemical framing was a great sales pitch. But that’s really all it is — a sales pitch.
One could also frame the pleasure you experience reading an article on Medium in terms of dopamine rewards. Good authors are always honing their writing styles to maximize reader engagement. They know that short sentences can create rhythms that are exciting. Fast-paced. Jarring. And long sentences, packed with metaphorical clauses, can feel like whitewater rafting: a reader tossed from side to side, words splashing up to create the thrilling mist of meaning.
Similarly, religious institutions have always iterated their rituals to maximize dopamine rewards. The music of the Catholic Mass kept changing. Stained glass was developed to make biblical stories more compelling. The best architects and engineers were hired to build increasingly transcendent cathedrals. These changes can all be understood as the result of a simple analysis of the behaviors parishioners manifested as the result of dopamine release. In fact, recent research confirms that the essential oils frankincense and myrrh — just like smartphone notifications — have a demonstrable impact on our brain chemistry. But that shouldn’t be surprising. All pleasure can be understood, or framed, as the result of an activated reward system in the brain. So why, then, are we so willing to grant digital media unusual powers in this regard?
Certainly, it’s true that computer processing power enables us to collect, analyze, and interpret engagement metrics with unprecedented precision and depth. “We can now build systems that adapt to user behavior in real time, maximizing the pleasurable experience,” says Keith Devlin, co-founder and former executive director of Stanford University’s Human Sciences and Technologies Advanced Research Institute. “But it’s equally accurate to say that this is what a good stand-up comedian does intuitively by ‘reading’ the audience.”
I don’t think anybody’s worried that John Mulaney is destroying human civilization by making us laugh. Likewise, the so-called dark patterns and gambling mechanics of video games and social media are a lot less scary than the fearmongers would have you think.
All pleasure can be understood, or framed, as the result of an activated reward system in the brain. So why, then, are we so willing to grant digital media unusual powers in this regard?
It’s certainly true that some folks can develop an unhealthy craving for a dopamine fix. Kids might be seduced into making far too many in-app purchases, and grown-ups might use Tinder in compulsive ways. “It is to be expected that in some individuals the pleasure-seeking urge could be pathologically high or low,” says Devlin. “For instance, there are sex addicts and chronic overeaters (as well as people who are sex-averse or anorexic). But we don’t blame sex or food.” Likewise, we shouldn’t automatically scapegoat digital technologies just because it’s possible for some people to overindulge. Sure, the designers of electronic media try to maximize their profits by taking advantage of those who are most susceptible — as do fast food franchises and internet porn sites — but that’s really a feature of consumer capitalism, not digital technology.
Still, there may be a good reason to worry — not about brain chemistry, but rather about the ethical implications of the tech industry’s dopamine hacking narrative. It presumes that an effort to maximize pleasure and engagement constitutes a worthy end in itself. More clicks, more eyeballs, and more users are often not only the objectives, but also the metrics of success and the incentives. It’s enticement for its own sake: to maximize growth and scale of the persuasive apparatus. Alternatively, we tend to presume that religious rituals and good books engineer engagement, pleasure, and feelings of contentment in the service of a thesis, a moral message, or a higher purpose.
But this concern isn’t unique to the digital era. Almost two and a half millennia ago, Socrates complained that the Sophists of ancient Greece prioritized rhetoric over truth. He argued that oration — effectively the social media of its time — that doesn’t aim to bring us closer to “the good” is a cause for concern. In Plato’s Republic, Socrates even warned that prioritizing attention over meaning would ultimately prove to be democracy’s downfall. When a citizen “lives from day to day, gratifying the appetite of the moment,” he tells Adeimantus, “there is neither order nor necessity in his life, yet he calls it pleasant, free, and blessedly happy.”
Unfortunately for Socrates, the men of Athens weren’t interested in his message. Instead, he was tried, convicted, and executed for employing exactly the kind of supposedly empty rhetoric that he accused them of using: “making the weaker argument the stronger.”
There’s nothing new about the fear and demonization of rhetorical skill. And that’s why the question we need to confront isn’t whether screens are too engaging, addictive, or manipulative. Instead, it’s about how we categorize and make meaning from the different kinds of symbolic representations that accompany the release of dopamine rewards. We need to teach our children, as elders always have, to distinguish between the good and bad sensations of pleasure.