Privacy Is Just the Beginning of the Debate Over Tech

Controversial ‘smart locks’ show the way that surveillance tech begins with the poor, before spreading to the rest of us

Credit: Freer Law/Getty Images

InIn a weird case of who knew we had to fight for that particular right, last month a judge ordered that landlords in New York were required to “provide physical keys to any tenants who didn’t want to use the Latch smart locks installed on the building last September.” This legal ruling came after residents of an apartment building in Manhattan banded together and successfully sued to prevent their landlord from replacing the physical locks in the lobby with smart locks.

The Latch, and similar keyless locks opened using smartphones, are increasingly common features of apartment buildings across the U.S. In their opposition to this forced upgrading, tenants listed myriad reasons why the smart lock did not actually offer the enhanced convenience and security their landlord claimed it did. These issues ranged from some of the older residents not owning smartphones to larger concerns around the landlord being able to monitor and harass tenants. “I said I don’t want to be tracked, and the landlord laughed,” a 72-year-old resident of the building told CNET.

We are all aware that smart technologies — data-driven, internet-connected, automated — are rife with privacy issues. There are countless examples, and the list grows every day. Smart locks bring to life many of the worst fears of privacy advocates. Such fears-turned-reality include the collection of personal data, which is then used to (secretly) create detailed profiles about our identity, preferences, behaviors, and routines. In addition to fueling an economy built on personalized targeting, with this level of tracking comes a near-total loss of the “obscurity” that shields us from the disciplinary gaze of governments, corporations, and bosses.

When we accept a framing that is friendly to the surveillance-industrial complex, then we end up fighting on their turf, by their rules.

It’s not surprising that our skepticism of smart things tends to start with privacy. The problem, however, is that it also often ends with privacy. This framing limits our understanding of what’s at play and what’s at stake. Instead, when we talk about technology, we should be thinking about power dynamics.

There’s a reason why tech companies are happy to discuss privacy, fund luxe conferences about privacy, concede some points to advocates, and even create new professions like “privacy and civil liberties engineering.” It’s because this framing does almost nothing to challenge the actual political economic system — based on extracting data and expanding control — that they have built and benefit from.

Sociologist Sami Coll argues that the concept of privacy has now become more like an “ally of surveillance,” rather than its antidote. Privacy has come to be defined by individualistic ideas and solutions centered on being responsible digital consumers who practice good cyberhygiene. But in a battle between plucky individuals who must protect themselves, and powerful institutions who want to poke, probe, and profile people, it is no mystery who wins. When we accept a framing that is friendly to — or has even been absorbed and appropriated by — the surveillance-industrial complex, then we end up fighting on their turf, by their rules. We might win some skirmishes, but we will lose the war.

So when tenants in rent-stabilized apartments are trying to prevent landlords from installing facial recognition, they are not simply “sparking a debate about privacy and surveillance,” as CityLab framed it. This isn’t just a case of people wanting a place to hide from prying eyes. Nor is it, as the landlords frame it, about a false choice between privacy versus security. And it is definitely not a knee-jerk panic response to innovation, as the anti-regulation, pro-corporate think tank the Mercatus Center argued.

They are not merely technological refuseniks. They are fighting at the frontlines of digital capitalism — and they need reinforcements.

Ultimately, this issue is about power: who has it, how do they wield it, what do they gain? And vice versa: who is excluded, targeted, squeezed?

The example of smart locks and facial recognition does a great job highlighting what’s really at stake because it’s a clear cut case of a less powerful group (tenants) resisting attempts by a more powerful group (landlords) to arm themselves with a dangerous arsenal. Tenants aren’t even trying to advance their position — they’re just trying to not lose ground.

These power dynamics are more common, and a greater influence over the design and use of new technology, than we realize. Technology materializes power, giving it a solid and durable form. What’s most important is not the technology itself, but instead the people, values, and organizations that are behind technology and the ways their interests are channeled through technology. Technology is not neutral, inevitable, or totally malleable. Hidden inside every technology is a bunch of human choices about what problems should be addressed, how resources should be spent, where trade-offs should be made, whose views are represented, why the technology should be used, and many other decisions that boil down to doing X, instead of Y or Z.

Much of the smart technology we’re now surrounded by is born from a system of profit and power that legitimizes itself through what the historian and critic Lewis Mumford identified, back in 1970, as the “megatechnic bribe.” The conditions of this bribe are simple: In exchange for accepting what the system produces with minimal fuss, some people are granted access to modest luxuries of convenience and comfort, while many others are subjected to enhanced degrees of exploitation, extraction, and exclusion. The benefits for some rely on the harms for many.

In other words, we get distracted by the privilege of cool gadgets and regular upgrades while a supercharged system of surveillance capitalism is integrated into society. None of us explicitly agreed to this deal, but just like with a software license or terms of service, we have been given no choice but to “accept.”

Examples of this bribe can be found everywhere. When insurers offer discounted devices and other incentives as a way to harvest valuable, personal data from our smart homes and everyday lives. Or when tech companies sell solutions for optimizing urban systems and securitizing urban space, as a way of taking over the operations, oversight, and ownership of cities. This bribe is the very basis for many of the platforms we have come to rely upon — Uber rides, DoorDash deliveries, and rooms from Airbnb. It’s so cheap and convenient that we’ll overlook those inconvenient issues of worker exploitation, cutthroat expansion, value extraction, and the billions in venture capital that make those services possible.

Bribery is a form of corruption. And as society is upgraded according to the needs of digital capitalism, the imperatives of profit and power have corrupted these smart technologies at their source. We might not have accepted the bribe in any meaningful way, but nonetheless it appears to be working like a charm.

As Virginia Eubanks demonstrates in her book, Automating Inequality, capitalism often tests its newest methods to extract, exploit, and exclude on the least powerful, most marginalized groups in society — those who don’t even get the privilege of being bribed — before rolling them out to the rest of us. We should heed the warning told to Eubanks by a woman on welfare whose life is tracked and controlled at every step: “You should pay attention to what happens to us. You’re next.”

Unless, that is, we follow in the footsteps of those tenants who rejected the smart lock and the power dynamics it materialized. They are not merely technological refuseniks. They are fighting at the frontlines of digital capitalism — and they need reinforcements.

author — TOO SMART: How Digital Capitalism is Extracting Data, Controlling Our Lives, and Taking Over the World — https://mitpress.mit.edu/books/too-smart

Sign up for Pattern Matching

By OneZero

A newsletter that puts the week's most compelling tech stories in context, by OneZero senior writer Will Oremus. Take a look.

By signing up, you will create a Medium account if you don’t already have one. Review our Privacy Policy for more information about our privacy practices.

Check your inbox
Medium sent you an email at to complete your subscription.

The undercurrents of the future. A publication from Medium about technology and people.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store