Getting Cash for Our Data Could Actually Make Things Worse

Microsoft researchers suggest ‘data dignity’ will create a better digital society, but experts disagree

WWhat’s our data worth? No one’s quite sure, but a team at tech giant Microsoft is quite sure that we should be paid for it. In “A Blueprint for a Better Digital Society,” published in the Harvard Business Review, Microsoft researchers Jaron Lanier and E. Glen Weyl propose a new tech concept called “data dignity.”

The basic premise of data dignity is that we should be paid for our data. By turning data into a form of property, we will be compensated for our data and be required to pay for services that require data from others. To make the process more transparent and bridge the gap between tech giants and individual users, groups of volunteers called MIDs (mediators of individual data) would negotiate data royalties or wages, engage in collective bargaining, and form industry standards, among other tasks.

This would spark what Lanier calls an “entrepreneurial democracy,” where individuals become more-equal actors in the marketplace, as opposed to just being surveilled for ad revenue and profit. In this model, we would pay to use Facebook, but Facebook would pay us as well, which would give people a modest, continuous stream of income.

Lanier and Weyl believe that our current model — one where the public uses free services in exchange for targeted advertising derived from surveillance, and where tech companies partner with third parties to optimize and, therefore, manipulate consumer behavior — is both undesirable and unsustainable. With data dignity as an alternative model, they hope to replace surveillance capitalism with direct buying and selling, allowing platforms to grow and thrive in a market of true competition, and to restore dignity to us, the data creators.

Tech companies make billions from our data. Why shouldn’t we be compensated, especially for our loss of privacy?

Lanier also argues that data dignity will help eliminate the economic incentive tech companies have to facilitate outrage engagement (since outrage reliably generates the best metrics, which generates the most ad revenue). This model places faith in the marketplace to produce positive outcomes — things tend to get better when you pay for them. Lanier often offers Netflix as an example. The DVD-by-mail service that pivoted to become a premier streaming subscription service was able to provide audiences with “peak TV” because we were willing to pay for it.

The data dignity blueprint also cribbed some of its ideas and potential from Radical Markets, by Weyl and University of Chicago law professor Eric Posner. Lanier and Weyl also think data dignity should be paired with better content and privacy regulation, and the creation of checks and balances should be paired with a decentralized digital infrastructure.

The concept of data dignity, or data as property — the ability to own your data — sounds noble and intuitive. Tech companies make billions from our data, so why shouldn’t we be compensated, especially for our loss of privacy? Without us, these companies have no data and data has no value. Data dignity would also expose the shadow economy where we don’t know exactly how much data is worth and where tech companies take everything in exchange for unfair returns.

However, trusting our intuition on a big and unique idea like this without good-faith scrutiny can cause us to miss its flaws. Before we consider implementing data dignity at scale, we need to understand how turning data into property could open a Pandora’s box of other issues we can’t foresee. Scholars argue that turning data into property could actually widen the digital divide, give Silicon Valley even more control with property rights, and create new modes of exploitation without effectively addressing the underlying logic of surveillance capitalism, all for what would likely amount to a pretty small amount of money for users.

Data dignity operates under several premises: Changing data into tradable commodities would give people the incentive to better understand the terms of the deal — what value their data generates and what they are owed in profit, royalties, and so forth. It also assumes a flow of value from the companies that collect the data to the people who hand it over.

“I’m skeptical of basically every piece of that argument,” says Christopher Jon Sprigman, a law and economics professor at NYU studying copyright, intellectual property, patenting, and trademarking. “There’s no indication that the data is going to be very valuable or that people are going to be able to value it correctly.”

In their blueprint, Lanier and Weyl say that “attempts to calculate the value of data suggest that many Americans could earn $500 to $1,000 a year.” But they argue that these figures are a lowball estimate, since most of the value of data is off the books. In Lanier and Weyl’s own estimate, an American family of four could make up to $20,000 in annual income under the data dignity plan. But Sprigman contends that it could just as likely amount to only enough to buy you an occasional beer.

“The money you get is likely to be crumbs,” Sprigman says. “But in return for these crumbs, what companies will get is the rhetoric of property.”

If we “own” our data and can get paid for its use, tech companies could also persuade — sometimes coerce — us to sell our data outright. The rights that come with property give them a stronger case to do whatever they would like with our data.

In their blueprint, Lanier and Weyl say that MIDs must not allow the permanent sale of a person’s data. But if data dignity were implemented, how long would it be before tech companies, their lawyers, politicians, and even some users lobby for this right? How could MIDs successfully argue that a person cannot sell what they “own”? With its vast power in the marketplace, Facebook could offer discounts or free subscriptions in return for exclusive long-term licensing rights to a user’s data. It could put into its terms and conditions that the company gets first bid if you ever wanted to sell your data. Would MIDs have enough power to reject this part of the deal?

Another part of Sprigman’s reasoning is while the difficulty in calculating the worth of our data could lead to more positive possibilities of income, data dignity overlooks all the unforeseen transaction costs of buying and selling data.

“Contracting, payment infrastructure, termination fees. There’s going to have to be all the economic and regulatory apparatus that goes along with that,” Sprigman argues. “If I thought this would help a privacy problem or make the market more competitive, I might say, ‘Well, transaction costs are worth it.’ But I don’t think those benefits are worth selling your data.”

Sprigman also doesn’t think giving people property rights to data will create a vibrant competitive market — in fact, he thinks “quite the opposite.” Rather, he argues, this model would create new “opportunities for mischief, rigidity in the market, and anticompetitive problems.”

Once data becomes property, the digital market incentives would be for everyone to flock to Facebook, a platform that already has more than 2.1 billion people among its various offerings. If Facebook has to pay us for our data, it would incentivize the company to have a much firmer grasp on the data’s value without diminishing the business incentive to pay us the least amount it can reasonably get away with. There are also complicated accounting questions: How do we verify what is our data? How can we “check the books” ourselves and not rely solely on MIDs?

“It’s like being in the market but having a blindfold on,” says Nick Couldry, a media and culture sociologist at the London School of Economics and Political Science and a faculty associate at Harvard’s Berkman Klein Center for Internet and Society. “You’re offered some money, but you have no idea what the data would be worth because the means of evaluating it is locked into the deep infrastructure you can’t access.”

But the issue of surveillance is bigger than social media. Tech CEOs often allude to the inevitability of A.I. and automation, but this is a kind of sleight of hand. It spreads the message that humans are becoming obsolete even though these machines need our data to learn and function. They would like to pretend that they’re trying to help the public adapt to the impending reality instead of making the choice that breaking things (our economic and political systems) is the price for moving fast (innovation, increased productivity, and shareholder value).

Couldry and Ulises Mejias, a media and communication scholar and director of the Institute for Global Engagement at the State University of New York, College at Oswego, say data dignity is well-intentioned but misses the point, that maybe we don’t need to live in a world where tech companies continue to collect our data.

“Just the act of requiring us to live our lives while continuously being tracked from corporations is itself an undermining of human dignity,” Couldry says.

In their new book, The Costs of Connection: How Data Colonizes Human Life and Appropriates It for Capitalism, Couldry and Mejias widen the scope, comparing data collection to historical colonialism to show how surveillance capitalism transforms as much of our social lives into data as possible. They argue that the unequal relationship between us and tech companies has already been imposed, and that data dignity would be tacitly accepting the “camera in our souls.”

Sarah Clarke, senior privacy consultant for the Irish firm BH Consulting and a guest lecturer at Manchester University, echoes this idea, questioning the premise of “data as property.”

“The information about you as an individual is not something that can be fundamentally separated from you to be owned by another individual,” Clarke says. “In the EU, there is no concept of data ownership. You allow temporary custodianship of your data but no transfer of ownership.”

Mejias argues that data dignity merely makes the current dystopian dynamic more palatable, not less exploitative and oppressive. He tries to illustrate this idea with an analogy: “Imagine that you suddenly discover that cameras have been placed to capture every moment of your life, including your most intimate moments. The company says, ‘Well, we are not going to remove the cameras, but we are going to pay you to continue to record all of this information.’ Instead of addressing the injustice, they offer you payment. I don’t think this is dignity at all. In fact, it’s the opposite of dignity.”

We first need to change our imagination — to equip people to see that good-faith proposals like data dignity miss the key point of where the assault on dignity comes from.

Mejias’ analogy shows how data dignity could unintentionally increase the existing inequality in the real world. Compare an assistant professor in Boston to a Dalit factory worker in Mumbai. The professor’s data would likely be worth more up front, so she’d be more likely to join a MID with more power, influence, and control, and she would have the incentive to hold on to her data rights. The Dalit worker’s data is likely to be worth less, and she might live in the type of economic precarity that incentivizes her to sell her data and get more money more quickly if she let it go than if she kept it.

Sprigman says there are better solutions to income inequality than selling our data. “We need something more complicated — to rethink the purposes of human life away from 40 to 60 hours of work, better redistributing the wealth that those technologies create. Those are hard things, the politically difficult and less captivating things, but (they) are the things that would make a difference.”

Clarke thinks the alternative to paying for data could be a type of “data employment,” but she adds a caveat.

“That would entail all the same risks to worker rights, discrimination, and unfair compensation as any employment relationship with such a huge power imbalance between parties,” Clarke says. “All against the background of no laws or regulations really ready to deal with a concept like that.”

Instead of making data property, Mejias thinks we should challenge our assumptions about how to solve the digital divide—the idea that “if you are at the lower end of the economic scale, connection is what will allow you to climb out of that position”—and see how people on the lower end of the economic scale are more subject to surveillance, more vulnerable to their data being used in punitive, exploitative, or discriminatory ways.

While there are several potentially good uses for data collection in civic contexts — medical innovation, for example — turning data into property or a tradable commodity does little to reverse the power imbalance between tech giants and everyday people.

Clarke says data dignity is a bit of a misnomer. Couldry says we first need to change our imagination — to equip people to see that good-faith proposals like data dignity miss the key point of where the assault on dignity comes from. Mejias thinks we should first understand our position in this new capitalism and understand how data justice needs to be aligned with labor rights, environmental justice, and other social justice movements.

Even if it’s started with goodwill, data dignity could be, in Sprigman’s words, “a really bad idea.”

Joshua Adams is a writer at Colorlines.com from Chicago. UVA & USC. Taught media and communication at DePaul & Salem State. Twitter: @journojoshua

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store