Who Owns Your Health Data?

Companies are denying people access to their own data as security risks run rampant

Photo: Almos Bechtold on Unsplash

Today, much of our lives — and our health — is observed digitally. We discuss what’s for dinner with our families and our dieting apps. Athletes improve their performance with the help of their watches’ personal metrics. Mothers watch their babies grow both in the flesh and on their screens. People with diabetes can go about their day at ease, having precise knowledge of their blood sugar levels. And some people have even reported developing new senses from meticulously tracking their health.

We now have mobile health devices that are very attuned to our bodies, capable of detecting the subtle changes in our heart rate and sleep cycle. These devices offer us a new avenue to view our health on a daily basis, yet strict corporate data policies often stand in the way of our ability to make use of our data in the context of our own health.

Many people will use a device or download an app looking to address a specific health problem or develop a good habit. But in practice, users struggle to see what their data could mean for their health beyond post-workout reports and heart rate graphs. People with chronic conditions might want to use their devices to help track long-term changes in their condition but are forced to dig deeper to understand how to tailor an off-the-shelf tracking device so it can monitor their particular situation.

When companies deny people access to their own data, they’re likely to rebel.

Hugo Campos, a heart patient, wanted to understand how everyday activities affected his heart and saw that he could use the data from his implantable cardioverter defibrillator (ICD) to inform his lifestyle decisions. He needed online access to his heart data, which his ICD didn’t display for him. When Campos made the request, however, the device’s manufacturer denied access, deeming him to be an illegitimate user of that data — his data. “A built-in computer running proprietary software monitors my every heartbeat,” Campos said in a 2015 piece for Slate. “But the data it records via sensors in my heart is entirely beyond my reach. It is wirelessly transmitted to a monitor set up in my bedroom and sent via telephone lines to the manufacturer’s data warehouse, bypassing me altogether.”

If you take your blood pressure at the doctor’s office, the results are protected by HIPAA rules. Take your blood pressure with a personal health monitor, however, and that information will be sent back to the device’s manufacturer. The data collected by consumer health devices will most likely become the company’s property, making the information less accessible to the people who have generated it.

Companies can get away with claiming the data is theirs because consumer health data lacks the protections of patients’ data in clinical settings. And while many of these companies might see the potential in adapting their data products to consumers’ needs, social scientists Gina Neff and Dawn Nafus point out that their “business processes sanitize and genericize, making [them] unable to appreciate just how much self-tracking data could matter to people.” Companies just aren’t in the business of understanding the value of data for users, even though this is how they present their data products.

And this fact doesn’t sit well with people like Campos who believe they could benefit from interpreting their personal health data. When companies deny people access to their own data, they are likely to rebel. To get around the manufacturer’s strict data policy, Campos bought a pacemaker programmer and took a two-week course to learn about heart rhythms so he could siphon the data from his device and interpret it for himself.

Some people will also want to add functionalities to their devices that can enhance their quality of life. This might also require data access. Dana Lewis, who has Type 1 diabetes, saw that she could use her blood glucose readings to build an algorithm that could automatically administer her insulin — an artificial pancreas. She used an older model of her continuous glucose monitor to get around data policies. Parents whose children have Type 1 diabetes have also hacked their devices so they can monitor their children’s health remotely, allowing their kids to go to school and other extracurricular activities without needing constant supervision.

Hacking your own device, however, comes at a cost. Many users buy parts overseas to hack their devices. Others have gotten around restrictive data policies by using older models of their health monitors to get access to their data. When users take control of their devices, they run the risk of introducing insecurities to their devices that hackers could use to gain access.

“If a pacemaker for a patient gets hacked, you can’t take that back.”

Cybersecurity researchers recognize the potential for hackers to cause harm to patients through their devices. Although to date there have been no documented cases of hackers hijacking personal medical devices, it is possible. Billy Rios and Jonathan Butts, both security researchers, demonstrated last year how they were able to hack a Medtronic pacemaker and insulin pump. They remotely reprogrammed a pacemaker such that it could disrupt a patient’s heart rhythms and showed how a wireless signal can be sent to an insulin pump to deliver an inadequate amount of insulin to a patient. “If a pacemaker for a patient gets hacked,” Rios warned, “you can’t take that back.”

While some users can put themselves at risk of outside attacks, data from all devices is still not fully secured, leaving all users compromised. Companies are offering their services in exchange for personal information, participating in what Harvard Business School professor Shoshana Zuboff calls surveillance capitalism. This means companies can profit from stockpiling as much data as they can. Their devices are broadcasting data back to their headquarters for them to stockpile, bypassing the people who have generated it. And to collect this data, the companies have to maintain security loopholes. “As long as companies are free to gather as much data about us as they possibly can,” security expert Bruce Schneier says, “they will not sufficiently secure our systems. As long as they can buy, sell, trade, and store that data, it’s at risk of being stolen. As long as they use it, we risk it being used against us.”

Currently, there is no omnibus privacy legislation in the United States to prevent third parties like insurance companies, employers, and credit card companies from discriminating against people whose data they gain access to. Legal scholar Frank Pasquale retells the case of Walter and Paula Shelton, who were denied health insurance based on Paula’s prescription history. Some employers have invested in corporate wellness programs to encourage healthier lifestyles among their employees and to reduce health insurance premiums. However, employees who don’t meet their employer’s ideal health metrics can face additional costs. The tire company Michelin has set health standards for factors such as blood pressure, glucose, cholesterol, and waist size. Employees who didn’t reach the targets could lose out on an annual $1,000 deductible credit toward their health insurance. Similarly, CVS required employees to disclose information about their body fat, blood sugar, blood pressure, and cholesterol or pay $600 a year. These cases of data discrimination raise further concerns about how our health is being surveilled today.

Many people already struggle to figure out how to make their data work as it is, much less reconcile secondary issues of security and discrimination. Companies have been intentionally ambiguous about use cases for their devices to make them appeal to a wide consumer base. This means they’re leaving the guesswork to users to “make it work” for their lives. Even if some people can make it work, revealing our health information through our devices has continuing consequences for us in this industry built on exploiting our data.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store