Facebook Wants Your Medical Data, but Isn’t Legally Obligated to Protect It
The company promises its Preventive Health tool is private. Should users trust it?
Using information I handed over 14 years ago, Facebook just recommended that I get a flu shot, a cholesterol test, a blood pressure test, and a Pap smear.
The recommendations are part of Facebook’s new tool, called Preventive Health, which the company first announced in October. Over the past few weeks, many users have started seeing the tool in their feeds for the first time. (For now, the tool is only available to U.S. users.) The health recommendations are based on guidelines from the Centers for Disease Control and Prevention, American Cancer Society, and American Heart Association, which Facebook partnered with to develop the tool. Given my demographic, Facebook says I should get a Pap test and HPV test every five years.
Health information shared in apps, posted on social media, or collected by fitness trackers isn’t legally protected in the same way hospital and medical records are.
The tool allows users to mark each recommendation as done, set a reminder when they’re due for a checkup, and find locations where they can get screened. And that’s where the new tool feels a bit invasive. Data you share in the tool isn’t protected in the same way your health records at a doctor’s office are, which raises a lot of questions. Once you enter this information, what will Facebook do with it? How will the data be stored, and for how long? Who will see it?
Preventive Health represents a new foray into personal health for Facebook, which will need to answer these questions if it expects users to divulge sensitive health information. An NBC News/Wall Street Journal poll conducted last year found that 60% of Americans don’t trust Facebook to protect their personal data. Other tech giants are venturing into the health space, too, with Google recently acquiring Fitbit and Amazon launching a virtual medical service for its employees.
Facebook assures users that the information they disclose in the health tool won’t be shared with other Facebook users or third parties, like health organizations or insurance companies. Users can choose to show their Facebook friends that they’re using it, but doing so will not share specific information about their activity within the tool, like an upcoming doctor’s appointment. The company says a small group of Facebook developers who work on the tool will, however, have access to the data you input.
Facebook also says it won’t show users ads based on the information they provide in the tool. But Erin Egan, Facebook’s chief privacy officer, says in a blog post that “other actions that you take on Facebook could inform the ads you see, for example, liking the Facebook page of a health organization or visiting an external website linked to from Preventive Health.”
It’s well known that Facebook collects users’ browsing history and uses that information to sell targeted ads. So, if you search for health information on Google, you could wind up with targeted ads on your Facebook feed. (For a recent story, I was researching genetic screening for diseases common in the Jewish community. A few days later, Facebook showed me ads for JScreen, one such testing service, even though I never used Facebook in my search.) In spring 2018, Facebook announced a new feature to allow users to opt out of this kind of data collection for the first time. After a long delay, the company started rolling out the feature last year.
With the Preventive Health tool, Egan says the company is taking “extra steps” to protect users’ privacy and says the information users provide will be “securely stored.” But that promise isn’t very convincing, given the platform’s history of data breaches and privacy violations.
Another big privacy issue that consumers might not be aware of is that health information shared in apps, posted on social media, or collected by fitness trackers isn’t legally protected in the same way hospital and medical records are. The Health Insurance Portability and Accountability Act (HIPAA), a landmark U.S. law enacted in 1996, established national standards to protect individuals’ medical records and other personal health information, setting strict rules on who can access your health records and how those records can be used and shared. Under HIPAA, your doctors can share your medical records with other healthcare providers involved in your treatment but must use safeguards when doing so. Violators are subject to civil and criminal penalties.
But the law only applies to entities like health insurers, hospitals, and health care providers. It doesn’t extend to tech companies that ask users to voluntarily submit health information. Enacted more than 20 years ago, HIPAA was written at a time when the health care landscape was much less complex, long before the rise of social media and digital health startups. It’s why some advocates and digital health experts are now calling for an update to U.S. health privacy laws.
Companies that collect personal data from users are required by law to provide privacy policies, but the vast majority of people don’t read them because they’re far too long and loaded with legal jargon. Facebook also faces more constraints in this area than other tech companies: A 2019 settlement with the Federal Trade Commission, the government agency that enforces consumer protection laws, required Facebook’s board of directors to establish an independent committee to oversee consumer privacy. Facebook says it isn’t sharing users’ health data with third parties now, but it’s unclear if that will carry into the future. (The company did not respond to a request for comment.)
The FTC also levied a $5 billion penalty on Facebook last year to settle charges that the company deceived users about their ability to control the privacy of their personal information. While $5 billion is the biggest penalty the FTC has ever imposed on a single company, it represents less than a quarter of Facebook’s annual profit. And after the settlement announcement, Facebook’s stock price surged instead of sinking, prompting critics to argue that the FTC penalty wasn’t harsh enough.
Given Facebook’s track record with privacy, users might feel reluctant to use this new health feature — and future ones the company might introduce. Though Facebook doesn’t ask you to enter your test results into the tool, it could still glean what those results are, based on your browsing habits. And there’s also the question of whether users will be able to view the data they enter into the tool, download it, or be able to delete it from Facebook at any point in time.
In a blog post explaining Facebook’s impetus for creating the new tool, the company’s head of health care research, Freddy Abnousi, says, “tens of millions of people in the U.S. are missing out on recommended preventive care.” While it’s true that many Americans aren’t getting the preventive care that they should, unnecessary health screenings are also a big problem. Many people get screenings too often or at the wrong age. Experts say health screenings can sometimes do more harm than good, and medical groups disagree on the frequency of some tests, like breast cancer screenings. Instead of providing clarity, Facebook’s new Preventive Health tool could further confuse patients.
One feature of the Preventive Health tool is that it allows users to find locations nearby to get a screening. On one hand, this could help people who don’t have a primary care doctor or those who live in rural areas find providers near them, but it could also create needless costs if they end up seeing a provider they found in the app that doesn’t take their insurance.
Facebook says it’s starting with health checkups related to heart disease, cancer, and flu, but it plans to add more to the Preventive Health tool. But will people trust Facebook enough to use it?