Listen to this story

--:--

--:--

Google Just Lost One of Its Hardest ‘Right to Be Forgotten’ Cases Yet

A once “negligent” doctor just won a landmark case against the search giant

How long should a mistake follow you? On the internet, where an embarrassing old profile pic is always just a Google search away, the answer can feel like “forever.”

This isn’t quite the case in the European Union, where citizens have the “right to be forgotten,” a component of the General Data Protection Regulation (GDPR) that was established in 2014. It gives people the ability to request the removal of URLs containing “inadequate, irrelevant or no longer relevant” information from search engines.

In theory, it’s aimed at private citizens whose lives have been negatively affected by the dissemination of personal information (revenge porn survivors, for instance) or at those who have been subject to the spread of falsehoods. In practice, it’s unclear whether that’s the case: The definition of “inadequate, irrelevant or no longer relevant” information can feel ambiguous — and so can the criteria Google applies to grant or deny a specific request, at least until it’s legally compelled to change course.

What is clear, however, is that the right to be forgotten isn’t just for private citizens embarrassed by an old Flickr photo or two. In what is thought to be the first instance of a medical professional winning the right to be forgotten, an Amsterdam district court last week overruled Google’s wishes and granted a surgeon’s request to delist search results related to her medical suspension after she was disciplined. The decision may have chilling implications for online freedom of expression and underscores how central Google has become to our search for information.

According to the Guardian, the surgeon was suspended four years ago after accusations of negligence while caring for a post-op patient. After appealing the decision, her suspension was changed, and she was allowed to continue practicing medicine. Nonetheless, a “blacklist” website featuring the surgeon’s name showed up on Google as one of her first search results, which led to her petitioning the search giant to remove the link.

Taken to its logical conclusion, the judge’s ruling lays the groundwork for “a situation where if you search for doctors, you’d only get positive stories,” O’Brien says.

Danny O’Brien, international director at the Electronic Frontier Foundation, a nonprofit devoted to protecting digital privacy and the right to free speech, says this sets a disturbing precedent. Taken to its logical conclusion, the judge’s ruling lays the groundwork for “a situation where if you search for doctors, you’d only get positive stories,” O’Brien says.

“And in some ways, that is as deceptive a statement as any other [inaccurate statement] that a website might say about someone,” he adds.

It’s not particularly surprising that a physician would want to remove evidence of past errors or careless behavior from the internet. Like most service providers in the age of Yelp, medical professionals are plagued by the fear of bad patient reviews, which are often among the first to show up in Google search results, says Jane Orient, M.D., executive director of the Association for American Physicians and Surgeons.

“Many physicians have been troubled by unfavorable comments,” she says, “because they can be from anonymous people; they can be from competitors.” With no way to vet such comments or identify the authors, physicians have little room for recourse despite the fact that such reviews can seriously harm their careers.

Reviews are one thing, but the information collected on the blacklist website came from public records. As is the case in the United States, the Netherlands requires any history of disciplinary action to be noted in the physician’s entry in a national public database.

But in the ruling on the case, the judge argued that the title of the site, which implied the surgeon was not equipped to practice medicine, was inaccurate because the disciplinary board had overturned the decision. O’Brien says this is just splitting hairs. Delinking content of any kind, he argues, is like “only telling one side of the story. And what we’re seeing here is one side of the story disappearing.”

“The fact that she was on probation is a public fact,” he says. “A newspaper journalist can say that, and I’m fairly confident they could argue that was in the public interest. What this judgment is attempting to do is try to ensure that people can’t find that information very easily.”

To be clear, this doesn’t mean that people can’t find the information at all. One can always see whether a medical provider has been subject to disciplinary action on the Dutch BIG-register.

Still, search giant Google remains an easier point of entry — people may not know to access the BIG-register — and the information it surfaces takes on a certain power.

This is what both the pro- and anti-right to be forgotten contingents share: a distrust of platforms like Google and the power they have accumulated over the years.

“People trust Google,” says Orient. “But Google has made the determination that these are the most reliable sites, and Google has made this determination without authority… There should be some accountability for things like blacklists that do tremendous harm, and people [outside of the EU] don’t really have a way to defend themselves against them.”

Ultimately, this is what both the pro- and anti-right to be forgotten contingents share: a distrust of platforms like Google. Though this particular case hinged on a judge’s ruling, it highlights the search engine’s power; there are arguments for and against removing the search results, but there doesn’t seem to be a comfortable middle ground.

Regardless, Google’s algorithm has already created a hierarchy of information. By prioritizing some search results over others, the platform is implicitly telling us which sources of information are more accurate and reliable than others. As it stands now, we have almost no idea what goes into making that determination.

Google has a privacy overview website that shows how many URLs have been requested for delisting (almost 3 million) and how many of those URLs have actually been delisted (44.1 percent). But there’s not much insight into why some requests are granted and others are not or why some information is deemed irrelevant or out of date.

“Prior to the internet, you’d have to go to the library to find specific information… and you’d have to go with some intent,” O’Brien says. “On the internet, you’re always one hop away from something. That’s how it’s designed: to find information really quickly… [with the right to be forgotten], the aim here is to recreate obscurity.”

I write things about things.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store