OneZero

OneZero is a former publication from Medium about the impact of technology on people and the future. Currently inactive and not taking submissions.

Follow publication

An Algorithm May Decide Who Gets Suicide Prevention

A recent study on Google’s search results raises questions about the faith we put in algorithms — and the tech companies that use them

Jake Pitre
OneZero
Published in
5 min readMay 23, 2019

--

Credit: PhotoAlto/Alix Minde/Getty Images

ToTo anyone unfamiliar with the suffocating feeling of suicidal ideation, the significance of a phone call with a stranger might seem dubious. In that moment, the lowest of lows, speaking with someone who understands what you’re going through can — quite literally — mean the difference between life and death.

While it remains under debate just how effective suicide hotlines are in terms of preventing suicides, there is plenty of anecdotal evidence that lives have been saved by their existence. Based on that evidence, in 2011, Google began to show suicide helpline numbers at the top of results for searches like “effective suicide methods.”

Like so much of our online lives, the decision to show this advice alongside results is determined by an algorithm. Facebook has been doing something similar since 2017, when it began monitoring posts for content about suicide or self-harm with pattern recognition algorithms, and then sending those users relevant resources.

Yet, while these steps are helpful, the algorithms concerned do not perform consistently across the world, according to a study published earlier this year in the journal New Media & Society. The researchers — Sebastian Scherr at the University of Leuven and Mario Haim and Florian Arendt at the University of Munich — found that algorithmic bias is increasingly becoming a challenge for technological growth as algorithm creators struggle to confront the limitations of their programming. In this case, they found that Google’s algorithms contribute to a digital divide in access to health information online.

An algorithm, it seems, could determine, in some cases, who gets shown lifesaving information, and who doesn’t.

TThe researchers behind the New Media & Society paper set out to understand this odd quirk of Google’s algorithm, and to find out why the company seemed to be serving some markets better than others. They developed a list of 28 keywords and phrases related to suicide, Scherr says, and worked with nine researchers from different…

--

--

OneZero
OneZero

Published in OneZero

OneZero is a former publication from Medium about the impact of technology on people and the future. Currently inactive and not taking submissions.

Jake Pitre
Jake Pitre

Written by Jake Pitre

A very sensitive piece of horse flesh. Writer for Polygon, Hazlitt, etc. Grad student.

Responses (5)

Write a response