The Risk Makers
Viral hate, election interference, and hacked accounts: inside the tech industry’s decades-long failure to reckon with risk
--
One spring day in 2014, Susan Benesch arrived at Facebook’s headquarters in Menlo Park and was ushered into a glass-walled conference room. She’d traveled from Washington, D.C., to meet with Facebook’s Compassion Research Team, a group that included employees, academics, and researchers whose job was to build tools to help users resolve conflicts directly, reducing Facebook’s need to intervene.
Benesch, a human rights lawyer, faculty associate at Harvard, and founder of the Dangerous Speech Project, a nonprofit studying the connection between online speech and real-world violence, worked closely with the Compassion Research Team, and used this meeting to raise a serious issue that had come to her attention: The extensive sectarian violence in Myanmar.
Long before it was headline news, human rights groups were warning that the Burmese military and a segment of the population were orchestrating widescale abuses against civilians, particularly the country’s Muslim Rohingya minority: forced labor, sexual violence, extrajudicial killings, the burning of villages. The attacks, amply documented but denied by the Myanmar government, were being coordinated online and often via Facebook, human rights activists said. Facebook came preinstalled on most mobile phones and, as a result, was the country’s primary news and information source.
Activists in Myanmar and the United States were calling Benesch and asking for help. Facebook, they said, was proliferating dangerous speech without fully grasping the country’s political and cultural divisions, or comprehending the danger — and their efforts to address the problems with the company hadn’t received an adequate response. In her meeting with the compassion team, Benesch relayed their concerns in blunt terms.
“You have this serious problem in Myanmar,” she told the group. “There is an appreciable risk of mass violence.”
To address the issue, Facebook began working directly with activists in Myanmar to flag dangerous content, and the company made some changes, such as translating Facebook’s Community Standards into Burmese and…