Tech Firms Struggle to Balance Privacy With Security
As companies like Facebook move to make encryption standard in messaging, governments are fighting back
The United States has a long and complicated relationship with encryption. While free speech campaigners and many technologists defend encryption as a tool that is a technological expression of the First Amendment, many in law enforcement argue that encryption protects criminals, terrorists, and child abusers.
Until recently, it was only the authoritarian regimes of Russia, China, and Turkey that demanded backdoors to encrypted communications, which fitted with their policies of censorship and information control. But in early October, U.S. Attorney General William Barr wrote to Facebook demanding that the company halts plans to expand encryption across its messaging services “without including a means for lawful access to the content of communications to protect our citizens.”
The letter, which was co-signed with the U.K. and Australian governments, came after an explosive New York Times investigation that reported an unprecedented 45 million images of child sexual abuse were found online in 2018 alone. Poorly-resourced law enforcement is battling against sophisticated and powerful technologies that keep offenders anonymous, campaigners say.
“Security enhancements to the virtual world should not make us more vulnerable in the physical world,” read the letter. “Companies should not deliberately design their systems to preclude any form of access to content, even for preventing or investigating the most serious crimes.” The letter followed reports of an earlier meeting by senior officials in the Trump administration in June, who discussed how to outlaw any encryption that couldn’t be opened by the government — a move that would effectively make unbreakable encryption illegal.
A series of devastating terrorist attacks have also prompted greater scrutiny of how criminals are coordinating online, and the U.S. is just one of several Western governments exploring ways to give law enforcement access to the encrypted communications of suspects on popular services such as WhatsApp and iMessage. And what we are seeing, as a result, is a deliberate shift toward government control at the expense of our right to communicate in private.
For those defending free speech, the cost of introducing backdoors into encrypted software — even to help combat terrorism, crime, and child abuse — is just too high.
In end-to-end encryption, the sender’s software securely “locks” the contents of the message, video, audio, or document, so that it can only be opened by the recipient. The messaging apps WhatsApp and Telegram, which have 1.5 billion and 300 million users respectively, use end-to-end encryption. Facebook Messenger has more than 1 billion users and plans to introduce encryption, while its Secret Conversations feature is already encrypted.
The American government has long operated several known programs to break encryption. In 1993, the National Security Agency (NSA) designed the “Clipper chip,” a device that could be built into mobile phones as a way to give the agency backdoor access to any device, though the chip was never adopted.
In 2013, former NSA contractor Edward Snowden revealed an NSA program called Bullrun, which aimed at breaking the encryption of online communications. Snowden also disclosed that intelligence agencies could legally order Apple and Google to bypass encryption to access data stored on Android and iOS smartphones. In 2014, Apple and Google both responded by expanding encryption and closing that backdoor.
In early 2016, the FBI took Apple to court. It wanted Apple to bypass the encryption on an iPhone 5C used by one of the terrorists behind the December 2015 attack in San Bernardino, California that killed 14 people and injured 22. During negotiations, the FBI reportedly asked Apple to weaken security features in its next mobile OS. “It would be wrong for the government to force us to build a backdoor into our products,” Apple CEO Tim Cook said at the time. “Ultimately, we fear that this demand would undermine the very freedoms and liberty our government is meant to protect.”
The debate is not only restricted to the U.S. Recently, the intelligence services of the so-called Five Eyes nations—the U.S., U.K., Canada, Australia, and New Zealand—called for spies and police officers to be given backdoor access to WhatsApp and other encrypted communications. In 2015, then-U.K. Prime Minister David Cameron proposed a ban on encrypted messaging apps, citing the danger of extremism. The subsequent Investigatory Powers Act of 2016 required all encrypted services to be accessible to law enforcement, although because foreign companies are not required to comply, no major service has been affected. This month the U.K. appeared to step up its demands with the joint letter to Facebook, in which the National Center for Missing & Exploited Children estimated that 70% of child abuse reports from Facebook — 12 million global reports every year — would be lost.
Cybercrime, terrorism, and child abuse are strong arguments used in defense of internet censorship and backdoors in messaging apps, yet clumsy legislation can also go too far. A law passed in Australia in 2018 forces tech companies to give law enforcement access to encrypted material, but also bypasses protections for journalists; records show that Australian federal police accessed the metadata of journalists 58 times in the 2017-18 financial year. Press freedom advocates worry the new rules may have compromised journalists’ sources.
Omar Kaminski, a technology lawyer and campaigner, isn’t convinced we are seeing a more permanent shift away from privacy toward security. “From a political point of view… it seems more like a cyclical movement. But I believe that the freedom to use [privacy tools] comes into direct conflict with… claims that cryptography is used more by criminals than by citizens who want to guarantee their own privacy.”
Authoritarian countries with poor human rights records have long rejected encryption because it prevents them from monitoring communications. When the developers of Telegram refused to give backdoor access to the Russian government in April 2018, a court banned the app. According to Russian Federal Law 149-FZ, organizers of information distribution must decrypt messages when required to by the authorities.
China’s most popular messaging app WeChat is heavily monitored by Chinese authorities. WhatsApp has been blocked since 2017 due to its end-to-end encryption, and Apple’s iMessage is the only end-to-end encrypted app in the country, though operating under unknown concessions. In Turkey, an estimated 75,000 people were arrested for downloading the encrypted ByLock messaging app in 2017.
For the West, the priorities for bypassing encryption are different — yet campaigners believe sacrificing the privacy of all citizens to try and resolve complex social issues like terrorism is misguided. “We can blame the lack of action by the authorities, the lack of technical capacity, intelligence, agility, as well as really effective policies to prevent terrorism,” says Yasodara Córdova, a World Bank agile/civic tech fellow and former senior fellow of Harvard’s Kennedy School.
“If authorities institute a backdoor for WhatsApp, terrorists will use other methods, and the backdoor will be used to spy on people who have never done anything wrong.”
Córdova points to 8chan as a site notorious for radicalization; even its founder Frederick Brennan asked for it to be shut down after it emerged the terrorists behind the 2019 attacks in New Zealand and El Paso were users of the site. “What we are seeing is a lot of young people radicalizing daily on platforms that are in plain sight.” Allowing law enforcement a backdoor in all private communications is likely to do very little to change that.
Those who defend the right to private, encrypted communications say that public attitudes to data collection, and even to privacy itself, have changed with the rise of consumer technology platforms built on their users’ personal data.
“We have a level of surveillance never seen before, and this new surveillance is blind,” says João Carlos Caribé, a researcher for the Digital Humanities Network Laboratory at the Brazilian Institute of Information in Science and Technology. “It is no longer focused on the individual as in the model consolidated in the 20th century, but is all about data. The panspectron model captures all possible data, even the so-called residual data, read from the interactions of individuals.”
Introducing backdoors would put many people around the world at risk, including political dissidents, social activists, and anyone with a legitimate need to protect their identity and communications. It could also be exploited by hostile nations. For those defending free speech, the cost of introducing backdoors into encrypted software — even to help combat terrorism, crime, and child abuse — is just too high.
“Authorities worried about fighting terrorism and crimes like sexual exploitation of children and women need to start thinking about using technology to their advantage,” Córdova says. “If authorities institute a backdoor for WhatsApp, terrorists will use other methods, finding the authorities’ blind spots again and again. The backdoor will be used to spy on people who have never done anything [wrong].”
For Córdova, the privacy afforded by encryption is a technological right. “The code works to implement rights, and not just features for the profit of companies,” she says. “It is a right decoded in a protocol, implemented in a software. This is why end-to-end encryption is important for ensuring privacy and freedom of expression.”