Reporter’s Brief: Attorney General’s Attempt to Weaken Facebook’s Encryptionby Yael Grauer January 13, 2020
In early October, Attorney General William Barr wrote an open letter to Facebook, urging the company to abandon plans to expand its use of end-to-end encryption (which ensures messages can only be read by senders and recipients). The letter was also signed by then-acting Homeland Security secretary Kevin McAleenan and by law enforcement officials in Australia and England.
Facebook already uses end-to-end encryption for WhatsApp, a messaging tool used by more than 1.5 billion people across the world. The company announced plans to expand secure messaging to Facebook Messenger and Instagram messages in March.
Business journalists covering technology may benefit from looking at this latest government push in the context of recent history. The Justice Department has long tried to weaken encrypted communication, stating that doing so enables it to more effectively fight drugs, organized crime, terrorism and child sexual abuse materials. An early attempt in 1993 was the Clipper Chip, a device that enabled government agents to access private communications. Neither manufacturers nor users embraced the Clipper Chip, and a security researcher published a major design flaw in the system in 1994.
More recent government attempts to create encryption backdoors have taken place through legislation and through the courts.
The Compliance with Court Orders Act of 2016 was a bill that would have prohibited companies from designing strong encryption that law enforcement could not access. (The bill did not pass.)
In 2016, the FBI sought a court order to force Apple to crack open a terrorist’s locked iPhone in the aftermath of a mass shooting in San Bernardino, California. But the Inspector General’s office later reported that the FBI and Justice Department had not exhausted their own technical capabilities before engaging in legal action against Apple.
In the immediate aftermath of Barr’s letter, more than 100 organizations signed onto an open letter of their own, encouraging Facebook to continue increasing its security across its messaging services. That’s because end-to-end encryption protects users from abusive ex-partners, the prying eyes of authoritarian governments, criminals, identity thieves, and other malicious actors. And installing backdoors to give government access to any data it wants weakens encryption for all users.
One proposal that wouldn’t break encryption the way government-mandated backdoors would is content moderation, including known content detection and classifier-based content detection. Both could take place on the client-side (i.e., users’ phones). With content moderation implemented, Facebook or other providers could screen content while it is on the sender’s device and stop transmission of content deemed abusive, and either manually screen private conversations or automatically report them.
However, privacy advocates say that not only would content moderation be unable to prevent abusive content (which could be encrypted before transmission), it is also a slippery slope that would lead to surveillance and censorship and have a chilling effect on privacy and speech.
“When you’re checking content against a blacklist (or fuzzily trying to predict whether content your system hasn’t seen before should be blacklisted), ultimately you are talking about a system that keeps a list of things that must not be said or shared, and that monitors and reports people if they do so,” Riana Pfefferkorn, Associate Director of Surveillance and Cybersecurity at Stanford Law School’s Center for Internet and Society, wrote in a blog post. “It is not reasonable for any government to demand that platforms build the ability to surveil and censor everyone’s private communications.”
Business writers should consider closely tracking government attempts to weaken encryption, either through political pressure or through legislation. Such attempts, especially if they are successful, could impact not just WhatsApp, Facebook Messenger and Instagram Direct, but also Apple’s iMessage, as well as other tech companies looking to adopt similar technology.
In May 2015, The Nation reported that fallout from Edward Snowden’s revelations about the government’s surveillance led to losses as high as $180 billion due to overseas customer abandoning domestic companies over privacy concerns. It’s possible that weakened encryption or privacy could drive business outside the U.S., too.