GeoLegal Weekly #56: Encryption Conniption
Apple's spat with the UK government is emblematic of an unruly world where the line between good and bad actors is unclear and companies are asked to do the technological bidding of governments.
The widespread use of strong encryption creates deep tensions between privacy, law enforcement, and national security. Intelligence agencies have warned that encryption blinds them to criminal and terrorist activity, but weakening it would make everyone more vulnerable to cyber threats. Does strong encryption in nearly all of our communications and data storage make the world more or less secure?
This is much harder to answer in an unruly world. Cybersecurity is often framed as a battle between the good team and the bad team, where the good team encrypts its valuable data and the bad team tries to steal it. But other times, the bad team encrypts all its information about the bad things they are going to do, and it’s the good team who is righteously trying to get into it. Reality is complicated and the line between the goodies and baddies is blurred or can switch with a change in political power.
It’s no surprise that companies can easily find themselves stuck in the middle on issues like encryption and decryption. Is the right strategy to a take a principled stand or to bend to national security needs?
When you are one of the most powerful companies in the world, however, you have a third option: Stop providing the service altogether.
Encryption for all
Encryption is an essential part of protecting your data. But from whom? As powerful encryption began finding its way into hard drives and messaging apps (where end-to-end encryption is now standard not only in security-conscious apps like Telegram and Signal but even in mass-market ones like Apple’s own Messages and Facebook’s WhatsApp and Messenger), concerns mounted. Government intelligence and law enforcement agencies expressed public worries about the degree to which this could blind them to terrorist, criminal, and drug gang communications. They pressured tech companies to give them a backdoor in order to fight crime and terrorism. Apple, in particular, having decided to make privacy a central selling point of its products, pushed back in the US.
In recent weeks, Apple and the UK government have been in a shadow-boxing match about encryption, with the British government demanding that Apple create decryption keys to decrypt data stored in its iCloud servers. Just as Apple had previously done with various US law enforcement agencies demanding similar privileges, Apple appears to have refused and reiterated that its services do not have any backdoors and that creating one would present a significant security risk to all users. But this time, the technology giant went a step further and essentially took its encrypted ball and went home: Apple is making its encryption features, which it calls Advanced Data Protection, unavailable to UK customers.
The incident that provoked this clash is complex and not entirely clear based on publicly available information. In 2016, a law called the UK Investigatory Powers Act (IPA) came into effect and gave the British government the authority to demand that technology companies provide particular users’ data and decrypt it as needed. The catch is that the IPA allows the British government not only to ask for the data of UK citizens and residents, but also any individual in the entire world. We do not know if other companies have complied since 2016 because the IPA also prohibits companies from disclosing that they have received requests under this act. Someone—and we do not know if it was someone affiliated with Apple or the UK government—leaked the existence of this request to The Washington Post. Apple, for its part, has not made any statement specifically on the request because doing so would constitute a crime in the UK. But UK users are now met with a notice in their iCloud preferences saying:
“Apple can no longer offer Advanced Data Protection (ADP) in the United Kingdom to new users.”
Existing users will presumably have to manually turn off ADP in the coming months, since Apple says it does not have the capability to turn it off itself without effectively decrypting the account, which it refuses to do.
A metaphor for the unruly world
This encryption conniption involves many of the factors that make today’s world unruly. Countries and powerful tech companies are engaging in a struggle where technology is reshaping legal obligations and the limits of state sovereignty. The UK, a state with some of the most sophisticated espionage capabilities in the world, still must demand that a private company in the US assist it in spying on potentially any Apple user in the world. And the result—so far—of this clash is Apple punishing UK users, while the British government attempts to pressure Apple.
Contrast this with China, which is a far more important market for Apple than the UK. In China, Apple stores all of its iCloud data on servers located inside China and operated by a company owned by the Chinese government. Apple does not offer ADP to its users there, and while its iCloud data is technically encrypted, it is housed on servers controlled by the Chinese government, and Chinese law requires all companies to hand over any data when requested. Sometimes, borders change everything around data security compliance; other times, threats across cyber space transcend borders.
Beyond the ways in which deprecating ADP for users in the UK presents real cybersecurity implications, this whole development is an example of how something as ubiquitous as encryption demonstrates how technology, national security, and business (for both Apple and Apple’s customers) are coming together in increasingly unpredictable ways. It also points toward the hard choices that companies, individuals, and governments have to make regarding encryption, privacy, and legal obligations in the jurisdictions where they operate.
The example that first highlighted this challenge occurred in early 2016, when the FBI demanded that Apple assist it in unlocking the iPhone of a terrorist suspect in a 2015 shooting attack in San Bernardino, CA. Apple responded that while it would comply with any legal request to turn over data (presumably based on a court order or warrant), in this case—as with Advanced Data Protection—what the government was actually asking it to do was not simply to hand over existing data but to create a new tool that could be used as a backdoor into a locked iPhone. Arguing that creating such a “hack” would undermine the security of all iPhones, Apple refused. Ultimately, this clash over encryption, security, and privacy had a rather anticlimactic ending: the FBI turned to a third party that had developed a way to access data on locked iPhones, and nothing particularly relevant to the shooting incident was found on the phone.
But the stakes appeared clear: encryption posed a challenge to governments seeking information vital to national security and public safety. In refusing to assist in a terrorism case, Apple–and by implication all tech companies–were on a principled but still shaky ground. Of course, the analysis might look different if it was unfolding in real-time rather than retrospectively, for instance, if an attack where lives were at risk was imminent and a company refused to decrypt data to stop it.
Since 2016, though, two things have happened.
First, some encrypted devices have been undermined by technologies like Pegasus, which allow intelligence and security services to hack encrypted devices. Pegasus caused a massive stir in 2021 when it was revealed that the Israeli company NSO Group had developed a tool that could hack an iPhone simply by sending a message to it—even if the user never clicked on any link or opened any attachment. Despite claims from NSO Group that it controlled access to the tool for legitimate national security purposes, it became clear that it had been widely used to spy on journalists, activists, and even other heads of state. This was a dramatic demonstration of the cat-and-mouse encryption game.
Second, at least within the US national security community, there is a growing belief that having US citizens and companies use strong encryption is, on balance, a net positive for national security given the scale of hacking—mostly from China, but also Russia and potentially others. While strong encryption makes investigating criminals, terrorists, and gangs more difficult, it also helps protect US companies’ data and IP from foreign adversaries. So why the shift? Probably a mix of pragmatism—since encrypted apps have become ubiquitous—and genuine concern that Chinese hacking of US companies and individuals was presenting a meaningful national security threat.
As encryption evolves, so too will the battle between privacy, security, and state control. With the release of Microsoft’s new quantum computing chip, Majorana 1, the day when essentially all current encryption techniques—which rely on impossibly large computations using traditional computing—become obsolete is drawing closer. But even if the quantum computing revolution upends the status quo, the fundamental tensions between tech companies, governments, and users will remain.
For companies outside the technology sector who presumably look to third parties to handle their encryption needs, this all points toward a few key risks.
First, you may not provide encryption services but perhaps you provide other potential services or chokepoints to stop malicious actors. Consider your posture when the government asks is you will do their bidding. They are going to ask more frequently. how will you respond?
Second, strong encryption is now widely available, and all company communications and data storage should use it. However, in the Apple case, you may think you have a solution only for it to evaporate politically overnight; thus, you need alternatives. While internal cybersecurity practices are distinct from the broader encryption debate, it is worth repeating: cyber theft is far easier when an employee—often unknowingly—leaves a door open. Human error remains one of the biggest vulnerabilities in cybersecurity.
Third, borders matter. European data protection laws have made companies more aware of where certain types of sensitive data must be stored, but as encryption regulations evolve, the physical location of data will impact its security for political—rather than technical—reasons.
Fourth, as noted, the US government guidance on encryption is shifting. As cyber threats continue to grow, regulations surrounding encryption levels, key management, and data access policies are likely to evolve. Companies will need to monitor these changes closely to avoid compliance risks.
-SW & DB