“Unbounded and Unpredictable"
A National Academies of Sciences Report Lays Out The Risks of a Crypto Backdoor Mandate
Blog Post
Aug. 22, 2016
“Unbounded and unpredictable.” That is how Matt Blaze, professor at the University of Pennsylvania, described the risks associated with mandating a encryption backdoor at a recent National Academies of Sciences (NAS) workshop, titled “Encryption and Mechanisms for Authorized Government Access to Plaintext.” The workshop featured a series of discussion sessions ranging from the policy trade-offs of mandating backdoors, to the economic and market impacts of such a mandate, to the technical feasibility and consequences - and the-ever nagging question: is a solution plausible, even if it is technically “possible”? This two-day exercise was attended by current and former government officials, technologists and cryptographers, and civil society experts, including OTI’s director Kevin Bankston. A report-out of their discussions is now online and is must-read material for those involved or interested in the ongoing crypto debate.
The report draws no conclusions from the convening; it merely summarizes what was debated. However, the warnings issued by many of the participants were clear: encryption backdoor mandates would be dangerous to the economy, to civil liberties and human rights, and to cybersecurity. The following is a summary of some of the major issues that were discussed at the workshop.
Law Enforcement’s Desire for a Backdoor
Chris Inglis, the former deputy director of the NSA, began the workshop with some introductory remarks, as summarized by the report, that concluded with the idea that “there has never been a perfect encryption system; even when the math is correct, implementation is never perfect, and as a result, encryption will never meet all expectations.” Despite the fact that he believes there will always be an opportunity to exploit a vulnerability and bypass or hack encryption, he urged that the workshop seek common ground for a new system that provides sufficient security, while also meeting the needs of law enforcement to establish a means of “exceptional access” - what we at OTI would call a backdoor.
James Baker, the FBI’s general counsel, followed-up Inglis’ comments, emphasizing the problem law enforcement faces as more communications it seeks to obtain as evidence for investigations are protected by more widely available encryption. He noted that the FBI is not seeking a specific technical solution and that the American people must decide what tools the FBI should have, but stressed that “alternative strategies [i.e. not backdoors] may slow down investigations, lead to larger resource requirements, or yield less complete information than would otherwise have been obtained.
Inglis argued, and Baker agreed, that this debate about whether law enforcement should be guaranteed access to encrypted communications with legally mandated backdoors is “important enough to make it worth going beyond what seems practical and exploring what is possible.” However, the remainder of the workshop demonstrated that most participants, after carefully considering all alternatives that were put forth, believe that secure encryption backdoors are neither practical nor possible.
Human Rights Implications of Backdoors
Patrick Ball, the director of research at the Human Rights Data Analysis Group, kicked off the first session of the workshop with a discussion about how encryption backdoors would threaten human rights. Ball stressed that a government-mandated encryption backdoor would “increase the amount of government surveillance without making any of us any safer, and that the downsides of this increased surveillance would be experienced most acutely by vulnerable populations.” He warned that such a mandate would not only directly harm Americans, but would also harm U.S. policy around the world by undermining the Department of State’s funding of encryption and anonymization tools for use by human rights groups. It would also threaten the security and activities of journalists and activists around the world, and particularly those in repressive regimes like Syria, Russia, Iran, China, Venezuela, throughout east Africa, and parts of the Middle East.
Ineffectiveness of a Policy Solution
In addition to human rights concerns, Ball argued that any U.S. mandate for encryption backdoors would be ineffective in accomplishing law enforcement’s goals. The vast majority of encryption tools and services are either open source or developed abroad. This means that no matter what U.S. policy is, anyone who wants strong encryption, including sophisticated criminals and terrorists, will be able to access it. Indeed, as OTI’s Bankston pointed out at the workshop, eight out of nine of the encrypted messaging apps recommended by ISIS are either foreign-developed, open-source, or both.
Inglis acknowledged this difficulty, and conceded that while there may be no 100% solution, a fix that would get law enforcement most-of-the-way-there might still be possible. But, he recognized that even a partial fix would require a large-scale international agreement on how to implement such a backdoor requirement. Orin Kerr, professor at the George Washington University Law School, was dubious that such an agreement would ever be politically feasible, and several technologists cautioned that secure implementation of such an agreement would be exceedingly difficult and likely impossible.
After discussing policy-based arguments for and against encryption backdoors, the workshop turned to the question of implementation: is a secure system of exceptional access even feasible? The answer of many of the workshop attendees was no.
Encryption for Me but Not for Thee
First, the attendees questioned whether it would be possible to segment encryption policy “vertically,” by only allowing un-backdoored encryption for certain types of users or types of data. This idea was shot down by Eric Rescorla from Mozilla, who questioned whether it is feasible to “‘wall off’ the use of strong encryption in specific sectors.” As an example, he offered that people use the same web browser to contact medical and financial services providers, which we would want to be protected with stronger encryption, as they use when connecting to social networking platforms, which law enforcement would argue should be protected by weaker encryption.
Marc Donner, formerly of Google Health, also noted that such a system could require institutions to build their own encryption tools, which would be highly unlikely. Currently, “encryption used in various sectors [is] based on standard, commercially available products.” Several technologists raised the concern that imposing a backdoor requirement would severely degrade the security companies choose to offer. This is because it is unlikely that companies would devote the significant additional resources that would be necessary for developing, maintaining, and defending against attacks on inherently weak encryption. Instead, it is more likely that they would forgo using encryption to protect their products and their users altogether, lowering security for everyone.
Several participants, such as Rescorla, found the idea of segmenting strong vs. weak encryption by user “extremely unpleasant.” Participants also found that kind of segmentation was dangerous since there are a wide range of people who may be targets for nation-state and criminal hackers and who would need the protection of strong encryption, not just people who handle sensitive financial or health data.
Susan Landau, professor at Worcester Polytechnic Institute and one of the organizers of the meeting, argued that segmenting by user is impossible and any attempt to do so could have negative effects for national security since the Department of Defense is dependent on commercial products. Others noted that additional, non-governmental high-value targets include, but are not limited to, CEOs and anyone doing business internationally, further undermining the idea of preventing those parties from having strong encryption.
Building upon that concern, Bankston noted that “often the point of the attack is not to access an individual’s information but to use the compromise of an individual’s system as a platform from which to attack the valuable information of others.” This makes segmenting by user impractical. Since high-value targets are in all sectors, and each sector is only secure as its weakest link, establishing a backdoor anywhere in the system creates an attack vector that could impact mailroom staff, CEOs, and government officials alike.
Landau then moved the group to assess whether it would be possible to segment encryption policy “horizontally” by technological layer, e.g. by requiring backdoors at the operating system or platform layer but not at the application layer, or vice versa.
Again, the conclusion was that this would type of segmentation would be infeasible. ACLU Senior Staff Technologist Daniel Kahn Gillmor cautioned that if the U.S. mandates that all operating systems include encryption backdoors, someone will develop a more secure operating system abroad. Other security experts argued that platform-based backdoors would likely still be ineffective because applications could be developed in such a way as to avoid the backdoor, such as by requiring a key to access the application that the user remembers, rather than stores on their device. Finally, there were concerns raised that such a system would require software engineers to undermine the integrity of programs by adding significant complexity, and thus, unintentional vulnerabilities.
In sum, then, most of the experts agreed that requiring backdoors at a lower layer—like the hardware layer or the OS layer—would not accomplish law enforcement’s goal because targets could simply implement encryption at a higher layer, and would also have the side effect of making computing products inherently less secure in unpredictable ways. Therefore, the consensus of many attendees was that such horizontal segmentation of encryption policy approaches was not a reasonable solution.
New Approaches to an Old Idea: Key Escrow
Since vertical and horizontal segmentation would not provide reasonable paths forward, the workshop attendees turned to the question of whether a complex key escrow system, what engineers would call a “k out of n” escrow system, could be workable. This proposal would establish two layers of encryption. The first layer of encryption would protect user data, and would include a backdoor - or an “encryption key”. The second layer of encryption would include a set of “sealing keys” to protect the “encryption key” and a set of “unsealing keys” would be held by multiple individuals around the world. The operating concept behind this system is that by increasing the number of people who have keys, the chances of the “unsealing keys” and the “encryption key” being stolen would be reduced. Additionally, the chances for abusive uses of the “encryption key” would be reduced since any one individual’s capability to decrypt information would require the consent of all other individuals holding the “unsealing keys.”
Matthew Green, a cryptography professor at Johns Hopkins University, cautioned that it is unlikely that the security of this type of a system could ever be verified, and it would be impossible to know when or if the keys had been compromised. Moreover, any system this complex would almost certainly have vulnerabilities similar to those seen in previously proposed key escrow schemes. Other security experts pointed out that as a system being put in place for law enforcement needs, it would need to be regularly accessed by countless people, including compliance personnel from the relevant companies and the many federal, state and local law enforcement officers seeking access get evidence decrypted. Any system that has to accommodate regular access by such a large number of parties while also maintaining the confidentiality of all of the keys would necessarily be very complex and much harder to secure. Finally, participants emphasized that there would still be a significant risk that nation-state or criminal actors could steal the “unsealing keys”, as any holder would become a prime target for hackers.
After completing the discussion around a “k out of n” key escrow concept, the workshop turned to the question of whether a secure key escrow system could be made more secure by storing the encryption key on the device itself. The merit to this approach is that it would reduce security risks by requiring that any party that seeks to exploit the backdoor have possession of the device in question in addition to needing to get the unsealing keys from the escrow authority. Bankston agreed that this approach could reduce some of the security risk, but remained concerned that it would not address the human rights threats that backdoors raise, citing as an example, the possibility the state actors like China—the same state actors most likely to compromise a key escrow facility—also have the capability of seizing very large numbers of devices.
Landau also argued that inserting backdoors in device hardware would be a dangerous approach. She argued that mobile devices, and smartphones in particular, are increasingly becoming authenticators - tools that we use to prove we are who we say we are when accessing highly sensitive information, such as those used when signing into banking, medical, or email accounts or when being used as a second factor of authentication. This means that if a hacker steals the device and accesses the encryption key stored on it, they could not only break into the device itself, but could also use that device to gain access to sensitive information about all aspects of a person’s life or business.
The Costs of an Encryption Backdoor
The assessment of the technical feasibility of various approaches for encryption backdoors led to a discussion of the costs of implementing any of those proposed solutions. Blaze issued a stark warning. He told the attendees in no uncertain terms that the many problems we face with our security architecture today stem from the Crypto Wars of the 1990s, which distracted researchers and engineers from building stronger systems and prohibited widespread use of certain strong cryptographic algorithms because they were classified as a munition and subject to severe trade restrictions until 2000.
Our world is exponentially more connected today than it was in the ‘90s, and will become even more so with the rise of the Internet of Things. Blaze and other technologists were adamant that the security costs of imposing a backdoor requirement now would be enormous, not only because of the inherent risks that such a system could be exploited, but also because of the opportunity cost associated with devoting resources to developing and maintaining a system that intentionally weakens security rather than strengthens it.
Bankston also raised serious concerns, which were echoed by several other participants, about the negative economic impacts of such a mandate. Consumers are increasingly expecting and demanding strong encryption as a means of securing themselves against growing cyber threats. Additionally, while large companies might be able to bear the burden of a backdoor mandate - though as the technical discussion sessions made clear, it would be an extremely heavy burden - smaller companies would either fail due to the cost of implementation, or would choose to forgo providing security through encryption altogether. This would inevitably result in a loss of market competitiveness and put their users at increased risk.
Bankston put it best when he “suggested that the United States can either invest hundreds of millions of dollars to update law enforcement’s investigative capabilities for the 21st century or the economy can face a loss of billions of dollars if exceptional access is mandated for U.S. products.”
If Backdoors Aren’t Possible or Practical, What is to be Done?
Bankston’s comments raise the question of what exactly “21st century capabilities” entail. There was broad agreement that the FBI must be provisioned with adequate staff and resources to meet its technological needs. Right now, the FBI only has 39 staff that carry out activities to respond to the investigative challenges posed by encryption and anonymization technologies, only 11 of whom are agents, and $31 million in funding for those activities (which may increase to $38 million but with no increase in staff).
Additionally, FBI staff need better technical training so that avoidable mistakes, like the one made with the San Bernardino shooter’s iPhone, do not happen. Lastly, several of the attendees argued that those capabilities should include law enforcement hacking for investigative purposes. The government hacking into devices and networks raised concerns and questions among some attendees, including the question of what requirements the FBI would be under to disclose discovered or purchased zero-day vulnerabilities, and the concern that it may be infeasible for state and local law enforcement to acquire the expertise and tools necessary to conduct such so-called “lawful hacking.” Bankston also made clear that any such approach would necessitate “a substantial policy and legal debate...to define responsible, reasonable, and constitutional government hacking.”
This workshop reinforced that there are no easy answers to the dilemma that the FBI and state and local law enforcement claim they face. The one answer that comes across clearly in the report is what OTI has been saying since this debate reignited over two years ago: encryption backdoors were a bad idea in the 1990s, and they are an even worse idea today. Instead of continuing down this road to nowhere, the time has come (and gone and come again) to stop talking about whether a secure backdoor can exist, and start talking about what is both possible and practical: how to ensure that law enforcement at all levels of government can evolve to meet the demands of a fast-changing world, where new technologies bring endless new opportunities, and at times, some new obstacles.