Encryption Workarounds Paper Shows Why 'Going Dark' Is Not A Problem, And In Fact Is As Old As Humanity Itself
from the you-don't-know-what-I-know dept
It was October 2014 when FBI Director James Comey made his famous claim that things were "going dark" in the world of law enforcement because of the increasing use of encryption. Since then, Techdirt has had dozens of posts on the topic, many of them reporting on further dire warnings that the very fabric of civilization was under threat thanks to what was claimed to be a frightening new ability to keep things secret. Many others pointed out that the resulting calls for backdoors to encryption systems were a stunningly foolish idea that only people unable to understand the underlying technology could make.
One Techdirt post on the topic mentioned a great paper with the title "Keys Under Doormats: Mandating insecurity by requiring government access to all data and communications," which ran through all the problems with the backdoor idea. It was written by many of the top experts in this field, including Bruce Schneier. He's just published another paper, co-authored with Orin Kerr, who is a professor at George Washington University Law School, which looks at the other side of things -- how to circumvent encryption:
The widespread use of encryption has triggered a new step in many criminal investigations: the encryption workaround. We define an encryption workaround as any lawful government effort to reveal an unencrypted version of a target's data that has been concealed by encryption. This essay provides an overview of encryption workaround.
The various possibilities are largely self-explanatory:
We classify six kinds of workarounds: find the key, guess the key, compel the key, exploit a flaw in the encryption software, access plaintext while the device is in use, and locate another plaintext copy. For each approach, we consider the practical, technological, and legal hurdles raised by its use.
What's interesting is not so much what the workarounds are, as is the fact that there are a number of them, and that they can all work in the right circumstances. This gives the lie to the idea that we are entering a terrible new era where things are "going dark," and it is simply impossible to obtain important information. But as the authors point out:
there is no magic way for the government to get around encryption. The nature of the problem is one of probabilities rather than certainty. Different approaches will work more or less often in different kinds of cases.
Schneier and Kerr go on to draw an analogy:
When the police have a suspect and want a confession, the law gives the police a set of tools they may use in an effort to persuade the suspect to confess. None of the interrogation methods work every time. In some cases, no matter what the government does, suspects will confess. In other cases, no matter what the government does, suspects will assert their rights and refuse to speak. The government must work with the inherently probabilistic nature of obtaining confessions. Similarly, the government must work with the inherently probabilistic nature of encryption workarounds.
That analogy reveals something profound: that the supposedly new problem of "going dark" -- of not being able to find out information -- has existed as long as humans have been around. After all, there is no way -- yet, at least -- of accessing information held in a person's mind unless some kind of interrogation technique is used to extract it. And as the analogy shows us, that is exactly like needing to find some encryption workaround when information is held on a digital device. It may be possible, or it may not; but the only difference between the problems faced by those demanding answers thousands of years ago and today is that some of the required information may be held external to the mind in an encrypted digital form. Asking for guaranteed backdoors to that digital data is as unreasonable as demanding a foolproof method to extract information from any person's mind. We accept that it may not be possible to do the latter, so why not accept the former may not be feasible either?