Nine Epic Failures Of Regulating Cryptography

from the and-yet-they-try-again dept

They can promise strong encryption. They just need to figure out how they can provide us plain text. – FBI General Counsel Valerie Caproni, September 27, 2010

[W]e’re in favor of strong encryption, robust encryption. The country needs it, industry needs it. We just want to make sure we have a trap door and key under some judge’s authority where we can get there if somebody is planning a crime. – FBI Director Louis Freeh, May 11, 1995

Here we go again.  Apple has done (and Google has long announced they will do) basic encryption on mobile devices. And predictably, law enforcement has responded with howls of alarm.

We’ve seen this movie before.  Below is a slightly adapted blog post from one we posted in 2010, the last time the FBI was seriously hinting that it was going to try to mandate that all communications systems be easily wiretappable by mandating “back doors” into any encryption systems.  We marshaled eight “epic failures” of regulating crypto at that time, all of which are still salient today.  And in honor of the current debate, we’ve added a ninth: 

. . .

If the government howls of protest at the idea that people will be using encryption sound familiar, it’s because regulating and controlling consumer use of encryption was a monstrous proposal officially declared dead in 2001 after threatening Americans’ privacy, free speech rights, and innovation for nearly a decade. But like a zombie, it’s now rising from the grave, bringing the same disastrous flaws with it.

For those who weren’t following digital civil liberties issues in 1995, or for those who have forgotten, here’s a refresher list of why forcing companies to break their own privacy and security measures by installing a back door was a bad idea 15 years ago:

  1. It will create security risks. Don’t take our word for it. Computer security expert Steven Bellovin has explained some of the problems. First, it’s hard to secure communications properly even between two parties. Cryptography with a back door adds a third party, requiring a more complex protocol, and as Bellovin puts it: “Many previous attempts to add such features have resulted in new, easily exploited security flaws rather than better law enforcement access.”It doesn’t end there. Bellovin notes:

    Complexity in the protocols isn’t the only problem; protocols require computer programs to implement them, and more complex code generally creates more exploitable bugs. In the most notorious incident of this type, a cell phone switch in Greece was hacked by an unknown party. The so-called ‘lawful intercept’ mechanisms in the switch ? that is, the features designed to permit the police to wiretap calls easily ? was abused by the attacker to monitor at least a hundred cell phones, up to and including the prime minister’s. This attack would not have been possible if the vendor hadn’t written the lawful intercept code.

    More recently, as security researcher Susan Landau explains, “an IBM researcher found that a Cisco wiretapping architecture designed to accommodate law-enforcement requirements ? a system already in use by major carriers ? had numerous security holes in its design. This would have made it easy to break into the communications network and surreptitiously wiretap private communications.”

    The same is true for Google, which had its “compliance” technologies hacked by China.

    This isn’t just a problem for you and me and millions of companies that need secure communications. What will the government itself use for secure communications? The FBI and other government agencies currently use many commercial products ? the same ones they want to force to have a back door. How will the FBI stop people from un-backdooring their deployments? Or does the government plan to stop using commercial communications technologies altogether?

  2. It won’t stop the bad guys. Users who want strong encryption will be able to get it ? from Germany, Finland, Israel, and many other places in the world where it’s offered for sale and for free. In 1996, the National Research Council did a study called “Cryptography’s Role in Securing the Information Society,” nicknamed CRISIS. Here’s what they said:

    Products using unescrowed encryption are in use today by millions of users, and such products are available from many difficult-to-censor Internet sites abroad. Users could pre-encrypt their data, using whatever means were available, before their data were accepted by an escrowed encryption device or system. Users could store their data on remote computers, accessible through the click of a mouse but otherwise unknown to anyone but the data owner, such practices could occur quite legally even with a ban on the use of unescrowed encryption. Knowledge of strong encryption techniques is available from official U.S. government publications and other sources worldwide, and experts understanding how to use such knowledge might well be in high demand from criminal elements. ? CRISIS Report at 303

    None of that has changed. And of course, more encryption technology is more readily available today than it was in 1996. So unless the goverment wants to mandate that you are forbidden to run anything that is not U.S. government approved on your devices,  they won’t stop bad guys from getting  access to strong encryption.

  3. It will harm innovation. In order to ensure that no “untappable” technology exists, we’ll likely see a technology mandate and a draconian regulatory framework. The implications of this for America’s leadership in innovation are dire. Could Mark Zuckerberg have built Facebook in his dorm room if he’d had to build in surveillance capabilities before launch in order to avoid government fines? Would Skype have ever happened if it had been forced to include an artificial bottleneck to allow government easy access to all of your peer-to-peer communications?This has especially serious implications for the open source community and small innovators. Some open source developers have already taken a stand against building back doors into software.
  4. It will harm US business. If, thanks to this proposal, US businesses cannot innovate and cannot offer truly secure products, we’re just handing business over to foreign companies who don’t have such limitations. Nokia, Siemens, and Ericsson would all be happy to take a heaping share of the communications technology business from US companies. And it’s not just telecom carriers and VOIP providers at risk. Many game consoles that people can use to play over the Internet, such as the Xbox, allow gamers to chat with each other while they play. They’d have to be tappable, too.
  5. It will cost consumers. Any additional mandates on service providers will require them to spend millions of dollars making their technologies compliant with the new rules. And there’s no real question about who will foot the bill: the providers will pass those costs onto their customers. (And of course, if the government were to pay for it, they would be using taxpayer dollars.)
  6. It will be unconstitutional.. Of course, we wouldn’t be EFF if we didn’t point out the myriad constitutional problems. The details of how a cryptography regulation or mandate will be unconstitutional may vary, but there are serious problems with nearly every iteration of a “no encryption allowed” proposal that we’ve seen so far. Some likely problems:
    • The First Amendment would likely be violated by a ban on all fully encrypted speech.
    • The First Amendment would likely not allow a ban of any software that can allow untappable secrecy. Software is speech, after all, and this is one of the key ways we defeated this bad idea last time.
    • The Fourth Amendment would not allow requiring disclosure of a key to the backdoor into our houses so the government can read our “papers” in advance of a showing of probable cause, and our digital communications shouldn’t be treated any differently.
    • The Fifth Amendment would be implicated by required disclosure of a private papers and the forced utterance of incriminating testimony.
    • Right to privacy. Both the right to be left alone and informational privacy rights would be implicated.
  7. It will be a huge outlay of tax dollars. As noted below, wiretapping is still a relatively rare tool of government (at least for the FBI in domestic investigations — the NSA is another matter as we now all know). Yet the extra tax dollars needed to create a huge regulatory infrastructure staffed with government bureaucrats who can enforce the mandates will be very high. So, the taxpayers would end up paying for more expensive technology, higher taxes, and lost privacy, all for the relatively rare chance that motivated criminals will act “in the clear” by not using encryption readily available from a German or Israeli company or for free online.
  8. The government hasn’t shown that encryption is a problem. How many investigations have been thwarted or significantly harmed by encryption that could not be broken? In 2009, the government reported only one instance of encryption that they needed to break out of 2,376 court-approved wiretaps, and it ultimately didn’t prevent investigators from obtaining the communications they were after.This truth was made manifest in a recent Washington Post article written by an ex-FBI agent. While he came up with a scary kidnapping story to start his screed, device encryption simply had nothing to do with the investigation.  The case involved an ordinary wiretap. In 2010, the New York Times reported that the government officials pushing for this have only come up with a few examples (and it’s not clear that all of the examples actually involve encryption) and no real facts that would allow independent investigation or confirmation. More examples will undoubtedly surface in the FBI’s PR campaign, but we’ll be watching closely to see if underneath all the scary hype there’s actually a real problem demanding this expensive, intrusive solution.
  9. Mobile devices are just catching up with laptops and other devices.  Disk encryption just isn’t that new. Laptops and desktop computers have long had disk encryption features that the manufacturers have absolutely no way to unlock. Even for simple screen locks with a user password, the device maker or software developer doesn’t automatically know your password or have a way to bypass it or unlock the screen remotely.Although many law enforcement folks don’t really like disk encryption on laptops and have never really liked it, and we understand that some lobbied against it in private, we haven’t typically heard them suggest in public that it was somehow improper for these vendors not to have a backdoor to their security measures.That makes us think that the difference here is really just that some law enforcement folks think that phones are just too popular and too useful to have strong security.  But strong security is something we all should have.  The idea that basic data security is just a niche product and that ordinary people don’t deserve it is, frankly, insulting.  Ordinary people deserve security just as much as elite hackers, sophisticated criminals, cops and government agents, all of whom have ready access to locks for their data.  

The real issue with encryption may simply be that the FBI has to use more resources when they encounter it than when they don’t. Indeed, Bellovin argues: “Time has also shown that the government has almost always managed to go around encryption.” (One circumvention that’s worked before: keyloggers.) But if the FBI’s burden is the real issue here, then the words of the CRISIS Report are even truer today than they were in 1996:

It is true that the spread of encryption technologies will add to the burden of those in government who are charged with carrying out certain law enforcement and intelligence activities. But the many benefits to society of widespread commercial and private use of cryptography outweigh the disadvantages.

The mere fact that law enforcement’s job may become a bit more difficult is not a sufficient reason for undermining the privacy and security of hundreds of millions of innocent people around the world who will be helped by mobile disk encryption.  Or as Chief Justice of John Roberts recently observed in another case rejecting law enforcement’s broad demands for access to the information available on our mobile phones:   “Privacy comes at a cost.”  

Reposted from the Electronic Frontier Foundation’s Deeplinks Blog

Filed Under: , , , ,

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Nine Epic Failures Of Regulating Cryptography”

Subscribe: RSS Leave a comment
jameshogg says:

Alan Turing did not crack the Enigma code by politely asking the Nazis to install backdoors.

What makes the state think that just because it can force tech companies to install sabotaged encryption protocols, that therefore really nasty people – for example ISIS – will somehow NOT go out of their way to research what the REAL, unsabotaged encryption is like and put it into practice?

Something like this has already happened:

The point is that we cannot “uninvent” the mathematics no matter how hard we try. And we would not want to stop such mathematics from being discovered by scientists either as that would do nothing to stop the scientists of our enemies from conducting their own research.

Islamofascists are racist and totalitarian, but one thing they are not is stupid. Sam Harris sums this up in one terrifying sentence: “It is actually possible to be so well educated that you can build a nuclear bomb and still believe that you are going to get the 72 virgins.”

unlikely-to-be-anonymous coward says:

If they outlaw encryption California should seriously consider passing a law legalizing encryption and criminalizing back doors

Encryption is no different than speaking a language that only certain people can understand. The government cannot regulate speaking, so therefore it cannot regulate encryption. I think it is very basic. We have a right to speech, in whichever language we please, through whatever means possible. Also, in Meyer v. Nebraska SCOTUS also held that the banning of foreign language instruction violated the 14th Amendment.

There is no congressional authority to regulate the semiconductor industry nor speech on the internet! There is no way this should be legal. If the federal government does not uphold the courts we have a duty to change the government.

Plus, Silicon Valley is one of the only real industries making things better and growing good jobs in this country at a time when Wall Street and the government are stealing from the middle class.

Anonymous Coward says:

Re: If they outlaw encryption California should seriously consider passing a law legalizing encryption and criminalizing back doors

Which nation do you hail from?

Do me a favor and understand this right now.

The government can damn well do what ever the fuck it pleases, let me introduce a few thousand years of human history to prove that.

The only way a Government is stopped is with blood, also proven by history!

The founders of the USA fully understood that this nation would become corrupt again and that blood would be required to keep its liberty, blood very few of your fair-weather fellow citizens would be willing to sacrifice and subsequently turn against you for sacrificing to boot!

You should do it like me, kick back and lube up for the slide right into fucking hell.

Anonymous Coward says:

Re: Re: If they outlaw encryption California should seriously consider passing a law legalizing encryption and criminalizing back doors

The only way a Government is stopped is with blood, also proven by history!

This is at least as much an oversimplification of things as pacifists claiming that only nonviolence works. In order to make your claim you’d have to ignore a number of incidents of nonviolent resistance playing a large role in getting rid of oppressive regimes.

John Fenderson (profile) says:

Re: Re: If they outlaw encryption California should seriously consider passing a law legalizing encryption and criminalizing back doors

“You should do it like me, kick back and lube up for the slide right into fucking hell.”

This is called giving up and abdicating your responsibilities. Perhaps you’re right, that in the end only violent revolution will fix these problems. However, we are still a long way from being there. To give up now and just wait for things to deteriorate to the point where blood is running in the streets isn’t just wrongheaded, it’s downright immoral. Better to fight as hard as you can to fix things before they go that far.

Also, remember that revolutions only rarely produce governments that are better than the one that was overthrown.

Anonymous Coward says:

Another problem with officially mandating backdoors or key escrow is that the US is not the only government claiming a law enforcement need.

Other nations will not allow Apple or Google to sell its products locally unless they get the same law enforcement access.

A company making any product sold internationally can’t ppick and choose between good or bad governments and expect being allowed to market and sell its products locally.

So the choice for American companies is either no backdoor for anyone or a backdoor open for any government.

If they try to sneak through a product with a hidden US exclusive backdoor, it’s sooner or later going to be discovered.

That One Guy (profile) says:

Re: If You Don’t Believe Lawful Intercept Backdoors Can Be Abused ...

I’m pretty sure the only people who don’t realize that ‘lawful intercept backdoors’ can be abused are either the completely clueless(‘The security vulnerabilities are there for lawful reasons, that means it’s impossible for them to be used by anyone else, right?’), or the incredibly arrogant(‘This security vulnerability that we insisted on is totally hidden, it would be impossible for anyone but us to find and use it’).

Or the third, much worse alternative: Those insisting on the vulnerabilities know they can and will be found and used by others, and they simply don’t care as long as the vulnerabilities make their job easier in the short term.

That One Guy (profile) says:

Re: Re:

Not hardly, here’s how it works:

If cybercrime goes down, it’s only due to their actions and eagle-eyed watchfulness. Since decreasing funding or manpower would decrease these, obviously for the good of everyone their budget needs to stay just as it is, if not increase.

If cybercrime goes up, obviously it’s because they don’t have enough people and funds, so both of those need to be increased to ‘match the growing threat’.

In neither case are funds decreased, they are only ever increased.

Anonymous Coward says:


“Users who want strong encryption will be able to get it — from Germany, Finland, Israel, and many other places in the world where it’s offered for sale and for free.”

Sheesh! It ain’t like this stuff is so wackily hard to produce that you have to buy it from foreign countries. I wrote an implementation of Triple DES in 2005 just to secure some data for server storage. I pulled the algorithm and the full specification for the test suite from the NIST (U.S. National Institute of Standards and Technology) site. All of the “practically unbreakable” encryption schemes are well-known, widely available algorithms that any programmer with enough sense to pound sand can implement.

Dave says:


Here in the UK, we don’t seem to be able to gather much information about how much the government (in the shape of GCHQ) is also going down the path of our USA colleagues until it filters back from across the pond. Wouldn’t mind betting they’re up to good as well. What about other countries? Surely if the states are screwing around with people’s privacy, it’s a pretty good bet that other governments are also up to the same trick.

madasahatter (profile) says:

Plain text

At some point the encrypted file must be converted into plain text for the user to read. This would require the authorities to get a warrant for the user’s device in order to read the text. Also, if the devices can be connected to a printer then the plain text can be printed.

What the various reichscommissars are complaining about is they can serve a warrant on Apple, etc. for the user’s data and be able to read the files without the user being aware.

Uriel-238 (profile) says:

To repeat a common Techdirt adage,

Yeah, a law enforcement investigator who can’t do his job without encryption backdoors is law enforcement investigator who can’t do his job.

As it is with all constitutional protections, they are non-negotiable, and at the point they became negotiable was a point of government failure.

Even warrants are of dubious necessity: our lives leave so many tracks in the public that we should be able to identify law-breakers by never crossing into private information. Law enforcement is more interested in putting bodies in jail than in seeing justice done.

That One Guy (profile) says:

Re: Re:

Potentially infinite, there is no limit to how long a judge can throw someone in jail for contempt of court, because in theory the person could ‘release’ themselves by complying with the judge’s order, so according to the ‘justice’ system the only thing keeping the accused behind bars is their own refusal to comply.

From wikipedia:
‘The civil sanction for contempt (which is typically incarceration in the custody of the sheriff or similar court officer) is limited in its imposition for so long as the disobedience to the court’s order continues: once the party complies with the court’s order, the sanction is lifted. The imposed party is said to “hold the keys” to his or her own cell, thus conventional due process is not required.

Anonymous Coward says:

Re: Re: Re:

“Potentially infinite, there is no limit to how long a judge can throw someone in jail for contempt of court”

It appears plausible deniability crypto is the where the future is heading then. dm-crypt is able to accomplish plausible deniability in ‘plain’ encryption mode.

There’s no way to tell if there’s encrypted data on a disk, because there’s no LUKS metadata header when dm-crypt is used in plain mode. The disk doesn’t even need to be partitioned or formatted at all for plain mode dm-crypt to work.

The dm-crypt LUKS metadata header can also be hidden inside a plain mode dm-crypt volume. This is known as ‘stacked encryption’, but it doubles the encrypt/decrypt processing overhead.

Another way to attain plausible deniability with dm-crypt, is to store the LUKS metadata header on a separate disk from the encrypted data. This way there’s no LUKS metadata header on the data disk, which would be a dead giveaway there’s encrypted data on the disk.

The only thing to be careful about is system logs. System logs will show when a dm-crypt volume is opened. System logs are about the only thing that will break dm-crypt’s plausible deniability in plain mode.

There’s two easy ways to avoid this. Either boot directly into the plain mode dm-crypt volume before system logging starts. Or open the plain mode dm-crypt volume using a bootable LiveCD where the logs are wiped from the LiveCD after every system shutdown.

I’m glad Android uses dm-crypt as it’s cryptography library. 🙂 Not that I’d by crazy enough to put anything private on my smartphone. It doesn’t matter how good the encryption is, because while your phones turned on everything’s decrypted. I’m sure there’s remote backdoors in all that closed-source software that powers smartphones.

Why else would Rep. Mike Rogers freak out about China selling their electronic devices in the US? He knows how the backdoor game works.

Anonymous Coward says:

Re: Re: Re:

And that’s a bullshit justification on the level of “why did you make me hit you”.

If the imposed party really did “hold the keys” to his or her own cell, he/she could leave at any moment, without any other requirement. But that’s not true; the “key” is given only in exchange for complying with the order. The imposed party does not have the “key” at all until he/she does whatever the judge wants.

Once you peel away the victim blaming, you can see what jailing someone for contempt of court really is: punishment for noncompliance with a judge’s order. In a fairer legal system, it should be subject to the same limitations as any other punishment a judge can legally give. Which means a limited time in jail.

Austin (profile) says:

The funny part is...

…that none of this is relevant.

That is, none of what Apple or Google is doing is any change to mobile data traffic. They’re (finally) encrypting the contents of the device, but still not the traffic.

And most of the “encrypted” data on the device is only really “encrypted” in the sense that if you try to dump the contents of the memory on the device, it’s encrypted.

But if you guess the 4-digit numeric passcode that 90%+ of users use to “secure” their phone? Wide open, encryption irrelevant.

So the phone still isn’t really encrypted. The traffic isn’t encrypted, and the device itself isn’t really either, it has a lock you can retry infinite times with just 4 digits and only 10 potential characters per digit.

And this more-or-less lack of any real security improvement? Yep, that’s what the FBI is shitting its pants over.

Mark Murphy (profile) says:

Re: The funny part is...

none of what Apple or Google is doing is any change to mobile data traffic

Correct. That’s mostly up to app developers. Use of SSL and other encrypted communication protocols appear to be on the rise post-Snowden. I and others are working to help app developers do a better job here.

But if you guess the 4-digit numeric passcode that 90%+ of users use to “secure” their phone? Wide open, encryption irrelevant.

The encryption isn’t “irrelevant”, but it is easily bypassed. This is not significantly different than anything else in encryption: your security is only as good as your key. More stuff could be done here, at the device level (e.g., separate keys for the full-disk encryption from what users use to unlock their powered-on phones) and at the app level (e.g., separate encrypted containers for highly-sensitive data).

Though I’d be curious if anyone knows of a study that would back up the “90%+” claim. I suspect that a lot of users do use weak PINs, but a survey might be illuminating.

So the phone still isn’t really encrypted

Yes, it is. Your statement is akin to claiming that a locked door is not really locked, given the existence of lock picks and battering rams. Just because a security mechanism is ineffective against talented, determined attackers does not mean that it does not exist.

it has a lock you can retry infinite times

That too is something that could be improved, at least at the lockscreen level, whereby delays are introduced with every failure. If, for example, you added one second per PIN failure (no delay on first try, 1 second delay on second try, 2 second delay on third try, etc.), there would be a 50% of getting a 4-digit PIN right in several months of continuous attempts, but it could take as long as a couple of years if the attacker is extremely unlucky. I haven’t been trying to brute-force devices, so I don’t know how much of this is in place now (though Android historically lacked it). Which is why we have $5 wrenches and more sophisticated attacks (e.g., brute-forcing the disk encryption directly rather than trying to manually guess it at the lockscreen).

I do agree with your overall assessment that this is a lot of sturm und drang over an incremental improvement in security.

Anonymous Coward says:

Fifth Amendment

I’m hearing you’re not allowed to plead the fifth against self incrimination if a judge orders a defendant to disclose their encryption password.

I wonder how many years in prison contempt of court carries. 

Actually you are allowed to plead the Fifth Amendment in such a case.

It will depend on whether the government can prove that you know the password and its willingness to offer immunity.

All the cases wherein the government successfully got a court to compel decryption concerned stupid defendants. admitting that they knew the password.

Whatever (profile) says:

I find it interesting that all of the support for encryption sounds about the same as the excuses why people need to own sub-machine guns and conceal carry weapons all the time.

It all just requires that you ignore the bad uses entirely.

On that basis, both patents and copyright are great, because if you can ignore that level of abuse of encryption, you can ignore what few abuses happen in the patent and copyright world.

That One Guy (profile) says:

Re: Re:

Tell me, do you automatically assume criminal or nefarious intent of people who have locks on their doors? Because that’s what this is, putting a ‘lock’ on a device that contains a ton of very private information.

So unless you’re going to argue that those with locks on their house and car should be treated with suspicion because they ‘clearly have something to hide’, then you’ve really got no ground to say the same regarding encryption.

Whatever (profile) says:

Re: Re: Re:

I think the reverse: locks on your house and your car are there because OTHERS have bad intentions.

You also miss the point: I understand the reasons why they want to encrypt, in the same manner that I understand people who live in the US and want to conceal carry. However, I am also bright enough to understand that the “cure” may also be part of the “cause”.

Things aren’t black and white, trying to make them so and put someone into a position of taking an absolute position isn’t helping!

That One Guy (profile) says:

Re: Re: Re: Re:

I think the reverse: locks on your house and your car are there because OTHERS have bad intentions.

‘Bad intentions’ that the police, and various government agencies have shown to possess when it comes to digital files.

However, I am also bright enough to understand that the “cure” may also be part of the “cause”.

The increased focus on tightening up security, and improving and rolling out more comprehensive encryption is almost entirely because the government, and the police, have shown that without the tighter security they cannot be trusted not to go browsing.

When you’ve got the NSA scooping up as much as it can, pretty much simply because they can, people are going to want to try and, if not stop that, at least make the NSA work for their data.

Likewise, when the police are confiscating phones and digging through them without a warrant to the extent that the matter ends up before the Supreme Court(who thankfully ruled against the police and their warrant-less searches), it’s not surprising that even people who don’t think they’ve done anything wrong would want to make it so people, the police included, can’t just go on digging into their personal files at their whim(read: without a warrant).

The ’cause’, is those in power abusing their power, and showing that they can not be trusted, that is why encryption and heightened personal security is getting so much attention these days.

Just Another Anonymous Troll says:

Re: Re: Re: Re:

People carry concealed weapons because there are people with bad intentions. If concealed weapons are outlawed, then the people with bad intentions will have guns and you won’t. You will be unable to defend yourself and they will steal all your stuff if you’re lucky.
Encryption is the same. If you outlaw encryption, only outlaws will encrypt data. And then teh l33t h4x0rz will steal your data, and the U.S. government can too, without a warrant.

John Fenderson (profile) says:

Re: Re: Re: Re:

“locks on your house and your car are there because OTHERS have bad intentions.”

Precisely. Encryption is exactly like physical locks in this respect. It exists and is used because OTHERS have bad intentions. Yes, crypto is sometimes used to hide illegal actions, but that’s not the main use or benefit. The exact same thing is true of physical locks, as well.

To argue that people shouldn’t have crypto, or that the government should have a back door into it is precisely the same thing as saying that people shouldn’t have locks on their doors or that the government should have a copy of all door locks in their possession.

Anonymous Coward says:

Five years for refusal to decrypt?

What if you don’t have the password? Do they have to prove you do?

In the UK under RIPA § 49 the prosecution must prove beyond a reasonable doubt that you knowingly fail to disclose a key in your possession.
If the government can’t prove that you are in possession of a key, and that the data is encrypted by you or that there is encrypted data, you are not in violation of the law.

All the cases under RIPA § 49 resulting in jail are factually similar to those under the foregone conclusion exception to the Fifth Amendment.

If you admit to the prosecution, the police or in a conversation with family, a cellmate or in a diary seized by the police that you know how to decrypt a particular datablock, you are toast.

Anonymous Coward says:

Re:easy solution

if you dont hand over the encryption password on suspicion of a crime you get 5 years prison….

if found innocent….they must destroy all copies and prove so. 

You forget that the protection against self incrimination equally applies to the guilty and innocent.

Why should even a guilty person be compelled to help his own prosecution?

The Fifth Amendment does not say that No innocent shall be compelled to testify against himself.

It’s couched in absolute terms and has been held to protect the innocent and guilty alike.

Anonymous Coward says:


Routing around mandatory key disclosure laws is not difficult.

All these laws presuppose that there is a single individual to whom the order can be directed.

If x, y and z share a computer, and the government can’t prove who is responsible for encrypting a file, the government must either jail all or none.

If we change the number of possible subjects from 3 to 100, it quickly becomes impractical to investigate who is the actual owner of the data.

Anonymous Coward says:

Cloud encryption and the Fifth Amendment

Compelling the unlocking of a single user computer or phone is often not an issue under the foregone conclusion test, if the government can prove the identity of the owner, and that the owner is the sole user and that the owner knows the passcode.

However, if the unlocked phone is used as a storage device and contains i.e a headerless random data file, the government can’t compel the owner to explain whether it’s encrypted, and which software has been used.

So if one uses two layers of encryption and obfuscation –the outer one easily discoverable by the government — one can be compelled to unlock the most obvious layer.

However, if the inner layer is not easy discoverable, or if the government can’t prove what it is, the really incriminating evidence may be hidden inside the inner layer and the owner can’t be forced to reveal it.

So if the government lawfully seizes your locked phone, proves that you are the actual owner and sole user you can likely be compelled under contempt to unlock it even if it contains clearly incriminating evidence.

But unlocking the phone may just reveal a further layer of data.

If the government is unable to prove what it is, or if there is a further layer of obfuscation, you can still plead the Fifth when asked if you have hidden something or how the the data has been encrypted.

Anonymous Coward says:


Could Mark Zuckerberg have built Facebook in his dorm room if he’d had to build in surveillance capabilities before launch in order to avoid government fines?

Mark Fuckerberg didn’t need to “build in surveillance capabilities” to Facecrook, as Facecrook itself is — along with Go-Ogle, Twatter and the other PRISM-participants — a web-wide, corporate-operated surveillance/intelligence-gathering platform.

In my opinion, anyone who trusts any corporation (no matter how “good” their propaganda (i.e., advertising)) is either an ignoramus, a fool, or someone with a financial interest in said corporation.

ryuugami says:

Re: Huh?

In my opinion, anyone who trusts any corporation (no matter how “good” their propaganda (i.e., advertising)) is either an ignoramus, a fool, or someone with a financial interest in said corporation.

Or, more likely, they just. Don’t. Care.
Which also means it’s usually not the matter of trust. I would have no problem lending a dollar someone I just met. A thousand dollars, not so much.

I’d say step one is not “don’t trust ’em, they’re crooks”, but “this is why you shouldn’t give your personal data to strangers” type of basic education.

Yeah, this somewhat falls under ‘ignoramus’ in your post, but you can’t expect everyone to keep up with the tech and the privacy implications thereof.

John Fenderson (profile) says:

Re: Re: Huh?

“you can’t expect everyone to keep up with the tech and the privacy implications thereof”

Well, you really can. The basic rule is very simple: if you’re providing data to someone else — especially through an intermediate party — you should not expect that information to remain private.

Whether or not that matters depends, as you point out, on the sensitivity of the information. I make this computation in my mind every single day: is giving up this piece of information worth whatever I’m getting in exchange for it? Often the answer is “no”. Sometimes, it’s “yes”. It all depends.

Just Another Anonymous Troll says:

“it’s because regulating and controlling consumer use of encryption was a monstrous proposal officially declared dead in 2001 after threatening Americans’ privacy, free speech rights, and innovation for nearly a decade. But like a zombie, it’s now rising from the grave, bringing the same disastrous flaws with it.”
What would a zombie want with legislative bodies or law enforcement? This analogy makes no sense.

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...