Cybersecurity Official Believes Encryption Can Be Backdoored Safely; Can't Think Of Single Expert Who Agrees With Him

from the broken-encryption-isn't-broken-said-no-one-ever dept

The government continues to looks for ways to route around Apple and Google’s phone encryption. The plans range from legislated backdoors to a mythical “golden key” to split-key escrow where the user holds one key and the government shares the other with device makers.

None of these are solutions. And there’s no consensus that this is a problem in search of one. Law enforcement and intelligence agencies will still find ways to get what they want from these phones, but it may involve more legwork/paperwork and the development of new tools and exploits. Without a doubt, encryption will not leave law enforcement unable to pursue investigations. Cellphones are a relatively recent development in the lifespan of law enforcement and no crime prior to the rise of cellphone usage went uninvestigated because suspects weren’t walking around with the entirety of their lives in their pockets.

But still the government continues to believe there’s some way to undermine this encryption in a way that won’t allow criminals to exploit it. This belief is based on nothing tangible. One can only imagine how many deafening silent beats passed between question and answer during White House cybersecurity policy coordinator Michael Daniel’s conversation with reporters following the recent RSA conference.

In a meeting with a handful of reporters, Daniel was asked whether or not he could name a respected technology figure who believed it possible to have strong encryption that could be circumvented by just one party’s legal authority.

“I don’t have any off the top my head,” Daniel said…

And he never will. No one who knows anything about encryption will ever say it’s possible to create a “good guys only” backdoor. Or front door. Or whatever analogy government officials choose to deploy when arguing for the “right” to access anyone’s device with minimum effort.

But that’s not the end of Daniel’s embarrassing response. He went on to disingenuously toss this back at “Silicon Valley” with a back-handed compliment insinuating that if these companies don’t solve this “problem” for the government, they’re either stupid or evil.

[Daniel] added that if any place could come up with an answer, it would be the “enormously creative” Silicon Valley.

The government believes there’s a solution out there — some magical alignment of hashes that would keep malicious hackers out and let the government in. It certainly can’t figure out this conundrum, so it’s going to keep insinuating that tech companies already know how to solve the problem but they hate children/law enforcement/America so much they won’t even consider meeting the government halfway.

But the tech companies know — as do security experts — that there’s no “halfway.” You can have encryption that works and keeps everyone locked out or you can have the government’s “encryption,” which is spelled exactly the same but has extremely leaky quote marks constantly appended, and which lets everyone in the same “door,” no matter who they are or what their intent is.

Filed Under: , , , ,

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Cybersecurity Official Believes Encryption Can Be Backdoored Safely; Can't Think Of Single Expert Who Agrees With Him”

Subscribe: RSS Leave a comment
38 Comments
Anonymous Coward says:

It is dead simple, if anybody other than the senders and receivers can decrypt the message, the cypher system is broken.
Further, the more people that have access to the keys, the more likely that they will leak. Also, if it known that the NSA, FBI and/or other agencies have keys capable of breaking cypher system then concerted efforts will be made by various parties using all tactics available to get hold of the key, and given the number of employees in those agencies that will have access to the key, it will leak in minutes. Well maybe minutes is hyperbole, as it will take longer to distribute the key, but it will only take days.

Gonzoid says:

So if/when they outlaw actual encryption, will computer geeks (who have not already moved to some flavor of Linux w/full disk encryption) …finally move to open-source encryption for all of their devices?

And as they do, and become – what…felons, for illegal encryption? – will they become targets for SWAT raids?

And when they do, to the extent that they might also be very left-leaning and anti 2nd Amendment, will they finally realize that the 2nd Amendment isn’t about hunting or target shooting, and finally acquire the means to defend themselves from the stormtroopers who have taken over the country?

Something to think about…or not. head back in sand, probably.

KW England says:

We don't have to re-discover why key escrow doesn't work

In 1993 the NSA proposed the Clipper chip which included a hardware backdoor and key escrow. By 1996 the idea was defunct. Bruce Schneier and others wrote a report on key escrow in 1997 to shoot the idea down again. (https://www.schneier.com/paper-key-escrow.html).

We have been here 20+ years ago. It is important to remember.

Machin Shin (profile) says:

The one thing they constantly seem to ignore is that if you put a back door in, not matter how complex the key is or how many parts it is in, you have created the ‘holy grail’ that every hacker in the world is going to try and find.

I don’t know about anyone else, but when it is a situation of ______ organization or government VS the world’s hacking community, I’m betting on the hackers…. every single time.

Adam (profile) says:

Once again...

…ahem.

Back door encryption programs will not be used by smart people who wish to hide data. This will form an underground and/or overseas market for encryption applications that can be downloaded to/from anywhere and that’s what they “bad guys” will use. The government will have keys to every front door of every person who is either not trying to hide from the government or too stupid to use the underground tools.

So while they can open your phone at anytime, the national security threat will still force them to jump through the same hoops as now.

Roger Strong (profile) says:

How *Many* Governments and Agencies?

The NSA, DOJ, FBI, TSA, DEA and other agencies would demand access within the US alone.

If US agencies have them, you can count on similar agencies demanding the same access in every other country where Apple and Google’s phones are sold.

In any split-key system where the government agency and the device maker must combine their keys for access, it’s absolutely inevitable that a government agency will simply demand the device maker’s keys.

Zarquan (profile) says:

Re: How *Many* Governments and Agencies?

Once the US tries it, all the other governments will want the same level of access to devices used in their territory. The UK government is already proposing a similar scheme.

However, Angela Merkel is unlikely to be happy using a device that has US or UK government decryption keys built in. So will the manufacturers have to install different decryption keys depending on where a device is being used ? US keys on US devices, German keys on German devices, Brazilian keys on Brazilian devices.

International travel suddenly becomes very complicated.

If Michael Daniel visits France, will he be happy to comply with their laws and install the French government decryption keys on his Kindle ? What happens when he returns home. Will he want to be able to remove the French decryption keys once he is safely back in the USA ?

If Dilma Rousseff visits the USA, will she be required to install the US government decryption keys on her iPad ? Can she remove them again once she has left ? Will she ever be able to trust the device again, or do we all just throw everything away and buy new devices every time we cross a border ?

If the keys can be added and removed, who is authorised to modify the keys and how ? Who checks that a device has all the right decryption keys installed on it ?

Perhaps this a job for United Nations Cyber Law Enforcement ?

Anonymous Coward says:

The government was supposed to get a warrant before accessing people’s Facebook and Google account information. So the government wrote a law (section 215) that said these were all ‘business records’ that didn’t require a warrant to access.

If government gets front-door / side-door / back-door / open-window access to people’s encryption. They’ll just write another law stating they no longer need a warrant to access this information.

So where does that leave us? Totalitarianism, that’s where.

rapnel (profile) says:

safe

These are our effects. Without a warrant we are not obligated to permit entry or viewing. If a warrant was issued we would then be compelled to provide appropriate access. In the absence of a valid warrant you are, and should be, sol.

encryption is our digital safe into which we are permitted to place anything that will fit – having a safe place for our effects is unalienable – it is illegal for any government to require our combinations

Darren says:

So how does the government go about making these shared key schemes mandatory? Bernstein v. United States established that source code was an expression covered under the 1st Amendment.

At least on the Android side of the world, there are numerous forks of Android that would almost certainly choose not to comply, and given that the government has it’s hands tied on regulating source code, there would be nothing they could do stop it.

So unless I’m missing a point here, their quest to make sure nobody can have full device encryption that they do not have a means of decrypting has already reached a dead end.

It seems like they are just hoping that Apple, Google, etc. just voluntarily go along with this scheme and that users without technical knowledge of implementations would naively just go along with whatever their phone came with.

Rekrul says:

Re: Re:

So how does the government go about making these shared key schemes mandatory? Bernstein v. United States established that source code was an expression covered under the 1st Amendment.

Speech is covered under the 1st Amendment, but they’ve put limits on that. Obscenity, “hate” speech, encouraging a crime, etc.

If they can place limits on actual speech, how hard will it be for them to place limits on computer code?

ECA (profile) says:

Iv tried

Iv tried to explain encryption to people based on wireless/BT..
Encryption is a good/bad thing.
Passwords and encryption in a poor circumstance, is a BAD thing..and does not work.
Fair passwords and encryption, only delays what will/can happen.
Good passwords and encryption, takes TIME to solve.

Thinking your wireless/BT headphones are encrypted? is a dream. There may be a small weak encryption, but its very weak.

Encryption, SLOWS things down.. its like having ZIP files..and you have to open each file to use the file or see a picture.(ZIP is weak protection)
Never think that wireless is protected…there are HOLES in how to listen to it.
Encrypting a system, is/can be a good thing. But backdoors, are built into many products, called a reset button. without that button, if you have a problem its hard to fix many things. the problem with that tends to be EVERYONE knows that.

Anonymous Coward says:

Enormously creative

[Daniel] added that if any place could come up with an answer, it would be the “enormously creative” Silicon Valley.

Oh, they are going to come with an answer. Several of them, in fact. But they are not the answers this guy would like to see.

“Silicon Valley” is working on making sure transmissions can’t be decrypted after the fact (PFS). “Silicon Valley” is working on making it harder to spoof a valid certificate (CT, HPKP, DANE). “Silicon Valley” is working on ways to keep the user’s data safe even if the service provider is compromised. And so on.

The problem for this guy is not “Silicon Valley”‘s creativity. The problem for this guy is that their objectives are irreconcilable with his.

sigalrm (profile) says:

“So how does the government go about making these shared key schemes mandatory? Bernstein v. United States established that source code was an expression covered under the 1st Amendment.”

The US Government can’t (legally) regulate the source code. So what? They don’t have to. They can regulate access to public utilities.

Reclassify the internet as not a public utility. (for bonus points, subsidize access to it to ensure no one is left out based on their ability to afford it) and then specify the technical requirements for connection to it. Make one of those technical requirements “responds appropriately to key escrow validation query” or something similar and they’re set. No valid response? No network access for you, and the technical data about the system gets logged for investigation.

Mobile providers are already regulated this way, so no issue there – they just need to add back-end hooks to make sure the OS is “government approved”.

The technical capabilities already exist to do this at medium to very large scale, but they might require some tweaking to scale appropriately to, say, Cox Communications or Verizon Internet. Google “posture validation” and “network admission control”. For a fair number of these networks, the code is already in place, and just needs to be licensed and configured.

And yes, posture validation systems – as with any security related system – can be bypassed. Which is why the technical controls would/will be backed with administrative controls (Make it a felony to bypass “any technical control intended to regulate access to a public utility) and aggressively prosecute anyone caught attempting to do so. Oh. And the CFAA still applies.

It might take a decade or so to accomplish, but it’s certainly doable. And frankly, you don’t even need 100% coverage. just get the percentage of covered devices high enough to where it’s possible to evaluate the outliers and you’re “close enough”

Anonymous Coward says:

> Mobile providers are already regulated this way, so no issue there – they just need to add back-end hooks to make sure the OS is “government approved”.

This merely pushes the issue back one level. It is perfectly possible to store encrypted files on an encrypted file system. There is no requirement that the two encryption schemes share a common origin, scheme, or code base. You likely do this every day without realizing it: what do you think audio codecs are, or image/file compression?

If the government does mandate broken encryption on a device, you can bet that anyone wanting to keep their files secret will just put another private layer on.

… or you could just go the route England did: “unencrypt this for us or go to jail”.

sigalrm (profile) says:

Re: Re:

“This merely pushes the issue back one level. It is perfectly possible to store encrypted files on an encrypted file system. There is no requirement that the two encryption schemes share a common origin, scheme, or code base. You likely do this every day without realizing it: what do you think audio codecs are, or image/file compression?”

Pushing the issue back one level would be regarded as a significant win by the folks proposing this, as it dramatically reduces the number of people out there capable of working around the technical control. As to the other point above, as you say, there’s no requirement, per se, for any common format or code base, but realistically, if you want to communicate effectively, you need some sort of a common system, and whether or not they realize it, most people aren’t sufficiently competent to roll their own. This leads, inevitably, to common systems, format, code, and ciphers.

“If the government does mandate broken encryption on a device, you can bet that anyone wanting to keep their files secret will just put another private layer on.”

Given de-facto control of an OS, there’s very little that can be done on a system that you can’t also control.

Also, onto your final point: not all problems can be solved with technology, which is why you back up the technology with:

… or you could just go the route England did: “unencrypt this for us or go to jail”.

It’s not “or”, it’s “and”. Possible financial and reputational ruin, coupled with the possibility of jail time, is a fairly hardcore administrative control.

Never underestimate the effectiveness of a public execution (literal or figurative). The hard core penalties sought by prosecutors under, e.g., the CFAA – think Aaron Schwartz, or Deric Lostutter (who’s hacking under the alias KYanonymous brought about 2 rape convictions), and is now facing more prison time than the rapists because of it? Yes, prosecutors will put the person away for a long time, but that’s arguably a secondary goal – The primary goal – and we hear it stated over and over by prosecutors, county sheriffs, police captains, etc – is deterring other people from undertaking similar actions.

Anonymous Coward says:

TD has likely be NSL’d on reporting the subject (don’t know why else it’d be so prominently ignored so often)- but cell phones give the phone company authority over the device; the various three letter tyrants have authority over the phone co’s. No cellular device will every be secure, regardless of encryption. See: baseband co-processor.

tqk (profile) says:

This is his/their problem, not ours or anybody else's.

I’m sorry (well, not really) they have a problem with this, but it’s one of their making, not ours. We want to have secure communications channels. Apple and Google enabling secure communication by default is a great thing, and they should go piss up a rope if they disagree. If that’s a problem for them, if they insist on having this power, it’s up to them to find out how. It’s not our responsibility to just hand over the keys to our kingdoms.

All this really is about is they’re fighting a stupid drug war (prohibition, yet again). I don’t care that they want to do that and wish they’d just stop. If they insist on continuing that silly thing, it’s all up to them to find ways to do it. I feel no obligation whatever to compromise my security just to help them carry on as usual in their tilting at windmills.

Leave a Reply to Roger Strong Cancel reply

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...