DEA Accused Of Leaking Misleading Info Falsely Implying That It Can't Read Apple iMessages

from the that's-not-the-truth dept

So this is interesting. Yesterday, CNET had a story revealing a “leaked” Drug Enforcement Agency (DEA) memo suggesting that messages sent via Apple’s own iMessage system were untappable and were “frustrating” law enforcement. Here’s a snippet from that article:

Encryption used in Apple’s iMessage chat service has stymied attempts by federal drug enforcement agents to eavesdrop on suspects’ conversations, an internal government document reveals.

An internal Drug Enforcement Administration document seen by CNET discusses a February 2013 criminal investigation and warns that because of the use of encryption, “it is impossible to intercept iMessages between two Apple devices” even with a court order approved by a federal judge.

CNET posted an image of the letter:

In reading over this, however, a number of people quickly called bullshit. While Apple boasts of “end-to-end encryption” it’s pretty clear that Apple itself holds the key — because if you boot up a brand new iOS device, you automatically get access to your old messages. That means that (a) Apple is storing those messages in the cloud and (b) it can decrypt them if it needs to. As Julian Sanchez discusses in trying to get to the bottom of this, the memo really only suggests that law enforcement can’t get those messages by going to the mobile operators. It says nothing about the ability to get those same messages by going to Apple directly. And, in fact, in many ways iMessages may be even more prone to surveillance, since SMS messages are only stored on mobile operators’ servers for a brief time, whereas iMessages appear to be stored by Apple indefinitely.

That leads Sanchez to wonder if there might be some sort of ulterior motive behind the “leaking” of this document, done in a way to falsely imply that iMessages are actually impervious to government snooping. He comes up with two plausible theories: (1) that this is part of the feds’ longstanding effort to convince lawmakers to make it mandatory that all communications systems have backdoors for wiretapping and (2) that it’s an attempt to convince criminals that iMessages are safe, so they start using them falsely believing their messages are protected.

Which brings us to the question of why, exactly, this sensitive law enforcement document leaked to a news outlet in the first place. It would be very strange, after all, for a cop to deliberately pass along information that could help drug dealers shield their communications from police. One reason might be to create support for the Justice Department’s longstanding campaign for legislation to require Internet providers to create backdoors ensuring police can read encrypted communications—even though in this case, the backdoor would appear to already exist.

The CNET article itself discusses this so-called “Going Dark” initiative. But another possible motive is to spread the very false impression that the article creates: That iMessages are somehow more difficult, if not impossible, for law enforcement to intercept. Criminals might then switch to using the iMessage service, which is no more immune to interception in reality, and actually provides police with far more useful data than traditional text messages can. If that’s what happened here, you have to admire the leaker’s ingenuity—but I’m inclined to think people are entitled to accurate information about the real level of security their communication enjoy.

While both scenarios are plausible, both seem fairly cynical as well. I’d like to think that law enforcement is above attempting such tricks, but unfortunately that might just be naive these days.

Filed Under: , , ,
Companies: apple

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “DEA Accused Of Leaking Misleading Info Falsely Implying That It Can't Read Apple iMessages”

Subscribe: RSS Leave a comment
Ninja (profile) says:

Smart criminals will encrypt communications with their own keys, independent of service. And maybe even add VPNs and other anonymizing tools such as TOR. All this surveillance bs? It’s designed to catch petty criminals and spy on innocent citizens, specially those that disagree with Big Brother. I’d say it’s akin to DRM, it affects paying customers (law abiding citizens) only.

weneedhelp (profile) says:

Just caught this

“real level of security their communication enjoy” – Thats easy… NONE. Dont expect any privacy anymore. Our laws are interpreted in a twisted way that we dont even know about. I wouldnt doubt if the Constitution as we know it now has been “interpreted” down to nothing more than a fancy old-timy document. Awwww lookie here how cute, right to privacy, fair trials… awww weren’t they so cute back in the day.
T E R R O R I S M ! ! ! B O O! ! !

Nathan F (profile) says:

We will see how true it is in a few days.? If the feds suddenly start screaming and hollering about spying, aiding terrorists, and how they should throw the book at someone (but not CNET! they are after all a ‘serious’ news agency), then yes, it was a real internal memo.? If it was a deliberate leak by higher ups to try and hoodwink criminals, then we will never hear another peep about this since they don’t want the possibility of debunking information getting brought up.

Ben S (profile) says:

Re: Re:

Gnu Privacy Guard for your emails, but you’ll need the receiving user to have it as well. Some linux distros will have it installed by default. Gnu PG for Windows is available for those using Windows computers (and even comes with a GUI, unlike mine). Chances are, it’s also available for Mac.

Useful tool. It works be creating 2 keys, public and private. The public key can only encrypt the email/document/file, it can’t reverse the encryption. The private key though, is one you must protect and keep hidden. That key is the one that breaks the encryption. Keep a backup, but keep it locked up (for example, on a USB drive inside a lockbox) so prying eyes can’t get their hands on it.

Using GPG is not too hard. Step 1, create your public and private keys. Step 2, share the public key with anyone at all who would be interested in sending you encrypted emails. Step 3, obtain public keys from anyone you would want to send email to. Now you’re ready to use it.

Any emails you receive, you run through your personal private key, and it decrypts. Any emails you send, run the text through the receiving user’s public key first to encrypt it, then send it out.

MD says:

Apples and Pomegranates

I think the memo covers a different issue.
If the police are tapping SMS, they may think they’ve go everything when they don’t. There are two separate streams, Apple and SMS.

So the “oops we didn’t get it all” probably warn tech-challenged law school grads and police that they are not covering ALL bases by only tapping the phone.

Then there’s the question of decoding and reading – tapped cellular streams can be read in real time based on an on-going warrant – but reading between the lines, the iPhone iMessage content needs a warrant after the fact and Apple’s assistance to decode. (Do you think Apple would give out the key, or simply process the decoding for the law?)

So it seems to be a double purpose – they want to warn the police and attorneys they are not getting the whole picture with a simple wiretap. If it leaks and gives crooks a misleading sense of security, so they are more open on their imessages, bonus!

John Fenderson (profile) says:

Re: Re: Re:

You’re mostly right, but the way law enforcement agencies would do it is to get a court order that makes Apple decrypt the data for them. Apple holds the keys and can do so quite easily.

Just for completeness, how long it takes to break encryption depends entirely on how many resources you’re willing to throw at the problem. No encryption method available for use today will hold up for very long against a concerted, well-financed effort to break it.

You would, of course, have to be a very special person to warrant that kind of effort, so as a practical matter this doesn’t really mean much.

A says:

Re: Re: Re: Re:

Maybe they’re really complaining it’s impossible to do secretly without a warrant. I’m not sure how quick verizon hands over info in it’s customers but I can see apple being a little more discrete protecting it’s customer’s privacy. The way the telecoms rushed in with fisa for immunity providing warrantless unconstitutional spying makes me think they’re a little quicker to give out customer info.

Silverguy says:

Re: Re:

No. iMessages are encrypted with a key that only your phone, your recipeint’s phone, and Apple has access to. This key is never transmitter through the cell tower, and stored on the phone. You need it to read the message. Thus, to read the messages you need physical access to a phone that was a party to the conversation, or Apple’s help.

Dr Duck (profile) says:

How does (a) imply (b)?

“That means that (a) Apple is storing those messages in the cloud and (b) it can decrypt them if it needs to.”

Sure, I can see (a), but how does that imply (b)? Could not messages be stored in the cloud and passed to you fully encrypted? Why would Apple have any more ability to decrypt them just because they’re storing them?

Derp says:

The theory is weak

I think the theory in this followup article is flawed.

I believe it would be quite easy for iMessages to be stored, in their encrypted form, and recoverable when you use a different device, simply based on the Apple login credentials. Naturally, all Apple would have is a hash of the credentials, not the cleartext. The decryption key for a user’s imessages could also be stored by Apple in an encrypted form that used something Apple does not have — the cleartext of the login password — for the decryption key. Upon successful login to Apple, the encrypted key of the iMessages (which is all Apple has) is passed to the device, which the decrypts it with the cleartext (which never leaves the device) of the Apple login password…. and then decrypts the iMessage contents.

This would allow for recovery of iMessages when your device is replaced, but Apple would not be able to decrypt them.

This would also support password changes not having to re-encrypt all the iMessages… just the single key. But it doesn’t allow for password resets. I don’t know if iMessages survive a password reset. But even that may be doable w/o Apple being able to decrypt iMessages.

Angry Voter says:

Worse than you know.

Considering I can read iMessages, I’m sure the government can too.

There are also kits for law enforcement that plug in and copy every block (even deleted ones) off an iPhone without unlocking it.

Apple sold out long ago. Remember when they insisted there was no TCM chip in their machines but the Hackintosh people found the code support and then others found the chip on the motherboard?

Now it’s integrated in the CPU.

You can boot a turned off stock Dell or HP and remote control it – copy the drive, flash the BIOS, etc.

Ever put an Amiga and sniffer tools with a snooper hub on a PC or Apple network? Secret packets are sent that are hidden from the OS. Identifying serial number for every machine – just like cell phones.

a says:

Re: now I call BS

I doubt you can read an encrypted iphone or you would have mentioned how to get around the hardware enhanced aes copying “block by block”.
It’s definitely an obstacle worth mentioning, the key burned into the device where only that device can read the direct bits unencrypted once you encrypt…

tqk says:

I’d like to think that law enforcement is above attempting such tricks, but unfortunately that might just be naive these days.

Yes, you are naive. There’s nothing wrong with the time honoured practice by the police of lying to prospective perps. It doesn’t hurt anyone as long as it doesn’t try to act as evidence in court. They do it all the time to elicit information. Sometimes, suspects need to be threatened to cough up the truth. I see nothing wrong with that, as long as it’s the truth they’re after and it doesn’t descend into physical torture.

Add Your Comment

Your email address will not be published.

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...