An open-source firmware for hard disks may not be as simple as that. I've heard - 2nd hand, but from a source I put a reasonable amount of trust in - that at least one of the vendors listed has set the hard drives up to require signed firmware, or the disk won't accept it. if you can't sign the code with a key the disk will accept, your open source project won't gain traction.
Also: it would short sighted to assume the scope of the actions here is limited to hard drives. Yes, this set of recently released documents is HDD specific. Yes, HDD's make an excellent target for this attack vector, for a variety of reasons, not the least of which is that, being hard disks, storage space presumably isn't an issue and so you presumably wouldn't be so severely constrained on the size of the malware you were shipping. But hard disks aren't the only built-in peripherals that allow for field-upgradeable firmware. Video cards, mother boards, CPU's - almost all of them have some amount of field-writable, onboard storage coupled with the firmware that allows them to operate. In fact, while they'd be harder targets, they might well be more valuable.
After all: You can remove a potentially compromised HDD from a system entirely, and run it off of live media on thumbdrive/cd/dvd/etc. Most people would have a very hard time running that same live media system w/o a video card. Or a motherboard.
Can't wait to see someone try this defense in court and then lose terribly when it doesn't work.
That's exactly what would happen. Although for a politician or investigator, it wouldn't have to get to court - just to the press.
Our societies built-in skepticism and inclination to pre-judge guilt based on the news media is exactly why this would be such a nasty lever, were it to be used - People claim "it wasn't me" so frequently that no one pays attention when that might actually have been the case.
(please note, I'm not saying this has actually happened. I have not idea if it has or not. But assuming the NSA has its fingers into everything as deeply as it's been reported - there's nothing that can really prevent it.)
They have it because the threat model when the spec was developed excluded (accidently or intentionally) "TLA's grabbing all the keys".
The current crypto key generation model saves time and costs associated with key generation at the time of deployment, and frankly, is probably a large part of why deployment is so smooth (I can go to my cell phone carrier today, ask for a SIM card, and get one, pretty much no questions asked).
(and, by the way, anyone know if the SIM's pre-printed ID is also the key? From what I"ve seen, the crypto algorithms are clearly symmetric, there's no reason the SIM ID couldn't be the actual crypto key)
There's nothing Gemalto _can_ do about it that would be meaningful. The specification was designed more to ensure that unauthorized handsets couldn't use the network than to prevent mass surveillance from an organization with access to all of their keying material.
"Oh, hey, sorry about the compromised crypto keys on that first SIM, here's a free replacement. We know that _these_ crypto keys are secure because, well, Um...."
Shouldn't there be some prosecuter out there working on a CFAA case against them
Almost everyone is focusing on the NSA's ability to "get any data they want", but if the NSA and other TLA's are as deeply embedded into computer networks as they're rumored to be, then they have, or can get, Read-Write access to damn near anything they want. You have to assume they can trivially plant evidence as easily as they can retrieve it.
Unfortunately, If we've crossed the rubicon, you can be certain that any prosecutors, judges, politicians, etc, who might initially push back against the NSA and other assorted three letter agencies might quickly find themselves convinced to look the other way, lest they end up out of a job or in prison.
In all fairness, talking with the network administrators is a step I take only when literally everything else I've tried has failed
Indeed. And in a large campus environment (of which MIT is one) the network administrators are often intentionally heavily shielded from the general user base, much less the guest user base. This is to allow them to remain productive.
It's easy to say "I'd just get a hold of a network admin for a guest network in a large environment." It's another thing to actually do it. Want to know how hard it is to get to someone who knows what they're doing on a guest network? Go to a largish venue with a guest network. Say, a MLB stadium, or a NHL Arena. A big convention center - during a convention - might work. If you've got a college campus with open wifi near by, use that. Pretend you're having trouble getting online, and try to figure out how to get a hold of tech support - much less a network admin - for the guest network at the venue. The results are probably going to be enlightening.
Well, that's just the prison piece. Never mind the potenial for financially ruinous fines (As if the legal costs weren't substantial punishment in and of themselves) and
Ramifications post-conviction. A person with a felony conviction may lose the right to vote and may also be barred from serving on a jury. Certain professional licenses may become off-limits, and convicted felons may find it difficult to obtain jobs and housing. By contrast, those with a misdemeanor conviction will not face such serious consequences.
It's the same reason why the government fought so hard to keep the Torture Report secret, because releasing such information would create a tidal wave of terrorism sweeping across the country.
Never confuse the stated reason with the actual reason (and, when it comes to the torture report, you'd be naive to assume there's only one reason). It's just as likely that the report contains an innocuous reference to another incident/program/memo/etc in those however many thousand pages that would make the torture report yesterday's headline.
Even if a Harris insider with sufficient access to internal documentation to make a difference were to suddenly grow a conscious and leak the documents, you can be certain that that insider is going to be far more acutely aware than most just how much they have to lose in doing so, and just how difficult it would be for them to remain anonymous.
Most people don't have the ability to rabbit to a safe haven like Edward Snowden did. This hypothetical Harris insider has to assume that, in the best case, they'll end up in a similar legal situation to that of Chelsea Manning.
Also, consider: the Stingrays are the tech that's understood to exist. Your hypothetical well-placed insider is quite likely to know about the next-gen, in-use tech that hasn't leaked yet, and what its capabilities are. Factor that into the above, and the potential for leaks further diminishes.
It probably would make a difference in the election.
Of course, it would also undoubtedly be either an outright lie, or summarily forgotten once the individual was in office. Snowden would have to be quite the idiot to put his faith in any such statement.
With a company that big ($6.2bn in assets), multi-million dollar signing authorities aren't that uncommon, especially at the executive level.
I'd be willing to bet that the largest dollar amount wired ($9.4 million) was calibrated to be just under the companies single-signature signing authority.
Absolutely reeks of inside job, although frankly I doubt the controller was in on it. If he was smart enough to put the job together, he'd (presumably) be smart enough to transfer and hide the money in a way that didn't paint a target on his back.
No, you can't fix social engineering entirely but I'll tell you what - Give me a $17.2 million dollar (which, btw, is about 0.28% of their reported assets, if I'm not mistaken) budget to spend on OPsec training across their 800 employees and I'm pretty sure I could put together a training program that would have a measurable impact on their organizations exposure to it.
$17.2 million for a company that size just barely makes it over the rounding error threshold...
However, due to "network" requirements, a unique, per-device super-opt-out cookie will be required for all users opting out, and will be added to your network traffic to ensure all advertisers know that you've opted out.