You'd Think The FBI Would Be More Sensitive To Protecting Encrypted Communications Now That We Know The Russians Cracked The FBI's Comms

from the guys,-encryption-matters dept

On Monday, Yahoo News had a bit of a new bombshell in revealing that the closures of various Russian compounds in the US, along with the expulsion of a bunch of Russian diplomats — which many assumed had to do with alleged election interference — may have actually been a lot more about the Russians breaching a key FBI encrypted communications system.

American officials discovered that the Russians had dramatically improved their ability to decrypt certain types of secure communications and had successfully tracked devices used by elite FBI surveillance teams. Officials also feared that the Russians may have devised other ways to monitor U.S. intelligence communications, including hacking into computers not connected to the internet. Senior FBI and CIA officials briefed congressional leaders on these issues as part of a wide-ranging examination on Capitol Hill of U.S. counterintelligence vulnerabilities.

These compromises, the full gravity of which became clear to U.S. officials in 2012, gave Russian spies in American cities including Washington, New York and San Francisco key insights into the location of undercover FBI surveillance teams, and likely the actual substance of FBI communications, according to former officials. They provided the Russians opportunities to potentially shake off FBI surveillance and communicate with sensitive human sources, check on remote recording devices and even gather intelligence on their FBI pursuers, the former officials said.

That all seems like a fairly big deal. And, it specifically targeted the FBI’s encrypted communications phone system:

That effort compromised the encrypted radio systems used by the FBI?s mobile surveillance teams, which track the movements of Russian spies on American soil, according to more than half a dozen former senior intelligence and national security officials. Around the same time, Russian spies also compromised the FBI teams? backup communications systems ? cellphones outfitted with ?push-to-talk? walkie-talkie capabilities. ?This was something we took extremely seriously,? said a former senior counterintelligence official.

The Russian operation went beyond tracking the communications devices used by FBI surveillance teams, according to four former senior officials. Working out of secret ?listening posts? housed in Russian diplomatic and other government-controlled facilities, the Russians were able to intercept, record and eventually crack the codes to FBI radio communications.

While this is all interesting in the “understanding what the latest spy v. spy fight is about,” it’s even more incredible in the context of the FBI still fighting to this day to weaken encryption for everyone else. The FBI, under both James Comey and Christopher Wray, have spent years trashing the idea that encrypted communications was important and repeatedly asking the tech industry to insert deliberate vulnerabilities in order to allow US officials to have easier access to encrypted communications. The pushback on this, over and over, is that any such system for “lawful access” will inevitably lead to much greater risk of others being able to hack in as well.

Given that, you’d think that the FBI would be especially sensitive to this risk, now that we know the Russians appear to have cracked at least two of the FBI’s encrypted communications systems. Indeed, back in 2015, we highlighted how the FBI used to recommend that citizens use encryption to protect their mobile phones, but they had quietly removed that recommendation right around the time Comey started playing up the “going dark” nonsense.

Of course, it’s possible that the folks dealing with the Russians cracking FBI encrypted comms are separate from the people freaking out about consumer use of encryption, but the leadership (i.e., Comey and Wray) certainly had to understand both sides of this. This leaves me all a bit perplexed. Were Comey and Wray so completely clueless that they didn’t think these two situations had anything to do with one another? Or does it mean that they thought “hey, if we had our comms exposed, so should everyone else?” Or do they just not care?

Filed Under: , , , , , , ,

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “You'd Think The FBI Would Be More Sensitive To Protecting Encrypted Communications Now That We Know The Russians Cracked The FBI's Comms”

Subscribe: RSS Leave a comment
43 Comments
Gary (profile) says:

No matter

No matter how bad it looks to the citizens, the intel community has it in their brain that their cracks/hacks/moles are far better than the other teams. Therefore it’s in their best interest to keep the bugs coming, because our side will exploit them faster than the other side.

It makes perfect sense. If you have an ego the size of a gas giant.


https://en.wikipedia.org/wiki/Common_law

Anonymous Coward says:

Were Comey and Wray so completely clueless that they didn’t think these two situations had anything to do with one another?

No, they believe that the US government should be able to keep secrets, and that the same government should be able to rad everybody’s else’s communications, be that their citizens foreigners or foreign governments. They will provide special secure phone to government officials, although they will manage the keys so that they can read all government communications.

The development of open source encryption will only be carried out in more friendly countries, like it was back in the days of the first crypto wars.

Anonymous Coward says:

Re: Re:

No, they believe that the US government should be able to keep secrets, and that the same government should be able to rad everybody’s else’s communications

And that came back to bite them with the FREAK attack of 2015, a direct result of them forcing "export-grade" crypto into the standards during that last crypto war. "Sites affected by the vulnerability included the US federal government websites fbi.gov, whitehouse.gov and nsa.gov."

Similarly, see selective availability of GPS, by which they tried to ensure they’d have better positioning than their enemies. "During the 1990–91 Gulf War, the shortage of military GPS units caused many troops and their families to buy readily available civilian units. Selective Availability significantly impeded the U.S. military’s own battlefield use of these GPS, so the military made the decision to turn it off for the duration of the war."

So, Comey et al. really do look clueless; or ignorant of history, anyway.

The development of open source encryption will only be carried out in more friendly countries, like it was back in the days of the first crypto wars.

Which raises the obvious question, who’ll be developing the "military-grade" American crypto? The answer, of course, is the lowest bidder from America—the country who’ll no longer have any expertise in this area.

urza9814 (profile) says:

They'd be exempt, of course

They don’t care. Why would they care? Any crypto regulation will grant the government itself an exemption — whether the law directly states that or not, they would never get prosecuted for any violations. So strong crypto allowed by law gives them precisely nothing, while making strong crypto illegal gives them a huge power advantage. Why would they oppose that?

Anonymous Coward says:

Re: They'd be exempt, of course

They don’t care. Why would they care? Any crypto regulation will grant the government itself an exemption

See, for example, Trump’s phone. The people in charge like to pretend they’re separate from the civilian world, immune to the flaws of its products, but they’re not. They use a lot of normal commercial products like Windows, OpenSSL, Android, and if those are insecure they’re going to have trouble.

urza9814 (profile) says:

Re: Re: They'd be exempt, of course

There’s two separate issues here. The first is the security or flaws of the software itself — zero day exploits in Windows, Heartbleed, that sort of thing. But I don’t think any of that is relevant to the discussion of cryptography algorithms. Even if they put in some ridiculous key escrow system, those flaws would still have to be patched.

What the government wants are just weaker encryption protocols. And virtually any software that uses encryption is already designed to use multiple different protocols. So you have a special government patch that re-enables the existing secure algorithms or adds some new ones, and those are only available on government devices. Shouldn’t be THAT hard to do. Sure, that patch might leak, but anyone using would automatically be a criminal and could be arrested just for that. They get your data, or they get you in a cage; either way they win.

Long-term it might cause a loss of security knowledge (or maybe not — there’s still plenty of other security bugs to chase)…but not nearly fast enough to get the law actually repealed. Most of the people pushing for such laws would probably be retired by the time that became a problem for them. And there’s always the possibility that they could keep that knowledge internally — they’ve done it before. We already have evidence that the NSA was ten years ahead of academic crypto at one point in time, so clearly they have (or had) some ability to keep decent internal talent for things like that.

Anonymous Coward says:

Re: Re: Re: They'd be exempt, of course

We already have evidence that the NSA was ten years ahead of academic crypto at one point in time,

That was at a time, or just after, when encryption was largely limited to governments and big corporation, with implementation being largely special hardware.. Academic study of crypto was largely theoretical. Then the Web took off, and there was a demand for crypto, and computers powerful enough to run it.

Anonymous Coward says:

Re: Re: Re: They'd be exempt, of course

The first is the security or flaws of the software itself — zero day exploits in Windows, Heartbleed, that sort of thing. But I don’t think any of that is relevant to the discussion of cryptography algorithms.

That’s correct, but even ignoring implementation flaws, we can’t discuss cryptographic algorithms in isolation. The logistical details are security-relevant.

So you have a special government patch that re-enables the existing secure algorithms or adds some new ones, and those are only available on government devices. Shouldn’t be THAT hard to do.

Do you realize how close that is to the reality of the 1990s? Netscape and Internet Explorer had 40-bit cryptography for most of the world, and 128-bit for Americans and Canadians. That arrangement ended in 1996, and nsa.gov was found vulnerable to a 40-bit-downgrade attack 19 years later. They’re supposed to be the experts. (OK, it’s their public website and not their top-secret internal network. Still, by then there were no 40-bit clients to intentionally support.)

Wikipedia claims, with poor sourcing, that "Acquiring the ‘U.S. domestic’ version turned out to be sufficient hassle that most computer users, even in the U.S., ended up with the ‘International’ version […]. A similar situation occurred with Lotus Notes for the same reasons."

Note also that this hypothetical "government patch" is going to have to be available to all sorts of entities outside the federal government. Power plant operators, banks, water filtration, local police departments—failures here can destabilize a country. A secret shared between thousands of entities, each with many people, won’t remain secret for long. But sure, it’s easy enough to create.

urza9814 (profile) says:

Re: Re: Re:2 They'd be exempt, of course

"A secret shared between thousands of entities, each with many people, won’t remain secret for long. But sure, it’s easy enough to create."

Why would it need to stay secret? Is all information about the existing secure algorithms going to be scrubbed from the face of the earth somehow once they pass such a law? It doesn’t matter that people can still find and use strong crypto if the law makes that alone a reason to arrest them.

Anonymous Coward says:

Re: Re: Re:3 They'd be exempt, of course

Encryption was illegal in France till about 2000 but people were using it. A law like that isn’t practical, especially if people have an easy workaround like applying a patch. It’s not even obvious that the use of strong encryption could be easily detected. People eventually figured out to tamper with the Clipper Chip’s LEAF data so that their traffic would look normal but wouldn’t allow law enforcement access. The NSA experts who designed this backdoor apparently didn’t notice the problem.

Anonymous Coward says:

Re: Re:

"said a former senior intelligence official." isn’t good enough.

But it’s good enough Fox News, Breitbart, Drudge, Project Veritas, and every single other conservative news source? Please. Anonymous sources is a foundational aspect of investigative journalism that has been used for DECADES.

Unless you can provide evidence that what they reported on is objectively false, there is no reason that "said a former senior intelligence official" should not be taken seriously.

just written another conspiracy theory like the THREE YEARS OF THEM that all turned out to be NOTHING

Such as? Got the spine to back up that statement up?

Why are you so ticked off about this? This is encryption, not your normal article that you go off on. Whether Yahoo’s story is true or not doesn’t change the fact that the FBI needs strong encryption to protect against this type of thing. And that same encryption protects consumers from hackers and Nigerian princes who want your money. Are you saying you want to be hacked?

Also, weren’t you leaving?

That One Guy (profile) says:

'We deserve utter opacity, the public, complete transparency'

Were Comey and Wray so completely clueless that they didn’t think these two situations had anything to do with one another? Or does it mean that they thought "hey, if we had our comms exposed, so should everyone else?" Or do they just not care?

I’m going to go with option D, a slight modification of the first choice: The encryption they would use would not be the broken encryption they want the public to use, and as such they likely consider the two to be completely unrelated, not because they are clueless so much as maliciously indifferent.

The mindset that seems to be in play is that the public has no right to privacy if the government wants to sniff around, but the same is most certainly not true in reverse, which would make it entirely consistent for them to continue to push for broken encryption for the public, as that makes it easier to snoop around on a whim, while at the same time pushing for stronger protection for their own communications, as that keeps other people from snooping through their stuff.

Anonymous Coward says:

Re: Re:

I’m confused by the academic stance on encryption, maybe one of you experts could enlighten me: As I understand it, the only good encryption is encryption that is entirely exposed, that is, every detail of the technique is in the public domain and subject to review by every potential adversary. Not only that, but it is also subject to SPECIFIC MODIFICATION by GOVERNMENT AGENCIES. Wow.

Simultaneously, the common view of the academic community is that any encryption technique that is NOT exposed, but instead kept private, is weak and should not be used because it is NOT available to every potential adversary. Publicly available techniques: GOOD. Secret techniques: BAD. That’s the summary if you read the experts (even here).

Isn’t this entirely ass backwards? If I have a private encryption method, and I don’t tell you what it is, isn’t it HARDER for an adversary to decrypt it?

I would also note that the US government does NOT expose the details of their most secure encryption systems. And YET, ALL of the encryption GENIUSES in the ACADEMIC community (even here) want every detail EXPOSED.

Weird or what?

Anonymous Coward says:

Re: Re: Re:

You should learn what "security through obscurity" means and why it’s a bad idea.

http://www.pearsonitcertification.com/articles/article.aspx?p=2218577&seqNum=7

In short, security through obscurity may work temporarily, but it runs into two serious problems. (1) Not nearly as much testing is done to find weaknesses and vulnerabilities, so it’s less secure that way and (2) should the obscurity disappear (meaning someone figures out a way in) you’re totally toast. One or both of those may be the reasons for the above story.

On the other hand, by being open and open sourcing things, you’re right that it gives attackers some details to explore, BUT, the point is that the cryptography gets much more widely tested and searched for holes, meaning that it’s likely to be much more secure. As for it being "exposed," the whole point of smart cryptography is that even when exposed you can’t crack it — so that entire assumption in your point is incorrect.

Also, for what it’s worth, it’s not the "academic stance" on encryption that you’re misunderstanding. It’s the widely accepted commercial and government stance on encryption. The government frequently relies on the same open cryptographic code that you incorrectly suggest they don’t use.

Anonymous Coward says:

Re: Re: Re: Re:

Interesting. A lot of sideways reasoning and hints and guesses about what might be good for CONSUMER encryption, not SPY encryption, which is the subject of this article.

So, what you’re saying is obscurity "may work temporarily", by which you mean during my lifetime, that should be good enough. Testing? That doesn’t mean much either, because if the government DID have a way to crack a public scheme, they sure as HELL wouldn’t tell you. Regarding being toast, how many articles have you read about people’s "encrypted" information being exposed? Answer: A LOT. It’s doubt that there is ANY public scheme that has not already been cracked, that seems much more likely than someone cracked a private scheme between two committed spies.

"Gives attackers SOME details to explore"? How about just admitting public schemes give ALL THE DETAILS, that’s my point. You say "cryptography gets much more widely tested", but by WHO EXACTLY? Honest brokers who make a living so doing, or government sponsored academics that have a REASON to peddle a bullshit argument like "Open Source encryption is secure". What a crock.

The most secure standards used by the US government are NOT open source. Why is that, do you suppose? Maybe they’re NOT STUPID? Could that be it?

By the way, are you Chinese? I understand the Chinese have a vested interest in Open Source, from a lot of angles (such as them being to stupid to invent anything so they prefer we give them stuff for free). Do you take money from the Chinese? Who pays you to promote your misleading and obviously biased and unreliable opinion?

PaulT (profile) says:

Re: Re: Re:2 Re:

"So, what you’re saying is obscurity "may work temporarily", by which you mean during my lifetime, that should be good enough"

Wow, that’s naive. Obscurity works until someone spots it, in which case it doesn’t work at all. When will that be? You don’t know, could be after you die, could be in 30 minutes.

Are you willing to bet everything on the idea that it will be the former and not the latter?

"Regarding being toast, how many articles have you read about people’s "encrypted" information being exposed? Answer: A LOT."

Yes, and that’s just the ones that are publicised to get the public to take action. Can you imagine how many are not being made public?

"The most secure standards used by the US government are NOT open source"

I bet you can’t name them or describe their origins, so I bet this is at least partially false.

"By the way, are you Chinese?"

It’s always about xenophobia with you, isn’t it? Can’t you take actual criticism from your own countrymen, so you have to lie about who they are?

Canuck says:

Re: Re: Re:2 Re:

My god, you’re quite the retarded troll. The U.S. government makes extensive use of AES256, for example, and it’s completely "open source". Surely you can find its FIPS standard document, well, no, you can’t be arsed, can you?

Non peer reviewed encryption has always been garbage. WiFi encryption is as good an example as any.

Scary Devil Monastery (profile) says:

Re: Re: Re:2 Re:

"So, what you’re saying is obscurity "may work temporarily", by which you mean during my lifetime, that should be good enough."

Nope. "obscurity" only works until revealed. In other words, one single successful hack later Russia will be reading every last confidential US communication without the US ever being able to discover that this is the case. In fact the US will keep assuming their algorithm still works.

"The most secure standards used by the US government are NOT open source. Why is that, do you suppose? Maybe they’re NOT STUPID? Could that be it?"

Or you are wrong. The standard approved official encryption algorithm used by the US is AES. An open-source standard. And one the NSA uses as default. In fact, hilariously the NSA were the ones who advocated the open-source solution. Because they found EVERY proprietary encryption algorithm to be breakable.

"By the way, are you Chinese? I understand the Chinese have a vested interest in Open Source, from a lot of angles (such as them being to stupid to invent anything so they prefer we give them stuff for free)."

So if someone advocates open source one has to be Chinese? That means some 90% of US software companies today, as well as the frigging NSA are chinese, then?

I have a better answer for you. You are an idiot.

For more than one reason.

Anonymous Coward says:

Re: Re: Re:2 Re:

Interesting. A lot of sideways reasoning and hints and guesses about what might be good for CONSUMER encryption, not SPY encryption, which is the subject of this article.

Consumer encryption is spy encryption and vice versa. What’s good for one is good for the other. There is no distinguishing between the two. If there is, then that means that one is inherently weaker than the other. And when you are talking about keeping things secure digitally, with encryption, weaker is never good.

So, what you’re saying is obscurity "may work temporarily", by which you mean during my lifetime, that should be good enough.

No, that’s not what he’s saying at all. He’s "temporarily" as in only until it gets discovered. At that point it’s worthless. It could be your lifetime, or it could be a day, or an hour.

Testing? That doesn’t mean much either, because if the government DID have a way to crack a public scheme, they sure as HELL wouldn’t tell you.

Perhaps not, but the more people who can get there hands on it and test it, means a much higher chance of somebody finding the same way of cracking it that the government did and reporting it so it can get fixed. This called "quality assurance testing". You may have heard of it. There are many people who all they do is try out the new encryption standards to find the bugs and report them to be fixed.

Regarding being toast, how many articles have you read about people’s "encrypted" information being exposed? Answer: A LOT.

Yes, and almost all of those instances have occurred because either the information was never encrypted in the first place or it was implemented poorly and a vulnerability was exploited. That doesn’t mean public encryption is bad. It means companies and governments suck at implementing proper security.

It’s doubt that there is ANY public scheme that has not already been cracked,

AES 256 has not been cracked.

that seems much more likely than someone cracked a private scheme between two committed spies.

If those two spies used poorly implemented or weak encryption, then no, I would find that much more likely. Which is exactly what happened here.

"Gives attackers SOME details to explore"? How about just admitting public schemes give ALL THE DETAILS, that’s my point.

At the time of encrypting something, the encryption key is randomly generated. Even if you know how encryption works, if you can’t figure out what that key is, you’re screwed and the data is secure. Knowing HOW an encryption scheme generates that key does not give you the key itself, since it’s more or less random. That key is based on extremely large prime numbers which are notoriously difficult to determine past a certain size.

You say "cryptography gets much more widely tested", but by WHO EXACTLY?

Anyone. That’s the point.

Honest brokers who make a living so doing, or government sponsored academics that have a REASON to peddle a bullshit argument like "Open Source encryption is secure". What a crock.

Dude, you really don’t understand how any of this works. The people who develop encryption don’t just blindly make any changes to the encryption method if someone says they found a bug. The person submitting the bug has to explain exactly how to replicate the vulnerability and show proof that it makes it weaker. The developers then take that information and test to see if, in fact, they can decrypt encrypted data using that vulnerability. If they can then they implement a fix (which is also not determined by the bug reporter) and then tested again to verify that it fixed the vulnerability and didn’t introduce any others. Then it gets released publicly so that it can be more widely verified. I’m not sure how you think this works but testing by more people is always a good idea. Why do you think Microsoft encourages people to test every new release before they make it official?

The most secure standards used by the US government are NOT open source.

They are. In some cases they even state this as a requirement for new hardware software. Do you have evidence to the contrary? Please provide it.

Why is that, do you suppose? Maybe they’re NOT STUPID? Could that be it?

Or maybe you are spouting off a bunch of nonsense because you have no idea what you are talking about.

By the way, are you Chinese? I understand the Chinese have a vested interest in Open Source

No, they don’t. And the reason why is because they can’t control it and get access to it. Only closed source, state approved, stuff is what they want their people to use. I guarantee the Chinese government itself is using open source encryption though.

from a lot of angles (such as them being to stupid to invent anything so they prefer we give them stuff for free).

And here we go, queue the bigotry. Chinese people are not stupid. Do you want me to list all the things they have invented over the years.

Do you take money from the Chinese? Who pays you to promote your misleading and obviously biased and unreliable opinion?

Right back at you. I mean, you don’t even know how encryption works. The entire rest of your comment is just nonsense because of that. Obviously you’re either a shill, a troll, or a bigoted moron who doesn’t understand what you’re talking about. Take your pick.

Scary Devil Monastery (profile) says:

Re: Re: Re:

"Isn’t this entirely ass backwards? If I have a private encryption method, and I don’t tell you what it is, isn’t it HARDER for an adversary to decrypt it?"

No, not really.

An encryption algorithm is basically math. If all you’ve got working on a formula is a small set of experts from one given nation you will be left in the dust.

When the same algorithm has been worked on and criticized by everyone in the world with an interest in trying to find a weakness in it, then that algorithm becomes functional.

That’s why the NSA, for instance, uses mainly open source encryption algorithm, because once you have a fully vetted encryption algorithm to use as standard the remainder of the security comes in as the process of key management. This is where the weakest link is usually found – in how easy it is to read or guess the decryption keys in transmission.

Scary Devil Monastery (profile) says:

Re: Re: Re:

"I would also note that the US government does NOT expose the details of their most secure encryption systems."

Actually it does. The standard US encryption algorithm for Top secret and above is AES-256. As published by the NSA itself.

Fully open source, in other words.

In fact, the NSA was heavily pressured to use a proprietary standard of both encryption algorithm and software platform. They refused, because in their own words, they couldn’t find a single proprietary standard which wasn’t easily breakable.

Hence why they launched a public contest to determine which open standard should be adopted to use to encrypt US Top secret material with.

"I’m confused by the academic stance on encryption…"

No, you are confused about education in general. Not that this is exactly new with you, Baghdad Bob.

Anonymous Coward says:

Re: Re: Re:

As I understand it, the only good encryption is encryption that is entirely exposed, that is, every detail of the technique is in the public domain and subject to review by every potential adversary.

Correct. The reason being is that A) it gets reviewed by a lot of security and code experts who can spot and report potential vulnerabilities and B) it allows people who want to use it the opportunity to vet it to make sure it is secure enough for their needs.

Not only that, but it is also subject to SPECIFIC MODIFICATION by GOVERNMENT AGENCIES. Wow.

Sort of but not in the way that you think. The government can’t mandate that it be weakened, and they have input into it but the developers don’t necessarily have to follow it, especially if said input would weaken the strength of the encryption.

Simultaneously, the common view of the academic community is that any encryption technique that is NOT exposed, but instead kept private, is weak and should not be used because it is NOT available to every potential adversary.

Incorrect. It’s weak and should not be used because it’s not widely available to be tested and examined for vulnerabilities. It would be like being forced to eat a brownie from a guy off the street who swears he "used only FDA approved ingredients" but won’t tell you which ingredients he used if you ask him. Just because an adversary can see how the encryption works, doesn’t mean they can crack it if it’s strong enough and had all the vulnerabilities fixed. No vulnerabilities means no way in.

Publicly available techniques: GOOD. Secret techniques: BAD. That’s the summary if you read the experts (even here).

Yes, that is correct.

Isn’t this entirely ass backwards?

Not at all.

If I have a private encryption method, and I don’t tell you what it is, isn’t it HARDER for an adversary to decrypt it?

Only if your private method is of sufficient strength and contains no vulnerabilities. If your encryption method is your birthdate, well, even if you don’t tell people how it works, that will be cracked in seconds.

I would also note that the US government does NOT expose the details of their most secure encryption systems.

No, but it’s no secret what encryption they use, AES256. What they protect with that is irrelevant.

And YET, ALL of the encryption GENIUSES in the ACADEMIC community (even here) want every detail EXPOSED.

Yes. Knowing how encryption works does not mean you can actually crack it. I recommend looking into how encryption works. That should give you an idea of why even if you know how it works, that doesn’t mean it’s crackable.

Weird or what?

Neither. This is 100% logical. What would be weird would be if you were correct, then it would make absolutely no sense that everyone in the world is using AES256. Since it’s open source, it should be immediately cracked by children, yet it’s not.

Anonymous Coward says:

Holy cr*p! that is...not surprising

As a person in a related field, I tend to look at security with a grain of knowledge and realize just how much about the subject that I don’t know. It is easy to feel overwhelmed when the supposed experts of the field gets their security cracked.
But then I remember the past incidents regarding big companies and the government and realize that the keys were probably left in a database openly facing the internet with a login of admin and Password1.

Scary Devil Monastery (profile) says:

Re: Holy cr*p! that is...not surprising

"But then I remember the past incidents regarding big companies and the government and realize that the keys were probably left in a database openly facing the internet with a login of admin and Password1."

Somehow it’s always the case that the "crack" turns out to be the result of an idiot who left the passkey written down on the computer case or used one of the top ten contenders of a dictionary attack for a password.

Leave a Reply to Anonymous Coward Cancel reply

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...