This is what Google does. They're like the corporate version of that guy who has a dozen half-finished projects in his garage. They start something and do well with it until it gets too tough/boring and then they quietly abandon it. I fully expect that to happen to Gmail some day.
Looks like both headers and body is actually signed with this. But this does bring up another good question. We have no idea how they got a hold of these emails. For all we know, they could've had root access to the server. If they did, and managed to get the server's private key for DKIM, would they be able to modify the emails after the fact and then just re-sign them? If yes, then validating with DKIM doesn't do much for verification. Not saying this is what's happening, just an interesting question.
There's a reason it's a fallacy. It doesn't say on any level that things are related.
I'll give you an example. Penguins are black and white Zebras are black and white Therefore: Penguins are Zebras
Clearly, that's wrong and that's why association fallacy is so bad. It can be used to relate any two completely unrelated things by simply taking a single quality that the two things happen to share and plugging it in to that formula. It's the same way that you could say:
Theft = Bad Murder = Bad So: Theft = Murder
Yeah, let's leave association fallacy out of this.
Which is why the lawyer said that. If you automatically assume every case of imitation is infringing and don't investigate that idea further before filing, it's always in good faith because you assumed it was infringing.
It's wrongheaded for the NSA not to disclose vulnerabilities it finds. Even if their only job was "keeping Uncle Sam secure, not Wal-Mart," which would be a really stupid objective, keeping vulnerabilities secret in security products would mean the government itself is more vulnerable. Kind of stupid all around, if you ask me.
There's a reference for you. Basically, it's Portuguese barbecue, which is kind of big in Brazil. It would be like suing for the use of BBQ here in the US. It's not a tagline, it's simply telling you what it is. There's no way this lawsuit is going to stand.
You make a good point. I hadn't even thought about that aspect of what the FBI was asking for. I would like, however, to argue for my use of the word backdoor. To my way of thinking, a backdoor is a vulnerability that the creator knows about and can exploit for their own reasons. Would you agree with this definition? If so, I think this firmware upgrade process fits that definition.
But that's just the thing. Apple isn't crippling the encryption here. It isn't installing a backdoor. The backdoor is already there. That is the real story here. Everybody is concentrating on the FBI angle and completely ignoring the fact that Apple already has the ability to do what they want to your phone, passcode be damned. And now that we know about this ability, you can bet the legality is just an afterthought. The mere knowledge of it is enough that somebody (NSA) is already working on a way to exploit it.
I don't want to think about it. I would hope that Apple has the clamps on their little backdoor, but that seems too much to hope for and now that it's been talked about, several organizations, including the NSA are already working on exploiting it. Hopefully Apple does the smart thing and closes it in later iPhones.
I'm terrified to admit this, but I think what the FBI is asking for is surprisingly restrained and limited. Asking them to remove the passcode limits so they can more efficiently brute-force the thing is almost admirable compared to what they've been asking for. At least they're going to do some work in this thing.
That being said, terrifies me that Apple can do this at all. Note that I didn't say they were willing to do this, only that they can. This means that Apple isn't building a backdoor, so much as they already have one that they will use to accomplish this. If you can perform firmware/OS updates that remove security features with the device supposedly unlockable/uncrackable, that's a backdoor. It already exists and Apple just tipped their hand.
Let's make one thing perfectly clear. The FBI already had the means to crack this iPhone. All this backdoor does is make is slightly easier to do. There are software/hardware out there that can crack a 4-6 digit PIN, even with the lockouts/erase enabled. It just takes longer. That's really what this is about. The FBI didn't want to take the amount of time it would take to brute force the PIN without Apple's help, so they used the courts to force Apple to backdoor the lockout/secure erase functions, shaving quite a bit of time off the brute force attempt. So, while this is terrible, it's not quite as bad as it seems.