You make a good point at #2, which is why a lot of canaries require action to stay valid; either (as in the case of Reddit) by not existing prior to the annual filing (which gives it a lifetime of about a year), or a digitally signed message that clearly states it expires in (for example) 90 days.
This puts it on slightly more firm footing with regards to the theory that the government cannot *compel* false speech, it can only compel inaction (e.g.: silence). Although there is plenty of room to test the theory on how well a digital signature can be equated to attestation.
Re: Who needs rights when you've got 'National Security: Because terrorists!'
That's exactly why Australia outlawed warrant canaries. It might be argued that the US already has; since you can only report in bands of 1000, and the lowest band is 0-999, you can't just say "I have never...". That was also one of the theories on why Apple pulled theirs (it might still hold water, I can't find any updated references on the matter).
But the fact that the Reddit CEO gave the answer "I was counseled that I should not comment on that", means they were probably served. If they hadn't been served, they could just point to this argument in support of "canaries are not legally certain enough to risk in court".
You are implying that the two are somehow exclusive, as if setting an impossible goal somehow means that not achieving it is not a failure.
It's both. The reason it's a failure is because it's impossible. Unfortunately, other than simple statistics, this is not a mathematical conundrum; the only way we know it's impossible is by looking at all the results of trying to achieve it. Eventually, we can hope, everyone will conclude that it is indeed impossible; until then all we can do is point out the failures and make reasoned arguments that the only outcome of the current course of actions is failure.
> > While today’s data is encouraging, the challenges facing us are significant. The consumption of music is skyrocketing, but revenues for creators have not kept pace.
There you have it. They want you to pay every time you hear a piece of music. If the technology existed that would let them constantly monitor your brain and extract a fee from your account every time your auditory cortex registered a covered work, they would fight tirelessly to subject every person to it.
> Being forced to hand over the keys so FBI-OS can sign it's code with them is equivalent to being forced to endorse FBI-OS. That would be compelled speech....
That's arguable. It remains to be seen if a court would see it that way. Remember Ladar Levison. The fact that HTTPS connections are designed to guarantee authenticity and require digital signatures to attest to the fact might be enough to equate a digital signature with consent or endorsement. If the court buys that equivalency, and accepts the precedent as valid, then compelled speech is a done deal.
If we transfer from cyberspace to meatspace, the government can compel you to sign a false confession. Hell they may as well be allowed to compel you to endorse or vote for someone of their choosing. Truly terrifying.
On the other hand, if the courts block this type of compelled speech, I can see the FBI demanding that all future systems that accept software updates must not refuse an unsigned software update, or that they accept one signed by a different key. Imagine all future TPM chips must have a government key baked in.
On a third hand, the court may not make that leap to equivalency. In which case we (the People) could lose this battle. To combat that legal attack in the future, the tech community could add a cert flag that makes the endorsement explicit. Kinda like how the linux kernel code authors signal their intent as to which license a symbol is usable under (SYMBOL_EXPORT vs SYMBOL_EXPORT_GPL or something like that). These methods have not been tested in court however, and they may be shot down.
> This argument is particularly maddening: basically continuing the ridiculous line of thinking that protecting user privacy is some sort of deliberate marketing strategy against the government
At this point, the two are kinda synonymous; protecting my privacy necessarily pits all involved against the government, and it's entirely due to tactics like this.
> But again, Apple has always been willing to respond to legitimate government requests for information that it has access to. That's why that same chart shows that it complied with 81% of US requests as well. But that says absolutely nothing about the requirement to build a special system to hack in and access data that it does not currently have access to.
The tech companies are getting raped here, and the perpetrator is claiming to the judge that it's not rape 'cause "they gave me a handjob before; consent was already given!".
I hesitate a bit to draw a comparison to rape here. But seriously. We, The People, are getting fucked.
You know what, he's got a point. Think about that next time the words "slippery slope fallacy" spring to mind when you see an argument about government overreach. Slippery slope? Yeah. Fallacy? Not when the government greases the skids and hops behind the wheel....
Oooohhhhh, that makes it real easy then. Just EO-up a new branch of the military and presto! No need to worry about those pesky laws and acts that were only created to protect terrorists anyway. Why didn't I think of that?
So far they haven't been ordered; the court has given Apple 5 days to make a response as to why it would be "too burdensome". I will wait till I hear the court's reasoning on the eventual order (or lack of it) before I start blaming the court system in this case.
You raise a lot of good points, but trusting the people in government - and even the government itself - to not abuse this in the future is foolish and has already been proven unsound.
I am *almost* hoping Apple loses this battle; it would just force Apple to commit to the next step of wiping the key on firmware update. I think it would be possible. Perhaps if the correct pin is entered, the decrypted data key could be reencrypted with a temporary ephemeral key and stored in the main CPU registers so it can be fed back to the firmware after the flash. This would be extremely complicated, very brittle, and therefore difficult to get right.
I worry though that in losing this battle there might be some overreaching order by the government prohibiting Apple (and any other US based company) from progressing in security. That would be the final blow. So I am still also kinda hoping they win and we can get a good precedent about how much a court can compel action. I doubt that we would get a good precedent out of this though.
Apple owns the code, the government owns the phone; the dead guy was just the user. (I need to rewatch Cory Doctorow's "Civil War on General Purpose Computing". He raised a lot of these types of questions.) I think if the court rules against Apple, it will probably be on these grounds, and by virtue of the necessity of the method of execution, compelled action will be deemed justifiable. That would open up so many avenues for government overreach, it's frightening. Canary warnings could be made useless - the government could just compel your action to sign the "everything is still OK" message. Journalists could be compelled to promulgate government propaganda to hide illegal activities. Corporations could be compelled to install spyware on their machines to spy on their employees. A pharmacist could be compelled to supply the government with the drugs necessary to execute convicts. And of course any tech company could be compelled to sign and / or install a backdoor into the software or devices they produce and / or package. Hopefully, some of these things may be explicitly deemed out of bounds. But how much will fall in bounds from the collateral damage? Because if compelled action is deemed justifiable across the board - or even to a significant extent - then this experiment in self-governance is over. Once you combine compelled action with our existing framework of secret courts and gag orders, there is no way for the government to regain the trust and legitimacy necessary to govern by the will of the people. It will have been irrevocably shattered. Compelled inaction has done enough damage already.
I am kinda glad this is happening though; this is one of the key debates that has needed to happen for a while now. Unfortunately it's coming up on the election circus show, so none of the talking heads with the most focus - the candidates - can be trusted to speak the truth on this. I am extremely disappointed in Sanders, but not terribly surprised. I don't recall which issue it was, but he has been silent on it.
I would love to see that. I've seen a few presentations (DEFCON and the like) about defeating shoddy TPM chips. It's quite fascinating. The trust zone is essentially a TPM chip, so I'd imagine the same sorts of hardware tamper resistances have been built into it (which is why you can't just pop the cover off and hook up a jtag).
I don't know if their trust zone is or is not breakable, but the consensus is that if you physically tamper with it to physically "read" the bits, the data retrieval is less than 100%. And of course the process physically destroys the chip. No second chances. And the trust zone is actually part of the main CPU die, so that may or may not complicate matters a bit.
But as far as software / firmware goes, yeah the trust zone is the weak link. This whole article talks about how we were all mistaken in one assumption. The common wisdom would say that the trust zone should wipe the data encryption key if told to reflash itself. But it doesn't. Normal operation wouldn't be affected, because the flash rewriter could feed the data encryption key back to it.