Not so fast. It's like this. You have the right to blow the whistle to the public on crimes committed by the government, and you should take responsibility for the repercussions of that by spending 9 months in solitary confinement before being placed in prison for 35 years.
at least I *think* that's what aglynn is trying to say....
Hello Cory Doctorow. Once again we have an excellent example of how people really don't understand modern technology. I don't have a phone in my pocket, I have a general purpose computer that happens to have a phone application running on it. Does this mean that encrypted-by-default laptops would be illegal? how about desktops? What if your company issues you one? What if you take a pocket sized super-computer and plug it into a full size keyboard and screen? Car GPS computers? Car radios? Your home cable modem?
We are increasingly surrounded by more and more of these "other such devices", pretty soon that's all we will have. Arguing that law enforcement has a right to snoop in every piece of technology I own without a warrant is abhorrent.
Re: Re: This is why we should Encrypt All The Things
this fight is arguably easier tho. it is harder for many people to justify sharing copies of movies that they didn't pay for; it is not nearly as hard as saying "i encrypt all my emails just because". sure, it might be weird, but it's not immoral by almost anyone's definition. one has a moral stigma whether or not everyone does it, like masturbation; the other doesn't.
I smell another word play buried in this line of thought. On the one hand we have "expect" as the assumed state of affairs (e.g.: I expect that it will be cold this winter - I consider it likely). On the other we have the definition that the government is using for "expect" as in what is demanded or required (e.g.: I expect this to be taken seriously - I demand no less).
So we can both be right at the same time. I can expect privacy, and I cannot expect privacy.
I reject your connotations, and substitute my own. I demand my privacy; unfortunately I assume that my government is violating it.
> Instead of making the content available via multiple platforms to inspire competition and better services they chose to block the content from people who are doing it right....
It comes down to a fear that the new tech will cannibalize the existing revenue stream because it provides an alternate. Which is true, it would, in the short term. Even if it would generate massively more revenue in the long run, that doesn't do anything to pad this quarter's earnings statement. I'm pretty sure NetFlix wasn't profitable on day 1. So to satisfy the need to always grow profits in the short term, they will only look at things that immediately add more money to their pockets without shrinking revenue any where else (similar short term thinking was demonstrated by the Verizon FiOS buildout and stall circa 2008).
a rep from the MPAA.... said [to me]: "When you buy a movie to watch in your living room, we're only selling you the right to see it in your living room. Sending the same show upstairs to watch in your bedroom has value, and if it has value, we should be able to charge money for it."
... to say we have no expectation of privacy in the business records created by our phone usage is to say we have no expectation of privacy in the billing records created by our family doctor visits, or the administrative records from our library patronage.
I've pointed this out before; we can do better with the encryption. We can use Shamir's Secret Sharing Scheme to split the encryption key into 3 parts and require any 2 to decrypt. Distribute them to a judge, the police department head, and a third party (ALCU or something similar), encrypt it with the recipient's public key (so the recipient has to use a personally assigned private key to decrypt it), and require any 2 to obtain the video encryption key. Now illegal or unethical review of the video requires the collusion of two people, not just one loose cannon.
The only technological weak spot now is preventing the video encryption service from leaking the key before it is wiped (unique key per video). One solution to that is to use public key cryptography so the video encryption service only ever has half the key - the other half is distributed beforehand. The only problem with that is you cannot use a separate key for each video, so one order to unlock one video exposes the key and can be used on any other video.
But either solution is better than nothing, and definitely much better than entrusting the key(s) to any one entity.
You do raise a good point about giving the police a tool to review and correct mistakes, but they already have other ways to do this, and I'm not sure how serious they are about using videos to train.
I agree with the motion in part, but I think the conflation of "bad" with "insecure" is a step backwards. There should be 4 levels:
1) Insecure: exactly what it says on the tin. There are no security measures; e.g.: served over HTTP, or the certificate is invalid. A site should be presented this way if it uses a bogus or known broken "encryption" method.
2) Encrypted (and only encrypted): Separating these two alone would let small webmasters deploy self signed certificates for many things that don't really matter. This is currently displayed as "bad", but it really shouldn't be in all cases. The fact that self-signed certs are presented to the user as an error is the biggest hindrance to HTTPS everywhere. And training users to ignore this error is training them to ignore MITM attempts.
3) Encrypted, Authenticated by Third Party: Site has a valid cert that signed by one of the 200 CAs (or a chain of them). This is currently what many pressure sites to achieve. Throw in EV certs here too.
4) Gold (for lack of a better term): This cannot be automatically chosen by the vendor except in very specific circumstances. This is basically user-controlled cert pinning. The user has decided that they trust communication with this site if the communication is signed with this specific cert (or any cert signed with a specific CA). If the cert ever changes the warning should be gigantic.
An education campaign focusing on "the gold standard" would be necessary so that users know to mark their banking sites. And so that they understand that just because something is "encrypted" doesn't make it secure; it would be easy to draw up illustrations of having a secret conversation with the bad guy. "Sure, your conversation is encrypted, but you could be having an encrypted conversation with your stalker. Encryption is not enough for security, you also need *trust*."
We absolutely must move away from the disaster that is the third-party CA "trust us, because someone paid someone who paid someone who paid us money" system if there is any hope for real security; so all of this must be controllable by the user. If a user wants to blacklist a specific encryption algorithm or whatever, then any site using that encryption should display as "insecure". A user should be able to promote a site/cert from "encrypted only" to "gold". And a user should be able to "untrust" any site/cert combo. Sure, some parts of this are already possible, but the UI is a fucking disaster. The UI *must* improve. Certs, PGP, and other PKI implementations already give us all the necessary tools for strong personal crypto, but the UIs around all of these tools are fucking disasters.
And one final thing; browsers should by default try HTTPS before HTTP.
does this mean that microsoft now has to comply with the us doj's request (complete with a gag order) for some citizen's emails stored in ireland's servers (and it is widely suspected that the person is an irish citizen)?
I really would hope that google is above that nonsense, but even if not I don't think they can. As stated, google wasn't even making money from it's spanish news aggregation, so there would be no lost profits.
But I don't think it's either. I think we will be seeing more conflicts between these two interests in the future - a right to privacy, and a right to have a transparent government. In a lot of cases it's probably pretty clear cut as these interests will have little to do with each other, or one can clearly be seen as trumping the other; but there will be some surprising overlaps like this where the two interests sharply conflict with each other.
I think this was a good outcome; arguable not the best. But instead of steamrolling either concern (and both concerns are legitimate in this context I think), they worked with the concerned citizen to continue the bodycam rollout and address the privacy impact.
Now I am anxious to see if this 20-something programmer living with the parents has the technical chops to help the department and the public.