Proper channels typically means "up and out" -- the problem being "up" often means reporting to the problematic party to begin with. Not sure what a proper "out" channel would be, but I wonder if giving government contractors a way to bring cases directly, discreetly, and pseudonymously to the judiciary would work.
Because not all laws are written this way -- many laws still include concepts such as "reasonableness" or "substantial evidence", which permit a fair degree of judicial discretion.
Judges also decide what to do if you have two laws that are otherwise clear but contradict each other when presented with a particular test case (that wasn't anticipated at the time the law was drafted or "encoded").
Re: Re: Re: Re: Re: Who is really the zombie here?
Because natural language processing isn't quite there yet and the the law isn't (yet) written in a machine-readable format (although there are attempts to do this -- take a look at https://github.com/mpoulshock/hammurabi)
What part of this is outright cronyism vs. regulatory capture? I know plenty of people who could build healthcare.gov quickly, reliably, and cheaply, but I've also seen plenty of government contracts. Those things can be monstrous, and there are plenty of qualified individuals unable to work on healthcare.gov solely because they couldn't (or wouldn't) want to comply with all of the government's rules. Imagine if you had to do a cost-benefit analysis or choose the lowest bidder on every sub-component of your system. Ugh.
One thing that gets tossed around a bit here but is missing from the privacy discussion: Information wants to be free. What doesn't that apply to information we want kept secret from the NSA?
We usually use that phrase in the context of paywalls or DRM. But it's absolutely relevant here as well. Even if we didn't explicitly bargain for the NSA to see our private information (much as content holders don't bargain for their content to be shared outside of the original licensee), anything we put on the Internet can and will make its way out to them if they truly want to see it -- if not be the NSA, then surely by a foreign government which owes us even less accountability than the NSA (if such a thing is possible).
That doesn't me we have to condone domestic spying, much as we can recognize piracy happens without condoning it. But it does suggest that attempts to keep information private are a temporary stop-gap at best.
I'd argue that a better place to draw the line is not "what does the government know?" but rather "what can the government do with what it knows?". It's hard to control the flow of information, but it is (somewhat) easier to recognize and prevent certain conduct. I'm not sure what those conduct-based lines would be, but the DEA's prosecution of drug-based offenses based on NSA intel definitely fall on the wrong side. Privacy is a part of civilized society -- but not all aspects of civilization can be legislated. IMHO, our efforts are probably better directed at identifying specific harmful acts we want the government to refrain from, rather than a blanket ban on domestic surveillance.
The problem with the trade-off analysis is that it trivializes the issues of privacy somewhat. If we're OK with sharing private information as part of an exchange for services, but disapprove of that information being acquired without our consent, that implies that what the NSA is doing is equivalent of taxation without representation (or inadequate representation).
That's an important issue for sure -- one important enough to have started the American Revolution -- but I don't think that's the harm people are thinking of when the NSA spies on them. For example, the CIA spends all sorts of taxpayer money on secret gadgets, many of which probably have questionable benefits for national security. But that doesn't invite the same type of outrage that Snowden's revelations did.
This actually makes sense if you accept the basic premise of the NSA's argument re privacy -- there is nothing wrong with collecting information so long as we don't act on it in an inappropriate manner. By way of analogy, Google's collection of Wi-Fi data via StreetView was incorrect, but ultimately harmless since it deleted the data collected without sharing or doing anything with it. The fact that it's happening 1000s of times is meaningless if you consider each violation unimportant (1000 times nothing is still nothing).
The more damning argument, IMHO, is the revelation that the NSA data is, in fact, not merely being improperly collected but improperly used against U.S. citizens. Specifically, there is no reason for NSA data to be shared with the IRS or the DEA, no matter how broad a definition of national security you throw out there. Full stop. But it is. And that's wrong even under the NSA's rules.
This reminds me of early Chinese filters where they would ban references to dates (June 4 -- the date of the Tiananmen Square protests), which only served to make otherwise non-subversive Chinese citizens curious why the censors were flagging invitations to get coffee on June 4.
The analogy to computer security isn't applicable here. Transparency works well for "defensive" security because everyone with an interest in maintaining that security can find and fix exploits.
The NSA's job is not (purely) defense. It is offense. Its objective is to exploit holes in the security of its targets to collect signal intelligence. Revealing those exploits ahead of time would be counter-productive.
That said, where the NSA is involved in less offensively oriented activities, it has been surprisingly open. See, e.g., the open source Accumulo database.
I'm not Mike -- but I'd hazard a guess that he'd be okay with trademark law if it got rid of dilution and returned to its likelihood of confusion origins, although it becomes more of a consumer protection law than an IP law at that point.