"This merely pushes the issue back one level. It is perfectly possible to store encrypted files on an encrypted file system. There is no requirement that the two encryption schemes share a common origin, scheme, or code base. You likely do this every day without realizing it: what do you think audio codecs are, or image/file compression?"
Pushing the issue back one level would be regarded as a significant win by the folks proposing this, as it dramatically reduces the number of people out there capable of working around the technical control. As to the other point above, as you say, there's no requirement, per se, for any common format or code base, but realistically, if you want to communicate effectively, you need some sort of a common system, and whether or not they realize it, most people aren't sufficiently competent to roll their own. This leads, inevitably, to common systems, format, code, and ciphers.
"If the government does mandate broken encryption on a device, you can bet that anyone wanting to keep their files secret will just put another private layer on."
Given de-facto control of an OS, there's very little that can be done on a system that you can't also control.
Also, onto your final point: not all problems can be solved with technology, which is why you back up the technology with:
... or you could just go the route England did: "unencrypt this for us or go to jail".
It's not "or", it's "and". Possible financial and reputational ruin, coupled with the possibility of jail time, is a fairly hardcore administrative control.
Never underestimate the effectiveness of a public execution (literal or figurative). The hard core penalties sought by prosecutors under, e.g., the CFAA - think Aaron Schwartz, or Deric Lostutter (who's hacking under the alias KYanonymous brought about 2 rape convictions), and is now facing more prison time than the rapists because of it? Yes, prosecutors will put the person away for a long time, but that's arguably a secondary goal - The primary goal - and we hear it stated over and over by prosecutors, county sheriffs, police captains, etc - is deterring other people from undertaking similar actions.
"So how does the government go about making these shared key schemes mandatory? Bernstein v. United States established that source code was an expression covered under the 1st Amendment."
The US Government can't (legally) regulate the source code. So what? They don't have to. They can regulate access to public utilities.
Reclassify the internet as not a public utility. (for bonus points, subsidize access to it to ensure no one is left out based on their ability to afford it) and then specify the technical requirements for connection to it. Make one of those technical requirements "responds appropriately to key escrow validation query" or something similar and they're set. No valid response? No network access for you, and the technical data about the system gets logged for investigation.
Mobile providers are already regulated this way, so no issue there - they just need to add back-end hooks to make sure the OS is "government approved".
The technical capabilities already exist to do this at medium to very large scale, but they might require some tweaking to scale appropriately to, say, Cox Communications or Verizon Internet. Google "posture validation" and "network admission control". For a fair number of these networks, the code is already in place, and just needs to be licensed and configured.
And yes, posture validation systems - as with any security related system - can be bypassed. Which is why the technical controls would/will be backed with administrative controls (Make it a felony to bypass "any technical control intended to regulate access to a public utility) and aggressively prosecute anyone caught attempting to do so. Oh. And the CFAA still applies.
It might take a decade or so to accomplish, but it's certainly doable. And frankly, you don't even need 100% coverage. just get the percentage of covered devices high enough to where it's possible to evaluate the outliers and you're "close enough"
Actually, I think it's a little more nuanced than that:
It's (apparently) ok to call someone scum. It only turns defamatory when you preface it with an absolute, like "total scum", or "complete scum", thereby omitting the possibility that the defamed individual might be a quasi, hybrid, or otherwise partial scum. A possible example might be an incompetent scum.
By this logic, it's not defamatory because he only stated that Greenfield is a member of an illegal gang, not a "complete" or "total" member of an illegal gang....
Then that's on the teacher, and punishable as per their employment agreement. You know, the one where they agree to follow district policy, and then make a conscious decision not to?
It would - at the very least - partially insulate the school district from liability in this case. Which doesn't seem like a lot until your lazy teacher brings the federal government down on the school and the district.
If we're going to wield the "What part of illegal don't you understand" hammer, then here's a random thought:
The school district needs to be investigated for probable FERPA violations by the DoE Family Policy Compliance Office. Because based on what's in the article, they clearly have problems controlling access to the computer systems containing student records.
Oh, and the head of their IT department needs to be fired for gross negligence - for allowing the system to be configured to use such a weak password in the first place. Because in Active Directory, at least, you have to specifically enable use of such crappy passwords.
This is, ethically speaking, a highly complex subject. And anyone with a black and white answer in either direction should have their motivations heavily scrutinized.
Imagine, for a moment, that a highly reliable test (say, 95%) was available for a, incurable condition - say, ALS.
Would you want to know, with a high degree of certainty that you would get it?
Tweak that: Now, instead of ALS, it's melanoma.
Tweak that: Now, it's a condition that can only be passed to a child if both parents having a particular recessive gene. And one of your parents doesn't have it.
Tweak that: Now, one of your parents - whom you love dearly - isn't your actual parent, and you live in a culture where adultery is punishable by death?
Does your answer change, based on the scenario?
At a small scale, the implications are enormous. At a larger scale, the implications to society could be staggering.
Some people would absolutely want to know. Others absolutely wouldn't. Some might want their primary care physician to be informed, but wouldn't want to know themselves. The reactions would run the full gambit of possible answers.
I think from both practical and ethical perspectives, all or nothing isn't going to be a viable option here. Some sort of opt-in/opt-out system should be set up.
Goofy systems like the Electoral College aside, I believe that in the general case politicians get elected because a simple majority (50% + 1) of eligible voters who actually vote want them elected.
The will of the non-voter is entirely irrelevant, as is the cause of their non-voting status. Additionally, from a practical perspective, a non-voting constituent isn't a constituent. Voting constituents on the losing side of the election are also irrelevant to politicians.
with the low voter turnouts we've seen the last decade plus, it makes it easy for politicians to know who to try to please - and it only seems to be - on average - 10-15% of the registered voters in any given district.
Cache locally until cell service is restored. Done.
Also, it would be fairly trivial to compare GPS location of vehicle w/ cellular coverage map of provider. If they're using Verizon, and over the course of 8 hours you don't see any cellular coverage while the GPS shows you're in downtown LA, you're busted. Oh, and now you've clearly intentionally interfered with the proper function of their vehicle. You might want to check the fine print for penalties for that.