They're not trying to get a backdoor into the phone, they're trying to get a backdoor into the law. The phone is only valuable because it was owned by bad people and is encrypted. They can hold the phone up and threaten people with it: "What if this phone has more bad people's information?"
It's a great way to get people to accept a tiny change in what we accept as reality. It's not breaking the encryption, it's not directly affecting your phone, so it shouldn't be a problem.
But what about the next phone? This security flaw might be fixed, but that's not going to stop the court from ordering Apple to find another security flaw, and another, and another. Each one pushing just a little harder, stretching what we'll accept just a little more until there are no more security flaws.
When that happens it's not a large step at all to order Apple to start including these flaws. It's still not breaking encryption, still not directly affecting your phone. The software must be signed using Apple's secure key, so your phone's still safe.
And thus what we're willing to accept is stretched even further.
What about the next step after that. All these security flaws still don't address the primary issue, the encryption. Keep a good password on your encryption and you won't ever have a problem with these minor changes. So how long until the FBI or whomever come across a phone that is properly encrypted but could have been used to prevent another 9/11?
Our acceptance has already been stretched to accept security flaws in our phones, why not weaken encryption so the government can brute force the phone in a few days instead of centuries? Still not directly affecting your phone. Any normal person won't have access to the information required to use the weakness. Even if they did, they don't have access to the hardware the government has and wouldn't be able to crack the encryption in any reasonable amount of time.
But days delay can kill.
I could keep hammering this home, but to make a long story short: while the boiling frog story might be inaccurate, the meaning behind it is vary real.
And it doesn't take an intentional plan to kill of privacy. If what we're willing to accept can change, so can what the government is willing to accept.
I don't know how iPhones work, but if they're anything like my Android phone, no, it's not possible to do that. The USB port is cut off from storage until the phone is unlocked. The host OS in memory might be accessible through the port, but the internal storage itself isn't.
"If smartphones are beyond the reach of law enforcement, crimes will go unsolved, criminals will not be held accountable, victims will not receive justice and our ability to protect our children and community will be significantly compromised,"
There's your problem. Victims don't receive justice, the accused receives justice. To insist otherwise is not justice, it's vengeance. Once we get into vengeance territory, protecting the innocent goes out the window. It becomes about punishing the guilty no matter who else gets hurt along the way. "Casualties of war" as they will call it.
As Whatever so vary unintentionally points out, those who are in favor of mass violations of human rights are using words and definitions to distract everyone from the actions that are taking place.
The question should not be "Is what we're doing best described as mass surveillance or bulk collection?" The real question is "Should the mass collection of everyone's data be legal?" And the answer to that is quite clearly "Oh fuck no."
So this is why the FCC hasn't already jumped down T-Mobile's throat. They're treading a fine line. Push a little too hard in favor of the citizens and the House will come down on them like a ton of bricks.
If your button was on the home page, then it is a convoluted process as it changes from user to user. I followed the instructions on the T-Mobile help site and still had to go digging to find the setting as the instructions were wrong.
Not all that long ago my mom offered to get me a cell phone on her family plan. She would pay, I'd never have to worry about it. I said I'd rather stick with T-Mobile than go to Verizon. Think about that, I currently pay $80 a month and chose to keep that rather than getting a free Verizon phone.
Now I'm rethinking the offer.
But my big question is this. If I'm paying $80 a month just so I don't have to worry about the data caps (I spent half an hour on the phone with them making damn sure of that), why the hell did I just have to turn off Binge On? I refused their $35 "unlimited" plan specifically so I don't have to worry about "network optimization."
You're not crazy, but a lot of your points have already been dealt with in the physical market.
No, you wouldn't be able to sell outside of your chosen walled garden. It's a game on the Steam platform, why would you expect to sell it to a different platform? Same as selling a used Xbox One game. Why would you expect to sell it to a PS4 user? Different companies will setup their own markets (just like how there are plenty of other markets to sell Steam games), but it'll still be just Steam keys.
The price is, yes, going to be linked to the price of the full game. Just like the current used market. Ain't no one going to buy a used game that cost more than a new one and the seller is always going to want as much money as possible.
Steam credit is possible as a payment method. Game Stop only pays in store credit. But having a system for Pay Pal could easily be set up.
CDs, DVDs, Blue-Rays (while all have DRM) are easy to copy. What's to prevent someone from buying a CD, copying it, and selling it? In all practical sense, nothing but the law. The same would apply with digital goods.
You do make one vary good point, why would anyone buy a new copy if a "used" copy is available? The used copy is identical since the source files are the same. This is something we're going to have to figure out sooner or latter as more and more of the world becomes digital.
Self driving cars are not going to be out there without a licensed driver until it can be proven that they will be able to handle themselves in all situations that are likely to happen on the road. And when the unlikely happens, the fallback will be exactly the same as it is for humans, pull off to the right (or left) side of the road, stop, and call for help.
Figuring out what the likely situations are and accounting for them is exactly what Google and others have been doing for the past several million miles.
Then the erratic humans will not be driving with the flow of traffic, now will they?
If people are driving 10mph above the posted speed limit because it's the flow of traffic then when the flow of traffic drops to the speed limit, then those same people will be driving the speed limit.
That's what it comes down to isn't it. Person A is right and set a line in the sand. Person B is vary wrong and set their line in the sand. If person A steps over their line, they step into the wrong. But person B wants to compromise. Just step a little into the wrong, just a toe, I promise I won't pull you further in.
One would think we would have learned better by now.
Didn't know that, but it helps my point, not yours.
Why does Apple have end to end encryption for their chat service? Think about that for a second, why would they spend that much effort into creating that? Is it to help the criminals stay under the radar? Or maybe because Apple knows that keeping everything in a central repository is a stupid idea.
Your compromise will end up like the 6 strike compromise the ISPs put in place. Utterly worthless yet still being ratcheted up. ISPs should have stood their ground and Google and Apple should as well.
You're confusing two different things. You're talking about local encryption and communication encryption at the same time and getting confused.
Google's chat encryption is not end to end, it's from your PC to the central server and from the other PC to the central server. The government doesn't need to crack encryption to get that information.
Google chat and Apple chat are not secure systems, we all know this.
Local encryption is something else entirely. If I encrypt a file on my phone, say a password list, there is no central server between me and the file. I expect that file to be secure. At least as secure as the software used to encrypt it, not some unrelated, uninterested third party. I expect my communication with my bank to be as secure as the bank, not some unrelated, uninterested third party. Google should not have access to this information.
The government doesn't want access to Google chat, they want access to everything encrypted. Your compromise will never be enough for them because they already have it.