There are few burdens on those receiving notices beyond "take it down".
The entire point of these discussions is that this is a massive burden, overstepping first amendment and restrictions on prior restraint. There can be no useful discussion with you until you're willing to address the burden that DMCA takedowns results in for content creators - as opposed to copyright holders or (large) service providers, who I agree seem to have a generally cruisy time under the DMCA.
"So, if the DMCA didn't exist, I doubt we'd see any more lawsuits against users than we do now. We'd only see a lot more lawsuits against service providers."
Incorrect. Without DMCA, service providers would not accept anonymous postings, and if sued, they would quickly give up the user information and involve them in any lawsuit. "Anonymous User sourced content" would not be a valid business model, and with end users properly identified, it's very likely that infringement would drop dramatically because few would want to take the risks.
Without DMCA, wouldn't service providers still be covered by section 230?
Look at the Automatic numbers. 29% have technical errors. 10% are invalid. 61% are valid and are applied. so the fair use cases fit inside the 10% I am assuming as they didn't specifically break it out. When you get rid of the 29% technical error, the numbers for "processed" DMCAs are 85% valid, so the number of fair use cases rejected is relatively small.
The fact that only 15% of technically correct DMCA notices are obviously invalid (in the words of Automattic "clearly false or mistaken.") doesn't mean that the other 85% are correct and valid. Other sitations that aren't ennumerated that could fit into the other 85%:
In an effort to reduce the percentage of threads that are grandfathered by Whatever (and so get some real discussion going)...
An international corporation submitted DMCA notices seeking removal of images of company documents posted by a whistleblower.
Personally, I find that this situation provides an excellent example of why copyright should require mandatory registration (amongst the other reasons). If owning a copyright required registration, then I seriously doubt that the DMCA would ever be available in this kind of situation, and prosecution would (correctly) have to be pursued as a trade secrets violation.
"I respect your right to be a troll, even if I don't agree with you."
Respect. You're doing it wrong.
"When you start to pay attention, you start to understand a couple of weeks work for a few engineers is pretty much what would be needed to apply ANY patch."
Wasn't that couple of weeks work for a few engineers just to get it to the testing phase, and if any issues are discovered then it will take longer? And there is absolutely no way that a "man-month" or more of development is anything close to a simple change, no matter how much you want to think it is (despite admitting to having no domain knowledge).
"As for the caliber of engineer required, considering this isn't "write an OS" but rather "remove or disable a 10 counter" it's likely that the work could be done by a junior - or someone out of the country for that matter. It's not the highest of high end jobs."
"It's not something Apple would want to do, but to comply with the court order, they may have to be dicks with their employees to save their own butts."
Isn't that kind of the whole thrust of this article? The government wants to force Apple to write the code the government demands, at any cost? As has been mentioned in other posts, Apple can't just shut down to avoid the court order like Lavabit, giving the government (effective) eminent domain over Apple.
Consider if Apple was headquartered in Canada - this would be prime grounds for an ISDS dispute as the government directly meddling in the affairs of the company, forcing the company to take steps that will directly and negatively impact the company's revenue. Why is it suddenly ok just because Apple is headquartered in the US?
They are human too and they, like you, will make mistakes or overstep the law from time to time, and likely pay a bigger price than you ever will for it.
Yeah... but no. A law enforcement officer is much less likely to pay ANY price for overstepping the law than some other person, even if they do so regularly and maliciously.
I don't want to give the feds MORE rights, but I also don't want to give the people a way to avoid what has been fair game for the feds for 300 plus years. It's a key point of discussion, you know, balance.
And what exactly has been fair game for the feds for 300 plus years? Detailed location data for a person over days and months? A list of everyone that a person has contacted over days and months? Every little note and shopping list that a person has written?
Encrypting a phone's contents is just like writing notes in a cipher. The police are welcome to try to break it, but there's no guarantee of success. Just like it has been since writing existed.
The balance you say you're asking for is for digital devices to be restricted so they CANNOT be as secure as physical objects... and you've yet to discuss that with anyone, other than saying the same thing over and over and over and over, without acknowledging that anyone has even tried to join the "discussion".
No, it's part of a BAD set of security practices to make up for an incredibly weak link in the chain. Rather than fixing the weak link, they put a band aid over it and give every hacker in the universe a target. They basically shine a very bright light on the weakest part and say "don't come here".
You realise that the weak link is the user, don't you? And the user doesn't want to be fixed.
after of course you learn that rights aren't just yours, they are common to all of us.
Same goes for security. Perhaps you are happy barely using your phone, but you seem overly cavalier in saying that everyone else is being stupid and weak for using their phone in ways that are convenient to them, to do the things they want to do, via the tools that have been marketed to them.
Saying that security should only exist for the skilled is just poor form.
No, I am arguing that your personal data on a digital device should not have MORE protection than a piece of paper, a safety deposit box, or a locked safe. You seem intent on creating a special "it's digital so it's always out of reach" exemption that flies in the face of mor than 300 years of US court rulings on privacy, warrants, and legal searches.
And yet a piece of paper and a locked safe CAN be designed so they are "always out of reach" of the government. So can a whisper in the night. As has been pointed out elsewhere, police investigations have always had to deal with information which is unobtainable. Why should digital security be legislated to be weaker than physical security?
"Poor passcodes is indeed a real issue, and Apple has taken some very good steps to mitigate the issue and try to protect people from themselves. The FBI wants to roll back these steps and reduce the common person's security - you know, that guy that doesn't want a pin code longer than 4 digits (and secretly hates even that)."
Not really true. Nobody is asking Apple to roll out the small changed (disabling the 10 tries limit) to the general public. The guy with the short passcode will be in the same position tomomorrow as he is today, protected not by a fancy one way security chip or logn key encryption, but rather by a simple "counts to 10" blocker. That is a real issue and one I expect Apple to address in the future.
I'm not sure how to say this so you'll get it. Everyone else has tried, and you seem to be wilfully misinterpreting everything to see it your way. Not just get it but disagree, but actively-dance-around-the-edges ignoring the facts.
Fact: 4-digit pin codes are weak security, and Apple protects the user by extra security measures such as slowing down attempts, and locking after 10 failed tries. Fact: Apple phones currently have a security vulnerability - you can update the security configuration without requiring the user to authenticate.
Ok. Say that Apple agrees to create FBiOS and help the FBI hack this one phone. Let's even say that the impossible happens that FBiOS never leaks, or that if it does leak then the signature requirements to install it on a phone are actually sufficient to stop bad guys from accessing the contents of stolen phones.
The entire international hacking community now knows that there is a proof-of-concept working exploit on all current Apple phones. You can absolutely guarantee that teams will be working around the clock to try to replicate the same, or even improve it - who's to say that some group might not find an even better attack, where parts of the key can be leaked from the underlying security chip so they can attack pin codes of any length?
It's ok though, Apple knows about the vulnerability, and the simple fix is to prevent updating the security firmware without authenticating the user. FBiOS becomes useless, and DoJ once again starts stockpiling encrypted phones that it needs help unlocking. What do you think happens next?
Understanding that Apple is trying to paint the issue as something different is to understand why the FBI's request and the court's order isn't far off the mark.
The FBI's request was full of lies and misdirections, and as mentioned in Apple's reply they deliberately misquoted the courts in United States v. Halstead to make the implication that they wanted. If it were a reasonable request, why can't they find an argument that rests on the truth?
Can you point to anywhere where Apple has lied in this case? Drawing slurs from the media circus surrounding the case and using them to attack Apple's behaviour within the case isn't very sportsmanlike.
If I had a wall safe, they could break in, and it would be legal. I don't think an electronic device should have more protection.
If you had a wall safe that they couldn't break into (without destroying the contents), then it's the same thing as a phone that they can't break into.
What's the problem?
Apple's issue is that pincodes are 4 to 6 digits, which is well within the range of a brute force. All the talk of security chip this and encrypted that means nothing if the user passcode is pretty much "1234". That's the real issue.
Poor passcodes is indeed a real issue, and Apple has taken some very good steps to mitigate the issue and try to protect people from themselves. The FBI wants to roll back these steps and reduce the common person's security - you know, that guy that doesn't want a pin code longer than 4 digits (and secretly hates even that).
And yet, despite (correctly) pointing to this as a keystone issue... why do you continue to defend what the FBI is trying to do here?
> Don't believe the hype. Apple has gone over the top on this and they are lying to you outright.
That may be true, and yet it doesn't necessarily follow that the FBI should be granted its request. Don't forget that the FBI is lying outright in this affair as well, find your own path.
> The FBI has not asked Apple to roll out an new OS for everyone.
Correct. The FBI is asking Apple to craft a custom OS with reduced security, and a mechanism by which they can install that OS, bypassing any security measures, onto a targeted phone.
> They haven't told them to put a back door on every phone.
Incorrect-ish. They are asking for every phone, now and in the future, to be vulnerable to this custom OS.
> They are asking for a single device to be made more accessible.
Correct-ish, but not the whole story.
The FBI is asking for the entire ecosystem of iPhone devices, now and in the future, to be vulnerable to a process by which critical security mechanisms on the device can be circumvented.
Sure, the FBI is asking for this for just one phone (twice), but the rest of the law enforcement community is lined up to make this request for just one phone any number of times.
And the odds of the process itself not becoming available to malicious actors (not counting the FBI)? That number decreases to zero over time. I'm not sure how much time that is, but it's probably measured in years not decades.
> The latest overhype from apple is (and I am not kidding) that the FBI wants them to turn on cameras and microphones so they can film and listen to you. Really.
I was of the understanding that this has been possible in a variety of phones for years now, and not just by law enforcement. Regardless, the nature of the request being discussed is bad enough, and this request can be discussed separately if it turns out to be the case.
> Apple has pretty much turned to a turd on this one. They are outrightly being dishonest. Fuck Apple, seriously (and I don't say that often).
Sure. But don't let that set a terrible precedent that will fuck the rest of us, seriously.