Execs that advocate for automatic systems that can intelligently and consistently identify infringing content really aren't thinking it through. If things like ContentID aren't adequately doing the job (not that they care about the unacceptably high false positive rate) then extra smarts that add better judgement essentially puts them out of a job.
The ability to identify copied work is pretty much the same as being able to identify original work - the entire point of the music industry isn't it?
It would certainly be up a courts to interpret (potentially badly) but the one ultimately in control of real encryption is the person with the password. Since compelling people to divulge passwords has generally been found to be unconstitutional, I don't know if this section really accomplishes anything other than more theater and potential litigation ammunition
Moosehead Brewery will inevitably challenge the use of the pheromone-laden liquid that hunters sprinkle to attract prey. Moose piss is quite expensive, but as Moosehead will show in filings demonstrating potential confusion in the marketplace, their cheap brew is substantially similar!
It's scary that the justification for escalating to more intrusive methods is 'we didn't find out what we wanted to know'. In a sane world, there would be an element of proportionality that weighed the importance of the expected findings against the invasion of privacy.
I'm not sure how that would work. Maybe a neutral third party would _judge_ what sort of action the situation would_warrant_
If your device pings the access point to see if there's a network, that's enough to establish your location even without establishing a connection. By default most wifi enabled devices like to introduce themselves to every router within shouting distance.
Body cameras are far from being universally adopted and in many cases the cameras are imposed on problem departments as a PR move. Cameras adopted as accountability theater sure can't be expected to halt a worsening departmental culture.
Censorship progression: A user anonymously posts a copied article and down goes the site. A user maliciously posts that article in order to take down the site. User-derived content goes away because of the liability.
Someone claims copyright over part or all of a legitimate article, taking down the site while the claim is adjudicated.
Finally, news and commentary posted without attribution out of fear of reprisals is attacked as copyright infringement and nobody can safely lodge a counterclaim of authorship.
So, the possibility of an insecurity that the government can exploit means that a suspected individual has no defense, since they left themselves open - but an insecure chain of evidence doesn't mean the government's effort isn't suspect since the expectation is that their insecurities wouldn't have been exploited? There's some cognitive dissonance in that.
Sure some crimes are so heinous that they require more aggressive investigation, but aggressive used to mean greater manpower and resources, not breaking the rules.
If there's any way the ruling stands (or even if publicity rights end up being exempted from the safe harbor) there goes a big chunk of Facebook's business model. Copyright takedown notices are ever so slightly easier to adjudicate since there's at least some guidelines about what a copyright is and who owns it. Publicity rights can be stretched to include anything that has a picture, or name of ANYONE.
By not insisting they know the details of the crack, that means they were willing to risk the destructible/modification of this all important evidence. Even if they got any useful information, it would've not been useful from a law enforcement perspective since whether they had to turn over the phone or not to unlock it, the chain of evidence would be tainted.
That pretty starkly illustrates their motives in wanting the phone unlocked in the first place. They probably had to pay more to NOT find out the details of the vulnerability since just revealing its existence would lower its market value, and it's likely that they reflexively asked for plenty of safeguards like exclusive ongoing access and complete secrecy.
As law enforcement and intelligence agencies monitor connections between suspects to establish cause for investigation (and membership no fly lists, detention, even assassination) does that mean that access to communications content will lead to exonerations and a greater evidence threshold for government sanctions against individuals? If you're the perfectly innocent cousin of a terrorist suspect and at most you've discussed lasagna recipes you'd be off the hook, no?
I don't know whether Netflix incurs greater costs when customers view more titles or use more data. If watching 5 episodes using the same amount of data as one HD play means that Netflix is on the hook for additional royalties then throttling may be more altruistic than it looks at first blush
The old model: these guys are lone wolves, isolated from society. Watch out for loners and antisocial people. Be afraid
The new model: these guys are organized and positively chatty with fellow travelers over encrypted communications. They're all around you and can be anyone (especially ethnic and religious minorities). Be afraid.
I don't know if European courts work like in the US with respect to the ability of minors to enter into contracts... but I suspect it's at least as limited. So there's really no way for parents to avoid potential liability, since children couldn't even legally give permission. Worse, since most social media platforms require parental authorization (theoretically at least) that shifts the liability for liability for anything kids post to their parents as well.
I'm waiting for the previously heavily redacted document to leak that features a postscript from senior policymakers requesting that tech companies provide the algorithm to generate a 1980s Kelly LeBrock
As bad as her argument is, I'm stuck at the idea of having an Amazon Echo in the bedroom. What does she do when Amazon helpfully ships a stack of bibles or perhaps copies of the George Burns/John Denver classic 'Oh God!' after a particularly vocal performance in bed?
The intelligence community wants backdoors to encryption not to gather more information but to narrow down the enormous feed they are already taking in. They're essentially admitting defeat in sorting through the haystack and assuming that the smaller amount of encrypted communication would be more manageable and still contain actionable intelligence.
That's their wish- reduce the workload. Kind of petty really considering the widespread harm that enacting even the half-assed proposals currently on the table.