The ruling explicitly excludes forums and social media, and most comment sites.
Secondly, nothing was banned. The ECtHR didn't say that sites were liable for comments. They said that it wasn't a disproportionate interference with a site's freedom of expression in this specific case for this specific site to be liable for these specific comments to the extent they were found liable.
They still relied on a lot of the findings of law of the Estonian Supreme Court; including that the EU's limitations on liability didn't apply (if they did, it would be a different story).
This ruling isn't the end of the world. If it had gone the other way it would have been a great boost to Internet comments etc., but all this ruling does is maintain the status quo, giving national governments the option, in extreme circumstances, of imposing liability on sites for user comments.
The actual text of the ruling is here, for anyone interested.
From what I can tell this only covered sharing between the NSA and GCHQ of stuff from Prism or Upstream. Because the US Government has admitted that these programmes exist, Liberty etc. were able to bring a case in reference to them - unlike the UK's programmes (e.g. Tempora).
GCHQ admitted to having the legal power to access information collected through Prism/Upstream, and it was this power for which the limits on it weren't sufficiently clear. It took this case for them to admit what the limits were and what the legal position was - although they maintain that they haven't ever actually collected data from these programmes.
This is interesting as it means that GCHQ had powers that were unlawful (even if they never used them) despite repeated reassurances from all over that they didn't. And that it was only because of the information provided by Edward Snowden that these have now become legal.
By seeking to reveal unlawful surveillance and data sharing, Snowden has managed ot legalise some of it.
Surely he CAN take a better photo of a monkey than a monkey can, can't he?Probably, but the value in the photo is the idea that it was taken by the monkey. There are any number of good photos of monkeys, but there aren't as many supposedly taken by a monkey. That's what made it famous.
It may depend on the jurisdiction but in some places (e.g. the UK afaik) there isn't really such a thing as the "public domain" in copyright law. It is a term used to describe things that are no longer covered by copyright.
Put simply, copyright property rights are a creation of the law, so they only exist if the law says that they exist. If there is no copyright there isn't anything for anyone - the public, the author, whoever - to own.
The image is registered with the USCO and is a part of a registered Image Rights under Guernsey Ordinance 2012.This stood out for me, as this is one of the first uses I've seen of Guernsey's Image or Personality Rights. A couple of years ago Guernsey (a small island off the coast of Normandy, population around 65,000) made a big deal about being the first place in the world to have a specific "intellectual property right" covering images and personality.
For anyone interested, the full judgment can be found here. As with most legal issues, the situation is slightly more complicated than it may appear.
The right that the child is relying on (technically it is the child, not the ex-wife) is a tort that involves intentionally causing someone psychiatric harm. Intentionally causing someone physical harm has been illegal for a long time, the Victorian-era case referred to made psychiatric harm actionable as well (and has been followed and used since then).
As this was a pre-trial injunction the Court had limited evidence to go on, and had to decide whether the child was likely to succeed at trial - and it found he was, given evidence that the child was likely to read the material (the book is dedicated to him, contains parts addressed to him, is being serialised in a national newspaper, will probably be online, will be referred to in Wikipedia articles etc.), the material was likely to cause the child harm (not the stuff about sexual abuse specifically, but a load of stuff about self-harm), and that the father knew this (there was a clause in the parents' divorce about avoiding harm by disclosing information).
It's a messy case and situation, but the English legal position is generally to stop publication if there's a good chance it will be halted after a trial. The only thing the father (and publisher) lose is time and money - and the child has agreed to compensate them for any financial losses.
Firstly, I'm not sure if CoLP is actually trying to build a case against the DNS provider (although they probably think they could get an extradition request if needed), but more that they think they are being helpful - that DNS providers like EasyDNS don't want to host these sorts of sites and that CoLP is doing them a favour by politely letting them know their customers are evil criminals.
I imagine CoLP are pretty firm in their belief that the people who run the sites are "evil criminal scum" and therefore no one would want to do business with them.
The other possibility relates to the inclusion of the Serious Crime Act in that list of scary laws. I'm not sure I've seen that one included before, but it covers things like "encouraging or assisting an offence believing it will be committed" - which requires that belief, and the friendly CoLP email may go some way to demonstrating that EasyDNS knew offences might happen.
Again, assuming any offences are actually occurring. So far Fact Ltd is something like 1 for 4 in prosecutions against website operators.
The UK Government isn't censoring the story part of this (and it is just the Met police claiming that watching the video is illegal; I'm not sure anyone should trust them to determine what the law is) - footage from this video was all over the front pages today, and it was the top story for most of the newspapers.
Although that could just be a case of one law for the newspapers, one law for everyone else...
The articles aren't going anywhere - the search engines are required to disconnect the article from whatever term they're using that counts as a person's personal information. In theory anything that is of interest to a historian shouldn't be being de-linked.
Also, can we stop blaming the CJEU for this? The ruling is perfectly well-reasoned and it is a little difficult to imagine them ruling the other way without ignoring the law. The problem (to the extent that there is one) is with the underlying law (from the 90s) and how search engines either weren't thought of when it was drafted, or how modern search engines never thought they would have to comply with it when they set up.
There are a lot of terrorism-related offences in the UK. Also quite a few other offences that the police might think covers this. But I would go with ss1-2 Terrorism Act 2006; "Encouragement of terrorism" and "Dissemination of terrorist publications." I don't know if these offences have been tested in cases involving "non-terrorists" just downloading videos, but I imagine that won't stop the police from arresting people if they want to.
Section 3 also includes a notice and takedown procedure for "unlawfully terrorism-related" material.
The UK could do with removing some of its terrorism laws.
I didn't want to go into details because I admit the arguments are fairly subjective; that English+Welsh defamation law was never as bad as it was made out to be, it didn't really put the burden of proof on a defendant any more than any other civil law does (just a very large financial burden on both), and the changes brought in by the Defamation Act 2013 - like this website operators one - were pretty minor.
The big changes were the introduction of a single publication rule, and a presumption of not having a jury trial. The rest was mostly codifying the existing law (just to make things a little more confusing for defamation lawyers for a few years) and adding defences covering very narrow and rare situations - like this website operators one.
For anyone interested, the regulations the quoted article refers to are here, and there is some guidance from the Government here. I seem to remember picking up on this whole "attacking anonymity" thing back in 2012 when the Bill was being debated, pointing out how silly it is.
The entire section is pretty silly as well - the circumstances when it applies are pretty narrow (there are all sorts of other situations when a website operator would be immune), and the way the regulations are drafted there may be situations where an operator could remove a comment, but in order to comply with the section they would have to inform the claimant that they hadn't. The regulations were really badly drafted (with only closed, private consultation).
That said, as far as I know very few website operators knew or cared about them - most major sites have some sort of take-down system already, and defamation claims are so rare that it isn't worth the effort of setting up the automated systems required.
the UK has had terribly draconian defamation laws, that more or less put the burden on the accused to prove what they said wasn't defamation. This was incredibly plaintiff friendly and antagonistic to free speech. The situation was so bad that a whole campaign was mounted to finally update the UK's defamation laws, resulting in a big change that went into effect last yearI may be biased, but I think that almost every statement here is arguably false. But that's another story.
Mosley's really big win over the original newspaper, News of the World, was mostly over the fact that they called it a "Nazi sex party" and he insists that the party wasn't Nazi-themed.If anything it was the other way around; the newspaper's only real defence was that it was Nazi-themed, and therefore in the public interest to report on. The court found (based on the evidence of Mosley and others involved) that it wasn't really anything to do with the Nazis, and thus there was no public interest in reporting the story (never mind running it with pictures and videos).
The Court considered this argument and rejected it on the basis that much of the Internet runs on search engines.
In the original Spanish case the information was on an official government (or government-required?) website. But it was one data entry in thousands (if not millions), and no one would be able to find it unless they happened to go to that page. But because the page was indexed by Google, anyone putting the applicant's name into Google would find the page straight away.
Search engines make finding obscure bits of information (and connecting them up with other data - such as a person's name) really easy; that's their point. But it also means they are particularly important when it comes to data protection.
"the sites that actually contain this "privacy-invading" data ... are apparently immune from the very same existing laws"
Nope. The sites have to follow the law as well. The difference is that in some cases the sites' processing of the data (it is about processing, not containing - search engines do process personal data) may fall within an exception to the rules, which may not apply to the search engine.
But going after Google - in a case where they've provided a handy form - is far easier.
Bing was not a party to the proceedings. Can you show the mechanism? ... I also don't understand what statute could possibly be interpreted this way,... I can't see it applying to Bing or Yahoo...There's a thing called the Data Protection Directive, which requires all EU Governments to introduce a law implementing its provisions across their country - you can read more about the Data Protection Directive, along with some stuff about this new ruling, here.
It applies to any search engine.
The CJEU ruling says that search engines process data, so have to comply with EU data protection rules.
The specific ruling was in reference to a case against Google, which is why the press have focused on them, but it covers any search engine.
CJEU rulings are references; the domestic court asks the CJEU some questions as to interpreting EU law. So while Google was one of the parties to the case, their ruling is about the law, specifically that search engines process personal data, so have to abide by the Data Protection rules.
All search engines are covered by the ruling. But we're only hearing about Google because... well, a cynic would say because what's happened is all PR, with no substance.
To correct you a bit, the BBC articles weren't removed from Google search; they were only removed when connected with the name of the person who had complained (which we think was one of the commenters). If Google did remove the article completely they went way beyond what the law requires of them.
Secondly, it wasn't a UK court ruling, but an EU one; and depending on how you define censorship, it was pro-censorship, but pro-privacy. Although all the court really did was say that search engines weren't immune from the existing laws.
Re:
All the ECtHR did (or can do) was say that it wasn't a disproportionate interference in this specific case for this site to be found liable for these comments in the way it was. And it came to this conclusion based on a whole host of specific factors - including some assumed from the Estonian Supreme Court's decision.