Dangerous: European Courts Considering Requiring Search Engine Filters Over Embarrassing Content
from the bad-bad-ideas dept
You can understand why he'd be upset about such private actions becoming public, but once they're public, then what? Most people would recognize that the best thing to do is to recognize that the information is public, and move on in life, allowing people to gradually stop caring. But not Max Mosley. He seems to have dedicated his life to forcing everyone to take overt actions to make sure that rich and famous people, such as himself, can never be embarrassed again. First, he argued that newspapers should be required to alert famous people before they are written about, allowing the famous people to then use the court to block any stories they dislike. Thankfully, the European Court of Human Rights rejected this request.
That did not stop Mosley, however, who first used the recent "Leveson Inquiry" (a response to the later story of News of the World hacking into phone lines) to push for new rules requiring search engines to delete the photos from ever being found online. And thus began phase two of Mosley's response to the article: he went on a campaign against search engines, believing that if he could somehow force search engines to ignore the photos from that original story, the world might forget about it. Even though, in the Leveson hearing, Mosley admits that he was warned that by taking this issue to trial in the first place, it would renew interest in the issue, including putting such private information into official public court documents:
I mean, when I had my first meeting with counsel, they explained to me very carefully that.... By taking the matter to court, the entire private information which I was complaining about would be rehearsed again in public, with all the press there, with the benefit of absolute privilege for anything that was said, and that at the end of all of that, no judge could remove the private information from the public mind. Indeed, by going to court, I was augmenting the degree to which the public were aware of it.And, yet, Mosley still believes that it's possible to erase such things from the public mind, and the way to do that, obviously, is to filter Google. Thus he began both a legal and publicity campaign arguing that Google must magically filter out the content in question. He's asked by Leveson about how many sites his lawyers have "been able to shut down" and he responds by blaming Google:
It's in the hundreds. My lawyers would probably produce an exact figure. One of the difficulties is that Google have these automatic search machines so if somebody puts something up somewhere, if you Google my name, it will appear. We've been saying to Google, you shouldn't do this, this material is illegal, these pictures have been ruled illegal in the English High Court. They say we're not obliged to police the web and we don't want to police the web, so we have brought proceedings against them in France and Germany where the jurisprudence is favourable. We're also considering bringing proceedings against them in California.And thus began his legal campaign for mandatory search engine filters to block out content that he doesn't like. Yes, one country, the UK, has ruled that the use of those photos in a newspaper story represented a violation of his privacy, but the photos themselves are out there, and in other parts of the world, we have a belief in the freedom of the press. And in discussing the legality of showing the images, it seems that there is a strong journalistic reason to include at least some examples of the images. For example, Gawker reported on the case and quite reasonably included some of the images, including the following:
But the fundamental point is that Google could stop this material appearing, but they don't, or they won't as a matter of principle. My position is that if the search engines -- if somebody were to stop the search engines producing the material, the actual sites don't really matter because without a search engine, nobody will find it, it would be just a few friends of the person who posts it. The really dangerous thing are the search engines.
Furthermore, asking search engines (or anyone, really) to create specific filters to pre-block such content raises all sorts of concerns and consequences. Not only would it do little to hide the actual imagery or make people forget the story in the first place, but it sets a horrifying precedent, allowing people to seek to censor legitimate free expression all for the sake of trying to avoid embarrassment.
For example, if the French or German courts decide to force Google to censor access to the images above, then Google wouldn't just be forced to block and censor the images directly, but various stories that include the images too, such as the Gawker stories above. And those stories aren't about the initial "sex party," but rather the legal issues that were raised after the fact. Trying to silence discussion of the legal issues, such as in this article, starts to go deep into very concerning territory when we're talking about the freedom of the press. You can argue that the original article broke some UK rules, but many of the followup articles are important discussions on a topic of public interest, which news organizations need to be free to pursue.
And where do you stop with such filters? If he actually did get filters required on search engines, as with other injunctions on speech, you can imagine discussion and links quickly moving to social networks. So then what? Mosley goes back to court seeking mandatory filters on social networks like Facebook and Twitter? Anyone who links to or posts the images he does't like gets blocked? Add to this other famous rich people demanding similar filtering of stories, images and videos that they, too, find embarrassing, and you're talking about a complete logistical nightmare of censorship.
In addition, such filters present potential monitoring and data privacy issues, as they require extensive monitoring, rather than mere indexing of information. In fact, the European Court of Justice has already ruled that forcing social networks or search engines to set up automatic filters to catch "illegal" content is actually a violation of existing EU law, requiring way too much of companies' "freedom to conduct business", as well as leading to the blocking of perfectly legal communications. In one case, involving a court that had ordered a filter for Netlog, the EU Court of Justice said the unintended consequences were too great:
Accordingly, such an injunction would result in a serious infringement of Netlog’s freedom to conduct its business since it would require Netlog to install a complicated, costly, permanent computer system at its own expense.Given that, you would have hoped that the courts in France and Germany would have already rejected these lawsuits, and told Mosley that his comments to the Leveson Inquiry committee remain true: the more he continues to bring this up in court, the more attention he, himself, is calling to the story. Perhaps the best thing to do is to let it go, rather than trying to impose a massive, wasteful, unworkable filtering system that would do little to stop people from knowing the story or seeing the pictures, but would have dangerous unintended consequences that impact free expression and privacy.
Moreover, the effects of that injunction would not be limited to Netlog, as the filtering system may also infringe the fundamental rights of its service users - namely their right to protection of their personal data and their freedom to receive or impart information - which are rights safeguarded by the Charter of Fundamental Rights of the European Union. First, the injunction would involve the identification, systematic analysis and processing of information connected with the profiles created on the social network, that information being protected personal data because, in principle, it allows those users to be identified. Second, that injunction could potentially undermine freedom of information, since that system might not distinguish adequately between unlawful content and lawful content, with the result that its introduction could lead to the blocking of lawful communications.