NY Judge Laments The Lack Of A 'Right To Be Forgotten'; Suggests New Laws Fix That
from the the-first-amendment-would-be-a-problem dept
A NY state judge, Milton Tingle, has apparently decided that Europe’s troubling right to be forgotten concept should be imported into the US (possible registration/paywall). The case he was dealing with — the rather impressively an vaguely named “Anonymous v. Anonymous Jane Does” — touches on an issue we’ve discussed for many years. What happens when you have defamatory content posted to a site, where the site is protected by Section 230 of the CDA, and the original posters cannot be found.
In those cases, as we’ve noted, there may be no effective remedy for the defamatory speech. The site cannot be forced to take it down, because if they’re just a platform, they have no liability for someone else’s speech. And since the person or people who are responsible can’t be found, not much can be done. That appears to be the situation in this case:
Claiming they were prostitutes, anonymous commentators with the handles “JennaVixen,” “Emma NYC Escort” and “Anonymous,” posted opinions about the plaintiff’s sexual habits.
One of the commentators said he or she also was “an ex-employee owed money who is suing [the plaintiff].”
The plaintiff sued in 2013, claiming the comments were defamatory per se. He said he never engaged in the sexual activity described, nor was he an employer who failed to pay employees.
Since the commenters were anonymous, and there was no way to track them down, the judge initially allowed the commenters to be “served” by posting the summons on the same site where the comments were made, Dirtyphonebook.com. Not surprisingly, posting the summons on the site didn’t make the commenters show up in court (whether or not they even saw the summons). Thus, the plaintiff won in a default judgment. But, again, nothing specifically could then be done — which the judge appears to understand. However, he’s troubled by this lack of a remedy, and appears to use the opportunity to muse on importing the “right to be forgotten” in such cases:
Though it was not within the court’s authority to create laws, Tingling said he could offer suggestions to the Legislature.
One thought, he said, was to take up a rule akin to the “right to be forgotten” in the European Union Court’s 2014 case, Google Spain v. Agencia Espanola de Proteccion de Datos, Case C-131/12.
There, the court said individuals had the right under certain circumstances to request that search engines remove links to personal information that was inaccurate, inadequate, irrelevant or excessive for data processing purposes.
“Thus, the right to be forgotten offers greater protections than §230 of the [Communications Decency Act],” said Tingling.
Unfortunately, this is a dangerous approach to the situation and likely goes too far. First, even the lawyer for the plaintiff notes that this ruling gave his client “exactly what we needed” — which is a court ruling that the content is defamatory. It is highly likely that a copy of that ruling is being sent off to various search engines who may choose to remove it based on that court ruling. There need not be a “right to be forgotten” which would be much broader and have many unintended consequences, potentially harming the First Amendment. There is no need to break the important protections brought forth by Section 230.
Instead, the judge has ruled that the content is defamatory, and the plaintiff and his lawyer have the ability to use that to request various sites take action in response and it is likely that many will follow through in doing so.
Of course, even that situation can lead to suppression of speech. Note that the case described above is not all that different from one we recently described involving Roca Labs suing anonymous commenters, where it seems fairly clear that Roca isn’t looking to identify the commenters, but rather to get a default, in order to pressure Google into removing those negative reviews. This is why a legal change, a la the one suggested by Judge Tingle, would be so problematic. As it stands now, companies like Google can look at incoming requests showing legal rulings on defamatory content and decide for themselves whether or not the content should be removed from their index. Changing the law would open up another avenue for censorship through questionable means.