New Hampshire Supreme Court Issues Very Weird Ruling Regarding Section 230
from the but-that-makes-no-sense dept
In New Hampshire, Facebook has been dealing with a pro se lawsuit from the operator of a cafe, whose Instagram account was deleted for some sort of terms of service violation (it is never made clear what the violation was, and that seems to be part of the complaint). The Teatotaller cafe in Somerset, New Hampshire, apparently had and lost an Instagram account. The cafe’s owner, Emmett Soldati first went to a small claims court, arguing that this violated his “contract” with Instagram, and cost his cafe revenue. There are all sorts of problems with that, starting with the fact that Instagram’s terms of service, like every such site, say they can remove you for basically any reason, and specifically says:
And then there’s the Section 230 issue. Section 230 should have wiped the case out nice and quick, as it has in every other case involving a social media account owner getting annoyed at being moderated. And, indeed, it appears that the local court in Dover tossed the case on 230 grounds. Soldati appealed, and somewhat bizarrely, the New Hampshire Supreme Court has overturned that ruling and sent it back to the lower court. That doesn’t mean that Facebook will definitely lose, but the ruling is quite remarkable, and an extreme outlier compared to basically every other Section 230 case. It almost reads as if the judges wanted this particular outcome, and then twisted everything they could think of to get there.
To be clear, the judges who heard the case are clearly well informed on Section 230, as they cite many of the key cases in the ruling. It says that to be protected by Section 230(c)(1) (the famed “26 words” which say a website can’t be held liable for the actions of its users), there’s a “three pronged” test. The website has to be an interactive computer service — which Facebook clearly is. The plaintiff has to be an information content provider, which Teatotaller clearly is. That leaves the last bit: does the lawsuit seek to hold Facebook liable as a publisher or speaker.
Let’s take a little journey first. One of the things that often confuses people about Section 230 is the interplay between (c)(1) and (c)(2) of the law. (c)(1) is the websites not liable for their users’ content part, and (c)(2) is the no liability for any good faith moderation decisions part. But here’s the weird thing: in over two decades of litigating Section 230, nearly every time moderation decisions are litigated, the website is considered protected under (c)(1) for those moderation decisions. This used to strike me as weird, because you have (c)(2) sitting right there saying no liability for moderation. But, as many lawyers have explained it, it kinda makes sense. (c)(1)’s language is just cleaner, and courts have reasonably interpreted things to say that holding a company liable for its moderation choices is the same thing as holding it liable as the “publisher.”
So, in this case (as in many such cases), Facebook didn’t even raise the (c)(2) issue, and stuck with (c)(1), assuming that like in every other case, that would suffice. Except… this time it didn’t. Or at least not yet. But the reason it didn’t… is… weird. It basically misinterprets one old Section 230 case in the 9th Circuit, the somewhat infamous Barnes v. Yahoo case. That was the case where the court said that Yahoo lost its Section 230 protections because Barnes had called up Yahoo and the employee she spoke to promised to her that she would “take care of” the issue that Barnes was complaining about. The court there said that thanks to “promissory estopel,” this promise overrode the Section 230 liabilities. In short: when the company employee promised to do something, they were forming a new contract.
Barnes is one of the most frequently cited case by people trying to get around Section 230, and it almost never works, because companies know better than to make promises like the one that happened in the Barnes case. Except here, the judges say that the terms of service themselves may be that promise, and thus it can be read as the terms of service overrule Section 230:
This is not a total win for Teatotaller, as the court basically says there isn’t enough information to know whether the claims are based on promises within the terms of service, or if it’s based on Facebook’s decision to remove the account (in which case, Facebook would be protected by 230). And thus, it remands the case to try to sort that out:
Thus, because it is not clear on the face of Teatotaller?s complaint and objection whether prong two of the CDA immunity test is met, we conclude that the trial court erred by dismissing Teatotaller?s breach of contract claim on such grounds. See Pirozzi, 913 F. Supp. 2d at 849. We simply cannot determine based upon the pleadings at this stage in the proceeding whether Facebook is immune from liability under section 230(c)(1) of the CDA on Teatotaller?s breach of contract claim. See id. For all of the above reasons, therefore, although Teatotaller?s breach of contract claim may ultimately fail, either on the merits or under the CDA, we hold that dismissal of the claim is not warranted at this time.
So, there are still big reasons why this case against Facebook is likely to fail. On remand, the court may recognize that the issue is just straight up moderation and dismiss again on 230 grounds. Or, it may say that it’s based on the terms of service and yet still decide that nothing Facebook did violated those terms. Facebook is thus likely to prevail in the long run.
But… this ruling opens up a huge potential hole in Section 230 (in New Hampshire at least), saying that what you put into your terms of service could, in some cases, overrule Section 230, leading you to have to defend whether or not your moderation decision somehow violated your terms.
That sound you hear is very, very expensive lawyers now combing through terms of service on every dang platform out there to figure out (1) how to shore them up to avoid this problem as much as possible, or (2) how to start filing a bunch of sketchy lawsuits in New Hampshire to exploit this new loophole.
Meanwhile, Soldati seems to be celebrating a bit prematurely:
?I think it?s kind of incredible,? said Soldati, who represented himself as a pro se litigant. ?I think this is a very powerful message that if you feel a tech company has trampled or abused your rights and you don?t feel anyone is listening … you can seek justice and it will matter.?
That’s… not quite the issue at hand. Your rights weren’t trampled. Your account was shut down. That’s all. But in fighting this case, there may be a very dangerous hole now punched into Section 230, at least in New Hampshire, and it could create a ton of nuisance litigation. And, that even puts business owners like Soldati at risk. 230 protects him and the comments people make on his (new) Instagram account. But if he promises something… he may wipe out those protections.