Appeals Court Actually Explores 'Good Faith' Issue In A Section 230 Case (Spoiler Alert: It Still Protects Moderation Choices)
from the because-of-course-it-does dept
Over the last couple years of a near constant flow of mis- and disinformation all about Section 230 of the Communications Decency Act, one element that has popped up a lot (including in our comments) especially among angry Trumpists, is that because Section (c)(2)(A) of the law has a “good faith” qualifier, it means that websites that moderate need to show they did so with “good faith.” Many seem to (falsely) assume that this is a big gotcha, and they can get past the 230 immunity barrier by litigating over whether or not a particular moderation choice was done in “good faith.” However, as we’ve explained, only one small part of the law — (c)(2)(A) mentions “good faith.” It’s this part:
No provider or user of an interactive computer service shall be held liable on account of any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected
However, notably, Section (c)(1) — the famous “26 words” — makes no mention of “good faith” and establishes, more broadly, that no website or user of a website should be held liable for someone else’s content. Over the past 25 years, nearly all of the court decisions regarding content moderation have relied on Section (c)(1)’s broad immunity, noting that even moderation (take down) decisions, if litigated, would require holding a website/user liable for the speech of a third party. Therefore, the examination of “good faith” almost never comes up. Separately, there’s a fairly strong argument that courts determining whether or not a moderation decision was done in “good faith” would represent courts interfering with an editorial decision making process — and thus would raise serious 1st Amendment questions.
There is one significant court case that did look at (c)(2) — the case involving Malwarebytes decision to name Enigma Software as malware was not protected by 230. In that case, the 9th Circuit said that because it’s possible that Malwarebytes called Enigma malware for anti-competitive reasons, this would not count as “good faith,” and thus it wouldn’t be protected by 230. Unfortunately, the Supreme Court refused to hear this issue, but that case still lives on and the issue could be revisited otherwise (and, it may eventually show that Malwarebytes also has a 1st Amendment right to express its opinion — even about a competitor).
A few weeks ago there was another case that got a (c)(2) review, over in the 2nd Circuit. That case, Domen v. Vimeo, involved a more typical content moderation type decision. Eric Goldman summed up the story behind the case in his own post about the ruling:
Vimeo is a video hosting service. Domen is a ?former homosexual.? He posted videos to Vimeo that allegedly violated Vimeo?s policy against ?the promotion of sexual orientation change efforts? (SOCE). Vimeo notified Domen of the violation and gave him 24 hours to remove the videos or Vimeo would take action. Domen didn?t remove the videos, so Vimeo subsequently deleted Domen?s account. Domen sued Vimeo for violations of California?s Unruh Act, New York?s Sexual Orientation Non-Discrimination Act, and the California Constitution. The lower court dismissed all of the claims.
Domen appealed, and the 2nd Circuit dismissed the case, focusing on Section 230 (c)(2)(A). It is not clear why it didn’t just use (c)(1) like every other similar case (it mentions that the lower court used both sections, and that Vimeo asked the court to rule on (c)(1) grounds as well, but it does the (c)(2) analysis anyway). Either way, this opens up an opportunity for an appeals court (and a prominent one like the 2nd Circuit) to explore the whole “good faith” question. The court notes, correctly, that 230 is much in the news these days, but says this one is a fairly easy call. Vimeo has every right to enforce its own terms of service in this manner. In summarizing its decision, the court notes:
However, Appellants? conclusory allegations of bad faith do not survive the pleadings stage, especially when examined in the context of Section 230(c)(2). Section 230(c)(2) does not require interactive service providers to use a particular method of content restriction, nor does it mandate perfect enforcement of a platform?s content policies. Indeed, the fundamental purpose of Section 230(c)(2) is to provide platforms like Vimeo with the discretion to identify and remove what they consider objectionable content from their platforms without incurring liability for each decision. Therefore, we AFFIRM the judgment of the district court.
This is good framing. It recognizes that plaintiffs can’t just yell “bad faith!” even if they can show inconsistent moderation practices. In going into detail, the court says that (c)(2) is also a very broad immunity, giving websites the power to set their own rules and policies for what will be removed. Specifically, they say it grants “significant subjective discretion.”
A broad provision, subsection (c)(2) immunizes interactive computer service providers from liability for ?any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected.? 47 11 U.S.C § 230(c)(2). Notably, the provision explicitly provides protection for restricting access to content that providers ?consider . . . objectionable,? even if the material would otherwise be constitutionally protected, granting significant subjective discretion…. Therefore, Vimeo is statutorily entitled to consider SOCE content objectionable and may restrict access to that content as it sees fit.
The court also rejects the plaintiff’s argument that Vimeo could have just deleted the individual videos that it claims violated its policies, rather than shutting down his entire account. But, as the court notes, nothing in 230 requires the use of a scalpel when moderating:
Moreover, the statute does not require providers to use any particular form of restriction. Although Appellants take issue with Vimeo?s deletion of Church United?s entire account as opposed to deleting only those videos promoting SOCE, nothing within the statute or related case law suggests that this took Vimeo?s actions outside of the scope of subsection (c)(2) immunity. Indeed, Vimeo warned Church United that removal of the entire account was exactly what might happen if they ignored the warning. Church United received the warning and did not take the videos down or otherwise allay Vimeo?s concerns. Vimeo was entitled to enforce its internal content policy regarding SOCE and delete Church United?s account without incurring liability.
How about the “good faith” requirement? The court says you have got show more than just that Vimeo treated this plaintiff’s videos differently than some others on the platform. It points to the 9th Circuit’s Malwarebytes decision (and the Zango case that the 9th Circuit heavily cited in Malwarebytes), and says that even if it followed the same reasoning as that case, this is obviously a very different situation. The only reason that case got over the “good faith” hurdle was because it was deemed possibly anti-competitive. Here? Uh, no, this is standard everyday content moderation:
We also agree with the district court that Appellants? allegations that Vimeo acted in bad faith are too conclusory to survive a motion to dismiss under Rule 12(b)(6). Appellants? bases for arguing that Vimeo acted in bad faith are not commensurate with how courts interpret bad faith in this context. Appellants? cited cases do not satisfy their position. In Zango, Inc. v. Kaspersky Lab, Inc., the Ninth Circuit considered whether the defendant?s software?a filter blocking potentially malicious software from users? computers?qualified for Section 230 immunity in the same manner as platforms like YouTube or Facebook…. The Ninth Circuit held that it did…. In Enigma Software Group USA, LLC v. Malwarebytes, Inc., the Ninth Circuit limited the scope of Zango, clarifying that Section 230 ?immunity . . . does not extend to 3 anticompetitive conduct.?…. There, the court reinstated the plaintiff?s Lanham Act claim, which alleged that the defendant?s firewall program improperly filtered out the plaintiff?s rival firewall program, even though the plaintiff?s program posed no actual security threat to users? computers…. The plaintiff alleged that the defendant made ?false and misleading statements to deceive consumers into choosing [the defendant?s] security software over [the plaintiff?s].? … Vimeo?s deletion of Appellants? account was not anti-competitive conduct or self-serving behavior in the name of content regulation. Instead, it was a straightforward consequence of Vimeo?s content policies, which Vimeo communicated to Church United prior to deleting its account.
And, no, just because Domen found other videos on Vimeo that might have also violated its policies, that doesn’t mean that he was treated in bad faith. That’s not how any of this works.
Appellants argue that bad faith is apparent from the fact that other videos relating to homosexuality exist on Vimeo?s website. In support of this, Appellants point to titles of videos that allegedly remain on Vimeo?s website: ?Gay to Straight,? ?Homosexuality is NOT ALLOWED in the QURAN,? ?The Gay Dad,? and ?Happy Pride! LGBTQ Pride Month 2016.?… However, the mere fact that Appellants? account was deleted while other videos and accounts discussing sexual orientation remain available does not mean that Vimeo?s actions were not taken in good faith. It is unclear from only the titles that these videos or their creators promoted SOCE. Moreover, one purpose of Section 230 is to provide interactive computer services with immunity for removing ?some?but not all?offensive material from their websites.? Bennett v. 7 Google, LLC, 882 F.3d 1163, 1166 (D.C. Cir. 2018). Given the massive amount of user-generated content available on interactive platforms, imperfect exercise of content-policing discretion does not, without more, suggest that enforcement of content policies was not done in good faith. See Zeran v. Am. Online, Inc., 129 F.3d 11 327, 331 (4th Cir. 2017) (explaining that ?[t]he amount of information communicated via interactive computer services is . . . staggering? and that Congress passed Section 230 expressly to ?remove disincentives for the development and utilization of blocking and filtering technologies? ….
Appellants chose to ignore Vimeo?s notice of their violation of Vimeo?s content policy, and, as a result, Vimeo deleted their account. By suing Vimeo for this, Appellants run headfirst into the CDA?s immunity provision, which ?allows computer service providers to establish standards of decency without risking liability for doing so.?
That seems pretty nice and clear. As Goldman wrote in his analysis of this ruling:
In the short run, Internet services have a lot to celebrate about this ruling. First, the court revitalizes Section 230(c)(2)(A) as a tool in the defense toolkit, which increases the odds of a successful defense. Second, the court accepts that content moderation will never be perfect, so plaintiffs aren?t going to win simply by pointing out examples of imperfect content moderation. Third, the court grants Section 230(c)(2)(A) on a motion to dismiss, emphasizing that it?s an immunity and not just a safe harbor. This ruling isn?t novel, but a clean and decisive statement from the Second Circuit about Section 230(c)(2)(A) applicability to motions to dismiss will surely encourage future courts to do the same. Fourth, though not explicitly addressed, the court held that Section 230(c)(2)(A) preempted claims that the services had violated anti-discrimination laws?a critical issue given that majority communities are weaponizing anti-discrimination laws to perpetuate their majority status.
All very nice! However, Goldman also warns that all of this good stuff may be wiped away soon via many of the various bills to reform or repeal Section 230. He also notes that some of the statements in the opinion could be twisted in a problematic way. For example (as seen in the quotes above), the court repeatedly mentions that Vimeo gave Domen multiple warnings and even told him which policy he was violating specifically. This might lead some to falsely believe that moderation without those factors is outside the bounds of (c)(2)(A). Goldman also fears that this will lead to more litigation exploring the boundaries of (c)(2)(A) and the definition of “good faith” moderation choices — all of which could have been avoided if the court had just followed the path of many others and dismissed on (c)(1) grounds.
On the whole though, it still seems like a good general ruling and might put to rest some of the myths and nonsense going around about how a bunch of moderation decisions are not done in “good faith” and therefore do not deserve protection.