Appeals Court Actually Explores 'Good Faith' Issue In A Section 230 Case (Spoiler Alert: It Still Protects Moderation Choices)

from the because-of-course-it-does dept

Over the last couple years of a near constant flow of mis- and disinformation all about Section 230 of the Communications Decency Act, one element that has popped up a lot (including in our comments) especially among angry Trumpists, is that because Section (c)(2)(A) of the law has a “good faith” qualifier, it means that websites that moderate need to show they did so with “good faith.” Many seem to (falsely) assume that this is a big gotcha, and they can get past the 230 immunity barrier by litigating over whether or not a particular moderation choice was done in “good faith.” However, as we’ve explained, only one small part of the law — (c)(2)(A) mentions “good faith.” It’s this part:

No provider or user of an interactive computer service shall be held liable on account of any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected

However, notably, Section (c)(1) — the famous “26 words” — makes no mention of “good faith” and establishes, more broadly, that no website or user of a website should be held liable for someone else’s content. Over the past 25 years, nearly all of the court decisions regarding content moderation have relied on Section (c)(1)’s broad immunity, noting that even moderation (take down) decisions, if litigated, would require holding a website/user liable for the speech of a third party. Therefore, the examination of “good faith” almost never comes up. Separately, there’s a fairly strong argument that courts determining whether or not a moderation decision was done in “good faith” would represent courts interfering with an editorial decision making process — and thus would raise serious 1st Amendment questions.

There is one significant court case that did look at (c)(2) — the case involving Malwarebytes decision to name Enigma Software as malware was not protected by 230. In that case, the 9th Circuit said that because it’s possible that Malwarebytes called Enigma malware for anti-competitive reasons, this would not count as “good faith,” and thus it wouldn’t be protected by 230. Unfortunately, the Supreme Court refused to hear this issue, but that case still lives on and the issue could be revisited otherwise (and, it may eventually show that Malwarebytes also has a 1st Amendment right to express its opinion — even about a competitor).

A few weeks ago there was another case that got a (c)(2) review, over in the 2nd Circuit. That case, Domen v. Vimeo, involved a more typical content moderation type decision. Eric Goldman summed up the story behind the case in his own post about the ruling:

Vimeo is a video hosting service. Domen is a ?former homosexual.? He posted videos to Vimeo that allegedly violated Vimeo?s policy against ?the promotion of sexual orientation change efforts? (SOCE). Vimeo notified Domen of the violation and gave him 24 hours to remove the videos or Vimeo would take action. Domen didn?t remove the videos, so Vimeo subsequently deleted Domen?s account. Domen sued Vimeo for violations of California?s Unruh Act, New York?s Sexual Orientation Non-Discrimination Act, and the California Constitution. The lower court dismissed all of the claims.

Domen appealed, and the 2nd Circuit dismissed the case, focusing on Section 230 (c)(2)(A). It is not clear why it didn’t just use (c)(1) like every other similar case (it mentions that the lower court used both sections, and that Vimeo asked the court to rule on (c)(1) grounds as well, but it does the (c)(2) analysis anyway). Either way, this opens up an opportunity for an appeals court (and a prominent one like the 2nd Circuit) to explore the whole “good faith” question. The court notes, correctly, that 230 is much in the news these days, but says this one is a fairly easy call. Vimeo has every right to enforce its own terms of service in this manner. In summarizing its decision, the court notes:

However, Appellants? conclusory allegations of bad faith do not survive the pleadings stage, especially when examined in the context of Section 230(c)(2). Section 230(c)(2) does not require interactive service providers to use a particular method of content restriction, nor does it mandate perfect enforcement of a platform?s content policies. Indeed, the fundamental purpose of Section 230(c)(2) is to provide platforms like Vimeo with the discretion to identify and remove what they consider objectionable content from their platforms without incurring liability for each decision. Therefore, we AFFIRM the judgment of the district court.

This is good framing. It recognizes that plaintiffs can’t just yell “bad faith!” even if they can show inconsistent moderation practices. In going into detail, the court says that (c)(2) is also a very broad immunity, giving websites the power to set their own rules and policies for what will be removed. Specifically, they say it grants “significant subjective discretion.”

A broad provision, subsection (c)(2) immunizes interactive computer service providers from liability for ?any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected.? 47 11 U.S.C § 230(c)(2). Notably, the provision explicitly provides protection for restricting access to content that providers ?consider[] . . . objectionable,? even if the material would otherwise be constitutionally protected, granting significant subjective discretion…. Therefore, Vimeo is statutorily entitled to consider SOCE content objectionable and may restrict access to that content as it sees fit.

The court also rejects the plaintiff’s argument that Vimeo could have just deleted the individual videos that it claims violated its policies, rather than shutting down his entire account. But, as the court notes, nothing in 230 requires the use of a scalpel when moderating:

Moreover, the statute does not require providers to use any particular form of restriction. Although Appellants take issue with Vimeo?s deletion of Church United?s entire account as opposed to deleting only those videos promoting SOCE, nothing within the statute or related case law suggests that this took Vimeo?s actions outside of the scope of subsection (c)(2) immunity. Indeed, Vimeo warned Church United that removal of the entire account was exactly what might happen if they ignored the warning. Church United received the warning and did not take the videos down or otherwise allay Vimeo?s concerns. Vimeo was entitled to enforce its internal content policy regarding SOCE and delete Church United?s account without incurring liability.

How about the “good faith” requirement? The court says you have got show more than just that Vimeo treated this plaintiff’s videos differently than some others on the platform. It points to the 9th Circuit’s Malwarebytes decision (and the Zango case that the 9th Circuit heavily cited in Malwarebytes), and says that even if it followed the same reasoning as that case, this is obviously a very different situation. The only reason that case got over the “good faith” hurdle was because it was deemed possibly anti-competitive. Here? Uh, no, this is standard everyday content moderation:

We also agree with the district court that Appellants? allegations that Vimeo acted in bad faith are too conclusory to survive a motion to dismiss under Rule 12(b)(6). Appellants? bases for arguing that Vimeo acted in bad faith are not commensurate with how courts interpret bad faith in this context. Appellants? cited cases do not satisfy their position. In Zango, Inc. v. Kaspersky Lab, Inc., the Ninth Circuit considered whether the defendant?s software?a filter blocking potentially malicious software from users? computers?qualified for Section 230 immunity in the same manner as platforms like YouTube or Facebook…. The Ninth Circuit held that it did…. In Enigma Software Group USA, LLC v. Malwarebytes, Inc., the Ninth Circuit limited the scope of Zango, clarifying that Section 230 ?immunity . . . does not extend to 3 anticompetitive conduct.?…. There, the court reinstated the plaintiff?s Lanham Act claim, which alleged that the defendant?s firewall program improperly filtered out the plaintiff?s rival firewall program, even though the plaintiff?s program posed no actual security threat to users? computers…. The plaintiff alleged that the defendant made ?false and misleading statements to deceive consumers into choosing [the defendant?s] security software over [the plaintiff?s].? … Vimeo?s deletion of Appellants? account was not anti-competitive conduct or self-serving behavior in the name of content regulation. Instead, it was a straightforward consequence of Vimeo?s content policies, which Vimeo communicated to Church United prior to deleting its account.

And, no, just because Domen found other videos on Vimeo that might have also violated its policies, that doesn’t mean that he was treated in bad faith. That’s not how any of this works.

Appellants argue that bad faith is apparent from the fact that other videos relating to homosexuality exist on Vimeo?s website. In support of this, Appellants point to titles of videos that allegedly remain on Vimeo?s website: ?Gay to Straight,? ?Homosexuality is NOT ALLOWED in the QURAN,? ?The Gay Dad,? and ?Happy Pride! LGBTQ Pride Month 2016.?… However, the mere fact that Appellants? account was deleted while other videos and accounts discussing sexual orientation remain available does not mean that Vimeo?s actions were not taken in good faith. It is unclear from only the titles that these videos or their creators promoted SOCE. Moreover, one purpose of Section 230 is to provide interactive computer services with immunity for removing ?some?but not all?offensive material from their websites.? Bennett v. 7 Google, LLC, 882 F.3d 1163, 1166 (D.C. Cir. 2018). Given the massive amount of user-generated content available on interactive platforms, imperfect exercise of content-policing discretion does not, without more, suggest that enforcement of content policies was not done in good faith. See Zeran v. Am. Online, Inc., 129 F.3d 11 327, 331 (4th Cir. 2017) (explaining that ?[t]he amount of information communicated via interactive computer services is . . . staggering? and that Congress passed Section 230 expressly to ?remove disincentives for the development and utilization of blocking and filtering technologies? ….

In summary:

Appellants chose to ignore Vimeo?s notice of their violation of Vimeo?s content policy, and, as a result, Vimeo deleted their account. By suing Vimeo for this, Appellants run headfirst into the CDA?s immunity provision, which ?allows computer service providers to establish standards of decency without risking liability for doing so.?

That seems pretty nice and clear. As Goldman wrote in his analysis of this ruling:

In the short run, Internet services have a lot to celebrate about this ruling. First, the court revitalizes Section 230(c)(2)(A) as a tool in the defense toolkit, which increases the odds of a successful defense. Second, the court accepts that content moderation will never be perfect, so plaintiffs aren?t going to win simply by pointing out examples of imperfect content moderation. Third, the court grants Section 230(c)(2)(A) on a motion to dismiss, emphasizing that it?s an immunity and not just a safe harbor. This ruling isn?t novel, but a clean and decisive statement from the Second Circuit about Section 230(c)(2)(A) applicability to motions to dismiss will surely encourage future courts to do the same. Fourth, though not explicitly addressed, the court held that Section 230(c)(2)(A) preempted claims that the services had violated anti-discrimination laws?a critical issue given that majority communities are weaponizing anti-discrimination laws to perpetuate their majority status.

All very nice! However, Goldman also warns that all of this good stuff may be wiped away soon via many of the various bills to reform or repeal Section 230. He also notes that some of the statements in the opinion could be twisted in a problematic way. For example (as seen in the quotes above), the court repeatedly mentions that Vimeo gave Domen multiple warnings and even told him which policy he was violating specifically. This might lead some to falsely believe that moderation without those factors is outside the bounds of (c)(2)(A). Goldman also fears that this will lead to more litigation exploring the boundaries of (c)(2)(A) and the definition of “good faith” moderation choices — all of which could have been avoided if the court had just followed the path of many others and dismissed on (c)(1) grounds.

On the whole though, it still seems like a good general ruling and might put to rest some of the myths and nonsense going around about how a bunch of moderation decisions are not done in “good faith” and therefore do not deserve protection.

Filed Under: , , ,
Companies: vimeo

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Appeals Court Actually Explores 'Good Faith' Issue In A Section 230 Case (Spoiler Alert: It Still Protects Moderation Choices)”

Subscribe: RSS Leave a comment
45 Comments
This comment has been deemed insightful by the community.
That Anonymous Coward (profile) says:

I'm not broken, I don't need to be fixed.

Poor discriminated against thing.

Given the huge number of high profile ex-gays who end up being teh gay again, I am thinking perhaps its not a choice, like joining a bigoted hateful group calling themselves a church is, but just the way we are.

All this money wasted on a pointless lawsuit, something something help the poor & needy??

Good Faith… saying my skyfriend thinks you are broken & can change to make me feel better… seems like shitty faith to me.

But then I’m an immortal so I’m used to being the embodiment of evil from small minded folks, they judge me & boy howdy do they get pissed when I judge them right back.
None of the religions I founded every abused kids & covered it up.

This comment has been flagged by the community. Click here to show it.

Koby (profile) says:

It is not clear why it didn’t just use (c)(1) like every other similar case

c(1) offers protection when someone is trying to hold the platform liable for the speech of someone else. Vimeo was not being sued for being a platform for some else’s speech. Rather, they were accused of engaging in discriminatory moderation deciscions. (c)1 offers no protection in this case, so they instead needed c(2).

This comment has been deemed insightful by the community.
Anonymous Coward says:

Re: Re:

Odd, then, that the district court performed (c)(1) analysis, and judged that Vimeo WAS covered under (c)(1). (See Domen v Vimeo, both section II(A) (pg 9) and the in depth analysis at section II (B) (1) (a).

The Court finds that Plaintiffs are seeking to hold Vimeo liable for actions it took as a “publisher,” and therefore that Vimeo is entitled to immunity under Section 230(c)(1) of the CDA.

The appeals court, otoh, merely said, "Immune under (c)(2). We’re not going to analyze on (c)(1)."

This comment has been deemed insightful by the community.
Mike Masnick (profile) says:

Re: Re:

c(1) offers protection when someone is trying to hold the platform liable for the speech of someone else. Vimeo was not being sued for being a platform for some else’s speech. Rather, they were accused of engaging in discriminatory moderation deciscions. (c)1 offers no protection in this case, so they instead needed c(2).

As stated above — and explained to you directly many times before — the courts have recognized that holding a website liable for what content they took down is also holding them liable for third party content — in this case, the removal of 3rd party content. And thus, pretty much every court has ruled that content removals are also protected by (c)(1). As stated.

This comment has been flagged by the community. Click here to show it.

Koby (profile) says:

Re: Re: Re:

the courts have recognized that holding a website liable for what content they took down is also holding them liable for third party content

The plaintiff is the first party, the defendant is the second party. No third party is involved. If courts are finding otherwise, then they’ve artificially created quite an immunity loophole that allows discrimination by any platform, simply by claiming "moderation decision".

This comment has been deemed insightful by the community.
PaulT (profile) says:

Re: Re: Re: Re:

You don’t get to magically change responsibility because you as the plaintiff decided to name a 3rd party bystander as a defendant.

In the real world everyone else operates in, the person who claims to have been offended is the first party, the person who offended them is the second party. The person on whose property they happened to be standing when they did so is a 3rd party and should not be held responsible for the actions of other people.

This is very simple, and it’s always very telling that you have to invent new rules in order to claim that liability should lie somewhere other than where the law and basic logic say they should.

This comment has been deemed insightful by the community.
R.H. (profile) says:

Re: Re: Re: Re:

As far as section 230 is concerned, the phrases "first-party" and "third-party" don’t actually exist. They’re simply used to make it easier to understand who is responsible for what speech. Here’s the text of (c)(1) again:

"No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider." – 47 USC § 230 (c)(1)

This simply means that no one is responsible for the speech of anyone else on the internet and since no service can be considered a publisher of content that they allowed to be posted on their site, there aren’t any penalties for removing said content.

So, to speak on your conclusion, yes any (non-governmental) platform can discriminate against whatever speech they like. If some white supremacist site blocks speech calling for racial equality, they can do that. If a left-leaning blog wants to ban conservatives, they can do that too. This is a feature not a bug. The goal was to allow different spaces for different points of view to flourish on the internet.

This comment has been flagged by the community. Click here to show it.

Koby (profile) says:

Re: Re: Re:2 Re:

As far as section 230 is concerned, the phrases "first-party" and "third-party" don’t actually exist.

Correct. But, by extension, Mike’s 3rd party argument no longer holds water either. So we return to the plain reading of c(1), which is that it offers no protection if a 3rd party is not involved.

since no service can be considered a publisher of content that they allowed to be posted on their site, there aren’t any penalties for removing said content.

You have listed a c(2) protection, not a c(1), which means that it comes with conditions.

This comment has been deemed insightful by the community.
nasch (profile) says:

Re: Re: Re:3 Re:

Koby, you’re misinterpreting c(1) directly below a comment that quoted it. Nobody is going to fall for your obvious lie. And if even if you weren’t lying about that, you’re talking about a court case that clearly said the good faith requirement in c(2) is no problem here. Vimeo easily met the conditions for c(2) protection because "good faith" doesn’t mean "doesn’t hurt Koby’s feelings".

Scary Devil Monastery (profile) says:

Re: Re: Re:5 Re:

"Leave it to a troll like Koby to try to gaslight about content that’s still perfectly visible on the very same page as his lies."

He more or less has to do that, to be fair. His end goal every time freedom of speech is discussed is to find a way to twist it so it means bar and platform owners aren’t allowed to evict his friends from Stormfront for heiling and screaming the N-word all over the dance floor or forum.

And that means he has no choice but to lie, gaslight, spin every fact into the absolute opposite, because every time truth, logic and common sense get a say, his arguments get sunk.

This comment has been flagged by the community. Click here to show it.

Anonymous Coward says:

Funny how everyone who disagrees with you is a “Trumpist.” I mean gosh, you’d think that adults could disagree without it being a binary, partisan issue. I wonder if G. Greenwald is also a “Trumpist,” considering he disagrees with you on most issues. Oh, that’s right, he’s on the far left. Do better, Techdirt; stop being a part of the political divide and report the facts in a non-partisan way. I’m hardly a Trump supporter, but it still gets tiring to see your transparent agenda stamped on every article.

This comment has been deemed insightful by the community.
Mike Masnick (profile) says:

Re: Re:

Funny how everyone who disagrees with you is a “Trumpist.”

I have never said any such thing, nor do I believe any such thing. In the article above, I observed — accurately — that Trumpists seemed more focused on the "good faith" line, which they misread and misunderstand.

I am sorry that being accurate upsets your delicate snowflake sensibilities, but perhaps you’d prefer a site that only caters to your whining victimhood.

This comment has been deemed insightful by the community.
Anonymous Coward says:

A good point. The co-authors of this section crafted well.

Of course, in the end it all flows from First Amendment rights. But section 230 makes it a lot less cheaper for individuals and small organizations to defend their First Amendment rights against rich vexatious litigants

This comment has been flagged by the community. Click here to show it.

This comment has been flagged by the community. Click here to show it.

This comment has been flagged by the community. Click here to show it.

Anonymous Coward says:

Your cheering lasts until turn gloomy again when get to THIS:

Goldman also fears that this will lead to more litigation exploring the boundaries of (c)(2)(A) and the definition of "good faith" moderation choices

Of course you and Goldman prefer to just dodge the question because it’s inevitably a LOSS if good evidence can be found:

This comment has been flagged by the community. Click here to show it.

This comment has been flagged by the community. Click here to show it.

This comment has been flagged by the community. Click here to show it.

Anonymous Coward says:

Re: Where's YOUR "good faith", Maz? WORST time ever to

ONLY your BAD FAITH in not keeping to the "form contract" of open and honest "free speech" forum causes the hiding here, Maz. You set the "form contract", wrote the code, and set the policy, it’s not "the community" with a "voting system", it’s YOU censoring. — IF I could get you into court, there’s plenty evidence of YOUR BAD FAITH.

This comment has been flagged by the community. Click here to show it.

Buck 'Eyes Forward' Liddell says:

Had to go "AC" to get in. Worst blocking, ever.

I explored TOR addresses, cookies, and just simply repeating, without result.

It’s exactly as though Maz doesn’t want any dissent here, ’cause his notions are so fragile.

Like the court case: Maz just DODGES away from key topic to an aspect that he likes better; as Goldman says.

This comment has been flagged by the community. Click here to show it.

This comment has been flagged by the community. Click here to show it.

This comment has been flagged by the community. Click here to show it.

This comment has been deemed insightful by the community.
PaulT (profile) says:

Re: Malwarebytes surprising

"no, Section 230 does not protect you from being sued for defrauding investors"

It doesn’t. Section 230 does not protect you from liability anything you did yourself, only from what other people did.

So, if you defraud investors, you are liable. If someone else pastes a poster to a wall you own that’s intended to defraud investors, you go after the person who pasted the poster, not the owner of the wall. This is accepted logic everywhere else in the world, but in the US for some reason they needed section 230 to remind everyone that this is how it should work.

This comment has been deemed insightful by the community.
That One Guy (profile) says:

Re: Re: Malwarebytes surprising

This is accepted logic everywhere else in the world, but in the US for some reason they needed section 230 to remind everyone that this is how it should work.

It’s actually even crazier than that as outside of greedy schmucks engaging in Steve Dallas lawsuits even in the US it’s understood that you go after the speaker, not the owner of the property they spoke on, everywhere but online. 230 merely makes clear that the same rules that are widely accepted to apply offline also apply online, because people are stupid and that needed to be explicitly spelled out.

Anonymous Coward says:

Re: Re: Malwarebytes surprising

I phrased that a bit poorly – I meant "you cannot just claim a crime falls under moderation to claim protection". It would be pretty damn hard to commit crimes in moderation but I am sure some sort of conspiracy could manage it theoretically (even if it would be nigh-unenforcible) as furtherence of a conspiracy makes otherwise legal actions not so.

There isn’t a law against leaving a passcode to your warehouse on a post it note on the outer door, nor filing an insurance claim after your warehouse was burgled but agreeing to do so to assist a burglar for money and then claiming insurance would be a criminal conspiracy and insurance fraud.

I was thinking "use selective editing of a forum to do a pump and dump in a way clearly beyond just personally buying in on the hype" as an example but didn’t elaborate upon it properly.

This comment has been flagged by the community. Click here to show it.

Major CH Utney of the Bengal Spicers says:

Follow up, an actual view of "good faith" that MM hates:

On March 25, 2021, defendant Felipe Garcia filed a notice with the court indicating that he would like to rely on a so-called "advice-of-counsel" defense, noting that he had obtained `expert’ advice at some point and had therefore acted in good faith.

https://torrentfreak.com/two-las-vegas-men-plead-guilty-in-u-s-criminal-streaming-piracy-case-191214/

This comment has been flagged by the community. Click here to show it.

Major CH Utney of the Bengal Spicers says:

Re: Follow up, an actual view of "good faith" that MM

WAIT, WHAT? Masnick says that "good faith" doesn’t matter in court! Judges don’t even consider it! — NO, that’s Masnick LYING. He HATES the "in good faith" requirement of CDA S230, so much that in trying to win an argument with me, Maz not only re-worded statute to his liking (link below), but in follow up admitted that he had manually removed the characters, and also explicitly stated that "good faith" is not even considered by lawyers!

Mike Masnick (profile) says:

Re: Re: Follow up, an actual view of "good faith" that

Uh, no dude. The case you’re linking to is not a 230 case. The "good faith" being relied on there is something totally different.

I never said "good faith" never matters. I said it never matters in the content moderation cases you insist it does matter in.

My goodness are you ever stupid. You can’t even keep straight what law we’re talking about.

This comment has been flagged by the community. Click here to show it.

Major CH Utney of the Bengal Spicers says:

Re: Follow up, an actual view of "good faith" that MM

Yet this lying little twerp presents himself as a legal expert.

Here again is where Masnick literally removed the very characters of "in good faith" while pretending to quote black-letter law of the statute:

https://www.techdirt.com/articles/20190201/00025041506/us-newspapers-now-salivating-over-bringing-google-snippet-tax-stateside.shtml#c530

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...