Dear Supreme Court: Judicial Curtailing Of Section 230 Will Make The Internet Worse

from the do-or-die-moment-for-the-Internet dept

Every amicus brief the Copia Institute has filed has been important. But the brief filed today is one where all the marbles are at stake. Up before the Supreme Court is Gonzalez v. Google, a case that puts Section 230 squarely in the sights of the Court, including its justices who have previously expressed serious misunderstandings about the operation and merit of the law.

As we wrote in this brief, the Internet depends on Section 230 remaining the intentionally broad law it was drafted to be, applying to all sorts of platforms and services that make the Internet work. On this brief the Copia Institute was joined by Engine Advocacy, speaking on behalf of the startup community, which depends on Section 230 to build companies able to provide online services, and Chris Riley, an individual person running a Mastodon server who most definitely needs Section 230 to make it possible for him to provide that Twitter alternative to other people. There seems to be this pervasive misconception that the Internet begins and ends with the platforms and services provided by “big tech” companies like Google. In reality, the provision of platform services is a profoundly human endeavor that needs protecting in order to be sustained, and we wrote this brief to highlight how personal Section 230’s protection really is.

Because ultimately without Section 230 every provider would be in jeopardy every time they helped facilitate online speech and every time they moderated it, even though both activities are what the Internet-using public needs platforms and services to do, even though they are what Congress intended to encourage platforms and services to do, and even though the First Amendment gives them the right to do them. Section 230 is what makes it possible at a practical level for them to them by taking away the risk of liability arising from how they do.

This case risks curtailing that critical statutory protection by inventing the notion pressed by the plaintiffs that if a platform uses an algorithmic tool to serve curated content, it somehow amounts to having created that content, which would put the activity beyond the protection of Section 230 as it only applies to when platforms intermediate content created by others and not content created by themselves. But this argument reflects a dubious read of the statute, and one that would largely obviate Section 230’s protection altogether by allowing liability to accrue as a result of some quality in the content created by another, which is exactly what Section 230 is designed to forestall. As we explained to the Court in detail, the idea that algorithmic serving of third party content could somehow void a platform’s Section 230 protection is an argument that had been cogently rejected by the Second Circuit and should similarly be rejected here.

Oral argument is scheduled for February 21. While it is possible that the Supreme Court could take onboard all the arguments being brought by Google and the constellation of amici supporting its position, and then articulate a clear defense of Section 230 platform operators could take back to any other court questioning in their statutory protection, it would be a good result if the Supreme Court simply rejected this particular theory pressing for artificial limits to Section 230 that are not in the statute or supported by the facially obvious policy values Section 230 was supposed to advance. Just so long as the Internet and the platforms that make up it can live on to fight another day we can call it a win. Because a decision in favor of the plaintiffs curtailing Section 230 would be an enormous loss to anyone depending on the Internet to provide them any sort of benefit. Or, in other words, everyone.

Filed Under: , , , , , ,
Companies: copia institute, engine, google

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Dear Supreme Court: Judicial Curtailing Of Section 230 Will Make The Internet Worse”

Subscribe: RSS Leave a comment
85 Comments
Anonymous Coward says:

Re:

Forcing the CIA to help Google et al moderate by informing said entities about terror organizations, inclusive of the CIA-backed ones, and those terror orgs’ ad campaigns would be a start.

How would anyone be able to moderate if they don’t know which shell corp belongs to which terror org and thinktank/trust fund at least?

This comment has been flagged by the community. Click here to show it.

This comment has been deemed insightful by the community.
That One Guy (profile) says:

Re:

Whatever it is they’re pretty much guarenteed to be wrong and disappointned unless they want either:

1) No ability to post third-party content at all.

2) Extremely heavy moderation that blocks and takes down anything that even might be problematic and that will make the current moderation efforts of even highly restrictive platforms look downright generous in comparison.

3) No moderation at all other than blatantly illegal content that platforms are required to take down(CSAM and the like), resulting in a complete free-for-all where any legitimate content is buried under an avalanche of trolling and spam.

If those gunning for 230 for whatever reason don’t want one or more of the above then their best bet is to hope they fail in the courts yet again.

This comment has been deemed insightful by the community.
TKnarr (profile) says:

Re: Re: #3

No moderation at all including for blatantly illegal content, because the platforms will cease looking at content at all unless informed of illegal content by a third party. Recall that the court decision that caused Section 230 to be written was that, because the defendant scanned content for material that violated their ToS, the defendant could be presumed to know about all content and could be held liable for anything they missed. The same people pushing for Section 230 to be modified/removed will absolutely push for that interpretation to become the law, because it gives them yet another tool to force platforms to only carry what they approve of.

That’s something I think should be highlighted to the Court. The government can’t outlaw any but the absolute worst content because of that pesky First Amendment thing. If platforms are also afraid of blocking offensive/dangerous content because any mistake opens them to massive liability, well, we know how that goes because we can see what happens on 8chan and Parler and other sites that attempt to not moderate that sort of content.

This comment has been deemed insightful by the community.
That One Guy (profile) says:

Re: Re: Re:

That’s… disturbingly possible now that you bring it up.

If site owners face liability for even scanning the content on their platform then assuming they don’t block it entirely ‘don’t take anything down unless it’s been specifically pointed out’ would seem to be the safer option since if they took one thing down on their own and missed a second they’d be worse off than if they didn’t look for either until notified.

Anonymous Coward says:

Re: Re: Re:2

Wasn’t this the norm (both not scanning and straight up deleting offensive posts, sometimes together, usually not) BEFORE S230?

I seem to remember that there were 2 court cases dealing with moderation…

And remember, a repeal of S230 also makes finding the originators of illegal content harder. Because with the automated deletion of offensive contents comes the charge of “obstruction of justice”, which also includes… DESTRUCTION OF EVIDENCE.

This comment has been deemed insightful by the community.
TKnarr (profile) says:

Re: Re: Re:3

No, it wasn’t the norm. Section 230 was introduced in response to the court case Stratton Oakmont, Inc. v. Prodigy Services Co. (https://en.wikipedia.org/wiki/Stratton_Oakmont,_Inc._v._Prodigy_Services_Co.) where the court ruled Prodigy was liable because they moderated content. The case never went to appeal, and Section 230 was introduced to make sure it that outcome never became the norm.

Amusingly, conservatives supported S230 at the time because it would immunize ISPs from liability for removing pornographic content.

Rocky says:

Re: Re: Re:

No moderation at all including for blatantly illegal content, because the platforms will cease looking at content at all unless informed of illegal content by a third party.

And that will make any kind of automatic detection of copyrighted material on a site a catch-22, since without section 230 you either get the moderation liability or you get the hosting copyrighted material liability. Some will of course point out that removing copyrighted material shouldn’t confer the moderation liability but here we quickly descend into the weird world of fair use, false positives, meme’s the false DMCA requests – who is liable for what if content is removed because of those things? Because as usual, if someone feels wronged and sue-happy, they’ll sue the site first even though the site has been forced to remove material.

It would also entirely scupper Youtube among other services, because they have to scan every video. All those rules that YT have set up for what is considered acceptable content, poof!. And that will of course mean that most advertisers wont be buying ads on YT. If YT only stick to removing clearly illegal content they might survive until some parent discovers that their children are watching videos chock-full of things they find offensive but not illegal.

I’m surprised that we haven’t heard anything from the big ad-businesses, because they will loose huge amounts of business and money without 230 since no site can guarantee what content will be next to any ad if they go the no moderation route, and if they go the moderate heavily route the ad-views and clickthroughs will tank due to less users and engagement.

Anonymous Coward says:

Re: Re:

2) Extremely heavy moderation that blocks and takes down anything that even might be problematic and that will make the current moderation efforts of even highly restrictive platforms look downright generous in comparison.

As we know from this site: moderation at scale is impossible

Thus, one (the only?) way for a site to effectively moderate content post-230 is to block everything and only let through user contributions that are manually approved.

This will, of course, severely limit the amount of user-supplied content. Places like TikTok and YouTube would become dried-up ghost towns, with no hope of new creators getting anything posted or finding an audience.

Anonymous Coward says:

Re: Re: Re:

You’re right that Section 230 doesn’t cover copyright infringement, but there’s a thread of logic among copyright maximalists that any law that protects 3rd party hosts or sites is considered favorable to copyright infringement, because they’re not being held liable for enabling the actions of a few.

It’s why Richard Bennett (no apparent relation to the new Bennett) made claims on here and Ars Technica that ISPs throttling users was a good thing, because nobody other than pirates would want to use that much bandwidth or have strong and consistent Internet upload/download speeds. Same thing when net neutrality was being talked about.

Which is the reason why John Smith kept harping on the idea that websites have to preemptively prevent “defamation” by taking things down when asked without question. It’s not about reputation, it’s a Trojan horse to the RIAA’s end goal.

Frank says:

I think you’re concentrating too much on the section 230 aspect vs the verbiage in the filing. It’s states that they blame the recommendation system of google, not any form of moderation. If tech companies chose to remove the algorithm and moved to a purely chronological or metric based feed (ie highest NUMBER likes or engagements). This would severely limit their ad serving capabilities without being able to guarantee being served next to engaging content and shift power to individual creators, both of which I think would be positive outcomes

This comment has been deemed insightful by the community.
That One Guy (profile) says:

Re: Hope you like sports and celebrity gossip, you'll be seeing a lot

If tech companies chose to remove the algorithm and moved to a purely chronological or metric based feed (ie highest NUMBER likes or engagements). This would severely limit their ad serving capabilities without being able to guarantee being served next to engaging content and shift power to individual creators, both of which I think would be positive outcomes

The currently major creators who would as a result get the largest chunk of likes and engagements perhaps, but good luck if you’re a creator that’s not them or someone who isn’t interested in what they are posting because that’s all you’d be getting if pure numbers are all that a platform is allowed to base things on.

This comment has been deemed insightful by the community.
Anonymous Coward says:

Re:

It’s states that they blame the recommendation system of google, not any form of moderation.

Hate to break it to you, but recommending things IS moderation. It’s just the “inclusive” rather than the “removal” form of moderation. +1 instead of -1.

This comment has been deemed insightful by the community.
Anonymous Coward says:

Re:

If tech companies chose to remove the algorithm and moved to a purely chronological or metric based feed (ie highest NUMBER likes or engagements).

Having burst one of your illusions, let’s go for a second.

Q) … just what do you imagine an algorithm is?
A) a method or set of rules for doing a thing.

“purely chronological” and “metric based feed” are simply alternate algorithms for doing what the current algorithms do.

The sites who use algorithms more complex than what you have suggested do so because they found the results of your example algorithms to be unsatisfactory – to their business needs, to the customer’s desires.

I leave you with a useful quote:

For every complex problem there is an answer that is clear, simple, and wrong.

— H. L. Mencken

brianary (profile) says:

Re: Re: because

The sites who use algorithms more complex than what you have suggested do so because they found the results of your example algorithms to be unsatisfactory – to their business needs, to the customer’s desires.

This appears to be a fallacy of division. I have no doubt that complex algorithms are more satisfactory to the business needs of maximizing engagement (read enragement). Immediately linking that with customer desires is trying to pull a fast one, rhetorically speaking.

I’m quite happy with my purely chronological Mastodon feed, thanks.

This comment has been deemed insightful by the community.
Cat_Daddy (profile) says:

Being honest here…

I know that I tend to resort to humor, but this case legit terrifies me. It’s the damned uncertainty of it all. Will the Supreme Court respect the defense for Section 230? Will they even care? Will they even understand? Will their biases get in the way of perjury and precedent for this case? I don’t know. Outside of Clarence Thomas, we know nothing about what the other eight justices will think. And that’s scary.

I know that we’ll have to keep trying regardless of fear. Because if you give up, you’ll lose, if you try, you’ll might win. But this case in particular just fills me with a sense of deep dread. But as I tell myself, the internet never truly ends, it simply changes. For the better or even worse.

This comment has been flagged by the community. Click here to show it.

Chozen (profile) says:

Re:

“I know that I tend to resort to humor, but this case legit terrifies me. ”

It should BigTech should have never attempted to claim section 230 to avoid 18 U.S. Code § 2333.

The entire case was a conservative trap from the beginning. Once more we told you exactly what we were doing years ago. But people like Mike and Cathy are so fucking stupid no one hit the breaks.

Google should have just argued the 18 U.S. Code § 2333 on its merits and if they lost paid damages. The case was brought with the intent to get google to claim section 230. Its the prefect case to bring the two wings of the conservative court together. The Roberts neocon wing will defend big corporations but not to the death. Allowing google to use section 230 to avoid an anti-terrorism statute is a line too far for the neocon wing.

Yes it was a trap and Mike and the rest of the BigTech idiots fell right into it.

Anonymous Coward says:

Re: Re:

There is one way of keeping terrorist propaganda off the Internet, and that is to make the legal liabilities so sever that all published content has to be first approved by a gate keeper. However that will not stop the circulation of that material, as even the OLD USSR could not stop the circulation of paper copies of banned books, and their effort included armed guards watching photocopiers.

You would kill the Internet, but that just makes it harder to keep track of what people are thinking, as there will no little in the way of a highway on which to monitor what they are saying.

This comment has been flagged by the community. Click here to show it.

Chozen (profile) says:

Re: Re: Re:

God do I have to explain it like a child.

There are 6 conservative justices on the court. There are the populist conservatives lead by Thomas and there are the neocon conservatives led by Roberts. To get 5 you need a wedge issue that unites both sides. On the neocon side they will side its all corporations and military. So you need something that pits those two issues against each other. The anti-terrorism statute 18 U.S. Code § 2333 is that wedge. Roberts cannot allow section 230 to be used to avoid an anti-terrorism statute. As such section 230 must be gutted.

Stephen T. Stone (profile) says:

Re: Re: Re:2

I hope you realize that once 230 is gutted as you so desperately want to see happen, your speech will be far more heavily moderated on every website you go to⁠—and that’s if you’re even allowed to post at all, given how many outlets would likely shutter their ability to accept third-party content of any kind to avoid the newfound legal liability placed upon moderation.

Andy J says:

USA ≠ Internet

Look, I appreciate that ARPA created ARPANET which begat the Internet, and that this is a case before the US Supreme Court but can people stop saying things like “A decision that curtails the applicability of 47 U.S.C. §230 (“Section 230”), the statute at the heart of this case, stands to affect the entire Internet.. ” [first paragraph of the Aimcus brief]. A large part of the Internet is entirely unaffected by §230 whether it’s there or not, simply because the users, site operators and servers are not within the jurisdiction of the US courts, hence §230 does not apply to them. This is not to support those who wish to remove or water down §230, just that a sense of perspective is required. In fact removing §230 would send a bad message to authoritarian regimes.
The EU has for many years had parallel provisions in Article 15 of the InfoSoc Directive, and in contrast to this enlightened position, some other nations outside Europe and the the USA take completely the opposite view (eg India) when it comes to the liability of site owners for content posted on their sites.
Whatever else one may think about the Supreme Court justices, they are not stupid people and they can spot hyperbole better than most.

This comment has been deemed insightful by the community.
Wyrm (profile) says:

One-way internet

My interpretation is that these politicians think that the internet should be more like something they are familiar with. Namely TV and newspapers.

A few large corporations as gatekeepers, one-way communications to the public and politicians’ lies going unchallenged. Like the good old days of TV and newspapers.

Any “plebs” speaking against power should be fully liable for anything they say, preferably when interpreted out of context and in the worst way possible. When allowed to speak at all.
Meanwhile, “elite” should be held responsible only when their speech is taken in context and in the best light possible. When challenged at all.

So, we should assume that any negative outcome from abolishing or restricting section 230 would not be an accident, but the intended outcome. In particular if platforms end up choosing or being forced into the extreme outcome of losing user-generated content.
And the current composition of SCOTUS is not encouraging at all. The majority views seem to be even more backwards than the era of TV.

Anonymous Coward says:

The search engines already index inferior data, so that spew-topia of a “worse Internet” is just some tabloid trash about nothing.

It only confirms that they have a meaningless existence and serve no purpose and are a complete waste of time to mix with.

Everybody knows that an ultraprocessed food diet and a hyperprocessed data diet are a trajectory to oblivion. Don’t write. Forget too. They get to keep their stupid prizes and their ambiguous existence.

This comment has been flagged by the community. Click here to show it.

Chozen (profile) says:

I fucking love it. Robert Barnes wrote this legal theory years ago about how 18 U.S. Code § 2333 was the achilleas heel of section 230 and BigTech.

Hilarious, we told you exactly what the trap was and you stupid fuck sticks still fell right into it.

Hey Cathey you stupid fuck. Why didn’t any of you pretend legal experts tell Google not to fucking claim section 230 immunity to avoid 18 U.S. Code § 2333. Just argue the case on its merits win or lose. If you lose just pay damages.

But Nooooooo … you fucking egomaniacs had to try and section 230 your way out of a fucking anti-terrorism statute.

Now you have done the one thing you couldn’t do. You united the populist conservative and Neocon judges of a 6-3 conservative court against section 230. The Neocon judges cannot allow BigTech to section 230 their way out of an anti-terrorism statute you stupid fucking morons. They would have bent over backwards to defend big corporations but not over this. This is the one thing you couldn’t do. Uisng section 230 to get your way out of an anti-terrorism/national defense statute was the fucking line and you long jumped right over it you stupid mother fuckers. lol

Its so funny because you were so fucking stupid!

You could have just argued the case on its merits and paid damages if you lost. But no! You are so fucking tone deaf and so fucking stupid you had to claim section 230.

This comment has been flagged by the community. Click here to show it.

This comment has been flagged by the community. Click here to show it.

Anonymous Coward says:

Re: Re: Re:3

Yes, Dish owns spectrum which it uses

  • If nobody else can use dish networks spectrum, then effectively dish owns the spectrum. That it is a lease from the feds, is irrelevant in terms of who can use that spectrum.
  • Owning and leasing are MUCH MUCH MUCH MUCH MUCH closer together than not knowing what a public house is.
  • No matter what you say, you’re still the idiot that didn’t know the difference between a public house and public housing.
  • You’re also wrong about DirecTV and OAN.

Just keep coming back and I’ll keep owning you….

every. single. time.

This comment has been flagged by the community. Click here to show it.

This comment has been flagged by the community. Click here to show it.

Anonymous Coward says:

Re:

Why aren’t you held responsible for crime planned in public places like cafes and bars etc. that you visit. What aren’t the owner? In general people are not held responsible for what other do on their property.

What you want Google to do is impossible, and that is police the Internet as the index it.

This comment has been flagged by the community. Click here to show it.

Chozen (profile) says:

Re: Re:

“Why aren’t you held responsible for crime planned in public places like cafes and bars etc. that you visit. What aren’t the owner? In general people are not held responsible for what other do on their property.”

Thats why you argue the case on the merits. If you lose its because you were grossly negligent and you pay damages. Thats the end of it.

BigTech was so fucking stupid, and lazy, that they tried to section 230 their way out of an anti-terrorism statute and now all of section 230 stands to get gutted. It was a trap and those stupid bastards walked right into it.

Stephen T. Stone (profile) says:

Re: Re: Re:

If you lose its because you were grossly negligent and you pay damages.

Let’s examine that frame of thought.

In response to the notion that a business owner shouldn’t be liable for crimes planned on their property without their knowledge, you’re saying they should face⁠—at minimum⁠—a civil trial to determine whether they were liable for those crimes. A ruling that the business owner is liable would make all business owners responsible for any and all criminal third-party speech on their properties⁠—even if they didn’t know about that speech at all until they were sued. And in this case, we’re talking about a meatspace business such as a bar.

Now apply that same logic to Twitter: A ruling that Twitter is liable for “terrorism” related to speech that Twitter’s owners/admins may not have known about would destroy any such protections for all other websites that accept third-party speech. Most of them⁠—notably, the ones that wouldn’t qualify as “Big Tech”⁠—would likely shut down to avoid accepting such speech and risking legal liability for it. The major players would then be entrenched because they have the money to pay legal teams to deal with lawsuits.

I get that you have a hateboner for “Big Tech”. But that’s no reason to destroy the rest of the Internet while also keeping “Big Tech” intact and turning whatever’s left into a one-way broadcast medium. That way lies madness.

This comment has been flagged by the community. Click here to show it.

Stephen T. Stone (profile) says:

Re: Re: Re:3

Are you too fucking stupid to understand what is going on here you stupid piece of shit.

And you wonder why all your posts get flagged on sight by everyone else. Other than “say things you don’t agree with”, what the fuck did I⁠—or anyone else here⁠—ever actually do to you on an interpersonal level that would make you so perpetually angry?

This comment has been flagged by the community. Click here to show it.

Chozen (profile) says:

Re: Re: Re:4

There is no “constitutional law” at odds here at all. As for federal law there is a conflict between section 230 and 18 U.S. Code § 2333. Something has to give. One of the reasons google got in this problem is section 230. They knew they had a terrorism problem but did nothing about it because they have section 230 immunity. Without section 230 google might have taken reasonable efforts to combat terrorism on their platforms which would have sufficed in an 18 U.S. Code § 2333 case. With section 230 they were grossly negligent because they believed they had immunity.

Stephen T. Stone (profile) says:

Re: Re: Re:5

230 doesn’t “immunize” companies against knowingly hosting illegal content. Google having specific knowledge of specific illegal content on its servers and refusing to do anything about it would nullify any attempt to claim 230’s protections. The question, then, would be threefold:

  1. Did Google have specific knowledge of specific illegal content on its servers…
  2. …did it knowingly and openly refuse to do anything about that content…
  3. …and, most importantly, was/is the content actually illegal under U.S. law at the time of Google’s refusal to take action?

This comment has been flagged by the community. Click here to show it.

Chozen (profile) says:

Re: Re: Re:6

“230 doesn’t “immunize” companies against knowingly hosting illegal content. Google having specific knowledge of specific illegal content on its servers and refusing to do anything about it would nullify any attempt to claim 230’s protections. The question, then, would be threefold:”

ROFLMAO

Section 230 no matter how written has been interpreted by the 9th circuit to be an almost completely unqualified immunity. This is because most 9th circuit judges are bribed the same way foreign countries bribe our leaders … through their children and families.

Stephen T. Stone (profile) says:

Re: Re: Re:7

Section 230 no matter how written has been interpreted by the 9th circuit to be an almost completely unqualified immunity.

[ahems in citation]

Emphasis is mine:

Nothing in this section shall be construed to impair the enforcement of section 223 or 231 of this title, chapter 71 (relating to obscenity) or 110 (relating to sexual exploitation of children) of title 18, or any other Federal criminal statute.

— 47 U.S.C. § 230(e)(1)

230 does not immunize any interactive service provider against actual criminal activity. You may want to make an effort to understand the law as it was written before you comment on it again.

This comment has been flagged by the community. Click here to show it.

Stephen T. Stone (profile) says:

Re: Re: Re:9

Okay. So what?

You’re still asking for the liability for the speech of third parties⁠—in this context, terrorists⁠—to be placed upon the shoulders of the platforms that those third parties use. And even if you disagree with that idea, chances are good that you’d still look for some other entity upon which you can place that burden. Should the web hosting providers who support that platform have that liability thrust upon them? How about the company that makes the computers/mobile devices used by those terrorists? If those devices arrive in those territories by way of an international shipping company, should that company also face the exact same liability?

The whole premise of putting this liability on a social media company rests upon the idea that someone other than the terrorists should be held responsible⁠—criminally or otherwise⁠—for the speech of those terrorists. It’s no better than saying a company that sells megaphones should be held responsible for the Westboro Baptist Church using one of those megaphones to blare “God hates fags” on a street corner.

That isn’t to say that the role those social media platforms play in the spread of the speech of terrorists should go unquestioned or unexamined. We can and should look into that. But placing most or all of the responsibility for that speech on the platforms rather than the people who express it in the first place is a fool’s errand⁠—even if only because the consequences for success will be much worse than you think.

This comment has been flagged by the community. Click here to show it.

Chozen (profile) says:

Re: Re: Re:10

“You’re still asking for the liability for the speech of third parties⁠—in this context, terrorists⁠—to be placed upon the shoulders of the platforms that those third parties use.”

If the platform is grossly negligent then absolutely yes. That is a question to be decided in court.

It can easily be argued that the primary reason google was so negligent was because they believed that they were immune.

That is a fundamental problem with priori granting of immunity. It makes the receiver of that immunity believe that they are untouchable and therefore act negligently and recklessly.

But like I said I could really care less. What matters is that google took the bait and tried to section 230 their way out of an anti-terrorism statute. That buts them at odds with Roberts wing of the court. It was the Roberts Wing the, corporations and military can do no wrong, wing of the court that was keeping section 230 alive.

Stephen T. Stone (profile) says:

Re: Re: Re:11

If the platform is grossly negligent then absolutely yes. That is a question to be decided in court.

And if it’s decided in a way you don’t like, will you accept that ruling as binding legal precedent (pending appeals), or will you declare that it’s a liberal plot to encourage terrorism or some other inane conspiracy theory-ish bullshit? I mean, I didn’t like the SCOTUS ruling in Dobbs, but I still accept that the ruling is binding legal precedent.

It can easily be argued that the primary reason google was so negligent was because they believed that they were immune.

Sure, making that argument is easy. But proving it is not the same as making it⁠—especially in a court of law.

That is a fundamental problem with priori granting of immunity. It makes the receiver of that immunity believe that they are untouchable and therefore act negligently and recklessly.

230 doesn’t technically grant immunity before anyone even files suit. Courts use 230 to determine whether its immunity from legal liability applies to a given legal case. To wit: the YOLO case in a different article posted today. That case wasn’t dismissed seconds after it was filed because of 230. The court used 230 to help it consider whether the claims could withstand a dismissal motion; in that case, the court said “yes, all of this bullshit is barred by 230” in legalese and dismissed the case with prejudice.

Yes, believing you have immunity can cause reckless behavior. Backpage learned that the hard way. But even if you don’t believe you have that immunity until a court says otherwise, someone may still try to use the legal system to fuck you over with a frivolous lawsuit. 230 exists not to let tech companies commit crimes, but to short circuit frivolous lawsuits and give those companies the breathing room they need to moderate their services as they see fit. That goes just as much for small Masto instances⁠—including the right-wing shitpit instances!⁠—as it does for a giant like Twitter.

I could really care less

But you don’t. You care a great deal about 230, what it does, and who it protects…

It was the Roberts Wing … that was keeping section 230 alive.

…because the moment 230 is altered or altogether deleted, your Internet experience will fundamentally change forever. You think the consequences will be “Big Tech goes out of business”, and you’re free to believe that lie. But the consequences of gutting 230 will be more along the lines of “Big Tech gets even more powerful as everyone else shuts off their speech platforms to stay out of legal jeopardy”. This site’s comments sections, every small U.S.-based Masto instance, anything that isn’t prepared to handle multi-million-dollar lawsuits⁠—all of it goes away as if Thanos snapped them into oblivion. And that means all your outlets for trolling, especially this one, are going to disappear faster than you can think of another shitty rape joke. You will be left with nothing to do but enjoy the Broadcast Internet, the One-Way Internet, the Capitalist Internet⁠—and you, being an attention whore, will not like that one damn bit.

But hey, keep on dreaming that gutting 230 will “save the Internet”. Maybe you’ll also solve the American gun violence epidemic while you’re praying to God for your dreams to come true. But don’t go whining to people when that monkey paw you’re wishing on gives you what you want and what you deserve. After all, you can’t shake the devil’s hand and say you’re only kidding.

Anonymous Coward says:

Re: Re: Re:12

Maybe you’ll also solve the American gun violence epidemic while you’re praying to God for your dreams to come true.

We all know he’ll glad contribute to the former. When the chips are down, cunts like Chozen and Matty Bennett are the first to go for their 2nd Amendment dildos and vibrators. He’ll certainly keep doing the latter, praying to Trump to do something for him despite the track record of nothingburgers that Trump has done in his four years of office, and that was the nicest thing Trump did. Trump not doing any damage was the bar that he set for a best case scenario. This is the guy that Chozen is looking towards for a savior, despite the Republicans making it manifestly clear that they are fully intent on making things shittier for racial and sexual minorities like Chozen.

But I guess Chozen’s got all the femboy bussy he can grab so what does he care about survival instinct?

This comment has been flagged by the community. Click here to show it.

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Subscribe to Our Newsletter

Get all our posts in your inbox with the Techdirt Daily Newsletter!

We don’t spam. Read our privacy policy for more info.

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...