Joe Biden Wastes A Huge Opportunity To Support Free Speech; Still Wants To 'Revoke' Section 230

from the dude,-seriously? dept

Joe Biden had a golden opportunity to actually look Presidential, and stand up for free speech and the 1st Amendment at a moment when our current President is seeking to undermine it with his Executive order that is designed to intimidate social media companies into hosting speech they’d rather not, and scare others off from fact checking his lies. And he blew it. He doubled down on the ridiculous claim that we should “revoke” Section 230.

A spokesperson for the campaign told The Verge Friday that the former vice president maintains his position that the law should be revoked and that he would seek to propose legislation that would hold social media companies accountable for knowingly platforming falsehoods.

In other words, he wants to go even further than Trump and literally wipe out free speech online. Of course, the problem with that “proposed legislation” is that it’s clearly unconstitutional and the man who wishes to be President still thinks that Section 230 does what the 1st Amendment actually does. He’s simply wrong in claiming that taking away 230 will magically make Facebook liable for spreading false info.

Indeed, as we’ve pointed out multiple times, the claim that taking away 230 will make Facebook liable for false info is itself false info. But Biden (and Facebook and anyone else) are protected in repeating that false info because of the 1st Amendment.

Biden’s ongoing attacks on free speech are truly unfortunate — especially given that Trump’s silly Executive order basically put the issue on a tee for Biden to respond to. And instead, we get a plan to go even further than Trump in trying to harm the internet that enables the speech of so many people.

Filed Under: , , , , ,

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Joe Biden Wastes A Huge Opportunity To Support Free Speech; Still Wants To 'Revoke' Section 230”

Subscribe: RSS Leave a comment
85 Comments

This comment has been flagged by the community. Click here to show it.

Koby (profile) says:

In other words, he wants to go even further than Trump and literally wipe out free speech online.

While I don’t support a complete revocation, everyone should know that free speech online would not be wiped out by revoking 230. Online platforms would simply be unable to themselves moderate an interactive computer service without assuming liability. Thus, platforms would need to allow all speech by default, and then put the moderation tools into the hands of the users themselves. Let the people decide what to ban, and what not to ban.

This comment has been flagged by the community. Click here to show it.

Koby (profile) says:

Re: Re: "free speech online would not be wiped out by revoking 230"

Well, the last time 230 was dented by FOSTA we lost a ton of porn. That sounds a lot like the chilling of free speech on the internet.

Certainly. The sites that lost wanted to continue moderating. This then forced them to moderate and censor. But what happens when you cannot moderate at all without liability, ala Stratton Oakmont v Prodigy?

Still, it does make quite a mess, which is why I don’t support revokation. I’m just saying that a new paradigm will emerge if it does, which although much different, will eventually return to a free speech system.

This comment has been flagged by the community. Click here to show it.

This comment has been deemed insightful by the community.
That One Guy (profile) says:

Re: Re: Re:3 Re:

[Fair warning for anyone not Koby, skip this one if you’re squeamish, it gets nasty.]

Do a search on TD for ‘Facebook’ and read up on the article talking about Facebook moderators coming down with PTSD thanks to what they have to wade through.

In fact, let me save you the effort:

In September 2018, former Facebook moderator Selena Scola sued Facebook, alleging that she developed PTSD after being placed in a role that required her to regularly view photos and images of rape, murder, and suicide.

And from a 2019 article on the same subject:

Marcus was made to moderate Facebook content — an additional responsibility he says he was not prepared for. A military veteran, he had become desensitized to seeing violence against people, he told me. But on his second day of moderation duty, he had to watch a video of a man slaughtering puppies with a baseball bat. Marcus went home on his lunch break, held his dog in his arms, and cried. I should quit, he thought to himself, but I know there’s people at the site that need me. He ultimately stayed for a little over a year.

That is what you would inflict on the public, from children to adults, having everyone wading through that, simply because you don’t like the idea of a platform engaging in moderation. If that still sounds like a good idea to you see my comment below for a way to prove how dedicated you are to what you would foist on others.

This comment has been flagged by the community. Click here to show it.

Koby (profile) says:

Re: Re: Re:4 Re:

That is what you would inflict on the public, from children to adults, having everyone wading through that, simply because you don’t like the idea of a platform engaging in moderation.

Not if you gave individual users moderation tools. I’m not in favor of "no moderation". Primarily, I’m opposed to politically biased mass moderation by corporations. Providing the moderation tools to users would prevent that type of undesired content, while also satisfying my desire to eliminate censorship of legitimate free speech.

This comment has been deemed insightful by the community.
Stephen T. Stone (profile) says:

Re: Re: Re:5

Not if you gave individual users moderation tools.

You’re still saying that individual users should have to do the kind of work that sends Facebook moderators into therapy. Who the fuck would ever use a service where they would have to be personally responsible for moderating videos of people killing animals because the service owners/operators said “fuck it, we’re not getting sued”?

This comment has been flagged by the community. Click here to show it.

Koby (profile) says:

Re: Re: Re:6 Re:

You’re still saying that individual users should have to do the kind of work that sends Facebook moderators into therapy.

No, I would not say that. Individual moderation tools would allow people to take advantage of one of the things that computers are great at: copying data. Most users would undoubtedly copy their moderation settings from others with more proclivity, and that they also trust.

This comment has been deemed insightful by the community.
Stephen T. Stone (profile) says:

Re: Re: Re:7

No, I would not say that.

And yet, when you advocate for “users should be moderating”, you’re saying exactly that. You’re calling for the work of moderators to be done by people who aren’t prepared for what that work entails. And even if you’re all “oh but I’m only talking about them moderating their own experience”, you’re still expecting them to be able to moderate everything about their experience — including the need to see what horrible content is being sent their way before figuring out if they need to moderate it.

Your “solution” would take the already-thorny issue of “content moderation at a large scale fucks people up in the long term” and place that burden on average everyday people who can’t control what other people post. Your “solution” is no solution at all.

This comment has been flagged by the community. Click here to show it.

Koby (profile) says:

Re: Re: Re:8 Re:

You’re calling for the work of moderators to be done by people who aren’t prepared for what that work entails.

I find this to be quite an elitist view, that only some kind of trained professional moderator should be permitted to moderate, and therefore the only way moderation can be achieved is at the behest of a corporation. On the basis that most moderators between 1995 and 2015 were perfectly okay despite zero training, I reject your claim that some moderators in recent history were exposed to bad stuff online and had bad experiences, that therefore user moderation ought never be attempted.

But no. I am also not calling for this work to be done by users, on the basis that I do not support the revocation of section 230, as I originally mentioned. Instead, I am saying that if revocation were to occur, then there would still be the capability for free speech platforms to remain.

This comment has been deemed insightful by the community.
Stephen T. Stone (profile) says:

Re: Re: Re:9

I find this to be quite an elitist view, that only some kind of trained professional moderator should be permitted to moderate

The people who moderate Facebook are still people — but they’re paid and prepared to do the job of content moderation. You’re asking that people unprepared for that job to do that job…and without recompense, no less. If a paid moderator and a regular jackoff both get PTSD from moderating content, the person not employed in that job won’t have any treatment resources provided to them by the service that made them moderate content.

the only way moderation can be achieved is at the behest of a corporation

It isn’t the only way, but until you can come up with a perfect filter (you can’t), moderation of large-scale services carried out with the resources of a corporation is still the only way that will do the least psychological damage to the userbase of a given service.

On the basis that most moderators between 1995 and 2015 were perfectly okay despite zero training

They willingly volunteered for those roles. And you can’t know, with the absolute unyielding certainty of God, that all of the people who moderated content in that 20-year span are “perfectly okay”.

some moderators in recent history were exposed to bad stuff online and had bad experiences, that therefore user moderation ought never be attempted

User moderation should be limited to client-side filters. Period. No user should ever be forced to moderate content — especially the kind of content that most often gets moderated — for “the greater good of the service”, which is what your “solution” is suggesting should happen. An average, everyday, unpaid Black user of Twitter should not have to face a daily barrage of racial slurs and White supremacist propaganda only so they can “moderate” it for “the greater good”.

I am also not calling for this work to be done by users

That is exactly what you are calling for regardless of whether you like that assertion.

I am saying that if revocation were to occur, then there would still be the capability for free speech platforms to remain.

Go to 4chan. Stay exclusively on that site for a week. Then have the motherfucking brass balls to tell me you want every platform to be like that.

This comment has been deemed insightful by the community.
That One Guy (profile) says:

Re: Re: Re:7 Prognosis: Papercut. Cure: Amputation of the limb

Two things: As many articles on TD have made clear filters have any number of problems, from false positives, false negatives, to people finding new and inventive ways to bypass them, and second for that filter to work someone still has to view the content in question. All of it. Because if someone posts something that’s objectionable but not on the filter yet guess what, it’s on the platform for anyone to see and they either have to moderate it themselves or wait for the ‘trusted moderator that’s not the platform’ to do it for them.

Your ‘cure’ is vastly worse than the claimed disease, and funnily enough aimed at a problem that doesn’t exist and would still be perfectly acceptable even if it did.

This comment has been deemed insightful by the community.
Anonymous Coward says:

Re: Re: Re:7 Re:

Most people do not have the computing power available to automate the moderation, and you approach would make that problem worse by everybody having to have all the data to moderate their own feeds. Automation works best when the advantage can be taken of scale, especially as the social media sites only need to make a moderation decision on a posting once to filter it out for all their users. What you are requiring is that people have a significant amount of storage in a parallel processing array, so that filter can operate in a reasonable time.

Hint, it not the number of posts in a users feed that matter, but the number of items each post has to be compared against, even when those comparisons are of hash values. Users cannot do individually what the social media sites do with a data centre, once for all users.

Loss of 230 cannot be compensated for by user moderation, as the problem is not what the want to look at, but rather the databases needed to implement the filtering, and the processing power needed to filter out what they want to avoid.

This comment has been deemed insightful by the community.
That One Guy (profile) says:

Re: Re: Re:5 Re:

Not if you gave individual users moderation tools

Yes, even then, as to moderate something you first have to see it. By the time you’ve clicked ‘hide this post’ or whatever option is there you have already seen/read whatever the content was, so yes, you are saying that people should be forced to see that.

Primarily, I’m opposed to politically biased mass moderation by corporations.

And for that you would inflict horrors untold, simply because you don’t like the fact that platforms are allowed to have biases. Even if I believed that such ‘politically biased mass moderation’ actually existed(and so far the evidence has been utter garbage if non-existent) your ‘solution’ if worse by far.

If the poor persecuted ‘conservatives’ don’t like it that others are allowed to show them the door for their ‘conservative views’ they can make use of that handy dandy ‘free market’ they are supposedly so enamored with and create their own platform where every political view is allowed, and it’s entirely on the users to moderate the content as they see fit. If their positions are really so great and desired by the public then it should have no problem knocking the likes of Facebook and Twitter off their thrones, showing them just how much people really wanted an open platform where everything is welcome if if you don’t like it it’s on you to hide it.

This comment has been flagged by the community. Click here to show it.

Koby (profile) says:

Re: Re: Re:6 Re:

Yes, even then, as to moderate something you first have to see it.

User based moderation tools would allow people to copy their moderation settings from someone else. And copying data is something that computers are pretty good at. I anticipate that over 99% of users would choose to copy from the moderation settings of others, thereby the vast majority of people would not see such material.

And for that you would inflict horrors untold, simply because you don’t like the fact that platforms are allowed to have biases.

I would not, because as I mentioned above that I do not support a revocation of 230. I’m just saying that a revocation of 230 would still result in free speech platforms.

This comment has been deemed insightful by the community.
Stephen T. Stone (profile) says:

Re: Re: Re:7

User based moderation tools would allow people to copy their moderation settings from someone else.

But someone still has to see the content before they can moderate the content. You’re still asking unpaid labor to do a job given to paid labor — a job, might I reiterate, that has literally sent people into therapy.

I’m just saying that a revocation of 230 would still result in free speech platforms.

Stay on 4chan on a week and see how much you like that “free speech platform”, then get back to me with why you think every platform should be like 4chan or its even-less-moderated brethren.

This comment has been deemed insightful by the community.
That One Guy (profile) says:

Re: Re: Re:7 Re:

User based moderation tools would allow people to copy their moderation settings from someone else.

Which still requires that someone view the content in question to add it to the settings, so as Stephen pointed out above all you’re doing is shifting the burden of wading through the horrors from people who are paid to do it to some poor schmucks.

I anticipate that over 99% of users would choose to copy from the moderation settings of others, thereby the vast majority of people would not see such material.

Sucks to be the 1% who have all that dumped on them, but more importantly people already do that, it’s called the platform’s moderation and rules, where people use a platform accepting that it will decide what content is and is not allowed on it, and if they don’t like that they can go elsewhere.

I’m just saying that a revocation of 230 would still result in free speech platforms.

We already have ‘free speech platforms’, what you are objecting to is the fact that free speech isn’t shorthand for ‘consequence-free speech’ and that private platforms have the free speech rights to tell you ‘not on our platform’.

That said if you really are a big fan of ‘free speech platforms’ I would point you to my comment below, titled ‘Put up or shut up’, which lists a way you can demonstrate just how dedicated you are to the idea.

This comment has been flagged by the community. Click here to show it.

Koby (profile) says:

Re: Re: Re:8 Re:

I would point you to my comment below, titled ‘Put up or shut up’, which lists a way you can demonstrate just how dedicated you are to the idea.

I read that post again, and considered what I would get out of it if I did.

"Do this at least once a day(say at the first opportunity you have for free time) so you can get a good feel for the kind of content you are trying to foist on others and then come back with the same argument and I might take you seriously,"

I do not want the ability to foist any content upon anyone else. I want people to have the ability to voluntarily hear whatever speech is offered, even if it is unpopular. I dont care if some authority doesn’t want me to read The Anarchists Cookbook, or watch a video about someone growing weed in their own backyard. If I want to view it, then I ought to be able to view it.

When you say "not on this platform", you are almost always attempting to block people who want to hear a message from hearing it. Don’t you find it a worthy endeavor to do accommodate both sides, especially with the advent of technology? Allow those who want to hear the message to get it, and for those who don’t it would get blocked?

This comment has been deemed insightful by the community.
Anonymous Coward says:

Re: Re: Re:9 Re:

I want people to have the ability to voluntarily hear whatever speech is offered, even if it is unpopular.

They already have that ability should they choose, as the chans, Brietbart, Alex Jones etc. exist online. What you want is a guaranteed audience, but I think if you get your way, you will find that most people flee the ‘free speech’ sites.

This comment has been deemed insightful by the community.
Stephen T. Stone (profile) says:

Re: Re: Re:9

When you say "not on this platform", you are almost always attempting to block people who want to hear a message from hearing it.

Only on that platform. If Twitter bans the Anarchist Cookbook from Twitter, you can still go find it somewhere else. You’re implying that Twitter admins shouldn’t be allowed to ban speech they don’t want on Twitter, which is such a ludicrous argument to make that I’ll bring up a hypothetical to which I’ve yet to get a straight answer.

Assume you run a small Mastodon instance with a rule that says “you can’t post racial slurs”. One day, the government rules that you must either let users post racial slurs or immediately shut down the instance, and if you fail to do either in a given timeframe (let’s say 48 hours), the government will seize your instance. Which outcome would you choose: The one that alienates every user who expected your instance to be (mostly) free of racial slurs, the one that sends all your users packing to other instances without warning (and without access to their data from your instance), or the one that sees you getting your shit pushed in by the government? Keep in mind: There are no other options.

Don’t you find it a worthy endeavor to do accommodate both sides, especially with the advent of technology?

One side preaches “Black Lives Matter”. The other side preaches “White Power”. Do you really believe Twitter should be forced to host both “sides” of that “debate”?

Allow those who want to hear the message to get it, and for those who don’t it would get blocked?

I dunno, do you find it a worthy endeavour to pass a law that forces services to associate with everyone and host all legally protected speech regardless of the feelings of the owners on such matters?

This comment has been flagged by the community. Click here to show it.

This comment has been deemed insightful by the community.
Stephen T. Stone (profile) says:

Re: Re: Re:11

Yes.

Do you really believe Twitter should be forced to host propaganda for all of the following groups and causes, even if the owners of Twitter don’t want the platform associated with them and the users of Twitter don’t want to be subjected to such speech/people?

  • “conversion ‘therapy’ ”, which is literally the physical and psychological torture of gay people committed with the intent to make them heterosexual
  • anti-vaxxers, whose mis- and disinformation campaigns can cause (and already have caused) serious health issues amongst not only fellow anti-vaxxers but society in general
  • Holocaust deniers, who believe that the truth of the Holocaust is exaggerated by literally millions of deaths
  • Sandy Hook truthers, including Alex Jones, whose campaign of odious lies about the Sandy Hook massacre have sent into hiding several parents of children who died in that massacre
  • White supremacy, which preaches the inherent supreme nature of White people (and White men in particular) and shares many of the same ideals as…
  • Nazism, which is literally the Nazi platform as “popularized” by Adolf Hitler and the National Socliasts that ruled Germany until Hitler’s cowardly suicide in 1945
  • NAMBLA, the group dedicated to normalizing child rape

Remember: Saying “no” to even a single group or cause means you’re for moderation of odious speech instead of “free speech everywhere”, so think hard before you decide on an answer.

Uriel-238 (profile) says:

Re: Re: Re:12 A thing about NAMBLA

The North American Man/Boy Love Association really did die in the 1970s. It still exists as a fictitious business entity, but it’s connected only to a phone number with a message machine (that I’d wager is a cold-call point for some kind of espionage scheme than connected to actual NAMBLA). So no, NAMBLA has gone the way of Recovered Memory Therapy and Satanic Ritual Abuse.

The twenty-first century version of the subject is age of consent reform most efforts of which are things like standardizing laws that allow for normal human development. Romeo & Juliet laws, for instance have inconsistent age-difference thresholds from county-to-county and often don’t apply to LGBT relationships. Also there are a considerable number of sixteen-year-olds who want to bang twenty-somethings.

PS: Considering we’ve had a dozen or so US media pundits on the right-wing side express skepticism about our 100,000+ death count from the current epidemic, it kinda sheds light on Holocaust Denial might be a psychological phenomenon, rather than a rational argument. Some people just can’t look at facts that fly in the face of their ideological identity.

This comment has been deemed insightful by the community.
Scary Devil Monastery (profile) says:

Re: Re: Re:11 Re:

"Yes. I may disagree with your viewpoint, but I will defend to the death your right to say it."

Laudable, if that was what you were actually doing.

You’re not.

You are defending to your death the abolition of private property. Twitter and facebook, in case you didn’t get the memo, are as privately owned as someone’s living room.

You are, effectively, arguing that the bar owner should no longer gets to toss out the patrons who stand up on his property, and take a leak on the bar desk.

Put in less descriptive terms you are arguing for the nationalization of private platforms.

This comment has been deemed insightful by the community.
That One Guy (profile) says:

Re: Re: Re:9 Re:

I read that post again, and considered what I would get out of it if I did.

Funny that you are willing to weigh that choice when it comes to you but you’re fine taking it away from everyone else, willing to stay away from 4chan while advocating an idea that would remove that option from others.

As for what you could get out of it? Well you’d be taken slightly more seriously for one, as it would be showing that you are willing to endure what you are insisting that others deal with, but if you’re not cowardice and hypocrisy it is I guess.

I do not want the ability to foist any content upon anyone else.

A lie repeated often does not a truth make.

Yes, you are, and no amount of empty denials will make that lie a truth. When you insist that platforms no longer moderate but dump all that on the users you are insisting that the users, not the paid moderation team, wade through the filth. Even if it’s only a handful that will then create filters for others to use, you absolutely are insisting that users have to deal with that sort of content.

In addition you are also insisting that platforms have content they might object to foist upon them, like it or not, with penalties should they dare try to refuse to host any of it not blatantly illegal(I’m guessing you’d make that exception anyway), so they would be forced to deal with content they objected to as well, making your assertion false twice over.

I dont care if some authority doesn’t want me to read The Anarchists Cookbook, or watch a video about someone growing weed in their own backyard. If I want to view it, then I ought to be able to view it.

So find a platform that allows that content, you don’t get to expect to be taken seriously when you insist that others must host content they don’t want to simply because you want to see it.

When you say "not on this platform", you are almost always attempting to block people who want to hear a message from hearing it.

Only on that platform, but even then too damn bad, my front yard may make a great meeting place for the local racist losers but I’m not going to lose any sleep if I tell them to get the hell off and find their own place to wallow in their garbage, even if doing so means that less racist losers get to hear what was being said for whatever reason.

Don’t you find it a worthy endeavor to do accommodate both sides, especially with the advent of technology?

When one side is advocating rights for all and the other is advocating against equal rights, no. Just because there are many sides and many opinions does not mean they are all equal and deserve equal treatment, and if a platform decides that they’d rather not play host to bigots they are under no obligation to do so, not should they be.

Scary Devil Monastery (profile) says:

Re: Re: Re:9 Re:

"I do not want the ability to foist any content upon anyone else."

You are arguing for something even worse. You are saying that the owner of private property must allow anyone to come in and start shouting.

Imagine a world where you have a party in your home. One of the guests starts shouting and being generally unpleasant. According to you, you are not allowed to show him out or tell him to shut up.

Then, like any decent troll would, he camps out in your living room. Again, according to you, despite this being your own home you are not allowed to throw him out.

THAT is what you are arguing for.

This comment has been deemed insightful by the community.
Stephen T. Stone (profile) says:

Re: Re: Re:3

Give the users the tools to choose for themselves what gets banned, and what does not.

Yeah, and considering how often people manipulate such tools (e.g., reportbombings), and how often such manipulation comes from people acting in bad faith, how do you think that shit is gonna work out in the long run in making sure a site doesn’t become another 4chan? (Hint: It won’t work.)

And even if you give those tools to the users, the fact remains that the “objectionable” content would still be on the service. The owners/operators couldn’t touch that content in any way, even with users reporting it en masse, without risking a lawsuit for content that wasn’t moderated.

Sooner or later, your proposed solution will always result in a 4chan-esque service. Nobody but people who like that kind of chaotic bullshit would use that service — and they’ll make keeping such a service alive more of a headache than people whining after the service shuts down. You can’t win this fight with a “third option” because that option will always lead to one of the other two.

That One Guy (profile) says:

Re: Re: Re:4 Re:

Sooner or later, your proposed solution will always result in a 4chan-esque service.

Worse actually, as I understand it 4chan actually does have moderation of a sort while the idea they are proposing would be to dump all of that on the users, removing even that.

‘Worse than 4chan’ is hardly something to boast about, but it’s something that any site that allowed user submissions would be faced with if they got their way.

This comment has been deemed insightful by the community.
Uriel-238 (profile) says:

Re: Re: Re:5 Worse than 4chan

Child porn is prohibited and certain types of gore and porn are restricted to certain channels and usually restricted to threads that are announced to be for that purpose.

Also porn in general is restricted to adult-friendly channels.

So, without moderation, all the child porn, furry porn, gore, war footage, racism, terror incitement and so on that you could eat. More so, since some people enjoy the toddler pastime of grossing each other out.

This comment has been deemed insightful by the community.
Anonymous Coward says:

Re: Re:

Thus, platforms would need to allow all speech by default,

And lose most of their users. If social media becomes unsuitable for keeping in touch with friends and family, many people will fall back to text and email, which is not such an issue with mobile devices.

This comment has been deemed insightful by the community.
That One Guy (profile) says:

Re: Put up or shut up

Tell you what, I’ll give you the same offer that I proposed as a hypothetical a while back: If you think that sites shouldn’t moderate and instead should leave it up to users I want you to go spend a solid week(at least) wading through 4chan’s /b board. You don’t have to open every topic but you do have to go through the entire list of topics, carefully looking at every picture(no matter what it is) posted for a second or two, along with reading all the text, from the first comment thread in the board to the last.

Do this at least once a day(say at the first opportunity you have for free time) so you can get a good feel for the kind of content you are trying to foist on others and then come back with the same argument and I might take you seriously, because until then you are doing the equivalent of advocating for the destruction of a walkway keeping you and everyone around you from plunging straight into an open cesspit as you stand on it.

This comment has been deemed insightful by the community.
Stephen T. Stone (profile) says:

Re:

free speech online would not be wiped out by revoking 230

You’re technically correct, but in this case, that isn’t the best kind of correct.

Sure, revoking 230 wouldn’t kill speech online entirely. But it would revoke the legal liability protections for moderation. Those protections let interactive computer services moderate third-party speech without a fear of lawsuits. Put that fear back into them, and one of two things will happen: They’ll stop moderating content altogether, or they’ll shut down the service.

The outcome of the first one can be seen already. We call it 4chan, 8chan, or any other service like it. The outcome of the second one will see numerous services like Twitter, but not nearly as big, shut down to avoid liability for third-party speech. If you don’t see those two outcomes as a blow to online speech, you’re not looking far enough past your own nose to see how what you want will affect everyone that isn’t you.

This comment has been deemed insightful by the community.
Anonymous Coward says:

Re: Re: Re:

They’ll stop moderating content altogether, or they’ll shut down the service.

Apart from the little catch 22 called FOSTA, which requires moderation to avoid legal liability. So it looks like shutdown, especially as without 230, someone whose comments are moderated will be able to find something to sue over, if they get a little backing. They do not have to win, just be part of the thousands of paper cuts that will drive a company under.

This comment has been flagged by the community. Click here to show it.

Koby (profile) says:

Re: Re: Re:

The outcome of the first one can be seen already. We call it 4chan, 8chan, or any other service like it.

I can’t say that I’m familiar with how the moderation works on either of these two platforms, since I have never perused them. So I could be wrong on this. However, it is my understanding that neither of them offer moderation tools to the individual users. The solution to a world where 230 is revoked is to provide moderator tools to each individual, to alleviate your concerns.

This comment has been deemed insightful by the community.
Anonymous Coward says:

Re: Re:

"everyone should know that free speech online would not be wiped out by revoking 230"
everyone should know this?

"Online platforms would simply be unable to themselves moderate "
and this is a good thing somehow?

"Thus, platforms would need to allow all speech by default"(
Even that which advocates all sorts of things you do not like, and you are cool with that? ….. I doubt it. You sound like a person that wants a megaphone … the only megaphone.

"Let the people decide what to ban, and what not to ban."
Are you one of those who rail against the TD mod system?

Anonymous Coward says:

Re: Re: Re: Re:

I believe there is a bias against conservatives, or more accurately, against assholes who can’t keep their bigoted views to themselves. These are, of course, equal.

That bias is enabled by the 1st Amendment, not 230. 230 just clarifies that point and that platforms that engage in moderation are not suddenly responsible for anything they don’t moderate.

Anonymous Coward says:

Re: Re: Re:2 Re:

"I believe there is a bias against conservatives"

I believe there is bias against everybody all the time. It is not all the same all the time as each individual will have differing levels of bias in different categories. But let’s play like it is all the same thing and only one type of person is affected because they are more important – yeah?

This comment has been deemed insightful by the community.
cpt kangarooski says:

Re: Re:

User moderation doesn’t work. It exposes sites to total liability. We know because in the Stratton Oakmont case, Prodigy was had a team of user moderators. The key language eviscerating your not well-thought-out idea:

PRODIGY’s conscious choice, to gain the benefits of editorial control, has opened it up to a greater liability than CompuServe and other computer networks that make no such choice.
Gaining those benefits by any means — employee moderators, software (also used by Prodigy), or user moderators — creates the legal risk.

This comment has been deemed insightful by the community.
Scary Devil Monastery (profile) says:

Re: Re:

"…everyone should know that free speech online would not be wiped out by revoking 230."

Still being disingenious, Koby?

What would be wiped out is property rights. Because if 230 is revoked online then you are, essentially, giving up the ability to tell a patron on your property they are no longer welcome.

"Thus, platforms would need to allow all speech by default, and then put the moderation tools into the hands of the users themselves."

Great idea if we’re talking about a public platform.
But we aren’t. We are talking about a privately owned platform.

Last I checked both sides of the aisle heavily opposed the phenomenon of nationalization. What changed?

This comment has been deemed insightful by the community.
Anonymous Coward says:

Re: Re:

Online platforms would simply be unable to themselves moderate an interactive computer service without assuming liability.

Every single time I read something like this it triggers the same sentence in my head: Would you advocate for the same approach when it comes to email?

I’d challenge you to turn off any spam filtering options for a week to see what that looks like. Because spam filtering is certainly moderation.

Anonymous Coward says:

Another constitutional scholar pesidential candidate.

Wonderful to know that neither major party presidential candidate in the 2020 election know, understand, or support the Constitution of the United States – other then lip-service. It is…. disappointing.

I suppose you get the government that you put the effort into making. As an American people, this is our effort and our result. Again, disappointing.

Anonymous Coward says:

230 Is Doomed

The sad part is even though Trump’s EO is all theater it has painted a target on 230’s back. It’s pretty much doomed according to Eric Goldman because remember, the EARN IT Act is still there waiting and this will likely be the environment by which it’ll be passed.

So no matter who wins this election the internet is about to get a whole lot smaller.

This comment has been deemed insightful by the community.
Anonymous Coward says:

Re: 230 Is Doomed

I disagree, but it is all conjecture at this point isn’t it.

Without 230, who in their right mind would allow comments on their assets?

Those who wanted this may be a bit disappointed when they can not find any blogs to spam.

Anonymous Coward says:

Re: Re: Re:2 230 Is Doomed

Thing is it seems the bill may not come up to vote or even pass for a while since congress is preoccupied with the coronavirus so its not likely to pass before the election and this is unlikely the environment by which it’ll be passed.

Saying 230 is doomed is fearmonger at best or trying to undermine groups like the EFF who are fighting to protect 230 at worse.

Anonymous Coward says:

Re: Re: Re:3 230 Is Doomed

Fearmogering may not be the best word to use because it is a tough fight to protect it but like I said many are fighting to protect 230 so saying 230 is doomed and the internet is about to get a whole lot smaller is not going to help them win the fight.

Best thing to do is work to protect Section 230

Also Here all actions and it seems it may not be moving anytime soon. https://www.congress.gov/bill/116th-congress/senate-bill/3398/all-actions-without-amendments?KWICView=false

Anonymous Coward says:

Re: 230 Is Doomed

Its not doomed and it dont think Eric Goldman is right and it seems the bill may not come up to vote or even pass for a while since congress is preoccupied with the coronavirus so its not likely to pass before the election and this is unlikely the environment by which it’ll be passed.

So NO the internet is about to get a whole lot smaller because Joe Biden is likely to backtrack on this.

Upstream (profile) says:

I know this is repetitive, but . . .

this article cries out for it:

There is currently an alternative to the two existing government factions, and her name is Jo Jorgensen. A better option than either of the two same old, same old.

There is no need for me to speak against the two government candidates. They do that most effectively themselves, as TD consistently point out.

Anonymous Coward says:

Double indemnity

I’m surprised I haven’t seen anyone bring up one of the classic responses to unwanted liability – indemnity. If platforms could get sued over what their users post, one of the nastier things they could ask for would be indemnity for their legal expenses. I suspect a lot of people would shrug and agree, when that’s could be a huge bill. They’d have trouble collecting (the press would be pretty awful), but even if they don’t enforce it, I could see their lawyers wanting to put that in there if 230 goes away.

Anonymous Coward says:

Re: Double indemnity

I’m surprised I haven’t seen anyone bring up one of the classic responses to unwanted liability – indemnity.

Indemnity is only useful if you will get the money, otherwise you are just increasing your legal bills by having to pay lawyers to try and help you recover the indemnity.

Anonymous Coward says:

Re: Re:

Nope. For all his immature posturing, the president is the Executive branch, not the Legislative. He can pressure Congress to act but it’s still up to the representatives to make any changes. The president is effectively powerless wrt laws which is exactly what the framers of the Constitution had in mind.

ECA (profile) says:

Fact checking, fact checks?

Just my understanding of Facts?
How do you do it?

Opinions are opinions, and they will say its an opinion. Even if you can show that it matches NO factual data.
Might as well debate how many, and the names of all the gods.
His comments alone, even with many from others in congress, demand that we fire all of them, and find others With more REALITY/WORK HISTORY/LIFE SKILLS..Anything except lawyers and judges that CANT get a job in the real world.

Petitions. get them started.

Anonymous Coward says:

Take techdirt as an example, I presume it blocks spam, or. Maybe people posting random links to dubious extreme websites, or people who post links to
websites like the pirate Bay,
Section 230 works, it allows users to post comments, it protects the moderator or website owner from getting sued for user comments If section 230 is banned then most websites will ban user comments and stop users posting links to
articles on other websites. It will be a massive
reduction on freedom of speech. A website is not a
Newspaper it has almost unlimited space for articles. user comments and links.
If Biden wants to be elected he must provide an
alternative to trumps attack on free speech and
tech company’s ability to moderate content
Asking for section 230 to be banned is exactly the wrong thing to do at this time. Biden seems in
general to be against tech company’s and he seems
to have no one to give him advice on policy re
how to handle tech company’s and issues around
moderating content on the web

Anonymous Coward says:

Re: Re:

It will be a massive reduction on freedom of speech

And this is precisely why all this blustering over 230 will come to nothing or, if our government again passes an unconstitutional law, it will be torn down in the near future by the Supreme Court.

I’m not too worried about it.

bobob says:

Biden is probably the only one of the democratic candidates that could lose to trump. How he has managed to stay in office for so long, much less be the presumptive democratic candidate in the presidential election, is baffling. If he does manage to win the election, it won’t be because of anything he’s done to try to win it. If he wins, it will because trump is so bad that a dumpster could win.

Leave a Reply to Stephen T. Stone Cancel reply

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »