Republicans Blame CDA 230 For Letting Platforms Censor Too Much; Democrats Blame CDA 230 For Platforms Not Censoring Enough

from the which-is-it? dept

It certainly appears that politicians on both sides of the political aisle have decided that if they can agree on one thing, it's that social media companies are bad, and that they're bad because of Section 230, and that needs to change. The problem, of course, is that beyond that point of agreement, they actually disagree entirely on the reasons why. On the Republican side, you have people like Rep. Louis Gohmert and Senator Ted Cruz who are upset about platforms using Section 230's protections to allow them to moderate content that those platforms find objectionable. Cruz and Gohmert want to amend CDA 230 to say that's not allowed.

Meanwhile, on the Democratic side, we've seen Nancy Pelosi attack CDA 230, incorrectly saying that it's somehow a "gift" to the tech industry because it allows them not to moderate content. Pelosi's big complaint is that the platforms aren't censoring enough, and she blames 230 for that, while the Republicans are saying the platforms are censoring too much -- and incredibly, both are saying this is the fault of CDA 230.

Now another powerful Democrat, Rep. Frank Pallone, the chair of the House Energy and Commerce Committee (which has some level of "oversight" over the internet) has sided with Pelosi in attacking CDA 230 and arguing that companies are using it "as a shield" to not remove things like the doctored video of Pelosi:

But, of course, the contrasting (and contradictory) positions of these grandstanding politicians on both sides of the aisle should -- by itself -- demonstrate why mucking with Section 230 is so dangerous. The whole point and value of Section 230 was in how it crafted the incentive structure. Again, it's important to read both parts of part (c) of Section 230, because the two elements work together to deal with both of the issues described above.

(c) Protection for “Good Samaritan” blocking and screening of offensive material

(1) Treatment of publisher or speaker
No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.

(2) Civil liability No provider or user of an interactive computer service shall be held liable on account of—

(A) any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected; or

(B) any action taken to enable or make available to information content providers or others the technical means to restrict access to material described in paragraph (1).

It's these two elements together that make Section 230 so powerful. The first says that we don't blame the platform for any of the actions/content posted by users. This should be fairly straightforward. It's about the proper application of liability to the party who actually violated the law, and not the tools and services they used to violate the law. Some people want to change this, but much of that push is coming from lawyers who just want the bigger pockets to sue. It involves, what I've referred to as "Steve Dallas lawsuits" after the character in the classic comic strip Bloom County, who explains why you should always focused on suing those with the deepest pockets, no matter how tangentially they are to the law violating.

But, part (2) of the law is also important. It's the part that actually allows platforms the ability to moderate. Section 230 was an explicit response to the ruling in Stratton Oakmont v. Prodigy, in which a NY state judge ruled that because Prodigy wanted to provide a "family friendly" service, and therefore moderated out content it found objectionable (in order to support that "family friendly" goal), it therefore became automatically liable for any of the content that was left up. But, of course, that's crazy. The end result of such a rule would be either that platforms wouldn't do anything to moderate content, which would mean everything would be a total free for all -- and you couldn't have a "family friendly" forum at all, and everything would quickly fill up with spam/porn/harassment/abuse/etc -- or platforms would basically restrict almost everything to create a totally anodyne and boring existence.

The genius of Section 230 is that it enabled a balance that allowed for experimentation and this includes the ability to experiment with different forms of moderation. Everyone focuses on Facebook, YouTube and Twitter -- which all take moderately different approaches -- but having a Section 230 is also what allowed for the radically different approaches taken by other sites: like Wikipedia and Reddit (and even us at Techdirt). These use very different approaches, some of which work better than others, but much of which is community-dependent. It's that experimentation that is good.

But the very fact that both sides of the political aisle seem to be attacking CDA 230 but for completely opposite reasons really should highlight why messing with CDA 230 would be such a disaster. If Congress moves the law in the direction that Gohmert/Cruz want, then you'd likely get many fewer platforms, and some would just be overrun by messes, while others would be locked down and barely usable. If Congress moves the law in the direction that Pelosi/Pallone seem to want, then you would end up with effectively the same result: much greater censorship as companies try to avoid liability.

Neither solution is a good one, and neither would truly satisfy the critics in the first place. That's part of the reason why this debate is so silly. Everyone's mad at these platforms for how they moderate, but what they're really mad at is humanity. Sometimes people say mean and awful things. Or they spread disinformation. Or defamation. And those are real concerns. But there need to be better ways of dealing with it than Congress stepping in (against the restriction put on it by the 1st Amendment), and saying that the internet platforms themselves either must police humanity... or need to stop policing humanity altogether. Neither is a solution to the problems of humanity.

Filed Under: cda 230, censorship, content moderation, frank pallone, louis gohmert, nancy pelosi, section 230, ted cruz


Reader Comments

The First Word

Subscribe: RSS

View by: Time | Thread


  1. icon
    Anonymous Anonymous Coward (profile), 10 Jun 2019 @ 7:53am

    They're both wrong

    Independents blame CDA 230 for causing Republicans and Democrats.

    reply to this | link to this | view in thread ]

  2. icon
    Anonymous Anonymous Coward (profile), 10 Jun 2019 @ 8:01am

    Seriously though

    Wouldn't either move be a 1st Amendment violation? After all, Congress shall make no law is not just prominent, but the first words of the Amendment. Either proposal would be engaging in 'prohibiting the free exercise'. If platforms do it, it is not the government. If the government tells platforms what to do, it is government.

    "U.S. Constitution - Amendment 1"

    "Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof; or abridging the freedom of speech, or of the press; or the right of the people peaceably to assemble, and to petition the Government for a redress of grievances."

    reply to this | link to this | view in thread ]

  3. icon
    Mason Wheeler (profile), 10 Jun 2019 @ 8:23am

    Either proposal would be engaging in 'prohibiting the free exercise'.

    Umm... huh? The only place that those words appear in the First Amendment, it's immediately followed by the word "thereof", making it clear that it refers to the thing that was discussed immediately prior, which is religion, not speech.

    (You're not wrong about this being a likely violation of the First Amendment; only about how it applies here.)

    reply to this | link to this | view in thread ]

  4. icon
    Thad (profile), 10 Jun 2019 @ 9:33am

    Re: Seriously though

    I don't know what amending 230 would do, but repealing it entirely would simply move the status quo back to what it was before the CDA was passed -- viz, the Stratton Oakmont v. Prodigy decision that if platforms moderate content, they're liable for whatever content they don't remove.

    This would be disastrous, for reasons Techdirt has covered repeatedly and at length. There would be legal challenges, there would be lobbying, there would be an awful lot of frivolous suits, and most US sites would either shut down comments entirely or not moderate them at all (including spam filters).

    As for constitutional challenges? Maybe. The Prodigy case wasn't appealed to the Supreme Court, so there's always the possibility that a new challenge could make it up to SCOTUS and the precedent could be reversed. But that would take years.

    reply to this | link to this | view in thread ]

  5. icon
    xebikr (profile), 10 Jun 2019 @ 9:59am

    Sounds like a win

    The fact that both parties are unhappy with it, for opposite reasons, makes it appear to be the rare good law, IMHO.

    reply to this | link to this | view in thread ]

  6. identicon
    AricTheRed, 10 Jun 2019 @ 10:04am

    TechDirt Et al., you owe me!

    1.) Read the title of article.

    2.) Face-Palmed so hard I lost a crown!

    3.) Mike, It is clearly your fault for publishing "Absolute Idiocy"

    4.) CDA 230 should not protect you from My Palm.

    reply to this | link to this | view in thread ]

  7. identicon
    Anonymous Coward, 10 Jun 2019 @ 10:06am

    And these clashing views from both parties will definitely slow down congress from messing with S230 in my opinion.

    reply to this | link to this | view in thread ]

  8. identicon
    Anonymous Coward, 10 Jun 2019 @ 10:11am

    Re: Seriously though

    The first amendment isn't an absolute right. If it were, then child porn wouldn't be illegal.

    reply to this | link to this | view in thread ]

  9. This comment has been flagged by the community. Click here to show it
    identicon
    The Real Dick Ottomy, 10 Jun 2019 @ 10:16am

    False Dichotomy. Too little moderation AND target conservatives

    This is Masnick's usual attempt to position Un-Constitutional Section 230 as "opposed by both, therefore must be good".

    But in fact, both complaints are true.

    Masnick also wants Section 230 to provide corporations with absolute immunity AND government-conferred authority to control all speech.

    That's just his wishes for corporations. It's not the law.

    "Good Samaritarans" must be GOOD. Inarguable. It's right there, black letter law.

    Masnick's duplicity on this is shown by that when arguing with me, he simply DELETED the "in good faith" requirement! -- And then blows it off as not important:

    https://www.techdirt.com/articles/20190201/00025041506/us-newspapers-now-salivating-over- bringing-google-snippet-tax-stateside.shtml#c530

    Now, WHERE did Masnick get that exact text other than by himself manually deleting characters? -- Go ahead. Search teh internets with his precious Google to find that exact phrase. I'll wait. ... It appears nowhere else, which means that Masnick deliberately falsified the very law under discussion. Probably because trying to keep me from pointing out that for Section 230 to be valid defense of hosts, they must act "in good faith" (to The Public), NOT as partisans discriminating against those they believe are foes.

    reply to this | link to this | view in thread ]

  10. identicon
    Anonymous Coward, 10 Jun 2019 @ 10:19am

    Re: Re: Seriously though

    Child porn is illegal due to the harms it causes to a disadvantaged group, i.e. children. This isn't a free speech issue at all. Your comment is a nice strawman but it burned far too quickly to be viable.

    reply to this | link to this | view in thread ]

  11. This comment has been flagged by the community. Click here to show it
    identicon
    The Real Dick Ottomy, 10 Jun 2019 @ 10:22am

    Re: False Dichotomy. Too little moderation AND target conservati

    Forgot to point out that mere statute CANNOT empower any entity to violate Constitutional Rights. Section 230 is therefore null and void. -- Yes, no matter how often used in cases to get immunity (usually rightly), it STILL cannot empower corporations on the "material is constitutionally protected" point.

    That's the actual crux of argument. Masnick tries to buttress the censorship with non-controversial parts. -- Because there's BIG money in being able to control all speech. If corporations are able to shunt opposition into tiny outlets, they automatically win.

    reply to this | link to this | view in thread ]

  12. identicon
    Anonymous Coward, 10 Jun 2019 @ 10:22am

    Re: False Dichotomy. Too little moderation AND target conservati

    You're an idiot. Please stop talking and leave the grownups to their discussion.

    reply to this | link to this | view in thread ]

  13. identicon
    peter, 10 Jun 2019 @ 10:27am

    On balance?

    "Republicans Blame CDA 230 For Letting Platforms Censor Too Much; Democrats Blame CDA 230 For Platforms Not Censoring Enough"

    If both parties are annoyed, I would say CDA230 got the balance just about right.

    reply to this | link to this | view in thread ]

  14. icon
    Thad (profile), 10 Jun 2019 @ 10:44am

    Re: Re: Seriously though

    reply to this | link to this | view in thread ]

  15. identicon
    Anonymous Coward, 10 Jun 2019 @ 10:45am

    Re: False Dichotomy. Too little moderation AND target conservati

    No 230, no comments sections, and no censorship either.

    reply to this | link to this | view in thread ]

  16. icon
    Thad (profile), 10 Jun 2019 @ 10:47am

    Re: Sounds like a win

    This time.

    As Techdirt reminded us last week, sometimes when a solution makes every side equally unhappy that's just because it's a shitty solution.

    reply to this | link to this | view in thread ]

  17. identicon
    Anonymous Coward, 10 Jun 2019 @ 11:01am

    Also in 230:

    (3)
    The Internet and other interactive computer services offer a forum for a true diversity of political discourse, unique opportunities for cultural development, and myriad avenues for intellectual activity.

    Was that a hint of the FCC's abolished Fairness Doctrine? While all the big platforms certainly started out with minimum censorship, the vice keeps tightening ever so slowly. Will Conservatives and other Wrongthinkers be boiled alive like the proverbial frog, or will an online equivalent of Fox News emerge that splits social media in much the same way as cable news, along political, cultural, and ideological lines?

    reply to this | link to this | view in thread ]

  18. icon
    xebikr (profile), 10 Jun 2019 @ 11:11am

    Re: Re: Sounds like a win

    Yep. YouTube messed that up. But I'm much more okay with YouTube making bad decisions while trying to please everyone, than YouTube making the same (or opposite) decisions because the law tells them to.

    reply to this | link to this | view in thread ]

  19. identicon
    Anonymous Coward, 10 Jun 2019 @ 11:46am

    Re:

    Better question: Why should the law force websites to host certain kinds of speech?

    reply to this | link to this | view in thread ]

  20. icon
    Thad (profile), 10 Jun 2019 @ 11:46am

    Re:

    Was that a hint of the FCC's abolished Fairness Doctrine?

    I read it as more of a riff on the "marketplace of ideas".

    Remember, Section 230 was a direct reaction to Stratton Oakmont v Prodigy, a decision which held that because Prodigy moderated content, it was legally liable for content it didn't remove.

    (I was a Prodigy kid. I can assure you that Prodigy moderated content aggressively.)

    230 was explicitly built on the premise that platforms can moderate content as they see fit.

    reply to this | link to this | view in thread ]

  21. This comment has been flagged by the community. Click here to show it
    identicon
    Anonymous Coward, 10 Jun 2019 @ 11:47am

    It's these two elements together that make Section 230 so powerful. The first says that we don't blame the platform for any of the actions/content posted by users.

    The harm inflicted by a platform in amplifying/spreading defamation is separate from the harm inflicted by the user. Every country in the world EXCEPT the US recognizes this, and even the US did with distributor liability.

    Section 230 allows people to weaponize search engines, and if IP addresses don't prove authorship, or people use a "burner" IP that can't be traced (or are judgment-proof or posting from another country), the target of defamation is defenseless.

    I'm sure if someone ever used an untraceable IP address to post reviews of pro-230 lawyers and claim that they sexually abused children or female clients, the lawyers would scream bloody murder and their pro-230 position might change. Of course I'm not recommending anyone DO this but instead just demonstrating the potential for harm.

    The other problem is that people believe what they read online, then repeat it in their own words without linking to the original post and that makes them a publisher and liable for being dumb enough to believe and repeat what they read. Sometimes the defamation is on a questionable site (like a white-supremacist or anti-Semitic site) so they can't quote it but by not quoting it they become liable.

    Someone who wanted to game the system could easily have defamation about themselves planted online, go around arguing with people, wait for the people they argue with to Google them, then let nature take its course and sue those people for libel once they repeat what they've read (they can plant the defamation on a site the pawn wouldn't want to link to for maximum impact).

    Employers who believe defamation and deny someone a job because of it should be sued into bankruptcy.

    Section 230 is fatally flawed.

    reply to this | link to this | view in thread ]

  22. This comment has been flagged by the community. Click here to show it
    identicon
    Anonymous Coward, 10 Jun 2019 @ 11:48am

    Re: Re:

    The presumption was that such moderation would be politically neutral, especially by a platform with global influence.

    reply to this | link to this | view in thread ]

  23. identicon
    Anonymous Coward, 10 Jun 2019 @ 11:50am

    Re: Re: Seriously though

    Notice-and-takedown would work just fine. It's what we've had offline for centuries.

    Libel laws were designed to replace DUELING.

    reply to this | link to this | view in thread ]

  24. identicon
    Anonymous Coward, 10 Jun 2019 @ 12:05pm

    Re: Re: Re: Seriously though

    How "fine" does it work for copyright? Because I've read lots of stories about false or faulty takedowns.

    reply to this | link to this | view in thread ]

  25. icon
    Thad (profile), 10 Jun 2019 @ 12:05pm

    Re: Re: Re: Seriously though

    Did you quit posting as John Smith just to get around my filter? I know that's why Blue started changing his name.

    On the one hand, there's something weirdly flattering about that. On the other, that's some kinda creepy stalker shit, Johnny.

    But then, when has "that's some kinda creepy stalker shit" ever stopped you before?

    reply to this | link to this | view in thread ]

  26. identicon
    Anonymous Coward, 10 Jun 2019 @ 12:07pm

    Re: Re: Re:

    Why should a platform be forced by law to host any kind of content?

    reply to this | link to this | view in thread ]

  27. identicon
    Anonymous Coward, 10 Jun 2019 @ 12:09pm

    Re:

    Are you naturally this stupid, or did you intentionally give yourself brain damage?

    reply to this | link to this | view in thread ]

  28. identicon
    Anonymous Coward, 10 Jun 2019 @ 12:18pm

    Re: Re: Re:

    Have you considered that it is politically neutral and if there is any imbalance between bans of liberals and conservatives it is due to the greater tendency of one of those to violate terms of service or basic human dignity?

    reply to this | link to this | view in thread ]

  29. identicon
    Anonymous Coward, 10 Jun 2019 @ 12:19pm

    Re: Re:

    Jhon is naturally that stupid yet exacerbated the problem by headbutting moving cars as a child.

    reply to this | link to this | view in thread ]

  30. This comment has been flagged by the community. Click here to show it
    identicon
    Anonymous Coward, 10 Jun 2019 @ 12:45pm

    Re: Re: Re: Re:

    The terms of service are biased and subjectively enforced.

    The public tolerates this by NOT boycotting companies who sponsor it, however, so the marketplace has spoken.

    If the public demanded that USENET rules apply or they won't buy anything advertised on the site, that's what we'd have, or USENET itself would still be populated more heavily.

    reply to this | link to this | view in thread ]

  31. identicon
    Anonymous Coward, 10 Jun 2019 @ 12:45pm

    Re: Re:

    Those who debate with ad-hominems like you are using are the ones generally thought stupid.

    reply to this | link to this | view in thread ]

  32. This comment has been flagged by the community. Click here to show it
    identicon
    Anonymous Coward, 10 Jun 2019 @ 12:46pm

    Re: Re: Re: Re: Seriously though

    Wow, you're impressed with yourself!

    reply to this | link to this | view in thread ]

  33. identicon
    Anonymous Coward, 10 Jun 2019 @ 12:52pm

    I blame CDA 230 for walking down an alley drunk, and dressed like that, in that part of town.

    reply to this | link to this | view in thread ]

  34. identicon
    Anonymous Coward, 10 Jun 2019 @ 12:55pm

    Re: Re: Re:

    The presumption was that such moderation would be politically neutral

    Citation needed.

    However, should I be similarly questioned for a citation, I offer this transcript of the Congressional Record when the amendment was read and several speakers commented - and none of them spoke of a political neutrality requirement for the immunity conferred upon service providers. The relevant section starts with "amendment offered by mr. cox of California"

    reply to this | link to this | view in thread ]

  35. icon
    Stephen T. Stone (profile), 10 Jun 2019 @ 1:07pm

    The terms of service are biased and subjectively enforced.

    So what?

    reply to this | link to this | view in thread ]

  36. icon
    Stephen T. Stone (profile), 10 Jun 2019 @ 1:09pm

    And yet, here you are, expressing an idea about “weaponizing” defamation and Section 230 that has not, and will never, become a reality. Sounds like that “ad hom” has more truth to it than you care to admit.

    reply to this | link to this | view in thread ]

  37. icon
    ECA (profile), 10 Jun 2019 @ 1:29pm

    DEAR INTERNET..

    Its TIMe to declare yourself an independent nation..
    Start making deals/trade agreements with every nation.
    Start Charging for access to YOUR GOODS..(Charge the ISP's for giving access to your customers)(Cable/sat does it, why not you)

    reply to this | link to this | view in thread ]

  38. icon
    Gwiz (profile), 10 Jun 2019 @ 1:39pm

    Re:

    Section 230 allows people to weaponize search engines, and if IP addresses don't prove authorship, or people use a "burner" IP that can't be traced (or are judgment-proof or posting from another country), the target of defamation is defenseless.

    So what? The right of anonymous speech has existed prior to the Constitution of the United States and has been consistently upheld by our courts as a First Amendment right. The one difference between the right to anonymity and other 1A rights is the fact that once you give it up (or it's taken from you) you can never reclaim it. Once again to paraphrase Blackstone: "I'd rather 100 defamation cases go unpunished as opposed to one persons right to anonymity be stripped from them."

    The other problem is that people believe what they read online...

    Yes, some people are gullible and stupid, but that doesn't mean I have to give up my rights because of their shortcomings.

    Section 230 is fatally flawed.

    No, it's not. It's because of Section 230 that we are able to have this discussion in this comment section in the first place because this comment section wouldn't exist without it.

    Personally, I made the decision to keep my and personal/professional identity and my online identity separate back in the late 90's and have never regretted it.

    reply to this | link to this | view in thread ]

  39. This comment has been flagged by the community. Click here to show it
    identicon
    Anonymous Coward, 10 Jun 2019 @ 1:58pm

    Re: Re:

    Personally, I made the decision to keep my and personal/professional identity and my online identity separate back in the late 90's and have never regretted it.

    You mean to ATTEMPT to keep them separate.

    reply to this | link to this | view in thread ]

  40. This comment has been flagged by the community. Click here to show it
    identicon
    Anonymous Coward, 10 Jun 2019 @ 1:59pm

    Re: Re:

    The other problem is that people believe what they read online...
    Yes, some people are gullible and stupid, but that doesn't mean I have to give up my rights because of their shortcomings.

    Exactly, and these people are sitting ducks for being manipulated into being sued by those who would use them as pawns.

    Some 4chan idiot wants to poison my coworker against me, the coworker grabs the bait, winds up sued, and then blames ME. Pathetic.

    reply to this | link to this | view in thread ]

  41. identicon
    Anonymous Coward, 10 Jun 2019 @ 2:07pm

    Re: Re: Re:

    If these gullible people are being duped into believing something untrue about you by some bad actor, then aren't they victims of the bad actor as well as you?

    Blaming you for suing them isn't that unreasonable - after all, you are choosing to sue them.

    reply to this | link to this | view in thread ]

  42. identicon
    Anonymous Coward, 10 Jun 2019 @ 2:08pm

    Re: Re: Re: Re:

    They mentioned "good faith" moderation.

    Either way, the law doesn't have to require neutrality for Congress to impose the condition on a law that circumvents two centuries of precedent in this country, and runs counter to all other countries.

    reply to this | link to this | view in thread ]

  43. This comment has been flagged by the community. Click here to show it
    identicon
    Anonymous Coward, 10 Jun 2019 @ 2:09pm

    Re:

    Bias in structure and enforcement means it's censorship.

    The public doesn't care though so it's really a nonstarter for those who would like more fairness. USENET still works for that diminishing crowd.

    reply to this | link to this | view in thread ]

  44. identicon
    Anonymous Coward, 10 Jun 2019 @ 2:10pm

    Re: Re: Re: Re:

    Why should a platform be forced by law to host any kind of content?

    If they are a state actor who should be treated as a common carrier, they should be.

    I support a law that treats federal contractors as common carriers, meaning if they want to censor, don't do it while playing with federal money.

    reply to this | link to this | view in thread ]

  45. identicon
    Glenn, 10 Jun 2019 @ 2:22pm

    Politicians apparently see the Constitution and the Bill of Rights as a hindrance to good government.

    reply to this | link to this | view in thread ]

  46. icon
    That One Guy (profile), 10 Jun 2019 @ 2:45pm

    Re:

    Politicians and large sections of the government apparently see the Constitution and the Bill of Rights as a hindrance to good government.

    Fixed for unfortunate accuracy.

    reply to this | link to this | view in thread ]

  47. icon
    That One Guy (profile), 10 Jun 2019 @ 2:59pm

    Be careful what you wish for...

    The 'funny' part of this is that if they do get their way neither side is going to be happy with the result.

    Reinstate a penalty for moderation and sites are likely to go one of two ways, either moderating nothing, which will anger the idiots who think they're not doing enough, or moderating heavily anything that even might be objectionable, angering the idiots who think that sites are already moderating too much.

    In their rush to throw tantrums because social media platforms aren't doing what they want them to they've completely missed that even if they win social media platforms still won't do what they want them to.

    reply to this | link to this | view in thread ]

  48. identicon
    Anonymous Coward, 10 Jun 2019 @ 3:03pm

    Re: Re: Re: Re: Re:

    They mentioned "good faith" moderation.

    Yes, they did. But that's got nothing to do with political neutrality. The commenters were quite clear that they didn't want providers to do nothing, but they also didn't want them to face liability if they did something. There's also a few bits in there about how there really isn't a definition for what should or shouldn't be removed, and it's probably not a good idea for the government to create a definition.

    Now, if you want to argue that a particular platform's moderation is not being done in good faith, you could certainly do that - I've not heard of any CDA230-related cases that make that argument, so it might be an interesting angle to take but also would probably be much harder to convince a judge that it was actually happening.

    the law doesn't have to require neutrality for Congress to impose the condition

    The law (CDA230) doesn't have to require neutrality for Congress to impose the condition (a neutrality requirement) on a law (CDA 230)....?

    Um, what?

    circumvents two centuries of precedent in this country, and runs counter to all other countries.

    So? Just because it was done that way before, or is done that way elsewhere, doesn't mean it's the right way to do it.

    reply to this | link to this | view in thread ]

  49. icon
    Stephen T. Stone (profile), 10 Jun 2019 @ 3:04pm

    these people are sitting ducks for being manipulated into being sued by those who would use them as pawns

    If you are so sure this plan of yours would work and is obvious to anyone with half a brain, please show us one instance of it working. At all. On any level.

    reply to this | link to this | view in thread ]

  50. icon
    Stephen T. Stone (profile), 10 Jun 2019 @ 3:06pm

    the law doesn't have to require neutrality for Congress to impose the condition

    Any imposition of neutrality would constitute a breach of the First Amendment. Until the Supreme Court says otherwise, corporations have the right of association — and that includes the right to avoid association with certain people/kinds of speech. Imposing content neutrality on social interaction networks would violate that right.

    reply to this | link to this | view in thread ]

  51. identicon
    Anonymous Coward, 10 Jun 2019 @ 3:07pm

    Re: Re: Re: Seriously though

    It would work fine for scammers, dodgy businesses etc. but not for the public who benefit from bad reviews etc warning them of possible problems.

    reply to this | link to this | view in thread ]

  52. icon
    Stephen T. Stone (profile), 10 Jun 2019 @ 3:08pm

    Bias in structure and enforcement means it's censorship.

    Ah, yes, the cries of “censorship”. How does YouTube booting some Nazi fuckboi prevent them from expressing themselves in any way, again?

    reply to this | link to this | view in thread ]

  53. icon
    Stephen T. Stone (profile), 10 Jun 2019 @ 3:12pm

    If they are a state actor who should be treated as a common carrier, they should be.

    Please explain the reasoning behind your belief that any website that hosts user-generated content should be classified as a government-controlled “common carrier” and thus forced to carry any content it otherwise would not host.

    FurAffinity, a site wholly dedicated to UGC, chooses not to host certain kinds of artwork. For what reason should the government have a right to make FurAffinity do otherwise?

    reply to this | link to this | view in thread ]

  54. identicon
    Anonymous Coward, 10 Jun 2019 @ 3:36pm

    Re:

    If "any website" is a state actor then it's hard to disagree with the prior poster. The government may not censor speech. FurAffinity is not a state actor and is exempt from that restriction.

    reply to this | link to this | view in thread ]

  55. identicon
    Anonymous Coward, 10 Jun 2019 @ 3:40pm

    Re: Re:

    There is no law, not even 230, that requires fairness in protected speech. Moderation has been recognized as a form of speech. So long as that moderation isn't performed by a state actor there is no violation of any law, rule, restriction or anything else. The government can't even intervene here and kill 230, a rule protecting free speech, without running afoul of the constitution.

    The world is unfair. You'll get used to it eventually. This is particularly funny since it was right wingers who first called left-wingers "snowflakes" but look how the right wing melts when their "unique and beautiful" views aren't well accepted.

    reply to this | link to this | view in thread ]

  56. identicon
    Anonymous Coward, 10 Jun 2019 @ 3:43pm

    Re: Re:

    Which platforms involved in this issue are state actors?

    reply to this | link to this | view in thread ]

  57. identicon
    cpt kangarooski, 10 Jun 2019 @ 3:56pm

    I’m inclined to agree with the Democrats here, to an extent. While sites absolutely don’t have to delete content, and should not be compelled to, that doesn’t mean that it Sites shouldn’t want to. Certainly, were I running a site that allowed user content to be posted, I would be greatly concerned about not providing any assistance to harmful speech, including by not providing a platform for it or for the people who engage in it, or by offering them connections to my users or by doing business with businesses that did tolerate it. Let malicious users go elsewhere to exercise their right of free speech.

    reply to this | link to this | view in thread ]

  58. icon
    Stephen T. Stone (profile), 10 Jun 2019 @ 4:08pm

    Now explain how Twitter, Facebook, and YouTube do not have the same exemption.

    reply to this | link to this | view in thread ]

  59. identicon
    Anonymous Coward, 10 Jun 2019 @ 4:16pm

    Ebony and Ivory, e.g., Louis Gohmert and Nancy Pelosi

    When the keys themselves are out of tune, color matters not - no harmony is forthcoming.

    reply to this | link to this | view in thread ]

  60. icon
    That One Guy (profile), 10 Jun 2019 @ 4:25pm

    Re:

    While sites absolutely don’t have to delete content, and should not be compelled to, that doesn’t mean that it Sites shouldn’t want to.

    If that was as far as it went most people on TD would likely agree with you(with the discussion then shifting to what should be removed, how it would be done, how to minimize collateral-damage...), it's when 'should' shifts to 'should be required to' that the problems and objections crop up.

    reply to this | link to this | view in thread ]

  61. identicon
    Anonymous Coward, 10 Jun 2019 @ 4:34pm

    Re: Re: Seriously though

    Do you go out of your way just to post things that are clearly ridiculous?

    reply to this | link to this | view in thread ]

  62. icon
    That One Guy (profile), 10 Jun 2019 @ 4:34pm

    Money on the table

    If Techdirt were to sell replacement sarcasm detectors and padded headbands to reduce facepalm-related head trauma they would quickly find themselves absolutely swimming in cash, given how often both of them are needed from those reading the site.

    reply to this | link to this | view in thread ]

  63. identicon
    Anonymous Coward, 10 Jun 2019 @ 4:35pm

    Re: Re: Re:

    None of them

    reply to this | link to this | view in thread ]

  64. This comment has been flagged by the community. Click here to show it
    identicon
    Anonymous Coward, 10 Jun 2019 @ 4:36pm

    Re:

    these people are sitting ducks for being manipulated into being sued by those who would use them as pawns
    If you are so sure this plan of yours would work and is obvious to anyone with half a brain, please show us one instance of it working. At all. On any level.

    You mean name names and put targets on people's back. Not necessary.

    I did cite a case where "reiterating" content was a key element in proving one was a publisher rather than a distributor, and that should be sufficient.

    We know that if someone posts, without attribution, a defamatory statement that they are a publisher and not a distributor.

    We also know that there are people who will repeat what they find in Google without bothering to link to the original source, which makes them publisher.

    One need not jump off a building to know that doing so is likely to cause death. The demand for specifics is therefore more indicative of a desire to target the people named.

    reply to this | link to this | view in thread ]

  65. identicon
    Anonymous Coward, 10 Jun 2019 @ 4:37pm

    Re: Re:

    Do you get upset when a restaurant tells you to leave because you have no shirt and no shoes?

    reply to this | link to this | view in thread ]

  66. This comment has been flagged by the community. Click here to show it
    identicon
    Anonymous Coward, 10 Jun 2019 @ 4:40pm

    Re: Re: Re: Re:

    If these gullible people are being duped into believing something untrue about you by some bad actor, then aren't they victims of the bad actor as well as you?

    Yes they are. People are predisposed to believe the worst about those with whom they disagree. Experienced internet users know how to manipulate this predisposition to turn these people into unwitting pawns.

    Blaming you for suing them isn't that unreasonable - after all, you are choosing to sue them.

    Yes, in that situation I would be choosing to defend my rights, and the lawsuit would be caused by the pawn's willingness to believe something defamatory about someone they don't like written by someone they never met. In fact, trying to warn them of this will often just empower them to make even more defamatory statements.

    Smart people won't fall into this trap, but not everyone is smart. The trap is not set by the plaintiff, who was simply targeted by those who didn't like him or her, but was set by instigators who cannot be located but who write very serious-sounding posts designed to induce third parties to grind their axe.

    Perhaps if enough people fall into this trap, or the wrong person does, it will be prevented, but we're not there yet. While it's not Section 230's "fault" the law definitely makes it possible, and without 230, it wouldn't happen because ISPs wouldn't let themselves by tricked, though I'd imagine if some admin didn't like a poster they might go out on a limb and get sued.

    reply to this | link to this | view in thread ]

  67. identicon
    Anonymous Coward, 10 Jun 2019 @ 4:41pm

    Re:

    And yet, here you are, expressing an idea about “weaponizing” defamation and Section 230 that has not, and will never, become a reality. Sounds like that “ad hom” has more truth to it than you care to admit.

    There are entire forums and websites devoted to weaponizing 230 by posting content for the explicit purpose of defaming people and having that defamation turn up when one's name is searched.

    The "don't date that guy" type of site is one (no, I've never been named on one).

    reply to this | link to this | view in thread ]

  68. identicon
    Anonymous Coward, 10 Jun 2019 @ 4:41pm

    Re:

    So you don't think Ripoff Report exploits Section 230 or relies on sites like Google to spread their words?

    reply to this | link to this | view in thread ]

  69. icon
    Stephen T. Stone (profile), 10 Jun 2019 @ 4:42pm

    I did cite a case

    Bullshit. I haven’t seen you post one link to a court case where someone used (or attempted to use) your plan as you described it.

    reply to this | link to this | view in thread ]

  70. identicon
    Anonymous Coward, 10 Jun 2019 @ 4:43pm

    Re:

    Any imposition of neutrality would constitute a breach of the First Amendment.

    You mean like the way we do with the phone company or the USPS violates the First Amendment? Common carrier is unconstitutional now?

    Until the Supreme Court says otherwise, corporations have the right of association — and that includes the right to avoid association with certain people/kinds of speech. Imposing content neutrality on social interaction networks would violate that right.

    Congress has the right to pass a law which says that internet sites who have UGC and are federal contractors shall be treated as common carriers.

    reply to this | link to this | view in thread ]

  71. identicon
    Anonymous Coward, 10 Jun 2019 @ 4:44pm

    Re: Re: Re: Re: Re: Re:

    The law (CDA230) doesn't have to require neutrality for Congress to impose the condition (a neutrality requirement) on a law (CDA 230)....?
    Um, what?

    Congress can decide to change 230 to require neutrality.

    reply to this | link to this | view in thread ]

  72. icon
    Stephen T. Stone (profile), 10 Jun 2019 @ 4:46pm

    There are entire forums and websites devoted to weaponizing 230 by posting content for the explicit purpose of defaming people and having that defamation turn up when one's name is searched.

    The people being (allegedly) defamed can sue over the content. They can have it declared defamatory and ask for its removal via court order. The existence of CDA 230 doesn’t prevent either action.

    reply to this | link to this | view in thread ]

  73. icon
    Stephen T. Stone (profile), 10 Jun 2019 @ 4:46pm

    So you don't think Ripoff Report exploits Section 230 or relies on sites like Google to spread their words?

    No more than you exploit CDA 230 to post your bullshit, but go off, I guess.

    reply to this | link to this | view in thread ]

  74. icon
    Stephen T. Stone (profile), 10 Jun 2019 @ 4:49pm

    Then Congress can get its collective ass smacked down by the Supreme Court. Requiring a privately-owned platform to host content it would otherwise not host is a gross violation of the First Amendment. For what reason should YouTube be required by law to host, say, White supremacist propaganda?

    reply to this | link to this | view in thread ]

  75. identicon
    Anonymous Coward, 10 Jun 2019 @ 5:51pm

    Re: Re: Re: Re: Seriously though

    And determining whether somethinbg is libelous is far siginificantly more difficult then determining whether something is copyright infringment.

    reply to this | link to this | view in thread ]

  76. identicon
    Anonymous Coward, 10 Jun 2019 @ 5:52pm

    Re: Re: Re: Re: Seriously though

    For someone who'd doesn't know either the writer of the allegedly defamatory post or the subject of it, determining whether something is defamatory is not going to be easy.

    reply to this | link to this | view in thread ]

  77. identicon
    Anonymous Coward, 10 Jun 2019 @ 6:01pm

    Re: Re: Re: Re: Seriously though

    All we need is for Jhon to say "copyright terms prevent publishers from murdering authors" and I'll have finished my bingo sheet!

    reply to this | link to this | view in thread ]

  78. icon
    Stephen T. Stone (profile), 10 Jun 2019 @ 7:09pm

    You say that like it’s a bad thing.

    reply to this | link to this | view in thread ]

  79. identicon
    Anonymous Coward, 10 Jun 2019 @ 8:21pm

    Re: Re: Re: Re:

    None of them

    Right, so then:

    If they are a state actor who should be treated as a common carrier, they should be.

    But none of them are state actors, so why even bring it up?

    reply to this | link to this | view in thread ]

  80. identicon
    Anonymous Coward, 10 Jun 2019 @ 8:23pm

    Re: Re:

    You mean name names and put targets on people's back. Not necessary.

    Yes, it's truly not. In such cases you can simply blank out the people's names and offer a court document instead.

    Nobody needs to know names. What needs to be known that your interpretation of the law exists outside of your fevered, fanciful imagination.

    reply to this | link to this | view in thread ]

  81. identicon
    Anonymous Coward, 10 Jun 2019 @ 8:24pm

    Re: Re:

    You seem to have a huge problem with websites that offer advice for people on how not to get scammed or raped. Now why is that?

    reply to this | link to this | view in thread ]

  82. icon
    Stephen T. Stone (profile), 10 Jun 2019 @ 8:59pm

    the law definitely makes it possible

    Point to a single court case that proves someone did it. Otherwise, quit talking out of your ass.

    reply to this | link to this | view in thread ]

  83. identicon
    Anonymous Coward, 10 Jun 2019 @ 11:22pm

    Re: Re:

    Congress also has the right to declare my cat the new state bird but that doesn't make the idea inherently not stupid anymore, either.

    reply to this | link to this | view in thread ]

  84. icon
    PaulT (profile), 11 Jun 2019 @ 1:05am

    Re: Re: Re:

    "The presumption"

    Who presumed that. Can you link to the legally relevant discussion of that being the intent? or, are you just angry because popular platforms have decided they no longer wish to have Nazis on their property?

    reply to this | link to this | view in thread ]

  85. icon
    PaulT (profile), 11 Jun 2019 @ 1:05am

    Re: Re: Re: Re: Re:

    "If they are a state actor who should be treated as a common carrier, they should be."

    Then why are you whining about private platforms who are no such thing?

    reply to this | link to this | view in thread ]

  86. icon
    PaulT (profile), 11 Jun 2019 @ 1:07am

    Re: Re:

    "Bias in structure and enforcement means it's censorship."

    Private platforms can censor you as much as they want without violating your rights. Go use a competitor - of which there are many - rather than whine that your own actions got you banned from the most popular places normal people congregate.

    reply to this | link to this | view in thread ]

  87. icon
    PaulT (profile), 11 Jun 2019 @ 1:20am

    Re: Re:

    "There are entire forums and websites devoted to weaponizing 230"

    There are entire forums devoted to the idea of zombie invasion, that doesn't mean that it has actually happened.

    Why are you always unable to give actual proof of your claims? If it doesn't happen with any kind of regularity, it doesn't justify stripping the rights of millions and the employment of thousands as you are demanding.

    reply to this | link to this | view in thread ]

  88. icon
    PaulT (profile), 11 Jun 2019 @ 1:21am

    Re: Re:

    I think they exploit the idea that con artists and other criminals can't shut them down because they don't like being exposed.

    Why do you have a problem with that, I wondeR?

    reply to this | link to this | view in thread ]

  89. icon
    PaulT (profile), 11 Jun 2019 @ 1:25am

    Re: Re:

    "I did cite a case"

    Did you? Would you mind linking again, since you refuse to offer people a way to search your previous posts?

    "We know that if someone posts, without attribution, a defamatory statement that they are a publisher and not a distributor."

    Yes - but the person who posts that would be liable, not the platform they used to post it, and certainly not someone showing them where that platform is.

    "We also know that there are people who will repeat what they find in Google without bothering to link to the original source, which makes them publisher."

    No, the fact that stupid people exist does not change the nature of a business.

    reply to this | link to this | view in thread ]

  90. identicon
    Anonymous Coward, 11 Jun 2019 @ 1:37am

    Re: Re: False Dichotomy. Too little moderation AND target conser

    John does not even know or believe what he talks about lol

    reply to this | link to this | view in thread ]

  91. icon
    Scary Devil Monastery (profile), 11 Jun 2019 @ 5:29am

    Re: Re: Re: Re: Re: Seriously though

    "All we need is for Jhon to say "copyright terms prevent publishers from murdering authors" and I'll have finished my bingo sheet!"

    You must have been absent when he made that claim.

    Although I'm pretty sure that was the claim made by his nickname "Bobmail" at torrentfreak, quite some time back.

    reply to this | link to this | view in thread ]

  92. icon
    Wendy Cockcroft (profile), 11 Jun 2019 @ 5:33am

    Re: Re:

    So you don't think Ripoff Report exploits Section 230 or relies on sites like Google to spread their words?

    As one who was defamed on ROR, no. It has the option to moderate or not. If they get sued, they will only remove the words deemed defamatory in a court of law; the rest of the negativity (the parts that are purely opinion) remain up. Section 23 isn't responsible for this, they are. Search engines don't spread anything, they simply index content. I blame no one but the troll who posted that content for what was posted there.

    In ROR's defense, they allowed me to post a rebuttal, so every time someone reads the troll post, they can read the rebuttal too.

    reply to this | link to this | view in thread ]

  93. icon
    Scary Devil Monastery (profile), 11 Jun 2019 @ 5:37am

    Re: Re: Seriously though

    "The first amendment isn't an absolute right. If it were, then child porn wouldn't be illegal."

    The first amendment is certainly absolute until someone sees fit to rewrite it.

    There are plenty of exceptions to free speech, all of which have in common that it has taken one or more supreme court decisions to formulate them. Until such rulings exist, however, the first amendment is indeed an unassailable absolute right.

    reply to this | link to this | view in thread ]

  94. icon
    Scary Devil Monastery (profile), 11 Jun 2019 @ 5:50am

    Re: False Dichotomy. Too little moderation AND target conservati

    "This is Masnick's usual attempt to position Un-Constitutional Section 230 as "opposed by both, therefore must be good"."

    By "un-constitutional" you mean as in "protects the constitutional rights of both comenters and platform owners"?

    Section 230 is nothing other than the online equivalent of the right a home owner has to, in the real world, decide for themselves on whether a visitor gets to shout their opinions while standing in said home owners living room.

    As usual, Baghdad Bob, you conflate the United States Constitution with the ruleset adopted by Borat the Dictator.

    reply to this | link to this | view in thread ]

  95. icon
    PaulT (profile), 11 Jun 2019 @ 6:16am

    Re: Re: False Dichotomy. Too little moderation AND target conser

    As with many of his type - he confuses freedom of speech with freedom from consequences of that speech. He refuses to exercise the freedoms he has and demands that someone else make everyone else conform to what he wants. He's a child.

    reply to this | link to this | view in thread ]

  96. icon
    Wendy Cockcroft (profile), 11 Jun 2019 @ 6:23am

    Re:

    What Stephen said. When the troll came after me, I was able to prove it was a troll post. I not only kept my job, I was promoted.

    reply to this | link to this | view in thread ]

  97. icon
    Wendy Cockcroft (profile), 11 Jun 2019 @ 6:26am

    Re: Re:

    ^This. Any moderation on a site with a large readership won't work well at scale as the admins would have to automate moderation via keywords as the cost of hiring people to manually go through each reported post (assuming that's when the moderation kicks in) to see what is or isn't acceptable.

    reply to this | link to this | view in thread ]

  98. icon
    PaulT (profile), 11 Jun 2019 @ 6:41am

    Re: Re: Re:

    "assuming that's when the moderation kicks in'

    Well, that's the big problem here - it's not. They're not trying to hold platforms responsible for not dealing with reports properly. They're trying to hold them responsible for anything that ends up on the site. Which means that pretty much any site of any size would need to use some kind of automated filter - even if you can personally deal with the normal level of traffic you get, can you really deal with any potential spikes, or deal with it when you're asleep?

    Which means the end of most sources of user interaction. We'll be left with a few sites with deep enough pockets to deal with lawsuits (read: the already entrenched giants) and everybody else reduced to a broadcast model.

    reply to this | link to this | view in thread ]

  99. identicon
    Anonymous Coward, 11 Jun 2019 @ 7:18am

    Re: Re: Re: Seriously though

    Notice-and-takedown would work just fine. It's what we've had offline for centuries.

    [citation needed]

    There's no notice-and-takedown for libel. Not without a court being involved.

    reply to this | link to this | view in thread ]

  100. This comment has been flagged by the community. Click here to show it
    identicon
    Anonymous Coward, 11 Jun 2019 @ 7:19am

    Re: Re: Re: Re: Seriously though

    You're talking about the guy whose response to Masnick upon Shiva Ayyadurai failing to destroy this site was this:

    "Your ugly POS wife is a better laugh. Your shit stain children even better. You backed down like the little pussy you are. The one who can't get top-shelf women."

    Stalker is Jhon Herrick Smith's middle-middle name!

    reply to this | link to this | view in thread ]

  101. identicon
    Anonymous Coward, 11 Jun 2019 @ 7:22am

    Re: Re: Re: Re: Re:

    No rights for anyone because this clown's coworker might be stupid.

    reply to this | link to this | view in thread ]

  102. identicon
    Anonymous Coward, 11 Jun 2019 @ 7:57am

    Re: Re:

    "Common carrier" status applies to point-to-point pipes, or their switched equivalents, not to something that's effectively a multicast-to-broadcast medium by default.

    reply to this | link to this | view in thread ]

  103. icon
    Toom1275 (profile), 11 Jun 2019 @ 8:51am

    Re: Re: Re:

    I remember Jhon citing a case.

    I also remember that it didn't actually give any support his arguments outside of his imagination.

    reply to this | link to this | view in thread ]

  104. identicon
    cpt kangarooski, 11 Jun 2019 @ 9:25am

    Re: Re: Re:

    I suspect that the worst posters are comparatively few in number; a social graph is probably the way to go. Don’t just delete posts, delete posters.

    Given the financial resources of the major sites, I’d suggest coordinating with anti-hate groups (SPLC, ADL, etc.) to basically dox the people in question so that they can be excluded en masse, and infiltrate their private boards so that you can avoid having to always be reactive.

    reply to this | link to this | view in thread ]

  105. icon
    PaulT (profile), 11 Jun 2019 @ 9:57am

    Re: Re: Re: Re:

    Oh, I certainly expect that to be the case. But, I might as well give him an opportunity to present an alternative to support his cause. After all, there's apparently so many that entire industries need to be destroyed in order to protect the people affected, so there's must be plenty of good examples to choose from!

    reply to this | link to this | view in thread ]

  106. icon
    Thad (profile), 11 Jun 2019 @ 10:24am

    Re: Re: Re: Re:

    That's, uh, a little Orwellian.

    reply to this | link to this | view in thread ]

  107. identicon
    cpt kangarooski, 11 Jun 2019 @ 11:07am

    Re: Re: Re: Re: Re:

    I don’t mind anonymous or pseudonymous speech, but if someone is abusive, and a platform claims to be serious about not allowing such things, trying nothing and being all out of ideas is not a great plan.

    If you’re serious about not providing support to neo-nazis or whomever, you’d better know who they are.

    That said, it shouldn’t be mandatory. But effectively shunning the dregs of society is not so far out there that good citizens should be unwilling to do it of their own free will.

    reply to this | link to this | view in thread ]

  108. icon
    Stephen T. Stone (profile), 11 Jun 2019 @ 11:37am

    In fairness, I think Herrick is referring to sites like Kiwifarms, which took the concept of “atrocity tourism” sites such as Encyclopedia Dramatica and ran with it to its natural conclusion.

    reply to this | link to this | view in thread ]

  109. icon
    Stephen T. Stone (profile), 11 Jun 2019 @ 11:50am

    I have no issue with racist assholes being outed and given the boot from a platform. But to effectively run a campaign of doxxing, possible harassment, and “social silencing” with the help of multiple outside groups would be a bit much, don’t you think?

    reply to this | link to this | view in thread ]

  110. identicon
    Glenn Wright, 11 Jun 2019 @ 1:14pm

    Re: Seriously though

    Not necessarily. CDA 230 specifically protects companies from the consequences of speech that's not protected by the First Amendment, like threats and libel. For example, true threats are not protected by the First Amendment; if someone sends you a death threat on Facebook, CDA 230 makes it so you can sue the person who sent it, but you can't sue Facebook.

    That said, removing CDA 230 would seriously jeopardize the ability of social media platforms to exist, and I wouldn't be surprised if the Supreme Court stepped in and ruled that they still don't count as publishers.

    reply to this | link to this | view in thread ]

  111. identicon
    cpt kangarooski, 11 Jun 2019 @ 2:42pm

    Re:

    What I suggest is that the platforms take advantage of section 230 to voluntarily effectively identify and boot such users from the platforms.

    They should not engage in harassment or publicly doxxing the users. But I am not averse to them comparing notes or seeking assistance from above-board groups who are apt to be better at connecting the dots and staying on top of trends, and who are, within certain boundaries that would need to be understood, unlikely to themselves be penetrated or corrupted.

    I admit, it is kind of like Red Channels except for assholes, and this gives me pause, but I think people can agree that this is a more serious problem that does not seem to have good solutions. The Hollywood Ten were not running people down with cars, shooting people, spreading communicable diseases because they refused to get vaccinated, etc.

    It’s not a panacea, and it shouldn’t be the only thing that is done, but I think platforms have a social, though not a legal responsibility to keep their platforms from being used maliciously and that they should do something effective to accomplish this.

    Given how easily any existing measures have been circumvented, it’s time to take it up a notch. But if you have a suggestion that goes beyond what’s being done now, please make it.

    reply to this | link to this | view in thread ]

  112. icon
    Stephen T. Stone (profile), 11 Jun 2019 @ 4:51pm

    Re: Re:

    What I suggest is that the platforms take advantage of section 230 to voluntarily effectively identify and boot such users from the platforms.

    No, what you said was:

    Given the financial resources of the major sites, I’d suggest coordinating with anti-hate groups (SPLC, ADL, etc.) to basically dox the people in question so that they can be excluded en masse, and infiltrate their private boards so that you can avoid having to always be reactive.

    That isn’t “tak[ing] advantage of section 230”, that is outright authoritarian bullshit — and it is bullshit for which you openly and unapologetically advocate. I mean, have you thought through the consequences of Twitter, Google, and Facebook pooling together resources to effectively spy on the entire goddamned Internet so they can keep assholes off Twitter, Google, and Facebook?

    reply to this | link to this | view in thread ]

  113. identicon
    cpt kangarooski, 11 Jun 2019 @ 6:14pm

    Re: Re: Re:

    I would imagine that its about as much spying as they do now in order to advertise to people. I am skeptical that people are good at maintaining totally separate identities online, and if the ad companies are as good as they’re made out to be, it only takes a little information to irreversibly connect a person’s commercial identity (for ordering things online) to their “anonymous” or “pseudonymous” posting identity as a troll, nazi, etc. So the information is likely already known to Google and almost certainly to a collaboration of Google, Facebook, and Amazon.

    Other than that people are already creeped out about it, is there a major consequence that isn’t already happening? If you’re worried about intelligence agencies doing the same thing or piggybacking, that ship has probably already sailed.

    Deplatforming of this nature should be done with a light touch, but at the end of the day it is relatively harmless. No one is kicking anyone off the net, no one is preventing assholes from making their own version of Google and Facebook with hookers and blackjack (like Conservapedia) and there’s probably few enough of them that a modicum of civility and reason could be restored by kicking out a small number of hard-core troublemakers.

    While I get that it is a distressing idea that we may have come to this point (and we certainly do not want to go further and let the government get involved in deplatforming people) there is a sickness and it’s not clearing up on its own. Some sort of affirmative treatment is called for before things get worse.

    What’s your suggestion for the malicious malaise afflicting society these days? Make popcorn? I’m still happy to hear about milder yet effective alternatives. And you didn’t actually say what harms you anticipate from my suggestion, either.

    reply to this | link to this | view in thread ]

  114. icon
    Stephen T. Stone (profile), 11 Jun 2019 @ 6:32pm

    You are calling for the major tech companies to spy on the entire Internet, with the help of third-party companies, so they can effectively punish assholes if they post bullshit on a platform owned by a major tech company. Imagine if you could be banned from Twitter because of something you said here, or vice versa.

    If you see no issues with that proposition, I cannot help you.

    reply to this | link to this | view in thread ]

  115. icon
    PaulT (profile), 11 Jun 2019 @ 11:58pm

    Re:

    I think the problem is a typical one when dealing with normal, decent people - they call for tools but do not consider the way the tools can be abused. It makes sense if people who think in a similar way are given those tools. Unfortunately, people in the real world will not always think that way.

    It is sadly better for society to put up with trolls, abuse, hatred, etc. than to face the alternative where good people are attacked with the tools we would use to stop that.

    reply to this | link to this | view in thread ]

  116. icon
    Wendy Cockcroft (profile), 12 Jun 2019 @ 2:28am

    Re: Re: Re: Re:

    Which would reduce our ability to interact with each other just because some people can't behave themselves. Individuals are personally responsible for their own behaviour. It's ridiculous to hold a platform responsible for user behaviour.

    reply to this | link to this | view in thread ]

  117. icon
    Wendy Cockcroft (profile), 12 Jun 2019 @ 2:31am

    Re: Re:

    Yes, but that's what mute and block buttons are for. I use them all the time when people annoy me. We don't have to put up with trolls, etc., at all.

    Honestly, it seems to me that refusing to engage with them is the better way. Too many people see a need to interact with them and have the last word. It's a stupid way to behave. Ignore, mute or block, and move on.

    reply to this | link to this | view in thread ]

  118. identicon
    Anonymous Coward, 12 Jun 2019 @ 2:34am

    Re: Re: Re: Re: Re: Seriously though

    Icing on the cake is his other post, saying that Masnick is "lying" when he doesn't know who Hamilton is.

    Always knew that MyNameHere and Hamilton were lovers, but it takes a special sort of screwed in the head to go yandere when someone else's crush ignores them...

    reply to this | link to this | view in thread ]

  119. identicon
    Anonymous Coward, 12 Jun 2019 @ 3:34am

    Re: Re: Re: Re:

    Are you confident that the ban line will be one that you agree with, you will never step over it? Are you confident that the system will never make a mistake and associate you with somebody else's speech?

    reply to this | link to this | view in thread ]

  120. identicon
    Anonymous Coward, 12 Jun 2019 @ 11:32am

    Do you think these clashing views will hinder efforts to get rid of S230?

    reply to this | link to this | view in thread ]

  121. identicon
    Anonymous Coward, 12 Jun 2019 @ 8:21pm

    Re: Re: Re: Re: Re: Re: Seriously though

    Actually I was around. It's why I'm waiting for it to inevitably show up as the cherry on top of his triple-decker shit sandwich he calls a cake.

    reply to this | link to this | view in thread ]


Add Your Comment

Have a Techdirt Account? Sign in now. Want one? Register here



Subscribe to the Techdirt Daily newsletter




Comment Options:

  • Use markdown. Use plain text.
  • Remember name/email/url (set a cookie)

Close

Add A Reply

Have a Techdirt Account? Sign in now. Want one? Register here



Subscribe to the Techdirt Daily newsletter




Comment Options:

  • Use markdown. Use plain text.
  • Remember name/email/url (set a cookie)

Follow Techdirt
Advertisement
Report this ad  |  Hide Techdirt ads
Essential Reading
Techdirt Insider Chat
Advertisement
Report this ad  |  Hide Techdirt ads
Recent Stories
Advertisement
Report this ad  |  Hide Techdirt ads

Close

Email This

This feature is only available to registered users. Register or sign in to use it.