Another Day, Another Bad Bill To Reform Section 230 That Will Do More Harm Than Good

from the no-bad dept

Last fall, when it first came out that Senator Brian Schatz was working on a bill to reform Section 230 of the Communications Decency Act, I raised questions publicly about the rumors concerning the bill. Schatz insisted to me that his staff was good, and when I highlighted that it was easy to mess this up, he said I should wait until the bill is written before trashing it:

Well, now he's released the bill and I am going to trash it. I will say that unlike most other bills we've seen attacking Section 230, I think that Schatz actually does mean well with this bill (entitled the "Platform Accountability and Consumer Transparency Act" or the "PACT Act" and co-authored with Senator John Thune). Most of the others are foolish Senators swinging wildly. Schatz's bill is just confused. It has multiple parts, but let's start with the dumbest part first: if you're an internet service provider you not only need to publish an "acceptable use policy," you have to set up a call center with live human beings to respond to anyone who is upset about user moderation choices. Seriously.

subject to subsection (e), making available a live company representative to take user complaints through a toll-free telephone number during regular business hours for not fewer than 8 hours per day and 5 days per week;

While there is a small site exemption, at Techdirt we're right on the cusp of the definition of a small business (one million monthly unique visitors - and we have had many months over that, though sometimes we're just under it as well). There's no fucking way we can afford or staff a live call center to handle every troll who gets upset that users voted down his comment as trollish.

Again, I do think Schatz's intentions here are good -- they're just not based in the real world of anyone who's ever done any content moderation ever. They're based in a fantasy world, which is not a good place from which to make policy. Yes, many people do get upset about the lack of transparency in content moderation decisions, but there are often reasons for that lack of transparency. If you detail out exactly why a piece of content was blocked or taken down, then you get people trying to (1) litigate the issue and (2) skirt the rules. As an example, if someone gets kicked off a site for using a racist slur, and you have to explain to them why, you'll see them argue "that isn't racist" even though it's a judgment call. Or they'll try to say the same thing using a euphemism. Merely assuming that explaining exactly why you've been removed will fix problems is silly.

And, of course, for most sites the call volume would be overwhelming. I guess Schatz could rebrand this as a "jobs" bill, but I don't think that's his intention. During a livestream discussion put on by Yale where this bill was first discussed, Dave Willner (who was the original content policy person at Facebook) said that this requirement for a live call center to answer complaints was (a) not possible and (b) it would be better to just hand out cash to people to burn for heating, because that's how nonsensical this plan is. Large websites make millions of content moderation decisions every day. To have to answer phone calls with live humans about that is simply not possible.

And that's not all that's problematic. The bill also creates a 24 hour notice-and-takedown system for "illegal content." It seems to be more or less modeled on copyright's frequently abused notice-and-takedown provisions, but with a 24-hour ticking time bomb. This has some similarities to the French hate speech law that was just tossed out as unconstitutional with a key difference being one element of notification of "illegal content" is a court ruling on the illegality.

Subject to subsection (e), if a provider of an interactive computer service receives notice of illegal content or illegal activity on the interactive computer service that substantially complies with the requirements under paragraph (3)(B)(ii) of section 230(c) of the Communications Act of 1934 (47 U.S.C. 230(c)), as added by section 6(a), the provider shall remove the content or stop the activity within 24 hours of receiving that notice, subject to reasonable exceptions based on concerns about the legitimacy of the notice.

The "notice requirements" then do include the following:

(I) A copy of the order of a Federal or State court under which the content or activity was determined to violate Federal law or State defamation law, and to the extent available, any references substantiating the validity of the order, such as the web addresses of public court docket information.

This is yet another one of those ideas that sounds good in theory, but runs into trouble in reality. After all, this was more or less the position that most large companies -- including both Google and Facebook -- took in the past. If you sent them a court ruling regarding defamation, they would take the content down. And it didn't take long for people to start to game that system. Indeed, we wrote a whole series of posts about "reputation management" firms that would file sketchy lawsuits.

The scam worked as follows: file a real lawsuit against a "John or Jane Doe" claiming defamation. Days later, have some random (possibly made up person) "admit" to being the Doe in question, admit to the "defamation" and agree to a "settlement." Then get the court to issue an order on the "settled" case with the person admitting to defamation. Then, send that court order to Google and Facebook to take down that content. And this happened a lot! There were also cases of people forging fake court documents.

In other words, these all sound like good ideas in theory, until they reach the real world, where people game the system mercilessly. And putting a 24 hour ticking time clock on that seems... dangerous.

Again, I understand the thinking behind this bill, but contrary to Schatz's promise of having his "good" staffers talk to lots of people who understand this stuff, this reads like someone who just came across the challenges of content moderation and has no understanding of the tradeoffs involved. This is, unfortunately, not a serious proposal. But seeing as it's bipartisan and an attack on Section 230 at a time when everyone wants to attack Section 230, it means that we need to take this silly proposal seriously.

Filed Under: appeals, brian schatz, call centers, censorship, john thune, notice-and-takedown, section 230, transparency


Reader Comments

Subscribe: RSS

View by: Time | Thread


  • icon
    That One Guy (profile), 25 Jun 2020 @ 8:10am

    Better than terrible is still bad

    While this may be a 'better' bill it still seems like the sort of thing put together by someone who did absolutely zero research into the subject, something you'd think would be among the first things to do before you propose legislation that stands to impact millions.

    Really, how hard would it have been to ask various platforms 'is this a reasonable thing to demand?', listen to the responses and then tailor the bill accordingly, tossing it entirely if(as it currently is) it isn't viable?

    Ignorance-based bills may be better than dishonesty-based ones, but that doesn't by any stretch make them good bills, and from demanding live support that won't work for either small or large platforms and and a demand for insanely quick takedowns that will just results in massive amounts of legal content removed this is most certainly not a good bill.

    reply to this | link to this | view in chronology ]

  • icon
    aerinai (profile), 25 Jun 2020 @ 8:12am

    Call Center Requirement -- Infinite Complexity

    So, some places like TechDirt have a chat engine and moderation done 'in-house'. But other places outsource to platforms like Disqus. How would that work? Would the site using Disqus have to pay for the call center?

    There are complex business models and these guys think writing 200 words on a piece of paper and calling it law 'magically' fixes everything. No research. No business impact analysis. This is lawmakers sitting in a room and making shit up at random expecting everyone else to just accept their simplistic view of the world as rational.

    They get upset when we criticize them, but they don't take the time to actually research the problem, talk to stakeholders, or even understand the basics of the business. Damn the consequences, THINK OF THE CHILDREN!

    reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 25 Jun 2020 @ 2:21pm

      Re: Call Center Requirement -- Infinite Complexity

      I demand a complaints department and review board to handle my issues with how your call center insufficiently responded to my obstreperous complaints about how your site moderated the way i shat on your lobby carpet.

      reply to this | link to this | view in chronology ]

  • icon
    Anonymous Anonymous Coward (profile), 25 Jun 2020 @ 8:36am

    Addressing the wrong code

    § 230 seems to be overly popular with our legislative drones. Why aren't they as concerned with the DMCA, which is much more problematic? That is causing more speech restrictions than § 230 (in its present form) ever could (and as pointed out elsewhere might even be censorship as it is government mandated) as well as economic harm to many individuals rather than corporations. Is it that § 230 goes after evil corporations? I thought corporations with their large sources for regulatory capture donations (a.k.a. bribes) were the friends of compliant congresscritters.

    reply to this | link to this | view in chronology ]

    • icon
      aerinai (profile), 25 Jun 2020 @ 8:51am

      Re: Addressing the wrong code

      Google, Facebook, and Twitter (these guys' targets) don't kiss the ring, nor pay enough to campaigns...

      A conspiracy theorist may hypothesize that they are doing this to get more campaign donations and once they have their pound of flesh will back off this rhetoric...

      reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 25 Jun 2020 @ 10:06am

      Re: Addressing the wrong code

      Why aren't they as concerned with the DMCA, which is much more problematic? That is causing more speech restrictions than § 230

      Because restricting and compelling speech is what they are after. Like all religions, and party politics is a form of religion, the priesthood decides what people can and cannot think and say.

      reply to this | link to this | view in chronology ]

    • icon
      PaulT (profile), 25 Jun 2020 @ 11:50pm

      Re: Addressing the wrong code

      "§ 230 seems to be overly popular with our legislative drones. Why aren't they as concerned with the DMCA, which is much more problematic?"

      Takedowns under the DMCA are done under the guise of "copyright", and since so many of them are bought off by the kinds of corporations who defend the more draconian version of that, it won't be challenged.

      Section 230 is used to enable takedowns by other kinds of corporations that haven't paid for the congressmen, and in fact often correctly used by "big tech", the boogeyman used by the corporations who are paying them to as an excuse for business model failures.

      They likely don't understand either, they have just been bought to interpret them differently.

      reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 25 Jun 2020 @ 10:14am

    My grades for the proposals

    A - C : n/a

    C-: PACT Act: not terrible, but still misunderstood.

    D: Earn It Act: Poorly made and gives the Justice Department too much power over digital rights. It’s a D due to the fact Bluthmetal is open to constructive criticism and is willing to make changes so that encryption would be less impacted.

    D-: The Encryption Killer: Earn It but less earnest and transparent.

    F: Josh Hawley’s proposals: just every proposal made by Hawley.

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 25 Jun 2020 @ 10:17am

    Dmca is OK as it in theory protects content owned by
    Big corporations even if its used by trolls to censor legal content . Or Companys who want to claim others people work for ad revenue. Even political debates have been taken down by dmca
    Ti's the season to go after big tech even if it means pointless stupid laws . Google Facebook needs to hire some lobbyists and make donations to those politicians who support free speech
    This law will be used by trolls or extremists
    to harass websites or censor free speech
    It could shut down some websites
    It only took 1 lawsuit to shutdown gawker

    reply to this | link to this | view in chronology ]

  • identicon
    bobob, 25 Jun 2020 @ 10:25am

    Well, we're faced with a generation in which most older adults are too lazy to try to understand new technology, most younger people who are satisfied believing technology is magic and an uninformed government appealing to ignorance to win elections. All of the mistaken arguments about section 230 seems like par for the course. People with actual technical knowledge and the ability to understand nuances are a minority.

    reply to this | link to this | view in chronology ]

  • icon
    crade (profile), 25 Jun 2020 @ 10:29am

    "The scam worked as follows: file a real lawsuit against a "John or Jane Doe" claiming defamation. Days later, have some random (possibly made up person) "admit" to being the Doe in question, admit to the "defamation" and agree to a "settlement." Then get the court to issue an order on the "settled" case with the person admitting to defamation. Then, send that court order to Google and Facebook to take down that content. And this happened a lot! There were also cases of people forging fake court documents."

    Yeah but this should be highly illegal. To say that someone can take advantage by committing this sort of fraud isn't really a knock on the bill.. It's more a knock on the enforcement of the laws already in place that are supposed to protect us against that behavior. It isn't necessarily a bad game just because people are cheating and no one is enforcing the rules.

    reply to this | link to this | view in chronology ]

    • icon
      Celyxise (profile), 25 Jun 2020 @ 11:09am

      Re:

      Just because people shouldn't abuse the system doesn't mean they won't. The real world is messy and laws should be crafted with that messy world in mind. The road to hell is paved with good intentions after all.

      reply to this | link to this | view in chronology ]

      • icon
        crade (profile), 25 Jun 2020 @ 2:33pm

        Re: Re:

        Sure people won't always obey the law that's the general idea behind law enforcement, but there isn't much point in creating a law if it doesn't require enforcement. I would think basically any bill is going to break down if you assume you are going to just let people defraud your court system

        reply to this | link to this | view in chronology ]

        • icon
          Celyxise (profile), 26 Jun 2020 @ 1:46pm

          Re: Re: Re:

          You seem to be assuming that people always get caught when they game the system, which is ridiculous. How long was Prenda operating before it got sorted? A well crafted bill would weigh the consequences against it's purpose. Better bills than this have had a ton of terrible unforeseen consequences crop up after the fact. This one has terrible consequences we already know about, and to suggest we just ignore those because people aren't supposed to abuse it is naive at best. If this bill was addressing some huge problem and was going to do a ton of good then these consequences might be worth it, but that is not the case here.

          reply to this | link to this | view in chronology ]

          • icon
            Ehud Gavron (profile), 26 Jun 2020 @ 2:43pm

            Better crafted bills

            A well crafted bill would weigh the consequences against it's purpose.

            There are two types of congress-critters, those who represent the people who elected them, and those who "know better" and end up representing the lobbyists. The latter get their bills passed.

            The second part of the formula is not just the greater good or the cost, but understanding the issue. Congress-critters (Ron Wyden excepted) have shown a ludite desire to not be mired in an understanding of technlogy. /s

            So we have people who

            • don't want to know about the technology
            • will take lobbyist fodder word for word and put it in a bill
            • defend it as if they either understand it (narp) or wrote it (narp)
            • call their oponents leftists or rightists or antifa or whatever their word of the day is

            ...and you talk of a well-crafted bill.

            Well Celyxise, you have my vote. Please run for office and fix this s...tuff.

            E

            reply to this | link to this | view in chronology ]

          • icon
            crade (profile), 29 Jun 2020 @ 7:48am

            Re: Re: Re: Re:

            No I'm not assuming people always get caught gaming the system I'm saying that this isn't gaming the system. Gaming the system is when you abuse the bill using the law, not just plain go and commit fraud. This is like saying that a bill banning murder is bad because people might be able to rig the jury and use it to condemn their enemies.. Yes people might be able to rig the jury and yes that's a problem and they might even be able to get away with it, but you need to deal with that regardless of your bill. You can't fix people undermining your entire legal system with a new law.

            reply to this | link to this | view in chronology ]

            • icon
              crade (profile), 29 Jun 2020 @ 8:12am

              Re: Re: Re: Re: Re:

              Came out a bit wrong.. You might be able to fix people undermining your entire legal system with a new law, but it's really a whole separate issue from what this bill is..

              The problem of people defrauding the courts in this exact way is already happening today as was mentioned in this article regardless of this bill, and it's happening for a whole slew of other laws completely unrelated to this bill as well. Yes this bill doesn't fix that, but it doesn't cause it either.

              The issue here isn't that they are not caught doing this anyway, they have been getting caught at least enough of the time, they know that they will just get a slap on the wrist anyway so they don't care.

              reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 25 Jun 2020 @ 11:14am

    Could be that they get paid heavily to 'ignore' the issues with the DMCA, since those often benefit the large corporations who buy our government (I mean 'lobby' our government... yeah, right why don't we have a 'truth in politics moment and just call it what it is, a BRIBE, which should be illegal).

    reply to this | link to this | view in chronology ]

  • icon
    Koby (profile), 25 Jun 2020 @ 11:33am

    There's no fucking way we can afford or staff a live call center to handle every troll who gets upset that users voted down his comment as trollish.

    I have heard of a potential fix by exempting user-based moderation. Often, this has an added benefit of not making certain speech completely inaccessible. With one mouse click you you turn off user-based moderation and see what's being said anyhow. It's likely that a lot of truly objectionable content would quickly be flagged and fall into this category, cutting down costs considerably.

    reply to this | link to this | view in chronology ]

    • icon
      Stephen T. Stone (profile), 25 Jun 2020 @ 11:49am

      exempting user-based moderation

      What good will come from exempting people from liability for muting/blocking other users on a platform if these elected assholes wreck 230 for, y’know, the platform as a whole?

      reply to this | link to this | view in chronology ]

      • icon
        Koby (profile), 25 Jun 2020 @ 12:01pm

        Re:

        What good will come from exempting people from liability...

        First, if the platform is exempted, then the platform won't need to handle all moderation decisions with a call center; costs could be greatly reduced.

        Second, corporations would not be the primary deciders of what speech is allowed to be discussed.

        Third, many user-based moderation systems that I've seen currently in use allow anyone to un-hide the moderated text, if they desire. Since the speech isn't completely eliminated, total censorship does not occur.

        reply to this | link to this | view in chronology ]

        • icon
          Stephen T. Stone (profile), 25 Jun 2020 @ 12:23pm

          if the platform is exempted, then the platform won't need to handle all moderation

          Services like Twitter already have exemptions from legal liability for moderation decisions. Pretty much all of them offer some form of client-side moderation (e.g., blocking, muting). I’m still not getting why this should change.

          corporations would not be the primary deciders of what speech is allowed to be discussed

          For the most part, they’re not. A transphobic asshole who gets booted from Twitter for saying transphobic bullshit can go find a non-corporate service that accepts such bullshit (e.g., Gab) and say it there. Corporations only decided what speech is allowed on the services they own/operate; if you can’t deal with their rules, go start your own platform with blackjack and hookers.

          many user-based moderation systems that I've seen currently in use allow anyone to un-hide the moderated text, if they desire. Since the speech isn't completely eliminated, total censorship does not occur.

          Twitter doesn’t engage in “censorship” when it deletes a post. Users who have their posts deleted can repost the text of those posts on any other service that will host it. Censorship involves suppression of speech; moderation does not.

          reply to this | link to this | view in chronology ]

        • identicon
          Anonymous Coward, 25 Jun 2020 @ 12:25pm

          Re: Re:

          User based moderation is only suitable for adult only websites, as it takes time for comments to be hidden, and they are still available for the inquisitive to unhide. Your wish to destroy any sites ability to moderate as they see fit, which includes a good faith effort to remove or block content they think the majority of their users would find objectionable, would destroy the Internet's usefulness for many users.

          reply to this | link to this | view in chronology ]

          • icon
            Stephen T. Stone (profile), 25 Jun 2020 @ 12:28pm

            Your wish to destroy any sites ability to moderate as they see fit, which includes a good faith effort to remove or block content they think the majority of their users would find objectionable, would destroy the Internet's usefulness for many users.

            Somehow, I think that’s the point.

            reply to this | link to this | view in chronology ]

          • icon
            Koby (profile), 25 Jun 2020 @ 1:07pm

            Re: Re: Re:

            Your wish to destroy any sites ability to moderate as they see fit, which includes a good faith effort to remove or block content they think the majority of their users would find objectionable...

            I don't primarily object to sites doing moderation, especially since I am open to user-based moderation, as well as moderation against profanity. Rather, I object to corporations engaging in political bias and censorship, and then hiding behind the "objectional" argument (especially since most social media feeds are opt-in!).

            ...would destroy the Internet's usefulness for many users.

            This is a hyperbolic. Seeing the occasional comment with which you disagree will not destroy the internet. It's a similar loss of credibility as to those who claimed the internet would cease to function after Jun 11th, 2018.

            reply to this | link to this | view in chronology ]

            • identicon
              Anonymous Coward, 25 Jun 2020 @ 1:22pm

              Re: Re: Re: Re:

              I object to corporations engaging in political bias

              Which just about every newspaper and news outlet does. Also, some political speech, such as racist, misogynist, anti LGBT speech is objectionable, and many platforms rightly ban it.

              reply to this | link to this | view in chronology ]

            • icon
              Stephen T. Stone (profile), 25 Jun 2020 @ 3:21pm

              I object to corporations engaging in political bias and censorship, and then hiding behind the "objectional" argument

              Twitter is a corporate-owned service. White supremacist propaganda is legally protected speech that represents a political ideology.

              Should the law force Twitter into hosting White supremacist propaganda?

              reply to this | link to this | view in chronology ]

            • icon
              That One Guy (profile), 25 Jun 2020 @ 4:01pm

              'Curse those sock gremlins!'

              Rather, I object to corporations engaging in political bias and censorship, and then hiding behind the "objectional" argument (especially since most social media feeds are opt-in!).

              Which would be like me saying that my objection to social media is that using it causes gremlins to steal people's socks, and for the same reason: The 'harm' I'm describing hasn't been demonstrated to exist, beyond the occasional person losing socks and blaming gremlins.

              If you're going to claim 'political bias' and expect to be taken serious step one is demonstrating that said bias exists, and so far either you've utterly failed to do that or you've ignored any demands to do so.

              reply to this | link to this | view in chronology ]

            • identicon
              Anonymous Coward, 25 Jun 2020 @ 4:19pm

              Re: Re: Re: Re:

              Another question, do you frequent Parler, gab and BitChute, platforms whose moderation policies seem like what you want, If not, why nor.

              reply to this | link to this | view in chronology ]

            • icon
              PaulT (profile), 26 Jun 2020 @ 12:07am

              Re: Re: Re: Re:

              "Rather, I object to corporations engaging in political bias and censorship, and then hiding behind the "objectional" argument (especially since most social media feeds are opt-in!)."

              Then, stop whining and use a service that doesn't do that. I don't want your white supremacist friends infecting the feeds of my family just because you're too weak to vote with your wallet.

              The rest of us are not that weak, which is why the corporations do what they do - if your friends come back to infect these platforms, we will leave and take the ad dollars we generate with us.

              reply to this | link to this | view in chronology ]

            • icon
              Celyxise (profile), 26 Jun 2020 @ 2:08pm

              Re: Re: Re: Re:

              Seeing the occasional comment with which you disagree will not destroy the internet.

              This is the most dishonest line I've seen you write. You cannot, with your prolific commenting here, feign ignorance about what Section 230 enables on the internet. It is what allows competitors to Twitter like Parler to exist. Suits like Stratton vs Prodigy demonstrate that. Just because you want to see a certain opinion on Twitter doesn't mean that others should lose their voice everywhere. And enough with this "censorship" routine, its bullshit and by now you know that.

              The bottom line is that if you ever get your wish regarding Section 230, you will quickly lose your ability to brag about it online.

              reply to this | link to this | view in chronology ]

        • icon
          PaulT (profile), 25 Jun 2020 @ 11:52pm

          Re: Re:

          "First, if the platform is exempted, then the platform won't need to handle all moderation decisions with a call center; costs could be greatly reduced."

          They're alreadye exemted, so nothing needs to change.

          "Second, corporations would not be the primary deciders of what speech is allowed to be discussed."

          Unless you're a lazy moron with reprehensible views, they already aren't, so no need for anything to change.

          "Third, many user-based moderation systems that I've seen currently in use allow anyone to un-hide the moderated text, if they desire"

          So... nothing needs to change? Gotcha.

          reply to this | link to this | view in chronology ]

        • identicon
          Rocky, 26 Jun 2020 @ 1:46am

          Re: Re:

          Second, corporations would not be the primary deciders of what speech is allowed to be discussed.

          When on their property, their rules. You argument is dishonest because you can say whatever you want, just not when you use someones private property to do it. Your argument would hold water if corporations could stop you from speaking up at all, but we all know that's not true.

          How about you give specific examples of conservative speech being silenced, because if you think 230 needs fixing you really should be able to provide data on how it's broken.

          reply to this | link to this | view in chronology ]

          • icon
            PaulT (profile), 26 Jun 2020 @ 2:22am

            Re: Re: Re:

            "When on their property, their rules"

            This is why they're so desperate to pretend that being a large popular service suddenly turns their property into public property. Without that, they have to start demanding that private owners have no right to control their own property, which is hilarious when you see that so many of them cry "socialism" about laws that benefit anyone else but them.

            "How about you give specific examples of conservative speech being silenced,"

            He's been asked this many times, and on the rare occasion he provides an answer it's usually along the lines of "that's how it looks to me". No facts, no data, just his sad little bubble of offensively ignorant people being told they need to go somewhere else to be like that,

            reply to this | link to this | view in chronology ]

          • identicon
            Anonymous Coward, 26 Jun 2020 @ 7:20pm

            Re: Re: Re:

            An example of how it is broken is Facebook: their selection and promotion algorithms are effectively applying editorial discretion to promote particular posts and comments based on their content, albeit automatically. If they were printing on paper instead of producing web pages, they'd share liability for content just as a newspaper is liable for the letters it selects, whereas if they just posted messages from the feeds you subscribe to in some neutral order, or a user-selected order, they'd be more clearly a platform rather than a publisher and editor.

            Of course, you won't see the right complaining about that because they have been benefitting more from Facebook's selective promotion of material likely to get engagement than their opponents have.

            reply to this | link to this | view in chronology ]

  • icon
    ECA (profile), 25 Jun 2020 @ 1:16pm

    OM freeking G..

    Lets see..
    Call center?
    Anyone have his Home, State, and federal phone numbers??
    Lets let him experience this..

    "receives notice of illegal content or illegal activity on the interactive computer service"

    A NOTICE...not proof? and your opinion is ??? not Proof.

    "Federal law or State defamation law, and to the extent available, any references substantiating the validity of the order, such as the web addresses of public court docket information. "

    Extent is a very over covering word. It has no limits as to WHO can complain.
    A link to the complaint, in the public docket??

    SEND IT IN THE MAIL..they I will have proof that you demand that I CENSOR my participants/consumers/players/ those that are discussing the ramifications of a capitalist system gone amok.

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 25 Jun 2020 @ 2:18pm

    I was wondering what he could mean by "illegal content" (this phrase always gets me). I mean literal cp/cam is probably pretty recognizable and who wouldn't want that off their service asap? But what else? Oh... defamation, apparently. Defamation is now illegal. Not only can one be sued for it, but it's illegal. Criminal defamation, creeping in sideways. And you need to take down whatever i claim is defamation in 24 hours.

    Brian, your bill? Dumpsterfire, dude.

    reply to this | link to this | view in chronology ]

  • icon
    Ehud Gavron (profile), 25 Jun 2020 @ 2:56pm

    Working Hours

    Well, MM, you've covered a lot of it, and the 24 people before me have added much. Having founded, owned, and ran ISPs (note: not telecom companies) I will try to add my humble thoughts.

    • "Live answer" is expensive so we use auto-attendants/interactive-voice-response (AA/IVR) and often accept messages for a call back.
    • nobody works 8 hours a day 5 days a week. Even if you ignore federal holidays the 9-5 workday includes bathroom breaks, drink breaks, smoke breaks, lunch, etc. (We don't stagger employees' breaks because we REALLY WANT them to take breaks together and enjoy some offtime without the pressure of handing off tickets.)
    • There's no current law anywhere on the books that I'm aware of that tells any industry to "man the phones".

    If there was it should be for Verizon Wireless (42 minute wait on Friday), Comcast (37 minute wait on Monday), and other companies that offer telecom services.

    Having an Internet connection is a commodity. It's simple.
    Setting up a website is trivial. Lots of resellers, lots of CMSs, etc.
    Moderating a website is a biatch, but MM... you've covered that.
    BUT HAVING TO MAINTAIN A PHONE LINE WITH A LIVE HUMAN is even worse than requiring moderation.

    Just Saying. For a friend.

    E

    reply to this | link to this | view in chronology ]

  • This comment has been flagged by the community. Click here to show it
    identicon
    Anonymous Coward, 25 Jun 2020 @ 2:57pm

    Who ya gonna call ???
    BLM and Burn it all down .
    Start from scratch CHAZ all around
    Who ya gonna call ???
    BLM and Burn it all down .

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 26 Jun 2020 @ 12:35am

    Having actual experience moderating (over several years on public forums related to the Open Directory Project), I can say:

    (1) Moderators NEVER discuss moderation in public. What can you say? "There's a rule. You violated it. If you have to be told which rule, we think you didn't read the rules at all. And if you can't be more polite than you were in that post, we really don't have to talk to you about why we don't want to talk to you. We don't want to talk to you, because we think it's wasting our time and not helping us (or the other contributors) accomplish our goals. Stop wasting our time, OK, by which we mean, don't talk to us and we won't stop our ears."

    I really haven't followed the twitter-haters: I don't care enough about twitter to hate them (or like them). They don't help me accomplish my goals. HOWEVER, the people that I have heard getting kicked off, haven't impressed me as the sort who shouldn't have been banned. Alex Jones? Really. He's not a conservative (or any kind of political slant, and it wouldn't matter if he was political), he's just a psychotic sociopath--or perhaps he enjoys screaming one online. In a well-mannered and public-spirited society, he'd be absent. And so naturally anyone concerned about the manners and spirit of their online community would be eager to have him absent.

    (2) Moderators discuss privately with other moderators, because we frequently didn't agree how best to moderate specific posts (even though we were all in very close agreement on the purpose of the forum and the ODP.) Moderators sometimes felt that they couldn't live with the consensus, and went elsewhere to make their ODP contributions. There are no easy answers, you do the best you can, and if you come to think that isn't good enough, you quit or close the forum.

    (3) Contributors who didn't like their contributions removed could (1) repost, omitting the most-likely offensive portion, (2) go elsewhere and post, or (3) repost offensively, and be given an invitation to the offline world. What can you say constructively to people who really don't want to put up with your idiom? Scream in your padded cell, or say something people might want to hear.

    reply to this | link to this | view in chronology ]


Add Your Comment

Have a Techdirt Account? Sign in now. Want one? Register here



Subscribe to the Techdirt Daily newsletter




Comment Options:

  • Use markdown. Use plain text.
  • Remember name/email/url (set a cookie)

Close

Add A Reply

Have a Techdirt Account? Sign in now. Want one? Register here



Subscribe to the Techdirt Daily newsletter




Comment Options:

  • Use markdown. Use plain text.
  • Remember name/email/url (set a cookie)

Follow Techdirt
Special Affiliate Offer

Advertisement
Report this ad  |  Hide Techdirt ads
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Chat
Advertisement
Report this ad  |  Hide Techdirt ads
Recent Stories
Advertisement
Report this ad  |  Hide Techdirt ads

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.