Free Speech

by Mike Masnick


Filed Under:
content moderation

Companies:
aclu, eff, facebook



Rights Groups Demand Facebook Set Up Real Due Process Around Content Moderation

from the seems-like-a-good-idea dept

For quite some time now, when discussing how the various giant platforms should manage the nearly impossible challenges of content moderation, one argument I've fallen back on again and again is that they need to provide real due process. This is because, while there are all sorts of concerns about content moderation, the number of false positives that lead to "good" content being taken down is staggering. Lots of people like to point and laugh at these, but any serious understanding of content moderation at scale has to recognize that when you need to process many many thousands of requests per day, often involving complex or nuanced issues, many, many mistakes are going to be made. And thus, you need a clear and transparent process that enables review.

A bunch of public interest groups (including EFF) have now sent an open letter to Mark Zuckerberg, requesting that Facebook significantly change its content removal appeal process, to be much clearer and much more accountable. The request first covers how clear the notice should be concerning what content caused the restriction and why:

Notice: Clearly explain to users why their content has been restricted.

  • Notifications should include the specific clause from the Community Standards that the content was found to violate.

  • Notice should be sufficiently detailed to allow the user to identify the specific content that was restricted, and should include information about how the content was detected, evaluated, and removed.

  • Individuals must have clear information about how to appeal the decision.

And then it goes into many more details on how an appeal should work, involving actual transparency, more detailed explanations, and knowledge that an appeal actually goes to someone who didn't make the initial decision:

Appeals: Provide users with a chance to appeal content moderation decisions.

  • The appeals mechanism should be easily accessible and easy to use.

  • Appeals should be subject to review by a person or panel of persons not involved in the initial decision.

  • Users must have the right to propose new evidence or material to be considered in the review.

  • Appeals should result in a prompt determination and reply to the user.

  • Any exceptions to the principle of universal appeals should be clearly disclosed and compatible with international human rights principles.

  • Facebook should collaborate with other stakeholders to develop new independent self-regulatory mechanisms for social media that will provide greater accountability.

Frankly, I think this is a great list, and am dismayed that the large platforms haven't implemented something like this alread. For example, we recently wrote about Google deeming our blog post on the difficulty of content moderation to be "dangerous or derogatory." In that case, we initially got no further information other than that claim. And the appeals process was totally opaque. The first time we appealed, the ruling was overturned (again with no explanation) and a month later when that article got dinged again, the appeal was rejected.

After we published that article, we had an employee from the Adsense team eventually reach out to us to explain that it was "likely" that some of the comments on that article were what triggered the problems. After pointing out that there were well over 300 comments on the article, we were eventually pointed to one particular comment that used some slurs, though the comment used them to demonstrate the ridiculousness of automated filters, rather than as derogatory epithets.

However, as I noted in my response, my main complaint was not Google's silly setup, but the fact that it provided no actual guidance. We were not told that it was a comment that was to blame until after our published article resulted in someone higher up on the AdSense team reaching out. I pointed out that it seemed only reasonable that Google should share with us specifically what term it felt we had violated and which content was the problem so that we could then make an informed decision. Similarly, the appeals process was entirely opaque.

While the reasons that Google and Facebook have not yet created this kind of due process are obvious (it would be kinda costly, for one), it does seem like such a system will be increasingly important, and it's good to see these groups pushing Facebook on this in particular.

Of course, earlier this year, Zuckerberg had floated an idea of an independent (i.e. outside of Facebook) third party board that could handle these kinds of content moderation appeals, and... a bunch of people freaked out, falsely claiming that Zuckerberg wanted to create a special Facebook Supreme Court (even as he was actually advocating for having a body outside of Facebook reviewing Facebook's decisions).

No matter what, it would be good for the large platforms to start taking these issues seriously, not only for reasons of basic fairness and transparency, but because it would also serve to better make the public comfortable with how this process works. When it is, as currently construed, a giant black box, that leads to a lot more anger and conspiracy thinking over how content moderation actually works.

Update: It appears that shortly after this post went out, Zuckerberg told reporters that Facebook is now going ahead with creating an independent body to handle appeals. We'll have more on this once some details are available.


Reader Comments

Subscribe: RSS

View by: Time | Thread


  • This comment has been flagged by the community. Click here to show it
    identicon
    Band N Boston, 15 Nov 2018 @ 10:46am

    Contradicting own statement Section 230 gives arbitrary power.

    "And, I think it's fairly important to state that these platforms have their own First Amendment rights, which allow them to deny service to anyone."

    https://www.techdirt.com/articles/20170825/01300738081/nazis-internet-policing-content -free-speech.shtml

    So where do you find these new user's "rights" in the FLAT unqualified statement you made there?

    reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 15 Nov 2018 @ 10:57am

      Re: Contradicting own statement Section 230 gives arbitrary power.

      Why can't you understand the simple difference between what Facebook is allowed to do and what Facebook should be encouraged to do?

      reply to this | link to this | view in chronology ]

    • identicon
      Dan, 15 Nov 2018 @ 10:58am

      Re: Contradicting own statement Section 230 gives arbitrary power.

      • Facebook (or Google, or Twitter, or any other platform) has the absolute legal right to moderate content in any way, on any basis, and with any (or no) degree of transparency they wish.

      • It is foolish (and perhaps even morally wrong) for them to moderate in an arbitrary and opaque manner.

      These two statements are entirely consistent with each other. But for some reason you seem to believe they contradict each other.

      reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 15 Nov 2018 @ 11:04am

      Re: Contradicting own statement Section 230 gives arbitrary power.

      Saying that you "can" do something is not the same as saying that you "will" do something, or that you "should" do something. Nothing here suggests that Facebook "can't" be arbitrary and unhelpful in their content moderation decisions, just that they "shouldn't" be.

      Nor did he ever use the term "user rights." Because these are not "rights" that the user has which independently provide power over facebook, but terms of a contract that facebook might (or might not) voluntarily agree to.

      reply to this | link to this | view in chronology ]

    • icon
      James Burkhardt (profile), 15 Nov 2018 @ 11:11am

      Re: Contradicting own statement Section 230 gives arbitrary power.

      As others point out, you are conflating Facebook's legal rights with moral and ethical arguments about how Facebook should exercise its rights. They are two separate but linked issues.

      Your failure to address this, combined with your combative tone, is why you get a flag.

      reply to this | link to this | view in chronology ]

    • icon
      Toom1275 (profile), 15 Nov 2018 @ 1:01pm

      Re: Contradicting own statement Section 230 gives arbitrary power.

      If you're looking to complain about pulling fictional rights out of thin air, perhaps you should call out that deranged dipshit who claims that copyright is a "Constitutional Right."

      reply to this | link to this | view in chronology ]

  • This comment has been flagged by the community. Click here to show it
    identicon
    Band N Boston, 15 Nov 2018 @ 10:50am

    Let's have some details on your own alleged "voting system".

    When it is, as currently construed, a giant black box, that leads to a lot more anger and conspiracy thinking over how content moderation actually works.

    The KEY one of course is whether an Administrator okays the censoring with added editorial warning that you euphemize as "hiding".

    You call for others of vastly larger scale to be transparent but to say the least, don't lead by example.

    reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 15 Nov 2018 @ 11:07am

      Re: Let's have some details on your own alleged leaving forever

      No it really isn’t. The key is, they don’t have to tell you shit. Even though it’s been explained to you literally dozens of times. Watching you cry about it does brighten up my commute.

      reply to this | link to this | view in chronology ]

    • icon
      James Burkhardt (profile), 15 Nov 2018 @ 11:07am

      Re: Let's have some details on your own alleged "voting system".

      What, do you want a list of IP addresses that flagged you? Should I have to list why I flagged you in addition? Facebook moderation has nothing to do, functionally, with how the report button operates.

      reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 15 Nov 2018 @ 12:13pm

      Re: Let's have some details on your own alleged "voting system".

      How's this for transparent:

      I flagged you because you're a belligerent, often incomprehensible fool with massive holes in your understanding of everything and yet insult others for what you perceive (wrongly) to be holes in their understanding.

      There ya go. Now piss off.

      reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 15 Nov 2018 @ 12:17pm

      Re: Let's have some details on your own alleged "voting system".

      Since you asked:

      I flagged you because you never have anything useful to contribute, and your lack of substance has ceased to be amusing and moved into the realm of the tiresome.

      Put simply, I'm telling you to shut up and go away. You won't listen, of course, which is why the flag is there.

      reply to this | link to this | view in chronology ]

    • icon
      Gary (profile), 15 Nov 2018 @ 2:49pm

      Re: Techdirt has a great voting system

      Blue complains about getting downvoted to oblivion. Everyone else cheers.
      Blue demands "Transparency" and everyone laughs.

      Because you can't email a transparency report to a nameless troll. How would Mike track the stats for the cowards and let them see the info without a logon and an a working email?

      Blue lies, and doesn't know what Common Law is.

      reply to this | link to this | view in chronology ]

  • icon
    Bamboo Harvester (profile), 15 Nov 2018 @ 11:14am

    The point being missed here is...

    ...Facebook's sales product is... YOU. Users. The mine, farm, and sell, sell, sell all that user data.

    They don't want to boot ANYONE off for any reason - that's an inventory loss each time they do so.

    BIG markets out there for every possible "group", even the most radical hate groups.

    Facebook is a *company*. Companies exist to make money.

    It's not that difficult to figure out.

    reply to this | link to this | view in chronology ]

    • icon
      James Burkhardt (profile), 15 Nov 2018 @ 11:35am

      Re: The point being missed here is...

      How does facebook's unwillingness to kick people off the platform have anything to do with Facebook's unwillingness to be transparent about why they removed content, potentially leading to loss of members, and the EFF proposing a solution?

      reply to this | link to this | view in chronology ]

    • icon
      Leigh Beadon (profile), 15 Nov 2018 @ 11:37am

      Re: The point being missed here is...

      The issue here is them moderating TOO much content and providing no way for people to get it reinstated. Not them refusing to boot people off the service. So it's not really clear what you're saying here...

      reply to this | link to this | view in chronology ]

      • icon
        Bamboo Harvester (profile), 15 Nov 2018 @ 11:57am

        Re: Re: The point being missed here is...

        They're purging the least profitable inventory.

        Facebook isn't going to kick off high profit users/groups, no matter if they start a Nuke the Gay Whales organization.

        If Neo-Nazis suddenly stop buying tons of "memorabilia" and such crap, they'll get parsed out as well. If gays suddenly stop buying from Facebook ads, they'll get weeded out as well.

        Facebook is a BUSINESS.

        reply to this | link to this | view in chronology ]

        • identicon
          Anonymous Coward, 15 Nov 2018 @ 12:09pm

          Re: Re: Re: The point being missed here is...

          A business run by human beings.

          The bizarre notion of how you think a business should be run reflects more on you than it does on Facebook.

          reply to this | link to this | view in chronology ]

        • icon
          James Burkhardt (profile), 15 Nov 2018 @ 2:16pm

          Re: Re: Re: The point being missed here is...

          So first off, we are largely talking about content moderation, not user account moderation, which suggests a misunderstanding of the base premises.

          Your strain of thought is a reason for why they should do moderation, pruning off content that would drive away advertisers. But 'should we do moderation' is not in debate. The debate is around how that moderation occurs, because of clear and obvious variance. Your assertion seems to be that the variance happens because they are this nebulous business entity that is just pumping short term metrics i guess? EFF, Techdirt, and people who understand that a business is a collection of people, believe this variance is occurring due to the need for individuals to make snap value judgments often without the context to understand the content at issue. This forces personal biases to the forefront. The EFF is proposing a system that requires transparency so that appeals can occur, or corrective action taken.

          reply to this | link to this | view in chronology ]

          • identicon
            Anonymous Coward, 15 Nov 2018 @ 3:29pm

            Re: Re: Re: Re: The point being missed here is...

            No, I mean that while a successful business' primary motivation is profit, it is not the exclusive motivation. You depict it as the exclusive motivation.

            reply to this | link to this | view in chronology ]

        • identicon
          Anonymous Coward, 15 Nov 2018 @ 3:15pm

          Re: Re: Re: The point being missed here is...

          "If Neo-Nazis suddenly stop buying tons of "memorabilia" and such crap, they'll get parsed out as well. If gays suddenly stop buying from Facebook ads, they'll get weeded out as well.

          Facebook is a BUSINESS."

          A business that seeks to prosper long term will always have to consider issues other than immediate cash flow.

          For instance, YouTube must have lost plenty of money after implementing the decision a year or two ago to ban paid advertising from firearm manufacturers, which specifically targeted viewers of Youtube's many gun channels. Youtube might not have realized that this policy change would have the ultimate effect of turning most gun reviews into paid promotions when manufacturers switched from running YouTube ads to paying video makers directly. Youtube then cracked down a second time by banning links on YouTube pages going to external gun sites. One thing that Youtube has never taken any interest in is whether a product reviewer is actually a paid shill, an issue that cuts right to the core of ethical conduct, and one type of Youtube's moderation efforts that viewers would actually welcome.

          Through a continuous game of cat and mouse, it's rather obvious that Youtube is determined to kill off a highly profitable community of consumers rather than profiting from their interests, presumably to show the world (as well as placate its leftist activist employees) that Youtube is on the "correct" side of a highly divisive political issue.

          reply to this | link to this | view in chronology ]

          • identicon
            Christenson, 15 Nov 2018 @ 5:27pm

            Re: Re: Re: Re: The point being missed here is...

            I think you are mistaken about youtube and gun control, in that
            "gun nuts" are a relatively small part of their user population, and a relatively small part of their advertising revenue.

            Youtube has the same problem as any large newspaper or TV station did in the past: they can't afford to piss off the vast majority of their viewers or advertisers.

            I think youtube, as a corporation, is large enough to have become fairly amoral, and is mostly interested in not angering the majority.

            The same can be said for a "fair" moderation process: unfairness, or the perception thereof, can drive away business.

            reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 15 Nov 2018 @ 12:08pm

    I find it interesting how views of some commentators, including Mr. Masnick here, has evolved over times to "should not tell people why they were censored or else trolls will game it" to "people must be informed of why they were censored and must be able to appeal the decision".

    I'm not criticizing. Just amused.

    reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 15 Nov 2018 @ 12:15pm

      Re:

      If the censoring is automatic it is easily gamed. Human review of flagged posts/content is harder to game and should be clear and appealable.

      I get your point though. It's not a black and white situation.

      reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 15 Nov 2018 @ 12:14pm

    it was "likely" that some of the comments on that article were what triggered the problems

    You know you can use the user agent in an http request to permit/deny or even modify the response, right? It should be near-trivial to omit comments from requests made by spiders.

    reply to this | link to this | view in chronology ]

    • identicon
      Christenson, 15 Nov 2018 @ 5:39pm

      Re: Omitting user comments from search

      That *might* work for Techdirt, but it will omit the comments from such things as search results. Gets complicated if I want to say, find the last complaint about "blue" or something, or if I want to find a suggestion from the comments that I half-remember.

      And by the way, there's a decent chance I wrote the comment involved, using an example to point out that whether a given sentence would be acceptable or not depended heavily on context, which computers are bad at. Just consider your favorite hate speech, then consider someone complaining, quite correctly, about me saying it and quoting it. Quotes of bad stuff are part of a journalist's stock in trade.

      reply to this | link to this | view in chronology ]

  • icon
    Darkness Of Course (profile), 15 Nov 2018 @ 12:20pm

    So, nobody at FB/Google is a software dev

    Well, let me clarify that: A good software dev.

    The search flags are set, and the wide swatch of the web in Google's case and the subset of the web that is FB. If something is flagged, the reason for the flag is KNOWN at that time. Proper error returns, or error logging must show what (and hopefully where) the error was as well as the type of error.

    TechDirt article X, Hate speech flag, comments section.

    Based on Google's use of Go and their penchant for ML driven solutions one would be surprised if they ever managed to do it correctly.

    A detailed error might be difficult for their solution; Which is still fn-ing wrong.

    reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 15 Nov 2018 @ 3:21pm

      Re: So, nobody at FB/Google is a software dev

      It is not necessarily valid to place the blame for whatever at the feet of software developers when management directs what they work on, what the requirements are and what equipment they can use. Developers many times indicate their unwillingness to build substandard product and get down voted by the scrummy Agile twits who suck up to management.

      Guess what happens when the customer does not like the product ... who gets yelled at - never gets old does it?

      reply to this | link to this | view in chronology ]

      • identicon
        Christenson, 15 Nov 2018 @ 5:46pm

        Re: Re: So, nobody at FB/Google is a software dev

        Agile has a place and a point...you want to find out as soon as you can whether you have the right idea or not, so start small and cover the essentials.

        But I suspect that the end user is often left out of these setups as a stakeholder, remember, users don't write checks to Google.

        reply to this | link to this | view in chronology ]

  • identicon
    John Smith, 15 Nov 2018 @ 3:34pm

    I've already said that I believe AOL had greater censorship power in the 1990s relative to the size of the internet than any company in history, and that censorship problmes in general are self-correcting. Even Gab found a new host and benefitted from the Streisand Effect as attempts to silence it only increased their media exposure.

    Gab itself was a byproduct of Twitter censorship. There is, of course, always USENET for those who truly want free speech. USENET's sharp decline in the past decade or two shows that this is simply not a big priority for a public that seems to want to be spoonfed its information, "fake news" or not.

    Is there any American who could be trusted with absolute censorship power?

    reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 15 Nov 2018 @ 5:15pm

      Re:

      Hollywood is still not going to give you free pussy to grab.

      Have an Article 13 vote.

      reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 15 Nov 2018 @ 7:43pm

      Re:

      "Gab itself was a byproduct of Twitter censorship."

      When did Twitter stop people from posting on other websites?

      reply to this | link to this | view in chronology ]

      • identicon
        Anonymous Coward, 15 Nov 2018 @ 10:37pm

        Re: Re:

        When Jjonny boy needed them to. Kinda like how he’s a musician, Hollywood producer, best selling author, millionaire businessman, when need be. And most certainly NOT a bitter old has been nobody, who dick stopped working around the time the Spoce Girls were popular.

        reply to this | link to this | view in chronology ]

  • icon
    Toom1275 (profile), 15 Nov 2018 @ 10:14pm

    Re: Re: Buy ibogaine online

    At least it's more constructive and comprehensible than amything Blue/tp/John Smith have ever posted.

    reply to this | link to this | view in chronology ]

  • icon
    Anonymous Monkey (profile), 17 Nov 2018 @ 5:02pm

    Typo

    ...haven't implemented something like this alread.

    I think that should be "already"

    reply to this | link to this | view in chronology ]


Add Your Comment

Have a Techdirt Account? Sign in now. Want one? Register here
Get Techdirt’s Daily Email
Use markdown for basic formatting. HTML is no longer supported.
  Save me a cookie
Close
Have a Techdirt Account? Sign in now. Want one? Register here
Get Techdirt’s Daily Email
Use markdown for basic formatting. HTML is no longer supported.
  Save me a cookie
Follow Techdirt
Techdirt Gear
Shop Now: Techdirt Logo Gear
Advertisement
Report this ad  |  Hide Techdirt ads
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Chat
Advertisement
Report this ad  |  Hide Techdirt ads
Recent Stories
Advertisement
Report this ad  |  Hide Techdirt ads

Close

Email This

This feature is only available to registered users. Register or sign in to use it.