Once Again, Algorithms Can't Tell The Difference Between 'Bad Stuff' And 'Reporting About Bad Stuff'

from the stop-demanding-more-algorithms dept

We've discussed many times just how silly it is to expect internet platforms to actually do a good job of moderating their own platforms. Can they do better? Yes, absolutely. Should they put more resources towards it? For the most part, yes. But there seems to be this weird belief among many -- often people who don't like or trust the platforms -- that if only they "nerded harder" they could magically smarts their way to better content moderation algorithms. And, in many cases, they're demanding such filters be put in place and threatening criminal liability for failing to magically block the "right" content.

This is all silly, because so much of this stuff involves understanding nuance and context. And algorithms still suck at context. For many years, we've pointed to the example of YouTube shutting down an account of a human rights group documenting war crimes in Syria, as part of demands to pulldown "terrorist propaganda." You see, "terrorist propaganda" and "documenting war crimes" can look awfully similar. Indeed, it may be exactly the same. So how can you teach a computer to recognize which one is which?

There have been many similar examples over the years, and here's another good one. The Atlantic is reporting that, for a period of time, YouTube removed a video that The Atlantic had posted of white nationalist Richard Spencer addressing a crowd with "Hail, Trump." You remember the video. It made all the rounds. It doesn't need to be seen again. But... it's still troubling that YouTube removed it. YouTube removed it claiming that it was "borderline" hate speech.

And, sure, you can understand why a first-pass look at the video might have someone think that. It's someone rallying a bunch of white nationalists and giving a pretty strong wink-and-a-nod towards the Nazis. But it was being done in the context of reporting. And YouTube (whether by algorithm, human, or some combination of both) failed to comprehend that context.

Reporting on "bad stuff" is kind of indistinguishable from just promoting "bad stuff."

And sometimes, reporting on bad stuff and bad people is... kind of important. But if we keep pushing towards a world where platforms are ordered to censor at the drop of a hat if anything offensive shows up, we're going to lose out on a lot of important reporting as well. And, on top of that, we lose out on a lot of people countering that speech, and responding to it, mocking it and diminishing its power as well.

So, yes, I can understand the kneejerk reaction that "bad stuff" doesn't belong online. But we should be at least a bit cautious in demanding that it all disappear. Because it's going to remain close to impossible to easily determine the difference between bad stuff and reporting on that bad stuff. And we probably want to keep reporting on bad stuff.


Reader Comments

Subscribe: RSS

View by: Time | Thread


  • identicon
    Anonymous Coward, 28 Mar 2018 @ 12:35pm

    censorship as a recruitment tool

    It's also a given that people like Richard Spencer will make every effort to use this censorship against them as a recruitment tool, in yet another example of the old Nazi "Jews control the world" accusation while pointing to Google and Facebook. There's always a lot of 'mileage' that can be extracted out of 'playing the victim.'

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 28 Mar 2018 @ 12:50pm

    "shutting down an account of a human rights group documenting war crimes"

    Maybe this is their intent.

    reply to this | link to this | view in chronology ]

    • identicon
      Wendy Cockcroft, 29 Mar 2018 @ 6:08am

      Re:

      I can see that. The first job of censorship is to silence dissent. Proponents will use the language of victims to excuse it by claiming to protect the public — or certain groups — but ultimately it's about control. It always was.

      reply to this | link to this | view in chronology ]

  • icon
    Tyson (profile), 28 Mar 2018 @ 12:54pm

    A bit off-topic, but...

    This brings up the censor war occurring on YouTube and Twitter. I understand we are free to use other platforms, and YouTube and Twitter is allowed to do as it pleases. So you seek an alternative, like Gab.Ai, and it is full of most extreme alt-right. It's full of alt-right because they are the only ones forced to seek alternatives.

    Alternative platforms need a chance to thrive prior to getting inundated by people that are only there because they had no choice.

    If both sides of the political spectrum don't fight against censorship, regardless of who is censored, there will not be a practical alternative available when they censor you.

    reply to this | link to this | view in chronology ]

    • icon
      Richard (profile), 28 Mar 2018 @ 3:36pm

      Re: A bit off-topic, but...

      Remember that the original "First they came for the..." speech starts with Communists.

      Remember that at that point in time Communism was a murderous cult that was destroying Russia (starving the Ukraine to death)and attempting to take over the world. It looked like the authentic No. 1 evil in the world. Fascism at that point had not really shown its own evil face.

      The implication then is that everyone's rights need to be defended, no matter how vile they seem to be - because the people with the power to persecute them always have the potential to become even worse.

      reply to this | link to this | view in chronology ]

      • identicon
        Wendy Cockcroft, 29 Mar 2018 @ 6:02am

        Re: Re: A bit off-topic, but...

        I don't have much sympathy with Communism, Richard, but a deeper look at history will show you that the cult was of the personality of Stalin.

        The actual problem with communism is that it does nothing to address human nature; like any other ostensibly harmless philosophy promising blue skies and rainbows in an anarchist Utopia it creates a power vacuum by demolishing the institutions that uphold our society. Like it or not, we need leaders, and when we see chaos all around and somebody promising to restore order, we'll take whatever's going. And that "somebody" is usually a strongman figure. We've seen this played out throughout history wherever an idealist fantasy didn't work out in practice.

        This is why the Far Left and the Far Right have so much in common, including anti-Semitism; they're two sides of the same coin.

        It's why I tend to gravitate towards a centrist conservative position: it's entirely reasonable to be afraid of extremists on any side and I prefer order to chaos.

        reply to this | link to this | view in chronology ]

        • icon
          The Wanderer (profile), 29 Mar 2018 @ 7:10am

          Re: Re: Re: A bit off-topic, but...

          I parsed Richard's "at that point in time Communism was a murderous cult" as containing an implicit "generally seen as being" - i.e., he was talking about what those saying and hearing that "First they came for the Communists" speech would have understood Communism to be, not necessarily what those practicing it (or attempting to do so) understood it to be, much less what it actually was.

          reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 28 Mar 2018 @ 12:59pm

    Missed your writing and take on the issues, Mike. Hope to see more of it.

    reply to this | link to this | view in chronology ]

  • This comment has been flagged by the community. Click here to show it
    identicon
    Anonymous Coward, 28 Mar 2018 @ 1:31pm

    If you spent half as much energy as you spent trying to convince us that what google wants is kosher and proper (when nothing could be further from the truth, for example content moderation, you only press a seeming "need" for it because google wants it themselves as a premise for demonetizing more and more content they simply don't want to pay out for) you might actually accomplish something else in life.

    reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 28 Mar 2018 @ 1:48pm

      Re:

      "If you spent half as much energy as you spent trying to convince us that what google wants is kosher and proper (when nothing could be further from the truth, for example content moderation, you only press a seeming "need" for it because google wants it themselves as a premise for demonetizing more and more content they simply don't want to pay out for) you might actually accomplish something else in life."

      Would you care to elaborate on any of those accusations or are you just a mud-slinging troll?

      reply to this | link to this | view in chronology ]

      • identicon
        Anonymous Coward, 28 Mar 2018 @ 6:35pm

        Re: Re:

        The original poster is still salty about how European protectionist moves to push Google News out of their country ended up backfiring, since Google pulling out actually meant less money flowing to their publishers, once Google chose not to play by their extortionist rules.

        What he bitched about has absolutely nothing to do with the article aside from a tenuous reference to Google.

        reply to this | link to this | view in chronology ]

    • icon
      Ninja (profile), 29 Mar 2018 @ 7:12am

      Re:

      If you spent 10% of the energy you spend trolling you'd be Buddha.

      reply to this | link to this | view in chronology ]

      • identicon
        Anonymous Coward, 5 Apr 2018 @ 6:34am

        Re: Re:

        Since when is being a lethargic fat ass who barely has the energy to speak been an aspiration for .... oh... anyone?

        reply to this | link to this | view in chronology ]

  • identicon
    Carlie Coats, 28 Mar 2018 @ 1:57pm

    Strict liability for such actions

    Such censorship of reporting, under the claim that it is offensive speech, should constitute libel and should be subject to strict penalties for that.

    It would only take a few million-dollar judgements against the censors to make them think twice about such practices.

    FWIW

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 28 Mar 2018 @ 3:56pm

    I swear, the censorship is a trap.

    You build up an enemy, like say a group of people who think that becoming violent or discriminating based on racial stereotypes is a good thing for the world at large, and then online platforms take their content down because it's "unacceptable".

    Then it's gone. But people from the groups noticed it disappeared and they're convinced even moreso that their side is being unjustly repressed, no matter how ridiculous most people think the group is. It's a political Streisand effect: "Why suppress that group/video/comment/blog/website unless their enemies thought it was a valid threat?"

    Let people who cite ridiculous views speak. Let them be judged by their own words because they have no bite when we *choose* not to listen to them.

    I'm not trapped on one web page, communication protocol or social media platform. If there's something I find offensive there, I laugh or cringe and go somewhere else; the Internet's a big place.

    I'm not victimized, and the only thing that's offensive for someone to "protect" me from things like this, like I'll instantly get online PTSD or something. :P

    reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 28 Mar 2018 @ 4:50pm

      Re:

      Don't mock the notion of "online PTSD or something" - this poor woman (a CEO, no less) contracted PTSD via Twitter and seeing those mean tweets caused her so much distress that she had to go on disability and take a long [paid] leave from her job.

      Who knew that mean tweets can be as harmful to a person's long-term mental health as bullets and bombs on a battlefield?

      http://www.dailymail.co.uk/news/article-2605888/Woman-claims-PTSD-Twitter-cyberstalking- says-bit-war-veterans.html

      reply to this | link to this | view in chronology ]

      • identicon
        kJsAw, 29 Mar 2018 @ 12:09am

        Re: Re:

        Daily Mail is about the most dishonest, untrustworthy and toxic tabloid trash out there - and is responsible for popularizing the stupid, hateful bullshit causes that motivated the trolls attacking the "poor woman" in your story.

        The scumbags that own and run toxic tabloid media like the Daily Mail are far bigger contributers to the problem than Twitter etc.

        reply to this | link to this | view in chronology ]

      • identicon
        Anonymous Coward, 29 Mar 2018 @ 2:26am

        Re: Re:

        Citing the Daily Fail is promulgating fake news.

        reply to this | link to this | view in chronology ]

        • identicon
          Wendy Cockcroft, 29 Mar 2018 @ 6:06am

          Re: Re: Re:

          I howled with laughter at the thought of Our Glorious Leaders bringing in such laws over here: The Daily Fail would be the first to go, followed by the other right wing rags and their hate-filled invective.

          Since these are what's keeping our government in power (plus the ineptitude on the part of Her Majesty's Opposition) I can't see that happening any time soon.

          reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 28 Mar 2018 @ 4:37pm

    #1 reason why not: 47 US Code 230 c2a. "No provider or user of an interactive computer service shall be held liable on account of any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected."

    #2 reason why not: libel laws require the speech to be:
    - untrue;
    - presented as fact and not opinion;
    - either causing actual harm or being "innately harmful."

    "Your video was removed because it was determined to be in violation of our community guidelines" does not, in itself, cause actual harm. It's the removal of the video that would cause any problems -- which is not covered by libel.

    It's also not untrue -- the video was determined to be in violation of their guidelines -- and the guidelines themselves are an opinion of what YouTube considers to be hate speech.

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 28 Mar 2018 @ 9:21pm

    People in power make plans

    Why did anyone think "AI" would be different?, it's all a justification for fascism and violence, there is no other purpose, because "on a computer" justifys anything.

    reply to this | link to this | view in chronology ]

  • identicon
    tin-foil-hat, 29 Mar 2018 @ 3:28pm

    Humanity Taken For Granted

    I have worked in the IT industry for 3 decades. People take for granted all the processing the human brain does. I've had people demand things that are almost impossible to do and practically promise their first born for something that can be produced in 30 seconds. Computers do not understand nuance. It either matches or it doesn't. It's either right or wrong.

    On the other hand some people know exactly what they're doing. Their agenda is about controlling what others do. They know that perfection is impossible and overkill is their intent.

    reply to this | link to this | view in chronology ]


Add Your Comment

Have a Techdirt Account? Sign in now. Want one? Register here
Get Techdirt’s Daily Email
Use markdown for basic formatting. HTML is no longer supported.
  Save me a cookie
Follow Techdirt
Techdirt Gear
Shop Now: I Invented Email
Advertisement
Report this ad  |  Hide Techdirt ads
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Chat
Advertisement
Report this ad  |  Hide Techdirt ads
Recent Stories
Advertisement
Report this ad  |  Hide Techdirt ads

Close

Email This

This feature is only available to registered users. Register or sign in to use it.