Flip Side To 'Stopping' Terrorist Content Online: Facebook Is Deleting Evidence Of War Crimes

from the not-for-the-first-time dept

Just last week, we talked about the new Christchurch Call, and how a bunch of governments and social media companies have made some vague agreements to try to limit and take down "extremist" content. As we pointed out last week, however, there appeared to be little to no exploration by those involved in how such a program might backfire and hide content that is otherwise important.

We've been making this point for many, many years, but every time people freak out about "terrorist content" on social media sites and demand that it gets deleted, what really ends up happening is that evidence of war crimes gets deleted as well. This is not an "accident" or such systems misapplied, this is the simple fact that terrorist propaganda often is important evidence of war crimes. It's things like this that make the idea of the EU's upcoming Terrorist Content Regulation so destructive. You can't demand that terrorist propaganda get taken down without also removing important historical evidence.

It appears that more and more people are finally starting to come to grips with this. The Atlantic recently had an article bemoaning the fact that tech companies are deleting evidence of war crimes, highlighting how such videos have actually been really useful in tracking down terrorists, so long as people can watch them before they get deleted.

In July 2017, a video capturing the execution of 18 people appeared on Facebook. The clip opened with a half-dozen armed men presiding over several rows of detainees. Dressed in bright-orange jumpsuits and black hoods, the captives knelt in the gravel, hands tied behind their back. They never saw what was coming. The gunmen raised their weapons and fired, and the first row of victims crumpled to the earth. The executioners repeated this act four times, following the orders of a confident young man dressed in a black cap and camouflage trousers. If you slowed the video down frame by frame, you could see that his black T-shirt bore the logo of the Al-Saiqa Brigade, an elite unit of the Libyan National Army. That was clue No. 1: This happened in Libya.

Facebook took down the bloody video, whose source has yet to be conclusively determined, shortly after it surfaced. But it existed online long enough for copies to spread to other social-networking sites. Independently, human-rights activists, prosecutors, and other internet users in multiple countries scoured the clip for clues and soon established that the killings had occurred on the outskirts of Benghazi. The ringleader, these investigators concluded, was Mahmoud Mustafa Busayf al-Werfalli, an Al-Saiqa commander. Within a month, the International Criminal Court had charged Werfalli with the murder of 33 people in seven separate incidents—from June 2016 to the July 2017 killings that landed on Facebook. In the ICC arrest warrant, prosecutors relied heavily on digital evidence collected from social-media sites.

The article notes, accurately, that this whole situation is kind of a mess. Governments (and some others in the media and elsewhere) are out there screaming about "terrorist content" online, but pushing companies to take it all down is having the secondary impact of both deleting that evidence from existence and making it that much more difficult to find those terrorists.n And when people raise this concern, they're mostly being ignored:

These concerns are being drowned out by a counterargument, this one from governments, that tech companies should clamp down harder. Authoritarian countries routinely impose social-media blackouts during national crises, as Sri Lanka did after the Easter-morning terror bombings and as Venezuela did during the May 1 uprising. But politicians in healthy democracies are pressing social networks for round-the-clock controls in an effort to protect impressionable minds from violent content that could radicalize them. If these platforms fail to comply, they could face hefty fines and even jail time for their executives.

As the article notes, the companies rush to appease governments demanding such content get taken down has already made the job of those open source researchers much more difficult, and actually helped to hide more terrorists:

Khatib, at the Syrian Archive, said the rise of machine-learning algorithms has made his job far more difficult in recent months. But the push for more filters continues. (As a Brussels-based digital-rights lobbyist in a separate conversation deadpanned, “Filters are the new black, essentially.”) The EU’s online-terrorism bill, Khatib noted, sends the message that sweeping unsavory content under the rug is okay; the social-media platforms will see to it that nobody sees it. He fears the unintended consequences of such a law—that in cracking down on content that’s deemed off-limits in the West, it could have ripple effects that make life even harder for those residing in repressive societies, or worse, in war zones. Any further crackdown on what people can share online, he said, “would definitely be a gift for all authoritarian regimes. It would be a gift for Assad.”

Of course, this is no surprise. We see this in lots of contexts. For example, the focus on going after platforms for sex trafficking with FOSTA stopped the ability of police to help find actual traffickers and victims by hiding that material from view. Indeed, just this week, a guy was sentenced for sex trafficking a teenager, and the way he was found was via Backpage.

This is really the larger point we've been trying to make for the better part of two decades. Focusing on putting liability and control on the intermediary may seem like the "easiest" solution to the fact that there is "bad" content online, but it creates all sorts of downstream effects that we might not like at all. It's reasonable to say that we don't want terrorists to be able to easily recruit new individuals to their cause, but if that makes it harder to stop actual terrorism, shouldn't we be analyzing the trade-offs there? To date, that almost never happens. Instead, we get the form of a moral panic: this content is bad, therefore we need to stop this content, and the only way to do that is to make the platforms liable for it. That assumes -- often incorrectly -- a few different things, including the idea that magically disappearing the content makes the activity behind it go away. Instead, as this article notes, it often does the opposite and makes it more difficult for officials and law enforcement to track down those actually responsible.

It really is a question of whether or not we want to be able to address the underlying problem (those actually doing bad stuff) or sweep it under the rug by deleting it and pretending it doesn't happen. All of the efforts to put the liability on intermediaries really turns into an effort to sweep the bad stuff under the rug, to look the other way and pretend if we can't find it on a major platform, that it's not really happening.

Filed Under: christchurch call, content moderation, evidence, extremist content, terrorist content, war crimes
Companies: facebook, google, twitter


Reader Comments

The First Word

Subscribe: RSS

View by: Time | Thread


  • identicon
    MathFox, 20 May 2019 @ 9:43am

    Do you say that the politicians that complained yesterday about the Facebook "filter bubble" now want Facebook to filter more?

    reply to this | link to this | view in chronology ]

  • icon
    Stephen T. Stone (profile), 20 May 2019 @ 9:51am

    the idea that magically disappearing the content makes the activity behind it go away

    See also: Pornography, violent media, and overt bigotry.

    reply to this | link to this | view in chronology ]

    • icon
      Wendy Cockcroft (profile), 21 May 2019 @ 6:36am

      Re:

      Confirmed correct. Banning All The Things! doesn't make them go away, just underground. We need to address the attitudes behind the expression instead of sweeping them under the ban carpet.

      reply to this | link to this | view in chronology ]

  • This comment has been flagged by the community. Click here to show it
    identicon
    Siniffa Spole, 20 May 2019 @ 9:55am

    What makes you think "flip side"? Facebook deletes Israeli crime

    You're trying to flip this to divert from Facebook's EVERYDAY practices as if it's valiantly struggling to do good.

    Facebook's Secret Censorship Manual Exposed as Platform Takes Down Video About Israel Terrorizing Palestinians

    December 30, 2018 "Information Clearing House" - After the New York Times on Thursday published an expos‚ of Facebook's global censorship rulebook, journalist Rania Khalek called out the social media giant for taking down a video in which she explains how, "on top of being occupied, colonized territory, Palestine is Israel's personal laboratory for testing, refining, and showcasing methods and weapons of domination and control."

    http://www.informationclearinghouse.info/50851.htm

    `Facebook is a private company!' shout people in favor of censoring political content

    Once upon a time, Facebook espoused idealistic notions of making the world a more open and connected place. But when it comes to views the US establishment [or UK or Israel]doesn't like, Facebook magically remembers it's also a "private company."

    What's most surprising is how many people (and even journalists) are jumping on the censorship bandwagon and clinging to the "private company" argument instead of admitting that they really don't believe in freedom of information - if it inconveniences them or risks threatening their status in some way, that is.

    https://www.rt.com/news/451872-facebook-private-company-censorship/

    And what I state is that so long as a business is in The Public's markets, then WE have final say in how operates. -- Even if it hadn't agreed to a license.

    Inside FACEBOOK Secret Rulebook for Global Political Speech...

    An Unseen Branch of Government

    Increasingly, the decisions on what posts should be barred amount to regulating political speech - and not just on the fringes. In many countries, extremism and the mainstream are blurring.

    https://www.msn.com/en-us/news/technology/inside-facebook*s-secret-rulebook-for-global-po litical-speech/ar-BBRvjsg

    Now, WHO here is always for surveillance capitalism / corporate censorship and doesn't want these global-mega-corps to be regulated, let alone cut down to size? Masnick.

    This is just more of Masnick's ongoing "left-libertarian" cover story as if he's for free speech and free markets. But like Facebook, what you DON'T see mentioned by Techdirt reveals its true agenda.

    reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 20 May 2019 @ 10:31am

      Re: What makes you think "flip side"? Facebook deletes Israeli c

      There's a button for "Submit Story" at the bottom of the webpage, you know, if you think a story merits their time. There's also people who read this site, you know, who know pretty much every author here has gone on and about at length about surveillance being a Bad Thing, and how Facebook caving in to public power every time the gu'mints want something else taken off the Googles is a Bad Thing, even if Twitter having the ability to do the Bad Thing at all in the first place is not itself a Bad Thing. There's also that WE you speak of- why are the Public you and not the people working at Techdirt? Mike said it was OK so now it's common law. I read that on the internets so we know it's legit.

      reply to this | link to this | view in chronology ]

    • icon
      Gary (profile), 20 May 2019 @ 11:54am

      Re: Now, WHO here is always for surveillance capitalism

      Now, WHO here is always for surveillance capitalism

      That is a very good question, thanks for asking!

      I'd say it is the Pro-Copyright folks (Blueballs, John, others) that are always screaming about how they want corporations to monitor (and moderate!) all user activities.

      Obviously they both want corporations to see everything we do to stop copyright violations. Surveillance Capitalism needs needs strong copyright law to justify the oversight.

      reply to this | link to this | view in chronology ]

    • icon
      Stephen T. Stone (profile), 20 May 2019 @ 12:30pm

      so long as a business is in The Public's markets, then WE have final say in how operates.

      Please cite the law, statute, or court ruling that says the public has “final say” in what speech what a privately owned platform open to UGC will and will not host.

      reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 20 May 2019 @ 3:28pm

      Re: What makes you think "flip side"? Facebook deletes Israeli c

      so long as a business is in The Public's markets, then WE have final say in how operates.

      So than I say that all movie studios release all of their movies via torrents! That is the final say on how they operate because the business is the The Public's markets!

      reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 20 May 2019 @ 10:25am

    It seems a bit late to try and control the narrative but it looks like that is what the cult of conservative is up to. The cat is out of the bag, the horses have left the barn, pandora's box is open and it is too late to put the genie back in the bottle but they will continue to try because that is all they know apparently.

    Any good coverup now has to include methods of scrubbing the internet of all instances of said story and, as many here know, this is near impossible.

    Seems it is not the content that they dislike, it is the discussion of terrorism they do not like because of their involvement.

    reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 20 May 2019 @ 11:03am

      Re:

      The only thing wrong with your comment is the word conservatives as EVERYBODY who does not like something on the internet is demanding exactly the same thing, what they personally dislike be removed and replacement with something they personally like.

      reply to this | link to this | view in chronology ]

      • identicon
        Anonymous Coward, 20 May 2019 @ 12:37pm

        Re: Re:

        "EVERYBODY who does not like something on the internet is demanding exactly the same thing"

        I doubt it ... I'm not.

        Many complain about creeping fascism while others point out how the law is not equally enforced. These people are not demanding an internet fairness law nor are they demanding social media carry their message. There is a huge difference between what people want and how they communicate same.

        I do not demand removal of the stupid crap people type/say/record/whatever, stupid crap should be given rebuttals not removal.

        reply to this | link to this | view in chronology ]

      • icon
        Stephen T. Stone (profile), 20 May 2019 @ 12:41pm

        EVERYBODY who does not like something on the internet is demanding exactly the same thing

        Including you? Or are you “the exception that proves the rule”?

        reply to this | link to this | view in chronology ]

  • icon
    Toom1275 (profile), 20 May 2019 @ 10:26am

    See also: how FOSTA is designed to protect sex traffickers by obsucing evidence of their actions.

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 20 May 2019 @ 11:23am

    Someone needs to take that company and pull the plug on it.

    reply to this | link to this | view in chronology ]

    • icon
      That One Guy (profile), 20 May 2019 @ 3:51pm

      Re:

      Which company? The one that's been pressured by politicians to remove content that allows people to find criminals and do something about them, or some other company for... some reason?

      reply to this | link to this | view in chronology ]

  • This comment has been flagged by the community. Click here to show it
    identicon
    Anonymous Coward, 20 May 2019 @ 12:47pm

    Why Facebook has censorship.

    I do not often read RT news for the simple reason I grew up in the cold-war but on occasions I do.

    RT has an article which I have coped and pasted a portion below with a few modifications.


    Hollywood actress ***** has just learned that there’s nothing so illiberal as a liberal when the received truths that underpin their worldview are challenged.

    Her experience of being dog piled on social media for daring to echo concerns over the official narrative surrounding *****, conforms to an established pattern.

    It dictates that anyone who dares question the official narrative on things such *****, is subjected to an evermore intense level of character assassination and calumniation.

    Focusing first on forensic work undertaken by a group of dissident academics , has induced a state of hysteria within the mainstream media.

    For their trouble members of the group have found themselves depicted as enemies of the people in (main stream media), their pictures and personal details published and pressure put on the universities which employ them to sack them. And all for daring to cast doubt on events such as .


    (She) is not part of the , yet for daring to cite their work ***, she found herself being mauled on social media. She was no doubt unaware that in doing so she was guilty of giving succour to ‘Assadists,’ ‘conspiracy theorists,’ and that she’s ‘naive.’
    This is really important. Why aren’t we talking about it?


    "We may have just discovered a major piece of the puzzle explaining how seemingly independent international organizations help deceive us into consenting to wars and regime change interventionism around the world."
    May 17, 2019


    Some subjects are such that anything posted has an immediate response from one group or the other with a barrage of hate so profound and intense that for all practical purposes the author would have been better off committing suicide.

    For verification one can read the unredacted article at
    https://www.rt.com/op-ed/459821-sarandon-media-opcw-syria/

    reply to this | link to this | view in chronology ]

    • icon
      Stephen T. Stone (profile), 20 May 2019 @ 12:49pm

      …okay, then.

      reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 20 May 2019 @ 1:26pm

      Re: Why Techdirt has Moderation

      Hey blue balls. See that. That’s how you do crazy right. Not the brain drooling slowly out of your mouth like you practice.

      reply to this | link to this | view in chronology ]

    • icon
      Gary (profile), 20 May 2019 @ 4:19pm

      Re: Why Facebook has censorship.

      I do not often read RT news

      Hey - and actual Russian Troll! And I thought trying to cite Infowars was bad. This shit is amazeballs.

      A very stabile genius said that Putin is our bestie - so we'd better take heed of RT News.

      reply to this | link to this | view in chronology ]

  • icon
    That One Guy (profile), 20 May 2019 @ 3:50pm

    'If I can't see it, it's not a problem.'

    It really is a question of whether or not we want to be able to address the underlying problem (those actually doing bad stuff) or sweep it under the rug by deleting it and pretending it doesn't happen.

    A question that the politicians involved have answered pretty clearly: Brush it under the rug.

    Blaming the companies and forcing them to take the videos down makes the problems/war crimes less visible, so they look like they're Doing Something, whereas actually doing something about the problem/war crimes takes work, and would require having to explain to the easily triggered that yes, objectionable content is still up, because that's the best way to find who did it and stop them, and what politician wants to deal with that when it's easier to just blame the companies and take all the credit for themselves?

    The goal of bills like that isn't to address the underlying problems, to stop the actual criminals and/or rescue the victims, it's merely to present the facade of Doing Something, to make it look like the politicians are addressing the issues because they're less visible when all they've really done is hand a huge gift to the criminals involved by making them harder to find.

    reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 20 May 2019 @ 5:50pm

      Re: 'If I can't see it, it's not a problem.'

      It was obvious when the problem of censorship vs discretion became visible was when the videos of those cursed terrorists where cutting off peoples' heads and the ceos of those platforms couldn't be trusted to use common sense decency and discretion to block and or remove immediately those acts of inhuman horror.

      reply to this | link to this | view in chronology ]

      • icon
        That One Guy (profile), 20 May 2019 @ 6:22pm

        Re: Re: 'If I can't see it, it's not a problem.'

        Just a tip for the future, but it helps if you actually read the article(all of it) before commenting, as it allows you to avoid such fumbles as 'advocating for something that would have prevented the person involved in the killings from being caught'.

        The fact that the video was still up is what allowed people to find at least one of those responsible and charge them with the murders, something that would not have been possible had the video, as abhorrent as it was, been immediately taken down.

        reply to this | link to this | view in chronology ]

  • identicon
    Rekrul, 20 May 2019 @ 4:10pm

    It's part of the new plan to sweep up extremist content;

    Creating A Reliable Program to Exterminate Terrorism

    reply to this | link to this | view in chronology ]

  • icon
    Coyne Tibbets (profile), 21 May 2019 @ 6:17am

    Why free speech?

    Censorship always has worse outcomes than unrestricted public discourse.

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 21 May 2019 @ 6:57am

    Responding to criminal acts with....criminal acts?

    If you look closely at the Federal obstruction statutes, this sounds like some folks are "engaged in misleading conduct towards another person" (lying to Facebook about the effects and consequences of the takedown) "with intent to" (they want the takedown) "cause or induce any person" (Facebook and employees thereof) "to ... withhold a record, document, or other object" (the "terrorist content" they want gone) "from an official proceeding" (any attempt to actually catch and punish The Bad Guys™). (18 USC 1512(b)(2)(A))

    While this doesn't apply directly to Congressmuppets (due to Speech or Debate Clause immunity), it sounds like something that the rest of the muppets who are suggesting "take down all the bad things!" need to be reminded of...

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Cowherd, 21 May 2019 @ 8:44am

    This is by design, because it's ever so much easier for governments to brush problems under the rug than solve them.

    reply to this | link to this | view in chronology ]


Add Your Comment

Have a Techdirt Account? Sign in now. Want one? Register here



Subscribe to the Techdirt Daily newsletter




Comment Options:

  • Use markdown. Use plain text.
  • Remember name/email/url (set a cookie)

Close

Add A Reply

Have a Techdirt Account? Sign in now. Want one? Register here



Subscribe to the Techdirt Daily newsletter




Comment Options:

  • Use markdown. Use plain text.
  • Remember name/email/url (set a cookie)

Follow Techdirt
Insider Shop - Show Your Support!

Advertisement
Report this ad  |  Hide Techdirt ads
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Chat
Advertisement
Report this ad  |  Hide Techdirt ads
Recent Stories
Advertisement
Report this ad  |  Hide Techdirt ads

Close

Email This

This feature is only available to registered users. Register or sign in to use it.