Behind The Scenes Look At How Facebook Dealt With Christchurch Shooting Demonstrates The Impossible Task Of Content Moderation

from the it's-not-that-it's-difficult,-it's-impossible dept

We've been saying for ages now that content moderation at scale is literally impossible to do well. It's not "difficult." It's impossible. That does not mean that companies shouldn't try to get better at it. They should and they are. But every choice involves real tradeoffs, and those tradeoffs can be significant and will upset some contingent who will have legitimate complaints. Too many people think that content moderation is so easy that just having a a single person dedicated to reviewing content can solve the problem. That's not at all how it works.

Professor Kate Klonick, who has done much of the seminal research into content moderation on large tech platforms, was given the opportunity to go behind the scenes and look at how Facebook dealt with the Christchurch shooting -- an event the company was widely criticized over, with many arguing that they took too long to react, and let too many copies of the video slip through. As we wrote in our own analysis, it actually looked like Facebook did a pretty impressive job given the challenges involved.

Klonick, however, got to find out much more from the people actually involved, and has written up an incredible behind the scenes look at how Facebook dealt with the video for the New Yorker. The entire thing is worth reading, but I did want to highlight a few key points. The article details how Facebook has teams of people around the globe who are ready to respond and deal with any such "crisis," but that doesn't make the decisions they have to make any easier. One thing that's interesting, is that Facebook does have a policy that they should gather as much information as possible before making a call -- because sometimes what you see at first may not tell the whole story:

The moderators have a three-step crisis-management protocol; in the first phase, “understand,” they spend as much as an hour gathering information before making any decisions. Jay learned that the shooter seemed to be trying to make the massacre go viral: he had posted links to a seventy-three-page manifesto, in which he espoused white-supremacist beliefs, and live-streamed one of the shootings on Facebook, in a video that lasted seventeen minutes and then remained on his profile. Jay forced himself to watch the video, and then to watch it again. “It’s not something I would ask others to do without having to watch it myself,” he said.

If you think it's crazy to think that it might take up to an hour (I should note, this doesn't mean they always wait an hour -- just that it may take that long to gather the necessary info), Klonick demonstrates how the same basic fact pattern may present very different situations when understood in context. For example, you might think that a Facebook live video of one man shooting and killing another probably shouldn't be shown. But, context matters. A lot.

Understanding context is one of the most difficult aspects of content moderation. Sometimes, a post seems clearly destructive. In April, 2017, Steve William Stephens, a vocational specialist, shot and killed Robert Godwin, Sr., an elderly black man who was walking on the sidewalk near his home in Cleveland. Stephens said, bafflingly, that he had decided to kill someone because he was mad at his ex-girlfriend, and posted a video of the killing on Facebook, where it remained for two hours before the company removed it. People were horrified by how long it stayed up....

The fact pattern there is straightforward. A black man was shot on Facebook live. Facebook should take it down, right? But...

But disturbing videos may not always be damaging. In July, 2016, Philando Castile, a black school-nutrition supervisor, was shot seven times by a police officer during a traffic stop in Minnesota. Castile’s girlfriend, Diamond Reynolds, live-streamed the aftermath, as Castile bled from his wounds and died after twenty minutes. The footage arrived amid a series of videos depicting police violence against black men but was striking because it was streamed live, which exempted it from claims that it had been edited by activists or the police department before it was released.

If the "rules" say no live video of a shooting, you block the first one... but also the latter. Indeed, for a time, Facebook did block the latter, but that resulted in a lot of (reasonable) complaints, and Facebook changed its mind. Even though the basic fact patterns are the same.

Facebook initially removed the video, but then reinstated it with a content warning. To moderators looking at both, the videos might look similar—a grisly shooting of a black man in America—but the company eventually determined that the intentions behind the videos gave them distinct meaning: keeping up Reynolds’s video brought awareness to the systemic racism of the criminal-justice system, while taking down Stephens’s video silenced a murderer’s deranged homage to his ex-girlfriend.

In short: context matters a ton, and you don't always get the context right away. Indeed, sometimes it's very difficult to get the context. And, the same video in different contexts can be quite different. Indeed, this turned out to be some of the problem with the Christchurch video. Klonick details how just removing all copies of the video raised some questions about why some people were posting it:

This created an ethical tangle. While obvious bad actors were pushing the video on the site to spread extremist content or to thumb their noses at authority, many more posted it to condemn the attacks, to express sympathy for the victims, or because of the video’s newsworthiness. For consistency, and in deference to a request from the New Zealand government, the team deleted even these posts. The situation was a no-win for Facebook. Politicians were quick to condemn the company for the spread of extremism, and users who had posted the video in good faith felt unreasonably censored.

In other words, there are tradeoffs, and it's a no win situation. No matter which choice you make, some people are going to be (perhaps totally reasonably) upset about that decision.

And, of course, there was technical difficulties involved as well, though Facebook did move to try to minimize those:

By the time the handling of the Christchurch video switched to teams in the United States, some twelve hours after the shooting, moderators discovered a problem that they hadn’t encountered before at such a scale. When they tried to create a hash databank for the shooter’s video, users began purposefully or accidentally manipulating the video, creating slightly blurred or cropped versions that obscured the hash and could make it past Facebook’s firewall. Ahmed decided to try a new kind of hash technology that took a fingerprint from a vector of the video—its audio—which was likely to remain the same across different versions. This technique, combined with others, worked: in the first twenty-four hours, one and a half million copies of the video were removed from the site, with 1.2 million of those removed at the point of upload.

In short, there are lots of good reasons to complain about Facebook and to hate on the company. And it often does a bad job with its moderation efforts (though, they have gotten much better). But part of the problem is that when you're doing moderation at that scale, e mistakes are going to be made -- and some of those mistakes are going to be a big deal -- and some may be because of a lack of context.

Assuming that there's some magic wand that can be waved (as Australia, the UK, and the EU have suggested in recent days -- not to mention some US politicians) suggests a world that does not exist. It is not helpful to demand that companies magically do something that is impossible and that is driven by the fact that human beings aren't always good people. A more serious look at the issues of people doing bad stuff online should start with the bad people and what they're doing, not on blaming social media for being used as a tool to broadcast the bad things.

Filed Under: christchurch, content moderation, context
Companies: facebook


Reader Comments

The First Word

Subscribe: RSS

View by: Time | Thread


  • icon
    Mason Wheeler (profile), 26 Apr 2019 @ 7:10am

    from the things-i-never-thought-i-would-ever-say dept.

    One thing that's interesting, is that Facebook does have a policy that they should gather as much information as possible before making a call -- because sometimes what you see at first may not tell the whole story

    Which, surprisingly, makes Facebook significantly better in the handling of controversial videos than the Washington Post.

    reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 26 Apr 2019 @ 7:20am

      Re: from the things-i-never-thought-i-would-ever-say dept.

      How does one measure the better/worse handling of controversial videos? Is it a gut feel sort of thing, you know it when you feel it? Or is there some logic based, repeatable way to do this?

      reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 26 Apr 2019 @ 10:13am

      Re: from the things-i-never-thought-i-would-ever-say dept.

      In your continuing self-embarrassment, you cannot tell the difference between content moderation decisions and reporting.

      reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 26 Apr 2019 @ 7:24am

    This post was timely as word got out Twitter can't even distinguish white supremacists from elected GOP officials on its platform.

    reply to this | link to this | view in chronology ]

    • icon
      PaulT (profile), 26 Apr 2019 @ 7:44am

      Re:

      "Twitter can't even distinguish white supremacists from elected GOP officials"

      Erm, we might need some names, because there is no difference in some cases since they are white supremacists.

      Sure, there's probably some who aren't who "accidentally" happen to belong to some KKK-style groups and/or share their material, but there's a no-zero number of GOP officials who are exactly that themselves.

      reply to this | link to this | view in chronology ]

      • identicon
        Anonymous Coward, 26 Apr 2019 @ 7:59am

        Re: Re:

        That was rather the point being made. Something along the lines of "If we moderated white supremacist content the same way we moderate ISIS content, then we'd have to take down the content of a lot of mainstream Republican politicians in the USA (for example), and certainly a lot of activity from their supporters, because white supremacy is mainstream US politics." And they can't do that because the Republicans will shout and scream about their tweets being taken down.

        It's an interesting problem...

        reply to this | link to this | view in chronology ]

        • icon
          PaulT (profile), 26 Apr 2019 @ 8:03am

          Re: Re: Re:

          It certainly is. But, that does seem to be true - when sites take down white supremacist and neo-Nazi groups, right-wingers do seem to ask "why are they targeting us and not the left?".

          Whereas, they probably should be asking "why am I politically aligned with neo-Nazis?"

          reply to this | link to this | view in chronology ]

          • icon
            Stephen T. Stone (profile), 26 Apr 2019 @ 9:20am

            There’s a tweet about that. (It’s not a dril tweet, sadly.)

            reply to this | link to this | view in chronology ]

          • This comment has been flagged by the community. Click here to show it
            identicon
            Woody Bosk, 26 Apr 2019 @ 9:33am

            No, the left should be asking why they're Communists.

            Whereas, they probably should be asking "why am I politically aligned with neo-Nazis?"

            Lood Gord, that's crude and feeble smear.

            You (the left) set yourself up as possessed of all moral virtue and divide society into two classes, but of course since it's YOU and your Establishment indoctrination, can't see that you're no different from Brown Shirts except in small details.

            Oh, and you're one of those odious "out of the blue" elitists who PAY to pur your notions above others.

            reply to this | link to this | view in chronology ]

            • icon
              Gary (profile), 26 Apr 2019 @ 9:52am

              Re: No, the Right should be asking why Nazi's love them

              Woody, thank you for your kind words.
              I do divide the world into groups:

              People who punch nazi's and those that don't.

              reply to this | link to this | view in chronology ]

            • identicon
              Anonymous Coward, 26 Apr 2019 @ 9:53am

              Re: No, the left should be asking why they're Communists.

              So you're saying it's virtuous to align oneself with the Neo-Nazis? If "the left" believes otherwise and they're not the virtuous camp then by extension the white supremacist politicians are virtuous. You are, in effect, saying that it's ok to be white supremacist and spread racist hate. If you really believe that and that anyone who disagrees with you is wrong, well... Humanity would like to have a word with you.

              And you don't have to be a "leftist" to think <insert color here> supremacism is wrong.

              reply to this | link to this | view in chronology ]

            • icon
              Stephen T. Stone (profile), 26 Apr 2019 @ 10:00am

              Last time I checked, actual White supremacists endorsed Republicans far more than they endorse Democrats; in return, Republicans allow racists and xenophobes such as Steve King, Stephen Miller, and Steve Bannon (wow that’s a lot of Steves) to enter the party and influence its policies. Democrats are not angels by any stretch of the imagination, but they are also not linked to White supremacists on a regular basis, nor do their policies endear them to White supremacists.

              Also:

              you're one of those odious "out of the blue" elitists who PAY to pur your notions above others

              You can keep trying this smear, Blue Balls, but it will never take. We all know what “out of the blue” refers to, and it is neither the First/Last Word nor the Debbie Gibson song.

              reply to this | link to this | view in chronology ]

              • icon
                Bamboo Harvester (profile), 26 Apr 2019 @ 10:27am

                Re:

                Grand Dragon Senator Robert Byrd

                reply to this | link to this | view in chronology ]

                • icon
                  PaulT (profile), 26 Apr 2019 @ 10:35am

                  Re: Re:

                  When you have to reach back decades to find a”left wing” example, and not only ignore the southern strategy but also the many years the man spent apologising for his earlier actions - you’re proving the original point.

                  reply to this | link to this | view in chronology ]

                • identicon
                  Anonymous Coward, 28 Apr 2019 @ 8:46am

                  Re: Re:

                  Are you conflating liberal with democrat?

                  reply to this | link to this | view in chronology ]

                • identicon
                  Anonymous Coward, 28 Apr 2019 @ 5:29pm

                  Re: Maybe take a break from being a cop holster and read a book?

                  Something he called “The greatest mistake I ever made.” And “adding, “I know now I was wrong. Intolerance had no place in America. I apologized a thousand times ... and I don't mind apologizing over and over again”

                  So maybe you might want to take that bit back a read a bit of history bro.

                  reply to this | link to this | view in chronology ]

            • icon
              PaulT (profile), 26 Apr 2019 @ 10:32am

              Re: No, the left should be asking why they're Communists.

              Actually, I’d be mocking the left-wingers who whine because their groups espousing Pol Pot were taken down as well. But, I’m not seeing those, only the people complaining that literal Nazis bing taken down is an affront again them.

              But, hey, when you’re dumb enough to believe there’s only 2 points on the politics spectrum, I can forgive your comments reflecting that stupidity.

              reply to this | link to this | view in chronology ]

            • identicon
              Anonymous Coward, 26 Apr 2019 @ 10:34am

              Re: No, the left should be asking why they're Communists.

              "You (the left) set yourself up as possessed of all moral virtue and divide society into two classes, but of course since it's YOU and your Establishment indoctrination, can't see that you're no different from Brown Shirts except in small details."

              "Lood Gord, that's crude and feeble smear."

              • Funny how you accurately replied to your own comment.

              Project much?

              reply to this | link to this | view in chronology ]

            • identicon
              Anonymous Coward, 26 Apr 2019 @ 6:53pm

              Re:

              How's that Paul Hansmeier defense fund coming along, blue?

              reply to this | link to this | view in chronology ]

      • icon
        Stephen T. Stone (profile), 26 Apr 2019 @ 9:18am

        Re: Re:

        To quote Splinter News:

        Putting aside Twitter’s obvious ass-covering, let’s get to the real problem here: There is a critical mass of GOP politicians who are spout content virtually indistinguishable than what’s coming from white supremacists. … Put more simply: Twitter might have a white supremacist problem, but so does American politics.

        reply to this | link to this | view in chronology ]

      • identicon
        Anonymous Coward, 27 Apr 2019 @ 1:26am

        Re: Re:

        So even if one race IS more intelligent than another, we're not allowed to admit that because of hurt feelz.

        reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 26 Apr 2019 @ 8:27am

      Re:

      "This post was timely as word got out Twitter can't even distinguish white supremacists from elected GOP officials on its platform."

      Clearly we need to implement a white list for these obviously not white supremists so that they can continue to spew their white supremacy propaganda.

      reply to this | link to this | view in chronology ]

  • identicon
    MakeItSmaller, 26 Apr 2019 @ 7:44am

    TooBigToSucceed

    If it's too big to moderate, the solution is to make it smaller.

    reply to this | link to this | view in chronology ]

    • icon
      That One Guy (profile), 26 Apr 2019 @ 7:54am

      Or we could not do that

      Congrats, you just creates hundreds of millions of small, non-connected(because what's allowed on one may not be allowed on others, hence you need to silo) platforms, where maybe a few dozen people can post in each one, about as useful as pre-internet social gatherings.

      Good luck finding the funding to pay for all those platforms, not to mention the hundreds of millions of moderators willing to deal with them.

      reply to this | link to this | view in chronology ]

      • icon
        Anonymous Anonymous Coward (profile), 26 Apr 2019 @ 8:25am

        Re: Or we could not do that

        Also know as hundreds of millions of small, non-connected echo chambers where no one hears anything they don't like, and since all the hear is in agreement with what they want, they know with an inordinate amount of certainty that they are right.

        reply to this | link to this | view in chronology ]

      • identicon
        Anonymous Coward, 26 Apr 2019 @ 11:05am

        Re: Or we could not do that

        This is what we had in the 1990s isn't it?

        reply to this | link to this | view in chronology ]

        • identicon
          Anonymous Coward, 26 Apr 2019 @ 11:27am

          Re: Re: Or we could not do that

          No its not, because at least through the early 90, people who were online were dominantly computer and software people. The first Internet offering to the General public was AOL, a centralized and curated version of the Internet.

          reply to this | link to this | view in chronology ]

    • identicon
      Glen, 26 Apr 2019 @ 7:55am

      Re: TooBigToSucceed

      I'd really love to hear how you propose to accomplish this. Kick users off? That will go over well.

      reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 26 Apr 2019 @ 7:55am

      Re: TooBigToSucceed

      So, just murder half of the population?

      reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 26 Apr 2019 @ 7:58am

      Re: TooBigToSucceed

      Err,.. how do you make the human race smaller?

      As many or more moderators are required if the same amount of postings is spread across many many platforms.

      reply to this | link to this | view in chronology ]

    • icon
      PaulT (profile), 26 Apr 2019 @ 8:01am

      Re: TooBigToSucceed

      There's a lot of gun crime in the US, and it's not been possible to catch every crime in progress. Clearly the answer is to split each state into 100 pieces and prevent them from trading with each other to stop the spread of guns.

      reply to this | link to this | view in chronology ]

  • icon
    That One Guy (profile), 26 Apr 2019 @ 8:01am

    'If you REALLY tried/cared, you could make 2+2 equal 27'

    Assuming that there's some magic wand that can be waved (as Australia, the UK, and the EU have suggested in recent days -- not to mention some US politicians) suggests a world that does not exist

    As I've noted in the past, 'Everything is easy when you don't have to do it'.

    It costs the politicians nothing to claim that it can be done, because they know that they aren't the ones who will actually have to put their (impossible) demands into practice, and when the attempts fail they can simply double-down on blaming the companies for not trying hard enough, as all the while they get to preen and brag to the gullible about how they're Doing Something.

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 26 Apr 2019 @ 8:17am

    do you think that any politician is actually interested in how impossible it is to moderate content, regardless of what that content is? once the 'Grandstanding' starts, all every politician is after is the number of points they can gain. it's even worse when the real intention is to take control of the Internet, stopping the spread of 'news' about what they and their mega rich friends are up to while ensuring that they know everything about the ordinary people, despite which country they are in. what is happening, it seems to me, is what was wanted to happen in WWII, ie, dominate the Planet but not fire a shot!!

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 26 Apr 2019 @ 8:57am

    Of course it's possible to moderate content. It's impossible to do a perfect job of it, but they can do a better job than they are now.

    For example, they do a pretty good job of keeping child porn off of Facebook. You can definitely find examples that slip through occasionally, but by and large, you don't encounter it regularly.

    reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 26 Apr 2019 @ 9:07am

      Re:

      Child porn is illegal, and YouTube does not allow anonymous postings, therefore posting child porn on YouTube increases your chance of being arrested. Those facts explain its rarity, rather than any great moderation effort.

      reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 26 Apr 2019 @ 9:55am

      Re:

      You managed to agree with the article. Good job.

      reply to this | link to this | view in chronology ]

  • This comment has been flagged by the community. Click here to show it
    identicon
    Woody Bosk, 26 Apr 2019 @ 9:26am

    The original video forgotten, now to defend the corporation.

    1) Don't need "context": there's no fine points and no loss if taken down before seen all the way through. Facebook employees dithering over this is ridiculous. -- IF ever found to be worthwhile news, it could be restored. The premise that must be allowed "live" is just stupid. Facebook thereby created much problem they now congratulate themselves for fixing.

    2) We have only the story of highly-paid Facebook employees long after events, as doubtless coached by highly-paid PR and lawyer fiends, made to a friendly (no doubt somehow rewarded too) academic clearly intending to put out favorable PR. Not a single bit of the story can be trusted. It's just PR.

    3) Facebook may have done this clean-up at urging of NZ gov't which wanted it VERY MUCH, as proved by outlawing with ten-year sentence. So Facebook due no accolades.

    4) Again Masnick is saying that video of terrorist murders should stay up, but Alex Jones is too dangerous and should be "deplatformed" everywhere and every way. Complete contradiction: political speech within common law versus meaningless actual gore that's always been prohibited.

    reply to this | link to this | view in chronology ]

    • icon
      Gary (profile), 27 Apr 2019 @ 8:30am

      Re: Nonsense speech

      Complete contradiction: political speech within common law versus meaningless actual gore that's always been prohibited.

      Good morning Blue-Balls!! And thanks for your silly words of Common Law.

      Since I know you stay up late reading every post I make, let me take this opportunity to point out that you are unable to cite any free speech sites that meet your criteria for your Command Law.

      Also, your love Corporations is obvious because you get a hardon everytime a copyright case comes up and corporate censorship wins.

      So cite or suck it up. You use the word "proof" in almost every post, but you still don't seem to understand that means either.

      reply to this | link to this | view in chronology ]

  • This comment has been flagged by the community. Click here to show it
    identicon
    Woody Bosk, 26 Apr 2019 @ 9:27am

    You're saying "Nerd Harder" can make the bad idea work.

    I premise that:

    A) First sheerly mechanical: it's a bad idea for every yahoo and "terrorist" being able to quickly gain world-wide publicity. It's fundamental. You cannot fix with review, context or not, only with "prior restraint" as society has long prohibited and regulated in every prior medium.

    B) There is of course underlying and over-arching problems that moral restraints are eroded everywhere. This too cannot be fixed after the fact but requires the "prior restraint" of fairly unified societal opinion. The best way to control all excesses is to start with The Rich: keep constraints on their income (especially to limit where they profit from destroying civil society) and ability to influence legislators, else the present mess is what happens.

    Masnick has conflicting premises:

    1) That social media and more communication are inherently good, without drawbacks.

    2) Being an Extreme Libertarian (at the least) he actually assumes that there are no bad actors anywhere in economics or society. No amount of actual experience can sway him from this, even though here he directly states that there are such, because to him premise 1 is overwhelming. -- His stating that here is only to take the heat off social media.

    3) For of course his key constant: MONEY! Sex and violence bring highest audience to advertising for highest income, therefore must be no restraints on corporations.

    Clearly social media is now proven a bad idea. It required from start new law such as CDA Section 230 to create immunity that print publishers never had.

    (Again I wonder how did "Communications Decency Act" become permission to host EVERY type of previously prohibited content? Evidently was intended that the good parts be found Un-Constitutional while the corporate-empowering part was kept. At very least, that's the bad result and why it needs changed.)

    reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 26 Apr 2019 @ 9:56am

      Re: You're saying "Nerd Harder" can make the bad idea work.

      Clearly social media is now proven a bad idea.

      You were a bad idea. Your family should be banned from reproducing.

      reply to this | link to this | view in chronology ]

    • icon
      Stephen T. Stone (profile), 26 Apr 2019 @ 10:25am

      Engaging in some form of prior restraint would ruin the usability of a service such as Twitter. No one would bother posting anything on Twitter if it had to be reviewed before publication, even if the post is something innocuous and otherwise unremarkable like “going to see [movie title]”. Implementing prior restraint of any kind would also turn the platform into an honest-to-God arbiter of speech like you claim they already are; most tweets are punished after being posted, but prior restraint would punish them before they are posted. And while “moral restraints” may be “eroded” (which is not a universal fact that applies to all peoples), prior restraint of speech deemed “unpopular” or “immoral” would do us no favors — after all, defending a gay person’s right to exist and participate in society is still “unpopular” and “immoral” among a not-zero number of people.

      That social media and more communication are inherently good, without drawbacks.

      The fundamental premise of social interaction networks, regardless of form, is neither good nor evil in general terms. SINs can be used to spread positivity or negativity; they can be designed for optimal user experiences, capitalist bullshit, or anything in between. The key is how people use/design a SIN — which makes people, not the SIN or its underlying technology, the real “problem”.

      47 U.S.C. § 230 was implemented because lawmakers foresaw how people could, would, and eventually did find ways to abuse Internet communications. Publishers did not need (and still do not need) 230 protections because they did not offer services such as Twitter or Facebook. They publish words after holding them for “prior restraint” (i.e., fact-checking and editing), and they do not open up their publications in a way where anyone could contribute anything. SINs act as a form of “instant” communication; to engage in prior restraint with thousands — millions! — of posts every day is both an exercise in futility and a surefire way to piss off large swaths of the userbase at large.

      Oh, and 230 applies to corporations, but it also applies to regular jackoffs like me and you. If you host an Internet forum and a third party posts a death threat against a celebrity or politician, 230 gives you the right to both delete that threat, ban the user, and report them to the authorities without fear of legal liability for both the threat and the moderation thereof. That corporations make more use of 230 because they are bigger targets with bigger legal teams and bigger bank accounts is irrelevant.

      230 does not need changing and SINs do not need to implement prior restraint. If you can explain exactly why 230 is an overall bad law that needs repealing and how a SIN could implement prior restraint without destroying itself in the process, you would be the first.

      reply to this | link to this | view in chronology ]

      • icon
        That One Guy (profile), 26 Apr 2019 @ 12:42pm

        Re:

        What's particularly funny about them in particular supporting the idea of pre-screening content is that they've regularly lost their minds when their comments get caught in the spam filter and/or flagged by the community, making clear how much of a hypocrite they are in demanding that everyone else deal with what they themselves aren't willing to accept.

        reply to this | link to this | view in chronology ]

    • icon
      Cdaragorn (profile), 26 Apr 2019 @ 10:28am

      Re: You're saying "Nerd Harder" can make the bad idea work.

      So you've based your entire argument on 3 premises that are easily proven false by this very article, much less all the wealth of speech Masnick has made on this site. There's literally no point in even trying to talk with you about any conclusion you've made based on those.

      reply to this | link to this | view in chronology ]

      • identicon
        Anonymous Coward, 26 Apr 2019 @ 1:38pm

        Re: Re: You're saying "Nerd Harder" can make the bad idea work.

        "There's literally no point in even trying to talk with you about any" thing.

        reply to this | link to this | view in chronology ]

    • icon
      Mike Masnick (profile), 27 Apr 2019 @ 12:26am

      Re: You're saying "Nerd Harder" can make the bad idea work.

      Masnick has conflicting premises:

      Oh boy, oh boy, oh boy. Can't wait to find out what these are.

      1) That social media and more communication are inherently good, without drawbacks.

      Have never argued this, nor do I believe it. I believe -- as I've said many times -- that social media has many drawbacks, but also many benefits. There are significant tradeoffs.

      2) Being an Extreme Libertarian (at the least) he actually assumes that there are no bad actors anywhere in economics or society.

      Wait. Don't you keep calling me a leftist? Now you're saying I'm an extreme libertarian? Which is it?

      And, uh, no, I believe that there are many bad actors, and that's everywhere: on social media, in economics and society.

      3) For of course his key constant: MONEY! Sex and violence bring highest audience to advertising for highest income, therefore must be no restraints on corporations.

      Man. Strike three. You didn't get a single one right. I think that money as the sole driving force of corporate success is a huge problem (and at some point was planning a post on the evils of "fiduciary duty to investors"). And I've never said there should be no restraints on corporations -- though I'm wary of restraints that lead to worse outcomes -- that harm the public or competition.

      You misrepresent nearly everything I actually believe in. Maybe stop responding to the strawman in your head of me and deal with what I actually believe.

      reply to this | link to this | view in chronology ]

  • icon
    Bamboo Harvester (profile), 26 Apr 2019 @ 9:50am

    Intent?

    Ridiculous.

    Ban posting of videos showing actual killings and there's no "ethical tangle".

    And the whole some may have posted it in protest over the killing is ludicrous as well. Chop the audio and any "commentary" from the vid and it's the same damned thing - someone dying.

    Doesn't matter if it's a black guy being shot by a cop or a white child being raped to death by black muslims.

    It's a killing. "Intent" of the poster has zero bearing.

    reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 26 Apr 2019 @ 9:59am

      Re: Intent?

      That's a rather black and white view of a very gray world. No pun intended.

      reply to this | link to this | view in chronology ]

    • icon
      Cdaragorn (profile), 26 Apr 2019 @ 10:34am

      Re: Intent?

      So it's never of any value whatsoever to see proof of someone committing a terrible act. That's an incredibly stupid and narrow minded view of anything, but especially of speech (which video of something most definitely is).

      The fact that you don't agree with seeing it does not magically make the value others see in having it visible wrong. In fact you haven't even bothered to explain why you think it's wrong for it to be there. You seem to have jumped to the conclusion that everyone should "obviously" agree with you.

      You're assumption was not correct.

      reply to this | link to this | view in chronology ]

      • identicon
        Anonymous Coward, 26 Apr 2019 @ 1:41pm

        Re: Re: Intent?

        I'm having a hard time applying that to only one video.

        You're defending both videos remaining up, right?

        reply to this | link to this | view in chronology ]

      • icon
        Bamboo Harvester (profile), 26 Apr 2019 @ 2:12pm

        Re: Re: Intent?

        Your words, not mine.

        But if "intent" is to be the basis, if the shooter streamed it to his imam and the imam posted it to Facebook "in protest" because "not all muslims are like this", it'd be ok by you?

        Helluva way for a family to be notified their kid or parent was killed - live streamed on Facebook...

        reply to this | link to this | view in chronology ]

        • identicon
          Anonymous Coward, 27 Apr 2019 @ 7:34am

          Re: Re: Re: Intent?

          "Helluva way for a family to be notified their kid or parent was killed - live streamed on Facebook..."

          How come I never have to see this stuff I don't want to see?

          Pretty sure if I wanted to, I could find it

          reply to this | link to this | view in chronology ]

        • icon
          Wendy Cockcroft (profile), 29 Apr 2019 @ 5:19am

          Re: Re: Re: Intent?

          Agreed, but not all videos displaying violence are displaying real-world violence. If someone shares a clip from a film they found on YouTube, what then?

          reply to this | link to this | view in chronology ]

      • icon
        Bamboo Harvester (profile), 27 Apr 2019 @ 8:14am

        Re: Re: Intent?

        Nice diversion.

        Facebook considered "ethical reasons" and "intent of posting" as part of their "should we delete it" policy.

        So if I post a video of a person being shot in the head without comment or audio, is that ok?

        If I leave the audio on and the victim is screaming racial epithets at the shooter, is that ok?

        If I leave the audio on and the shooter is screaming racial epithets at the victim, is that ok?

        Oh, sorry. YOU don't get to make that decision. Facebook makes it FOR you.

        And now whines that "it's so hard to determine reason or intent...."

        reply to this | link to this | view in chronology ]

    • icon
      That One Guy (profile), 26 Apr 2019 @ 2:26pm

      Re: Intent?

      A few people on YT might have a bit of a problem with that.

      On the other hand, the people who's actions they were covering would probably be thrilled with your idea.

      reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 28 Apr 2019 @ 5:37pm

      Re: Intent!

      Hey bro have you ever seen a historical war documentary?

      Yes?

      Congrats you just played yourself.

      No?

      You’re a liar.

      reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 26 Apr 2019 @ 12:46pm

    Ahmed decided to try a new kind of hash technology that took a fingerprint from a vector of the video—its audio—which was likely to remain the same across different versions. This technique, combined with others, worked: in the first twenty-four hours, one and a half million copies of the video were removed from the site, with 1.2 million of those removed at the point of upload.

    Oh look, exactly what I suggested to do the morning after.

    Anyway yeah, it's impossible to automatically and proactively prevent content like this from being streamed but, audible gunshots and screaming must be worth a yellow flag. As long as Facebook (and other social media giants) demonstrate some of the most basic of their countermeasures to the public they probably would get less flak about this.

    reply to this | link to this | view in chronology ]

    • icon
      That One Guy (profile), 26 Apr 2019 @ 1:14pm

      Re:

      it's impossible to automatically and proactively prevent content like this from being streamed but, audible gunshots and screaming must be worth a yellow flag.

      Have fun wading through the piles of video-game, tv and movie footage that filter would catch to find the 'real' stuff.

      reply to this | link to this | view in chronology ]

      • icon
        Stephen T. Stone (profile), 26 Apr 2019 @ 2:35pm

        Videos of Mortal Kombat 11, for example, have been hit with “age restriction” markers and demonetized because the close-up scenes of violence manage to trip YouTube’s systems.

        reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 26 Apr 2019 @ 1:29pm

      Re:

      audible gunshots and screaming must be worth a yellow flag.

      Fictional dramas excepted. Also, a rule like that would lead to a lot of trolilng just to make the moderators work overtime.

      reply to this | link to this | view in chronology ]

  • This comment has been flagged by the community. Click here to show it
    identicon
    Anonymous Coward, 26 Apr 2019 @ 6:00pm

    reply to this | link to this | view in chronology ]


Add Your Comment

Have a Techdirt Account? Sign in now. Want one? Register here



Subscribe to the Techdirt Daily newsletter




Comment Options:

  • Use markdown. Use plain text.
  • Remember name/email/url (set a cookie)

Close

Add A Reply

Have a Techdirt Account? Sign in now. Want one? Register here



Subscribe to the Techdirt Daily newsletter




Comment Options:

  • Use markdown. Use plain text.
  • Remember name/email/url (set a cookie)

Follow Techdirt
Techdirt Gear
Shop Now: Copying Is Not Theft
Advertisement
Report this ad  |  Hide Techdirt ads
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Chat
Advertisement
Report this ad  |  Hide Techdirt ads
Recent Stories
Advertisement
Report this ad  |  Hide Techdirt ads

Close

Email This

This feature is only available to registered users. Register or sign in to use it.