Moral Panics: Don't Blame Facebook Because Some Guy Posted His Murder Video There

from the get-a-grip dept

As you've probably heard by now, on Sunday a horrific act of violence happened when a clearly disturbed individual apparently decided to (1) randomly murder an elderly man walking down the street, (2) film the entire process from searching for the guy, approaching him, talking to him and then shooting him, and (3) upload it to Facebook for people to see. The police initially reported that he streamed the murder live, but it was later clarified that, while he had streamed some other commentary live earlier in the day, the murder was filmed separately and then uploaded. Still, as happens all too often in these situations, people are immediately jumping to the moral panic stage and asking, as Wired did quickly after, what kind of responsibility Facebook should take. The title of the article says that Facebook "must now face itself" for streaming the murder -- but then seems to have trouble explaining just what it needs to face (perhaps because... there isn't anything for it to face).

And when the manhunt is over, and the grieving begins, so too will Facebook’s soul-searching.

Facebook is not the first media company to struggle with the prospect of unwittingly broadcasting violence shortly after being uploaded. When news anchor Christine Chubbuck killed herself on live TV in 1974, the station was unable to stop the event from airing, but never showed the footage again. The number of viewers who actually saw the event was minimal. Facebook has taken similar steps, pulling Stephens’ video shortly after it was posted. “This is a horrific crime and we do not allow this kind of content on Facebook,” the company said in a statement. “We work hard to keep a safe environment on Facebook, and are in touch with law enforcement in emergencies when there are direct threats to physical safety.”

Uh, right. So... what else does anyone expect Facebook to do? It's not like it can magically stop murders. Or stop people from initially uploading or streaming a murder video. Yes, it can (and does) take those down, and it can (and does) block re-uploading. But to pin this on Facebook seems... really, really weird. It's almost as if whenever there's a murder people want to find someone or something else to blame other than the person doing the killing.

The article kind of admits, later on, that expecting Facebook to do anything is impossible... but that just raises the question of why write a whole article asking what Facebook should do if the answer is "uh, it can't and shouldn't do anything."

Facebook, of course, is a decentralized system, with millions of freelance “reporters” with unfettered access to the public. By the time the company removed the video, thousands had already watched it, and it lives on in other corners of the internet. Meanwhile, the company has resisted calls to use its algorithms to censor videos like this before they are ever posted–not just because it does not want to be accused of violating speech rights, but also because training computers to identify real-time or recent murder is hard. Facebook has long relied on an army of humans to scour videos uploaded to its site. With videos, and especially Live videos, that job goes from hard to impossible—not even Facebook employees can watch a video before it posts.

Currently, Facebook relies on other Facebook users to flag videos that need to be taken down. But that means that someone has to watch the horror before others can be spared it. The onus falls to the viewers, not the company, to determine what is appropriate, what should be shared, and what should be flagged for removal. Traditional media companies have finely-wrought guidelines and policies to help them make these decisions, but Facebook depends on us to do it.

But even after basically admitting that this is an impossibility, the article still then says:

And now it might very well be time for the company to roll up its own sleeves and get to work.

And get to work doing what exactly? Again, Facebook isn't going to stop a murder. And I don't care how good the AI gets, it's unlikely any time soon to say "hey, that video is some person killing another person, don't stream that." There is no sleeve rolling to do on the Facebook side of the equation and even exploring this question seems silly. Yes, senseless murders and violence lead people to go searching for answers, but sometimes there are no answers. And demanding answers from a random tool that was peripherally used connected to the senseless violence doesn't seem helpful at all.

Filed Under: facebook live, murder, platforms, responsibility, video
Companies: facebook

Reader Comments

Subscribe: RSS

View by: Time | Thread

  1. identicon
    Anonymous Coward, 17 Apr 2017 @ 5:10pm


    You say that as if it's a point against Facebook but I dont see how it is.

    Favebook has a clear policy against female nipples and they enforce it to the best of their (admittedly awful and inconsistent) abilities.

    However, a nipple is a simple binary decision. It either exists in the video or it doesnt. At first glance, a random murder might seem similar, but we see this in our major movies all the time. People who make vine videos can have fake murders. Facebook would equally ban a fake nipple, sure. How much flack would they get if a well known director had a sneak peek of their new movie deleted and banned simply because someone died? How about an unknown director?

    Even if you think they should be able to figure out pretty quick what is what, can you not see the potential for issues? Even if you dont get it, this definitely blows the point of your "it'd be gone before 5 views" argument out of the water. I dont agree with facebook's blanket ban on female nipples, but its far easier to confirm those than it is to confirm "bad" murder over fiction. Some films these days are even being made to look like selfie cam, and you would remove the ability of a filmmaker to release promos about it if they involved violence.

    If you want facebook to straight up ban all depictions of violence, fine, but at least argue for that. Don't pretend its easy to quickly ban only what you dont like (and in this case what we all agree is awful) while still supporting the first ammendment.

Add Your Comment

Have a Techdirt Account? Sign in now. Want one? Register here

Subscribe to the Techdirt Daily newsletter

Comment Options:

  • Use markdown. Use plain text.
  • Remember name/email/url (set a cookie)

Follow Techdirt
Special Affiliate Offer

Report this ad  |  Hide Techdirt ads
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Chat
Report this ad  |  Hide Techdirt ads
Recent Stories
Report this ad  |  Hide Techdirt ads


Email This

This feature is only available to registered users. Register or sign in to use it.