Three Lessons In Content Moderation From New Zealand And Other High-Profile Tragedies

from the no-magic-wands-available dept

Following the terrorist attacks on two mosques in Christchurch, New Zealand, social media companies and internet platforms have faced renewed scrutiny and criticism for how they police the sharing of content. Much of that criticism has been directed at Facebook and YouTube, both platforms where video of the shooter’s rampage found a home in the hours after the attacks. The footage was filmed with a body camera and depicts the perpetrator’s attacks over 17 minutes. The video first appeared on Facebook Live, the social network’s real-time video streaming service. From there, Facebook says, it was uploaded to a file-sharing site, the link posted to 8Chan, and began to spread.

While the world struggles to make sense of these horrific terrorist attacks, details about how tech companies handled the shooter’s video footage and written manifesto have been shared, often by the companies themselves. Collectively, these details in combination with the public discourse on and reaction to, as the New York Times referred to it, “a mass murder of, and for, the internet,” have made clear three fundamental facts about content moderation, especially when it comes to live and viral content:

1. Automated Content Analysis is Not a Magic Wand

If you remember nothing else about content moderation, remember this: There is no magic wand. There is no magic wand that can be waved and instantly remove all of the terrorist propaganda, hate speech, graphically violent or otherwise objectionable content. There are some things that automation and machine learning are really good at: functioning within a specific and particular environment (rather than on a massive scale) and identifying repeat occurrences of the exact same (completely unaltered) content, for example. And there are some things they are really bad at: interpreting nuance, understanding slang, and minimizing discrimination and social bias, among many others. But perfect enforcement of a complex rule against a dynamic body of content is not something that automated tools can achieve. For example, the simple change of adding a watermark was enough to defeat automated tools aimed removing video of the New Zealand shooter.

Some, then, have suggested banning of all live video. However, that overlooks activists’ use of live streams to hold government accountable and report on corruption as it is happening, among other uses. Further, the challenges of automated content analysis are by no means limited to video. As a leaked email from Google to its content moderators reportedly warned: “The manifesto will be particularly challenging to enforce against given the length of the document and that you may see various segments of various lengths within the content you are reviewing.”

All of this is to reiterate: There is no magic wand and there never will be. There is absolutely a role for automated content analysis when it comes to keeping certain content off the web. Use of PhotoDNA and similar systems, for example, have reportedly been effective at ensuring that  child pornography stays off platforms. However, the nuance, news value, and intricacies of most speech should give pause to those calling for mass implementation of automated content removal and filtering.

2. The Scale, Speed, and Iterative Nature of Online Content ? Particularly in This Case ? is Enormous

It is a long-standing fact of the internet that it enables communication on a vast scale. Reports from YouTube and Facebook about the New Zealand attack seem to indicate that this particular incident was unprecedented in its volume, speed, and variety. Both of these companies have dedicated content moderation staff and it would be easy to fall into the trap of thinking that this staff could handily keep up with what seems to be multiple copies of a single live video. But that overlooks a couple of realities:

  • The videos are not carbon copies of each other. Any number of changes can make identifying variations of a piece of content difficult. The iterations could include different audio, animation overlays, cropping, color filters, use of overlaid text and/or watermarks, and the addition of commentary (as in news reporting). Facebook alone reported 800 “visually distinct” videos.
  • There is other content ? the normal, run-of-the-mill stuff ? that continues to be posted and needs to be addressed by the same staff that now is also scrambling to keep up with the 17 copies of the video that are being uploaded every second to that single platform (Facebook in this case; YouTube’s numbers were somewhat lower, but still reaching one video upload every second, culminating in hundreds of thousands of copies).

It’s worth noting here that not a single person reported the live video stream to Facebook for review. The video was reportedly viewed “fewer than 200 times” while it was live but the first report came 12 minutes after the stream ended ? a full 29 minutes after the broadcast began. That’s a lot of time for a video to be shared and reposted by people motivated to ensure it spread widely, not only on Facebook, but on other sites as well.

In addition to proving the challenges of automated content review, the New Zealand attacks demonstrated weaknesses of the companies’ own systems, particularly when dealing with emergencies at scale. YouTube, for example, was so overwhelmed by the flood of videos that it opted to circumvent its standard human review process to hasten their removal. Facebook, too, struggled. The company has a process for handling particularly sensitive content, such as an individual threatening to commit suicide; however, that process wasn’t designed to address a live-streamed mass shooting and likely could not easily be adapted to this emergency.

3. We Need Much Greater Transparency

As non-governmental and civil society organizations have hammered home for years, there needs to be more transparency from tech companies about their policies, processes, and practices that impact user rights. One of the more promising developments from 2018 in this space was the release of reports by YouTube, Twitter, and Facebook providing a quick peek under the hood with respect to their content enforcement. While there is still a long way to go from the first year’s reports, the reaction to their publication shows a hunger for and deep interest in further information from tech companies about their handling of content.

Among companies’ next steps should be transparency around specific major incidents, including the New Zealand attacks. Social media platforms are still reeling from over a week of whack-a-mole with a heavy side of criticism. But once they are able to identify trends or data points across the incident, they should be shared publicly and contextualized appropriately. For example, how did Facebook identify and handle the 800 distinct versions of the video? Did those include uses of the video in news reporting? How was the Global Internet Forum to Counter Terrorism ? an entity formed to share information on images and videos between companies ? engaged?

One challenge for companies when providing transparency into their policies and practices is doing so without providing a roadmap to those looking to circumvent the platforms’ systems. However, extant transparency reporting practices ? around, for example, government requests for user data ? suggest companies have found a balance between transparency and security, tweaking their reports over time and contextualizing the data within their larger efforts.

What’s Next?

There are no quick fixes. There are no magic wands. We will continue to debate and discuss and argue about whether the tech companies “did the right thing” as they responded to the New Zealand shooter’s video, his manifesto, and public reaction to both. But as we do so, we need transparency and insight into how those companies have responded, and we need a shared understanding of the tools and realities of the problem.

As details have emerged about the attacks in New Zealand and how they played out on social media, much of the analysis around internet companies’ handling of user content has fallen into one of two buckets:

  • Tech companies aren’t doing enough and refuse to use their enormous money/power/tools/resources to address the problem; or
  • The problem is unsolvable because the volume of content is too great for platforms to handle effectively.

The problem with presenting the issue as this dichotomy is that it overlooks ? really, completely ignores ? the fact that an unknown number of viewers watched the live video but did not report it to Facebook. Perhaps some viewers were confused and maybe others believed it was a joke. But the reality is that some people will always chose to use technology for harm. Given that fact, the question that will ultimately guide this debate and shape how we move forward is: What do we want our social media networks to be? Until we can answer that question, it will be hard, if not impossible, to address all of these challenges.

Reposted from the Center for Democracy & Technology

Filed Under: , , , ,

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Three Lessons In Content Moderation From New Zealand And Other High-Profile Tragedies”

Subscribe: RSS Leave a comment
112 Comments
Anonymous Coward says:

Re: Re: Re:

They are a for profit courier service that knowingly promotes their services to the extremes of society.

By that logic so do car manufacturers, gun manufacturers, knife manufacturers, fertilizer manufacturers, chemical supply companies (wherein you can buy enough nuclear material to make a bomb without ever flagging the authorities), etc…, etc…

They knowingly promote their services to the masses, aware that some people will use them for purposes they did not intend. This is the same for ANY company. You cannot blame one company for this without also blaming them all.

Whether you choose to blame them all or just this one, what do you suggest they do about it? Stop all marketing of their services altogether and hope somebody finds their services and uses it? Public marketing is just that, public. You can’t engage in marketing and not have people who intend to misuse it not see it.

Anonymous Coward says:

Re: Re: Re:2 Re:

None of which has anything to do with how they promote their products, which was your original point.

Please show me how you can beat someone over the head with a social media post and directly cause physical injury or death and then we can talk.

they have laws that apply to them

And there are also laws that apply to social media companies and other internet companies. If you think otherwise I suggest you go read a law book. They also all have rules banning this type of content from their platforms, so your argument fails by default.

Anonymous Coward says:

Re: Re: Re:3 Re:

  1. If that is the case then your original counter argument is also false. Cant have it both ways.
  2. You are assuming that the only valid injury is physical.
  3. You are also assuming that inciting someone else to violence is OK if it’s done over the internet. It is never OK.
  4. "go read a law book" . If you practiced what you preach you would not be calling it "a law book".
  5. Banning certain users without policing your own policy is no excuse. There is no loss of argument by default.
  6. Acting contrary to your written rules is not a shield against scrutiny or accountability.
Stephen T. Stone (profile) says:

Re: Re: Re:4

You are assuming that the only valid injury is physical.

No, we are assuming that liability for injuries caused by other people should be placed on those people instead of on the tools they may have used to cause those injuries.

You are also assuming that inciting someone else to violence is OK if it’s done over the internet.

[asserts facts not in evidence]

(Wait shit that’s someone else’s schtick…)

Anonymous Coward says:

Re: Re: Re:5 Re:

" No, we are assuming that liability for injuries caused by other people should be placed on those people instead of on the tools they may have used to cause those injuries."

Here we have the crux of the matter, you see social media as just a tool, without morality, without responsibility, without accountability. It’s about time we treated them for what they are, companies that make part of their profit from the misery of others. They need to be regulated.

Stephen T. Stone (profile) says:

Re: Re: Re:6

you see social media as just a tool, without morality, without responsibility, without accountability

No, I do not see social media as a tool without morality, responsibility, or accountability. I see social media as a tool that can be used for good or evil — just like any other tool — and the creators and maintainers of that tool as those who think they are above morality, responsibility, and accountability. Yes, social media does need to be held responsible for its failings; people being bastards regardless of the existence of social media is not one of those failings.

It’s about time we treated them for what they are, companies that make part of their profit from the misery of others.

So…we need to treat them like gun manufacturers?

Anonymous Coward says:

Re: Re: Re:7 Re:

So does that mean that you find the funeral industry completely unacceptable because they profit from human misery? Under that logic hiring someone else to dig a grave should be illegal because it means someone will profit from human misery!

Well, unless they are purposely burying people alive, then the funeral industry is not the ones causing the human misery using their "tools" (coffins, cremation, etc.). Unlike the gun manufacture’s guns, which some are very definitely used to cause human misery.

Anonymous Coward says:

Re: Re: Re:8 Re:

So, does that mean you find the tool industry completely unacceptable because they regularly profit from human misery, since their tools are routinely used to bash in someone’s head with a hammer, cut off a person’s head with a hacksaw, steal cars with a crowbar, break into homes with a screwdriver, set fire to a house with a portable torch, and other acts with tools that cause human misery on a daily basis?

Anonymous Coward says:

Re: Re: Re:6 Re:

"you see social media as just a tool, without morality, without responsibility, without accountability."

Wait a sec …. my tools are supposed to have morals?
How are my tools supposed to be responsible? … wtf?
Accountability?

It sounds as though you want the companies that make tools to exhibit all these traits … right? Because certainly you realize that an inanimate object is incapable of having human traits.

"It’s about time we treated them for what they are, companies that make part of their profit from"

  • And apparently you agree with me.
Scary Devil Monastery (profile) says:

Re: Re: Re:6 Re:

"Here we have the crux of the matter, you see social media as just a tool, without morality, without responsibility, without accountability."

Because that’s what they are, and have been ever since the first upright-walking ape chose to bear a message for another.

"It’s about time we treated them for what they are, companies that make part of their profit from the misery of others."

Nope. It’s time that YOU go back to school and learn a bit about the role of communications and messenger services in a democratic society.

Or perhaps move to north korea because not even China and Russia will sign to the bullshit you keep spouting.

Anonymous Coward says:

Re: Re: Re:4 Re:

Oh this is going to be fun.

If that is the case then your original counter argument is also false. Cant have it both ways.

Really now. So you contend that I can take a social media post and physically hit someone over the head with it? You also apparently contend that social media is being marketed as "completely safe and will never cause harm or offense to anyone". Interesting takes, interesting takes.

Well I’d say my original counter argument is still intact then since none of those things are actually part of reality. Social media markets their platforms as ways to communicate and stay in touch with people all over the world. What part of that is false advertising?

You are assuming that the only valid injury is physical.

Because you can’t legislate speech, which can only directly cause mental and emotional harm. I can’t take words I speak, write, or type, and directly physically harm them. The First Amendment protects pretty much everything I say with some EXTREMELY narrow exceptions. There is no law against saying mean, hurtful things.

You are also assuming that inciting someone else to violence is OK if it’s done over the internet. It is never OK.

No, I’m not, nor did I ever state that. But that is completely irrelevant in this topic of conversation. Social media platforms do not actively incite someone to violence over the internet or otherwise. People posting TO social media do that. This is no different than a moron standing on a street corner with a megaphone inciting people to violence. You don’t remove the street corner to stop the moron.

"go read a law book" . If you practiced what you preach you would not be calling it "a law book".

Do tell. What should I call it then? A textbook of law? A law textbook? A book containing the entirety of US law? A book of law? What part of that doesn’t make sense to you? You’re grasping at straws if that’s all you got. The fact of the matter is the law says you’re wrong and if you read any book about US law, it would tell you the same thing. The specific words I use to describe said books are irrelevant

Banning certain users without policing your own policy is no excuse.

I’m sorry, I think you are confused. The users get banned for violating the policies. That is, by definition, policing your own policy. Now, that is not to say that some social media companies have in the past seem to have enforced their policies in a confusing manner, but they HAVE enforced them. Saying otherwise is blatantly ignorant.

There is no loss of argument by default.

You can deny it all you want but your argument floats about as well as a lead colander. When your argument is predicated on false statements and logical fallacies, you fail by default. And all of your statements are easily checked with independently verifiable data, facts, and sources that all say: you’re wrong.

Acting contrary to your written rules is not a shield against scrutiny or accountability.

Please be more clear, there are two ways to interpret what you’ve written here. I’m going to assume that you mean users acting contrary to a social media platforms rules does not shield said platform from scrutiny or accountability, because the only other way to interpret that sentence is that the platform is acting contrary to the platforms rules and that just doesn’t make any sense at all.

So going with the assumption that we’re talking about users on a platform, actually it does as far as the actions of those users are concerned. If the platform itself does something to violate the law, then no, it’s not shielded. But if the users do something, then yes, the law states you can’t hold the platform liable. For the same reason you can’t hold the owner of a house liable if two of his guests get into an argument and one assaults or kills the other. Again, I suggest reading the law, it’s QUITE clear on this point.

Anonymous Coward says:

Re: Re: Re:

The extremes of society need interaction with normal views in society to de-radicalize them. Extreme people are grown under isolation, detached from normal views. The crusade to isolate everyone into ever more isolated pockets is sealing away time-capsules set to spring forward with ever more extreme views at some time in the future.

Anonymous Coward says:

Filtering stuff like this means keeping people in the dark. If we feel it is bad speech and bad actions, then perhaps the governments and people who have a problem with information such as this being made available on the internet should form counter-speech squads.

If you are not exactly likely to fall under the influence of your enemy’s propaganda, it is far more likely to reinforce your opinion against them or galvanize you to action. And when that happens, violent fringe groups or potentially sympathetic people get to see just how many people are not down with their shit. Then they can really whine about how they are discriminated against.

Anonymous Coward says:

Re: Re:

It appears you are arguing the video should remain online. Do you agree that there is a limit? If so, what is it?

I fail to see the validity of an argument for not removing some content that is just wrong. Now, define what that is … I know it when I see it. That doen’t quite cut it huh. Can’t please all the people all the time – blah blah.

What about kicking puppies? Should that type of video remain online? Why?

Scary Devil Monastery (profile) says:

Re: Re: Re: Re:

…and that’s the main issue with modern society. When we see evidence about heinous wrong-doing our first response isn’t "Holy cr*p, someone should do something about that vile scumbag!".

It’s become "Holy cr*p, someone needs to take this down stat, so I don’t have to know this shit is happening!".

Baghdad Bob/Bobmail/Blue, needless to say from his prior argumentation, is a staunch supporter of the second view.

GetOfMyLawn says:

Too Big, Should Fail!

This… Last bucket, "The problem is unsolvable because the volume of content is too great for platforms to handle effectively."

Use the FCC model prior to Micheal Powell, local content rules, break ownership up if they cross regions or have more properties in said region. For Facebook it would mean not owning instagram, not having news feeds that go across state lines, content derived locally. National content comes from approved trusted sources.

Reality – not gonna happen, but sure would be a boon to local news services, mom and pop shops and the mainstreet you see evaporating.

Anonymous Coward says:

Re: Old man is irrelevant

There is no such thing as "too big". Define "too big".

Use the FCC model prior to Micheal Powell, local content rules, break ownership up if they cross regions or have more properties in said region.

And how do you propose to break up Facebook’s core service? Or Reddit? Or Youtube?

For Facebook it would mean not owning instagram

Fair enough. Though I disagree.

not having news feeds that go across state lines

Pffftt. HAHAHAHAHAHAHA!!! So tell me genius, I live in State A, the rest of my family lives in State B. Do pray tell how I would get their status updates in my news feed in such a scenario. Come back when you actually know how technology works.

content derived locally

What if I don’t want to see my local content? What if my local content sucks? It is an infringement of my rights for the government to define what legal content I am and am not allowed to see in my news feeds.

National content comes from approved trusted sources.

You do realize that Facebook doesn’t actually publish any news, right? They just aggregate it from other sources. So, mission accomplished?

Reality – not gonna happen

Yes but likely not for the reasons you are thinking of.

sure would be a boon to local news services, mom and pop shops and the mainstreet you see evaporating.

If they can’t adapt then they should die. I have no sympathy for a legacy company who can’t adjust to new technology and new ways of doing things. You want a law that keeps things the same as they were 50 years ago. Well guess what, technology and ways of doing things 50 years ago sucked, was inefficient, unsafe, and EXTREMELY SLOW.

So you can take your old man yelling at kids to get off his lawn schtick and stay off my internet.

Scary Devil Monastery (profile) says:

Re: Too Big, Should Fail!

"Use the FCC model prior to Micheal Powell, local content rules, break ownership up if they cross regions or have more properties in said region. For Facebook it would mean not owning instagram, not having news feeds that go across state lines, content derived locally. National content comes from approved trusted sources."

So basically roll back modern communications technology to 1960?

Anonymous Coward says:

I’d say this and the last article are just two facets of the same jewel (or, if you prefer, two flakes off the same cow pie.) "Nerd harder" to fix any human problem is always and only a recipe for failure–like the old tribal custom of killing the medicine man if the patient died, it may briefly satiate the relatives, but it does nothing to reduce the mortality rate.

schluckauf wichserin says:

A lot of snowflakes here

“High-Profile Tragedies”

LOL

High-Profile FALSE FLAG and psych ops to manipulate brain grilled zombies like the pricks who go on TV with veils and other fascist submission tools for jerks!

How can the Kiwis can be so stupid?

Well, they are pretty isloated and have not a clue about what the nazi cult called “islam’ is really about.

Somehow, people being as dumb as vegetables deserve waht they will get…

aerinai (profile) says:

Show me an idiot-proof realworld analog

For all these people complaining that ‘bad actors’ are ‘using the internet wrong’, look at the real world and tell me any aspect of the real world that is bad actor proof.

Some Examples:

  • Vehicles — Europe has had several high profile murders by vehicle. You don’t see people blaming Mercedes and GM

  • Household Chemicals — Tide Pod Challenges, etc. Need I say more

  • Prescription Drugs — Some people need opioids, some people abuse them

  • Firearms — Probably the closest analog to the internet in terms of people being up in arms about ‘never let any bad thing ever happen with your product’.

  • Power Tools

I’d rather people understand the problem and just accept that anything we build, people will find a way to pervert it. Every weapon we have created over the course of human history is a testament to that.

Anonymous Coward says:

Re: Show me an idiot-proof realworld analog

"I’d rather people understand the problem and just accept that anything we build, people will find a way to pervert it. Every weapon we have created over the course of human history is a testament to that."

I assume by that you don’t run antivirus software either.

Scary Devil Monastery (profile) says:

Re: Re: Show me an idiot-proof realworld analog

"I assume by that you don’t run antivirus software either."

The same way he doesn’t drink because dihydrogen monoxide is toxic and doesn’t breathe because oxygen is a dangerous oxidizer?
I believe he was commenting on the phenomenon of "dual-use".

That little phrase we who live in the real world use to describe concepts such as people being able to use a hammer on the head of a nail or the head of a fellow human being with roughly equal facility.

I don’t know whether "aerinai" uses or does not use antiviral software because his arguments indicate neither.

YOUR arguments, however, indicate that you refuse to drink water, which people all too often use to murder other people with.

Anonymous Coward says:

Re: Show me an idiot-proof realworld analog

Opioid makers are settling lawsuits with states left and right.

Carmakers are not required to control their automobiles on the highway. Internet providers have the means to do this with relative ease. If too much content is being uploaded, then tax it. It costs $55 to register a copyright in the US, but why is that not free? Because the LOC would be overwhelmed, as would PACER if they eliminated that fee.

Anonymous Coward says:

Re: Re: Show me an idiot-proof realworld analog

"Carmakers are not required to control their automobiles on the highway. Internet providers have the means to do this with relative ease."

Interesting … internet providers have the capability to control vehicles on the highway. I was unaware of this development, when did this occur and how was it approved by the state … and which state was it?

Scary Devil Monastery (profile) says:

Re: Re: Show me an idiot-proof realworld analog

"Internet providers have the means to do this with relative ease."

Under the same necessary assumptions that would enable car manufacturers to control cars, yes.

So let me get this straight, Baghdad Bob – are we back to the argument you kept harping on back on Torrentfreak years ago where you thought Internet providers had a duty to monitor all of their customers and cut them off if something…offensive…showed up in what should have been confidential communication?

Scary Devil Monastery (profile) says:

Re: Re: Re:2 Show me an idiot-proof realworld analog

"Did you just describe a sociopath’s wet dream?"

Well, I was describing a paradigm old Bobmail/Baghdad Bob/Blue has been advocating persistently and at length for years.

This may be a wet dream for sociopaths but given the tone Bobmail used when he was posting on it he was close to staining his trousers while awake. Not sure what that makes him.

Anonymous Coward says:

Re: Re: Show me an idiot-proof realworld analog

Opioid makers are settling lawsuits with states left and right.

Objection your Honor! Relevance?

Carmakers are not required to control their automobiles on the highway.

Because, quite rightly, they are not responsible for what users do with their vehicles on said highway after they obtain them from the manufacturer.

Internet providers have the means to do this with relative ease.

No, they really don’t. Or have you not seen the massive number of false detections of offensive/infringing content all the detection systems generate?

Regardless of that, why should they be required to be responsible for things their users upload? Internet providers didn’t upload the content, why should they be held responsible? There is literally no difference here, except maybe the internet providers are more akin to the highway than a car. Your argument is fundamentally flawed.

If too much content is being uploaded, then tax it.

Define too much. Who gets to define what too much is? There is no such thing. To define it and subsequently tax it is censorship and a direct and flagrant violation of the First Amendment because it limits people’s speech online.

Not to mention you ignore the many other use cases, such as families backing up their home videos to cloud storage systems like OneDrive and DropBox. People already pay for their internet access and those storage services. You really want to charge them A THIRD TIME for the amount they upload per minute? Get out.

It costs $55 to register a copyright in the US, but why is that not free? Because the LOC would be overwhelmed, as would PACER if they eliminated that fee.

Objection your Honor! Relevance! These are completely unrelated!

Anonymous Coward says:

Re: Re: Re: Show me an idiot-proof realworld analog

"Opioid makers are settling lawsuits with states left and right.
Objection your Honor! Relevance?"

Poster may have been trying for some sort of equivalence where there is none. Opioid manufacturers have been implicated in kickback schemes, you should’ve seen the story some time ago. Car manufacturers were not bribing dealers to over prescribe vehicles to drivers were they?

Anonymous Coward says:

Have to call out New Zealand for their ham-fisted response that completely embraced the shooters states objectives, and carried out those wishes to a T. The shooters manifesto clearly lays out his intent to use the event to force an over-sized response by State to curtail rights of good people. More might be aware of this, but it’s apparently illegal for those in New Zealand to read the document for themselves and recognize that the government picked up the ball and ran forward in all the things the shooter hoped to accomplish.

Anonymous Coward says:

The Authorities in New Zealand Also Ran up against the imposible task of finding anyone inside Face-Book willing to talk or able to take down the stream. These companies don’t want to spend money dealing with complaints, or the public. we shouldn’t be to worried if they have to spend more money to deal with complaints

Anonymous Coward says:

Re: Re:

the imposible task of finding anyone inside Face-Book willing to talk or able to take down the stream

Are you talking about the live stream? Facebook didn’t even know about it until after the live stream ended BECAUSE NO ONE REPORTED IT TO THEM.

If you’re talking about the copies that spread after it ended, this blatantly false as evidenced by the fact that Facebook engaged for days in an endless game of whac-a-mole removing it wherever they found it on their site. Same goes for Youtube and some of the other sites it was uploaded to.

You might want to check your facts before you go off like that.

Anonymous Coward says:

Re: Re: Re:

So you approve that this snuff film is shown, that we should allow the mentally weak to be desensitised to suffering to death. What, do you think the world needs more copycats? Facebook has been working the demographics of it subscribers for years, if they were to allow this sort of message in there system, like minded people would be barraged with this content, then the only thing the rest of us could do is bet where the next Slaughter would be.

Anonymous Coward says:

Re: Re: Re: Re:

That’s a really great strawman you’ve constructed there, and lots of words you’ve shoved in my mouth. But I’m feeling a comeback coming on so let’s get to it.

So you approve that this snuff film is shown

Actually, yes, in certain scenarios, I do approve of it being shown. It is a historical record now of a terrible act that was committed. It is useful to people such as historians and people who study human behavior. Not to mention there are likely clips from the video where the gunman is speaking that gives an insight into his motivation that could help us prevent things like this from happening again by confronting those issues.

that we should allow the mentally weak to be desensitised to suffering to death

Nowhere did I say it should be left up in its entirety for the public to see on public platforms. What I did say was that you are incorrect in your statement that no one at Facebook was willing to do anything about taking it down. They did, for days! And the only reason they didn’t do it sooner was because no one reported it to them. Get your facts straight.

What, do you think the world needs more copycats?

Perhaps you should start pointing fingers at Hollywood then? There’s lots of crime TV shows and movies that show EXACTLY how to commit terrible acts of suffering to death.

Facebook has been working the demographics of it subscribers for years, if they were to allow this sort of message in there system, like minded people would be barraged with this content

Allow me to introduce you to Facebook’s terms of service that explicitly state this type of content is NOT ALLOWED. Let me also point you to the example in this very case where Facebook spent DAYS trying to scrub their platform of this video.

Please check your strawman and rejection of reality at the door.

Anonymous Coward says:

A few weeks ago I was listening to a podcast: Tomorrow with Joshua Topolsky, Episode 152. The hosts were discussing the conditions in which Facebook’s moderators work. They get paid garbage wages for the work they do, which psychologically scars them and gets them to start believing in conspiracy theories and other nasty things after looking at so much of it for so long. And this is only to stop a small portion of that nasty content from existing on the platform as there’s no way for them to get it all.

One of the hosts of the podcast, Ryan Houlihan, posited the idea that we should actually be questioning the concept of a site where users upload as much as they want of anything that they want and they’ll then use underpaid workers and ineffective algorithms to sort out just some of it as a healthy, valid business model that a corporation should have. Regarding YouTube, he had this to say: “YouTube has 400 hours of content uploaded a minute. Maybe that’s an impossible thing to moderate and it’s not responsible for a company to allow for 400 hours of content to be uploaded a minute?” He makes a good point.

You posit the question “What do we want our social media networks to be?”. My answer is “Something much smaller and actually manageable rather than the ludicrous, impossible-to-moderate-at-scale-free-for-all social networks and video sites that we have now.”

Al says:

Re: Re:

The jews are behind all this

Zuckeberg, facebook: jews, google: jews, corrupt bankers: jews, lobbies: jews again and again, soros jew, the media, the fake news outlets, and Hollywood, CNN and all the other fake news outlets: jews of course!

They are the one behind the fake financial crisis to grab power, the false flags, the fake news, the fake terrorist attacks, the fake “migrations’, tey are the ones who created isis, the wars, pushing islamic hordes and the scum of the third world onto our lands.

To rule they must divide, create chaos and permanent crisis, so they can continue to manipulate us and make us fight each other when we should fight them and their muslim allies.

Hitler didn’t want to get rid of them for nothing.

That One Guy (profile) says:

Re: 'Are you with a label/studio/publisher? No? Then beat it.'

You posit the question “What do we want our social media networks to be?”. My answer is “Something much smaller and actually manageable rather than the ludicrous, impossible-to-moderate-at-scale-free-for-all social networks and video sites that we have now.”

And when/if someone takes you up on that, and you want to post something only to find out that nah, you aren’t in the ‘in group’ and therefore you get to watch but not participate, maybe then you’ll see the problem with that idea.

Anonymous Coward says:

Re: Re:

My answer is “Something much smaller and actually manageable rather than the ludicrous, impossible-to-moderate-at-scale-free-for-all social networks and video sites that we have now.”

Why should Facebook manage people any better that any government has ever achieved?

The way to fertilize the ground for violence is to isolate small groups of people, because it is people who feel isolated from society, and do not see a future for themselves, that are most likely to fall for the ideas of an extremist leader, and become their human bombs, suicide attackers, and other forms of cannon fodder..

Scary Devil Monastery (profile) says:

Re: Re: Re: Re:

"The opposite side of that coin are people who feel empowered because they found like minds online and confuse that with mainstreaming."

Ah, you mean networks like "Der Stürmer" and "Breitbart".

They already exist so there’s no need at all to change anything on that account.

Unless your actual argument is that we need to change things because these minorities feel outraged that the "mainstream" as a whole insists on bringing to their notice that they are a small minority group not accepted by the general citizenry?

Anonymous Coward says:

Re: Re:

And this is only to stop a small portion of that nasty content from existing on the platform as there’s no way for them to get it all.

Then maybe we should find better ways of moderation and accept that bad people do bad things and you’re never NOT going to be exposed to it.

we should actually be questioning the concept of a site where users upload as much as they want of anything that they want

So you’re for government censorship then and want to ditch the first amendment? Because that’s the only way to make that happen.

they’ll then use underpaid workers and ineffective algorithms to sort out just some of it

Well, moderation at that scale is nearly impossible to get right. Should they have better working conditions? Absolutely. But maybe we should all stop getting our panties in a twist because we saw something we didn’t like and subsequently tried to force a third party to do something about it when we should be trying to go after the root cause.

Regarding YouTube, he had this to say: “YouTube has 400 hours of content uploaded a minute. Maybe that’s an impossible thing to moderate and it’s not responsible for a company to allow for 400 hours of content to be uploaded a minute?” He makes a good point.

No, he makes a stupid point. All that content is being uploaded by individuals. If you say you can’t upload that much content per minute, you have now just infringed on individuals right to free speech, creation, and expression. The only way to not allow that much content to be uploaded at that rate, is for the government to dictate what people can and cannot do online. I’m sorry but that’s completely stupid.

My answer is “Something much smaller and actually manageable rather than the ludicrous, impossible-to-moderate-at-scale-free-for-all social networks and video sites that we have now.”

Welcome to the human race, a ludicrous, impossible-to-moderate-at-scale-free-for-all social network and visual space, where bad people will always find a way to do bad things to other people using any and all available tools at their disposal. What you want is censorship and loss of freedom so that you don’t have to see anything bad or offensive. That’s not the real world and can never work.

Besides that, if you do get something much smaller, then you have now turned back communication, arts, technology, expression, and jobs about 20-30 years and put a cap on progress.

Scary Devil Monastery (profile) says:

Re: Re: Re: Re:

"It’s not FREE speech if it costs money to moderate."

You are – literally – an idiot, aren’t you?

If the user had to pay the people censoring him now THAT would be un-free speech indeed.
Not to mention that once your suggestion is implemented you have given whatever minority is eager and keen to censor other people’s opinion a free reign.

Good going there, Baghdad bob. You’ve actually provided an argument that we here on TD shouldn’t just be able to flag your offensive commentary to keep it hidden – we should be able to remove you completely, and get paid doing so.

This somehow doesn’t reconcile well with your frequent rants about "censorship"…but after all these years of reading your confused rants I’m not even surprised when you offer to self-destruct every prior argument you’ve offered in a fit of "Because ME!".

That One Guy (profile) says:

Re: Re: Re:2 'I didn't mean apply those rules to ME!'

Good going there, Baghdad bob. You’ve actually provided an argument that we here on TD shouldn’t just be able to flag your offensive commentary to keep it hidden – we should be able to remove you completely, and get paid doing so.

It’s a source of endless amusement that the trolls infesting the site push for rules and/or laws that would, if actually applied, impact and silence them first.

Whether it’s claiming that those that are ‘rude’ or do nothing but insult people should get the boot while doing nothing but that, or copy/pasting content from another source to defend a law that would make that act to risky to allow, their short-sighted hypocrisy truly knows no bounds.

Scary Devil Monastery (profile) says:

Re: Re: Re:3 'I didn't mean apply those rules to ME!'

"It’s a source of endless amusement that the trolls infesting the site push for rules and/or laws that would, if actually applied, impact and silence them first."

It is indeed. Although given the time I’ve seen these sock puppets post both here and on Torrentfreak I’m fairly convinced we’re really just talking about that one guy who keeps trying to pretend it’s not just him desperately rushing one sock puppet to the defense of his other when people have been too mean to it.

I keep saying that since his stated objective is commercial his first order of business should be to open a Patreon account or at least put out a hat. No clown should have to work unpaid, and his persistence in doing just that for years without a single dime received is a crying shame.

Unless some kind soul every now and then tosses 50 cents into his cubicle. That’s always possible.

Anonymous Coward says:

Re: Re: Re: Re:

It’s not FREE speech if it costs money to moderate.

That’s either really stupid of you or a very pathetic strawman.

Free speech does not refer to the cost of speaking, it refers to the freedom and ability to say whatever you want without being told you can’t say that. I suggest you read up on the subject.

Let the users who overwhelm the content system pay for its moderation.

This is a terrible idea and wouldn’t solve anything. Throwing more money at it isn’t going to make the problem go away because this is a human interaction problem, not an economic one.

They are not censored.

Nor should they be. Freedom of speech and all that.

You can even offer X amount of content tax-free, or waive the tax for the indigent if you want to be that principled.

Again: FREEDOM. OF. SPEECH. It’s kind of a law in the US that what you suggest is pretty much the number one thing the government is NOT allowed to do. The fact that you don’t understand that is telling.

Scary Devil Monastery (profile) says:

Re: Re:

"You posit the question “What do we want our social media networks to be?”. My answer is “Something much smaller and actually manageable rather than the ludicrous, impossible-to-moderate-at-scale-free-for-all social networks and video sites that we have now.”"

So essentially you mean "freedom of speech" is a bad idea and should get shitcanned.

Your answer, basically, is what I’d expect to hear from some 18th-century time traveler outraged that the comfy small publications and Old Boys Gossip Network of his heyday have gone the way of the dodo.

Anonymous Coward says:

"No, he makes a stupid point. All that content is being uploaded by individuals. If you say you can’t upload that much content per minute, you have now just infringed on individuals right to free speech, creation, and expression. The only way to not allow that much content to be uploaded at that rate, is for the government to dictate what people can and cannot do online. I’m sorry but that’s completely stupid."

Because multiple sites, each of which has a manageable amount of content, couldn’t possibly exist, as that would threaten Google’s practical monopoly in the viral-video space?

Anonymous Coward says:

Re: Re:

Moderating all the output of the human race is not possible, whether you have a few big sites, or lots of little sites, you need the same number of moderators, as the problem is dealing with the volume of output that the human race can generate, and employing enough people as moderators to deal with that flow.

Anonymous Coward says:

Re: Re: Re: Re:

That way you start a bidding war for people to get their ideas in front of the public, and with no guarantee those ideas will not cause violence and disruption of society. Calvin, Luther, Marx to name a few, were people in a position to have their words published in eras where few people has that privilege.

That One Guy (profile) says:

Re: Re: Re: You first

I’m seeing at least eight posts from you in this comment section alone, that’ll be $4, to be donated to TD for hosting your content under the ‘You want to post, you got to pay’ system you’re proposing.

If your input is really worth posting I’m sure you’ll have no problem paying the content uploading tax, unless of course the system you want to foist on others shouldn’t apply to you, but as that would be grossly hypocritical I’m sure that won’t be the case and you’ll be making your donation quickly.

Scary Devil Monastery (profile) says:

Re: Re: Re:2 You first

"I’m seeing at least eight posts from you in this comment section alone, that’ll be $4, to be donated to TD for hosting your content under the ‘You want to post, you got to pay’ system you’re proposing. "

Hey, I flagged a lot of his posts as "stupid". That means, according to the "get paid to moderate" idea he had, he owes me a few bucks as well.

Since we can’t bill an AC we should insist that he doesn’t get to post at all anymore until he registers an account and settles his account.

Anonymous Coward says:

Re: Re: Re: Re:

You can regulate it by making people pay for the cost of their "output."

This is called censorship and is prohibited by the First Amendment.

That’s the logic behind anti-SPAM rules

You obviously don’t understand anti-spam rules then. They are free and don’t force anyone to pay anything for sending spam.

unsolicited faxes

Pfft. What time period do you come from, the 60s? I mean I guess there is a rare occasion that a company who still has a fax machine might get an unsolicited fax but realistically, that technology is all but dead. The average joe doesn’t even have a fax machine.

and texts

Again, they don’t really cost anything to send other then what it would cost to get your own cellphone. And there are some online services that let you do it for free.

So, where is the cost of all this again? And how is it not censorship by forcing people to pay for it?

Scary Devil Monastery (profile) says:

Re: Re:

"Because multiple sites, each of which has a manageable amount of content, couldn’t possibly exist, as that would threaten Google’s practical monopoly in the viral-video space?"

Uh, those sites already DO exist, obviously. There are thousands of social networks run by individuals.

But as usual…don’t let factual reality get in the way of your false assumption and dishonest rhetoric, Baghdad Bob.

Anonymous Coward says:

Re: Re:

Because multiple sites, each of which has a manageable amount of content, couldn’t possibly exist

Actually no, they can’t.

as that would threaten Google’s practical monopoly in the viral-video space

But not because of this. See the many other video hosting sites currently available to set fire to your strawman.

They can’t exist because there is no way to feasibly limit how much content gets uploaded to a platform without pretty much destroying that platform’s ability to provide any kind of hosting services. Not to mention the fragmentation of having to manage multiple accounts across multiple platforms just so you can upload all the content you need/want.

Please educate yourself before you speak and make a fool out of yourself.

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...