Techdirt's think tank, the Copia Institute, is working with the Trust & Safety Professional Association and its sister organization, the Trust & Safety Foundation, to produce an ongoing series of case studies about content moderation decisions. These case studies are presented in a neutral fashion, not aiming to criticize or applaud any particular decision, but to highlight the many different challenges that content moderators face and the tradeoffs they result in. Find more case studies here on Techdirt and on the TSF website.

Content Moderation Case Studies: Facebook Removes Militia Event Following A Shooting (August 2020)

from the moderating-groups dept

Summary: Following the shooting of Black man Jacob Blake by Kenosha police officers, protests erupted in the Wisconsin town.

As law enforcement attempted to rein in the damage, citizens aligning themselves with private “militias” discussed taking action during the civil unrest.

Some of this organizing began on Facebook. A Facebook “event” created by the Kenosha Guard account (and promoted by conspiracy theorist/far right website Infowars) possibly caught the eye of 17-year-old Kyle Rittenhouse. Rittenhouse traveled from his home in Antioch, Illinois with his weapons to the protest/riot occurring less than 30 minutes away in Kenosha, Wisconsin. Before the night was through, Rittenhouse had killed two residents and injured one other.

Facebook finally removed the “event” posted by the Kenosha Guard account — one the account referred to as a “call to arms.” Posts by the group asked “patriots” to “take up arms” against “evil thugs.” The event was deemed a violation of Facebook’s policy regarding “Dangerous Individuals and Organizations.” Facebook also claimed it could find no link between the account and this event and Kyle Rittenhouse.

Some viewed this response by Facebook as too little too late. Someone had already apparently heeded the call to “take up arms” and had taken people’s lives. According to a report by BuzzFeed, the event had been reported 455 times before Facebook removed it. Four moderators had responded to multiple flaggings with a determination that the event (and the account behind it) did not violate Facebook’s rules. During an internal meeting with moderators, CEO Mark Zuckerberg admitted the company should have reacted sooner to reports about the event.

Decisions to be made by Facebook:

  • Should moderators be given more leeway to remove events/accounts/pages (at least temporarily) that have generated hundreds of complaints, even if they don’t immediately appear to violate policies?
  • Would a better/more transparent appeal process allow moderators to make more judgment calls that might address issues like this more expediently by allowing them to make mistakes that can be undone if no violation occurred?
  • How does the addition of more forms of content to the “unwanted” list complicate moderation efforts?

Questions and policy implications to consider:

  • Does the seemingly constant addition of new forms of content to “banned” lists invite closer government inspection or regulation?
  • Is a perceived failure to react quickly enough an impetus for change within the company?
  • Are policies in place to allow for judgment calls by moderators? If so, do they encourage erring on the side of caution or overblocking?
  • Does taking credit for actions not actually performed by Facebook make it appear more focused on serving its own interests, rather than its users or public safety in general?

Resolution: The event flagged by hundreds of users was ultimately removed? but not by Facebook, as was earlier reported. The group that posted the event took it down following the shooting in Kenosha. The company appears to realize this delay may have contributed to events that unfolded in Kenosha and has put policies in place to make things clearer for moderators.

Filed Under: , , , ,
Companies: facebook

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Content Moderation Case Studies: Facebook Removes Militia Event Following A Shooting (August 2020)”

Subscribe: RSS Leave a comment
33 Comments
Bobvious says:

Perhaps widen the skillset of moderators?

This seems like a classic case of "increased chatter", that someone with proper military and/or signals intelligence experience could assist with in terms of managing the moderation. I’m not suggesting this will only be one person or is entirely simplistic, but this can’t just be left to people who are using "profanity and copyright" filtering algorithms.

While the platforms are not responsible for morons and stupid behaviour, they have the opportunity to limit the negative outcomes of that, through thinking beyond the next financial incentive.

This comment has been flagged by the community. Click here to show it.

Libby Ruhl DeLuzionnes says:

The KEY criminal act is tried to stab a police officer.

According to the release, Blake admitted to investigators that he did have a knife at the time and a knife was recovered from the driver’s side floorboard of his vehicle. No other weapons were found, the report said.

But that’s simply elided here at dishonest TD and in the Rittenhouse piece:

It was past 11 p.m. local time Tuesday, the third night of protests after a Kenosha police officer shot Jacob Blake seven times in the back.

Scary Devil Monastery (profile) says:

Re: The KEY criminal act is tried to stab a police officer.

"According to the release, Blake admitted to investigators that he did have a knife at the time and a knife was recovered from the driver’s side floorboard of his vehicle."

This is still the United States we’re talking about, where a white man is not to be questioned about carrying a loaded AR-15, a shotgun, and have a crate of handguns or rifles in the trunk, but apparently, a black man possessing a knife in his vehicle is justification for said black man being killed?

Thank you very much, Baghdad Bob, for giving us yet more evidence you’re a hypocritical racist shitbag. As if we needed that given your previous assertions about how law shouldn’t require evidence or a courtroom.

"But that’s simply elided here at dishonest TD and in the Rittenhouse piece…"

"dishonest" for bringing to attention that if Blake had been white he could have had his cars stuffed with shotguns or worn them openly and the officers wouldn’t have so much as unholstered their weapons. In fact a white person already known to have committed murder could walk through a police line still carrying his AR without even being questioned.

This comment has been flagged by the community. Click here to show it.

Libby Ruhl DeLuzionnes says:

Another KEY part you omit: "third night of protests"

Among protesters, the rumor spread: Hundreds of white men with guns had answered an online call from a self-described militia group known as the Kenosha Guard and would be waiting in the park to shoot them.

Not nearly that many armed men showed up, but they were impossible to avoid. Some joined the marchers and pledged to protect them. Many protesters still felt more afraid than secure.

The police weren’t keeping order. Vigilantes decided to, as is the responsibility of sane people.

That’s why to stop riots immediately. Turned into armed confrontation.

And of course YOU omit that "the marchers" also had guns on their side.

That One Guy (profile) says:

Re: Re: Re:

You missed the best part in that their argument could easily be turned around and used in favor of those they would vilify, as people are pissed off and protesting and rioting because the police have shown that they aren’t interested in order or laws but merely protecting their own no matter what, such that people have to step in and try to fix a broken system, ‘as is the responsibility of sane people’.

This comment has been flagged by the community. Click here to show it.

This comment has been deemed insightful by the community.
Stephen T. Stone (profile) says:

Re:

It’s absolutely clear that "the left" won’t back off.

Who was it that was charged this morning with shooting up and helping burn down a police precinct in Minneapolis? Was it…

  1. Antifa
  2. Black Lives Matter
  3. A right-wing nutjob associated with the Boogaloo movement and had contact with a cop-killer before the precinct burned down but after said cop-killer had already killed a cop?

…it—it’s #3. The answer is #3.

That One Guy (profile) says:

Re: Re: The answer is always #3

Adding to that which side is it that’s driven vehicles into crowds multiple times now, killing at least one person in the process?

I do so laugh when someone tries to portray ‘the left’ as this pile of violent nutjobs that the police need to protect the helpless public from, as while some pissed off rioters might have torched some property when it comes to violence against people their side seems to have no problem not just crossing that line but vaulting over it with no shortage of outright sadistic glee.

This comment has been flagged by the community. Click here to show it.

Libby Ruhl DeLuzionnes says:

And the sum is that slanted bigotry isn't "moderation".

You’re totally biased and try to stop all other viewpoints, especially for any organizing, formula for conflict.

Now, flatly, not to stint you because you’ll go berserk anyway: I don’t find any wrongdoing by police or Rittenhouse, but DO by "the marchers". Leftists just pounce on any pretext without regard to facts.

Scary Devil Monastery (profile) says:

Re: Re: Re:

"You, uh…you friendly with the Proud Boys, Brainy?"

Pretty obvious, innit? Old Baghdad Bob’s pulled out all the stops in this one; Blake, being unarmed, gets shot 7 times in the back and old Bobmail justifies it by saying he had a knife in his car.

Then in the very next sentence Bobmail turns right around and glorifies the right-wing extremist violence as vigilantism.

With the only notable differences between the violence Baghdad Bob condones and condemns being the skin color of the victim it’s pretty damn clear where he’s standing.

Not, bluntly put, that this should be any surprise. We’ve known since his Torrentfreak days he’s a paranoid schizophrenic with an unhealthy attraction to fascism, racism, bigotry and outright sadism.

Or to summarize, a loser in life who compensates for nothing going his way by inventing a world inside his head where he’s the eternal victim of other people and only showing up from that world to write the most recent case of thuggery in on "his" side of that narrative.

He’d be tragic and pitiable if he wasn’t such an obvious and persistent asshole.

This comment has been deemed insightful by the community.
PaulT (profile) says:

Re: Re: And the sum is that slanted bigotry isn't "moderation".

The subject at hand is a kid who, inspired and emboldened by the same right-wing echo chamber he lives in, went and killed two people – one of whom was in the process of trying to legally apprehend him after he had been witnessed murdering the first. The kid was seen palling around with cops just before going on his murder spree, who let him get away comfortably with his gun to his home in another state, in stark contrast to the violent deaths handed out to black men who they suspect might have a weapon.

Dickhead has to take a viewpoint like the one he’s taking, because he has no leg to stand on morally or factually if he takes another one.

This comment has been deemed insightful by the community.
That One Guy (profile) says:

Not the best look...

Reported literally hundreds of times, reviewed for moderation four times, and a page that called for armed responses to ‘thugs’ ended up being taken down by the people that put it up in the first place, who apparently realized quicker than Facebook what a bad look keeping it up made for.

Even if it didn’t result in death what did they think that sort of ‘event’ was likely to result in, and how in the hell did not one but four moderators conclude that a page like that didn’t violate rules against ‘Dangerous individuals and organizations’, because honestly if a page calling for armed responses to a group of people doesn’t meet the bar required I struggle to think of anything that could.

This comment has been flagged by the community. Click here to show it.

That One Guy (profile) says:

Re: Re: Re: Re:

Oh how I wish the Some Asshole initiative was the rule, I can only imagine how many scumbags wouldn’t bother knowing that what they did would be all that was mentioned with the main focus being on how the people/community was healing as all the while they would just be known as ‘Some Asshole’ and treated as just a minor footnote.

This comment has been flagged by the community. Click here to show it.

Anonymous Coward says:

Re: Re: Re:

Would you be ok with shooting a convicted bank robber? Or a convicted burglar? Or a convicted drug dealer? Or a convicted drunk driver? Would they deserve to be murdered in cold blood for their crimes that they have already served their sentence for?

They have been convicted of their crime and have paid the price for it. It is not the job of everyday citizens to enforce the law. That’s what the police are (supposed to be) for.

Cops get away with murdering people all the time. But ordinary citizens? Murdering someone (no, it was not "self-defense") is a crime. Point blank.

You can’t get away with murdering someone just because you don’t like them for whatever reason. Even if you think your reason is valid.

Add Your Comment

Your email address will not be published.

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...
Older Stuff
15:43 Content Moderation Case Study: Facebook Struggles To Correctly Moderate The Word 'Hoe' (2021) (21)
15:32 Content Moderation Case Study: Linkedin Blocks Access To Journalist Profiles In China (2021) (1)
16:12 Content Moderation Case Studies: Snapchat Disables GIPHY Integration After Racist 'Sticker' Is Discovered (2018) (11)
15:30 Content Moderation Case Study: Tumblr's Approach To Adult Content (2013) (5)
15:41 Content Moderation Case Study: Twitter's Self-Deleting Tweets Feature Creates New Moderation Problems (2)
15:47 Content Moderation Case Studies: Coca Cola Realizes Custom Bottle Labels Involve Moderation Issues (2021) (14)
15:28 Content Moderation Case Study: Bing Search Results Erases Images Of 'Tank Man' On Anniversary Of Tiananmen Square Crackdown (2021) (33)
15:32 Content Moderation Case Study: Twitter Removes 'Verified' Badge In Response To Policy Violations (2017) (8)
15:36 Content Moderation Case Study: Spam "Hacks" in Among Us (2020) (4)
15:37 Content Moderation Case Study: YouTube Deals With Disturbing Content Disguised As Videos For Kids (2017) (11)
15:48 Content Moderation Case Study: Twitter Temporarily Locks Account Of Indian Technology Minister For Copyright Violations (2021) (8)
15:45 Content Moderation Case Study: Spotify Comes Under Fire For Hosting Joe Rogan's Podcast (2020) (64)
15:48 Content Moderation Case Study: Twitter Experiences Problems Moderating Audio Tweets (2020) (6)
15:48 Content Moderation Case Study: Dealing With 'Cheap Fake' Modified Political Videos (2020) (9)
15:35 Content Moderation Case Study: Facebook Removes Image Of Two Men Kissing (2011) (13)
15:23 Content Moderation Case Study: Instagram Takes Down Instagram Account Of Book About Instagram (2020) (90)
15:49 Content Moderation Case Study: YouTube Relocates Video Accused Of Inflated Views (2014) (2)
15:34 Content Moderation Case Study: Pretty Much Every Platform Overreacts To Content Removal Stimuli (2015) (23)
16:03 Content Moderation Case Study: Roblox Tries To Deal With Adult Content On A Platform Used By Many Kids (2020) (0)
15:43 Content Moderation Case Study: Twitter Suspends Users Who Tweet The Word 'Memphis' (2021) (10)
15:35 Content Moderation Case Study: Time Warner Cable Doesn't Want Anyone To See Critical Parody (2013) (14)
15:38 Content Moderation Case Studies: Twitter Clarifies Hacked Material Policy After Hunter Biden Controversy (2020) (9)
15:42 Content Moderation Case Study: Kik Tries To Get Abuse Under Control (2017) (1)
15:31 Content Moderation Case Study: Newsletter Platform Substack Lets Users Make Most Of The Moderation Calls (2020) (8)
15:40 Content Moderation Case Study: Knitting Community Ravelry Bans All Talk Supporting President Trump (2019) (29)
15:50 Content Moderation Case Study: YouTube's New Policy On Nazi Content Results In Removal Of Historical And Education Videos (2019) (5)
15:36 Content Moderation Case Study: Google Removes Popular App That Removed Chinese Apps From Users' Phones (2020) (28)
15:42 Content Moderation Case Studies: How To Moderate World Leaders Justifying Violence (2020) (5)
15:47 Content Moderation Case Study: Apple Blocks WordPress Updates In Dispute Over Non-Existent In-app Purchase (2020) (18)
15:47 Content Moderation Case Study: Google Refuses To Honor Questionable Requests For Removal Of 'Defamatory' Content (2019) (25)
More arrow