Techdirt's think tank, the Copia Institute, is working with the Trust & Safety Professional Association and its sister organization, the Trust & Safety Foundation, to produce an ongoing series of case studies about content moderation decisions. These case studies are presented in a neutral fashion, not aiming to criticize or applaud any particular decision, but to highlight the many different challenges that content moderators face and the tradeoffs they result in. Find more case studies here on Techdirt and on the TSF website.

Content Moderation Case Study: Google Refuses A Law Enforcement Agency Demand To Remove A Video Depicting Police Brutality (2011)

from the government-takedowns dept

Summary: Google began documenting government requests for content removal in 2009. Periodic transparency reports informed users about demands made by government agencies, breaking requests down by country and targeted service (YouTube, Google search, Blogspot, etc.)

In October 2011, the section dealing with US government agency requests included a note indicating Google had refused a questionable demand to take down a YouTube video.

We received a request from a local law enforcement agency to remove YouTube videos of police brutality, which we did not remove. Separately, we received requests from a different local law enforcement agency for removal of videos allegedly defaming law enforcement officials. We did not comply with those requests, which we have categorized in this Report as defamation requests.

News coverage about the unusual request pointed out YouTube’s value as an archive of public interest recordings. Later reports issued by Google added the name of the law enforcement agency: the Greensboro Police Department — one that apparently has a history of officers deploying excessive force.

This isn’t the only time Google has rejected an apparent effort to remove content that does not reflect well on the agency making the request. Subsequent reports show Google has rejected government requests targeting recordings of abuse of inmates by corrections officers, articles detailing a police officer’s work history, videos containing information about a law enforcement investigation, and five requests to remove videos that “criticized local and state government agencies.”

Decisions to be made by Google:

  • Does targeted content contain sufficient public interest to justify rejecting government demands?
  • Does pushing back on requests usually result in government agencies withdrawing or dropping questionable takedown demands?
  • If copyright claims are raised as justification for seemingly inappropriate requests, is fair use raised in defense of leaving the video up?

Questions and policy implications to consider:

  • Does YouTube have a duty to preserve content with sufficient public interest, even if the original uploader/creator tries to remove it?
  • Does Google/YouTube have a duty to the public in general, even though it’s a private company? If so, does this conflict with Google’s obligations to its shareholders?
  • Does challenging questionable government demands provide more value to users?

Resolution: As noted in Google’s transparency report, the targeted video was not taken down. This initial oddity was followed by other questionable requests over the years. Following this showdown with a local law enforcement agency, Google began highlighting requests that “may be of public interest,” allowing users to gain more insight into questionable government activities, as well as similarly-notable requests made by private citizens and businesses.

Filed Under: , , , ,
Companies: google, youtube

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Content Moderation Case Study: Google Refuses A Law Enforcement Agency Demand To Remove A Video Depicting Police Brutality (2011)”

Subscribe: RSS Leave a comment
16 Comments
That One Guy (profile) says:

I wonder if there's a term for that sort of thing...

Following this showdown with a local law enforcement agency, Google began highlighting requests that "may be of public interest," allowing users to gain more insight into questionable government activities, as well as similarly-notable requests made by private citizens and businesses.

Oh to be a fly on the wall of the police departments that tried to hide officer brutality and/or corruption only to have a huge freakin’ spotlight pointed at it instead, though the best part has got to be that unless I’m misreading something multiple requests of that sort have been made, which means some people really did not do their research before attempting to hide their rotten actions.

This comment has been flagged by the community. Click here to show it.

ECA (profile) says:

14 billion is the question.

What is the answer?
How much money every man/woman/child could have paid to win this election.? $46.67

How much was spent on this election, to be the all time most.?

Where is the F’ did they get that much in the first place? From all the tax savings in the last year?

Then the question.
Where does all the extra cash go After the election? I dont think it gets returned.

This comment has been deemed insightful by the community.
Uriel-238 (profile) says:

Does Google/YouTube have a duty to the public in general, even though it’s a private company? If so, does this conflict with Google’s obligations to its shareholders?

Google thrives on being popular with the general public, and this popularity is driven (in part) by preserving public-interest content, and serving as a reliable public forum.

So regardless of whether it has a duty to the public, by serving the public well, it is executing its duty to the shareholders to make bigger profits. Granted this might piss off some shareholders who are invested in the stability of the establishment, but that’s not Google’s direct problem until enough of them complain to the board of directors and dictate Google policy.

Google doesn’t have any civic responsibility to the government either. It may have to act on court orders just because it is a company under US jurisdiction, but these typically run contrary to the services it provides to the public (which includes individual privacy of private data), so it has every reason to contest government agents trying to encroach on these services.

Anonymous Coward says:

Re: government agents encRoach

"so it has every reason to contest government agents trying to encroach"

…. we all have every reason to contest & resist the constant stream of police-state encroachments

unfortunately Google & BigTech generally … do strongly support heavy government interventions into most aspects of our lives

This comment has been deemed insightful by the community.
Uriel-238 (profile) says:

Re: Re: strong support of heavy government interventions

Generally, yes, of course they do. The establishment system affords these companies a lot of wealth and power which runs contrary to the welfare and the good of the public.

But the point is we’re seeing Google not facilitating government encroachment the same way we see, for example, AT&T who sells fast-track phone record access to Law Enforcement, a practice which would run entirely contrary to the rights expressed in the Fourth Amendment of the United States Constitution… if it weren’t for Third Party Doctrine (which has precedent but not due to the Constitution).

And the question is, why is Google paying its lawyers to challenge the will of state agents / law enforcement, especially in a way that Facebook does not?

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...
Older Stuff
15:43 Content Moderation Case Study: Facebook Struggles To Correctly Moderate The Word 'Hoe' (2021) (21)
15:32 Content Moderation Case Study: Linkedin Blocks Access To Journalist Profiles In China (2021) (1)
16:12 Content Moderation Case Studies: Snapchat Disables GIPHY Integration After Racist 'Sticker' Is Discovered (2018) (11)
15:30 Content Moderation Case Study: Tumblr's Approach To Adult Content (2013) (5)
15:41 Content Moderation Case Study: Twitter's Self-Deleting Tweets Feature Creates New Moderation Problems (2)
15:47 Content Moderation Case Studies: Coca Cola Realizes Custom Bottle Labels Involve Moderation Issues (2021) (14)
15:28 Content Moderation Case Study: Bing Search Results Erases Images Of 'Tank Man' On Anniversary Of Tiananmen Square Crackdown (2021) (33)
15:32 Content Moderation Case Study: Twitter Removes 'Verified' Badge In Response To Policy Violations (2017) (8)
15:36 Content Moderation Case Study: Spam "Hacks" in Among Us (2020) (4)
15:37 Content Moderation Case Study: YouTube Deals With Disturbing Content Disguised As Videos For Kids (2017) (11)
15:48 Content Moderation Case Study: Twitter Temporarily Locks Account Of Indian Technology Minister For Copyright Violations (2021) (8)
15:45 Content Moderation Case Study: Spotify Comes Under Fire For Hosting Joe Rogan's Podcast (2020) (64)
15:48 Content Moderation Case Study: Twitter Experiences Problems Moderating Audio Tweets (2020) (6)
15:48 Content Moderation Case Study: Dealing With 'Cheap Fake' Modified Political Videos (2020) (9)
15:35 Content Moderation Case Study: Facebook Removes Image Of Two Men Kissing (2011) (13)
15:23 Content Moderation Case Study: Instagram Takes Down Instagram Account Of Book About Instagram (2020) (90)
15:49 Content Moderation Case Study: YouTube Relocates Video Accused Of Inflated Views (2014) (2)
15:34 Content Moderation Case Study: Pretty Much Every Platform Overreacts To Content Removal Stimuli (2015) (23)
16:03 Content Moderation Case Study: Roblox Tries To Deal With Adult Content On A Platform Used By Many Kids (2020) (0)
15:43 Content Moderation Case Study: Twitter Suspends Users Who Tweet The Word 'Memphis' (2021) (10)
15:35 Content Moderation Case Study: Time Warner Cable Doesn't Want Anyone To See Critical Parody (2013) (14)
15:38 Content Moderation Case Studies: Twitter Clarifies Hacked Material Policy After Hunter Biden Controversy (2020) (9)
15:42 Content Moderation Case Study: Kik Tries To Get Abuse Under Control (2017) (1)
15:31 Content Moderation Case Study: Newsletter Platform Substack Lets Users Make Most Of The Moderation Calls (2020) (8)
15:40 Content Moderation Case Study: Knitting Community Ravelry Bans All Talk Supporting President Trump (2019) (29)
15:50 Content Moderation Case Study: YouTube's New Policy On Nazi Content Results In Removal Of Historical And Education Videos (2019) (5)
15:36 Content Moderation Case Study: Google Removes Popular App That Removed Chinese Apps From Users' Phones (2020) (28)
15:42 Content Moderation Case Studies: How To Moderate World Leaders Justifying Violence (2020) (5)
15:47 Content Moderation Case Study: Apple Blocks WordPress Updates In Dispute Over Non-Existent In-app Purchase (2020) (18)
15:47 Content Moderation Case Study: Google Refuses To Honor Questionable Requests For Removal Of 'Defamatory' Content (2019) (25)
More arrow