Techdirt's think tank, the Copia Institute, is working with the Trust & Safety Professional Association and its sister organization, the Trust & Safety Foundation, to produce an ongoing series of case studies about content moderation decisions. These case studies are presented in a neutral fashion, not aiming to criticize or applaud any particular decision, but to highlight the many different challenges that content moderators face and the tradeoffs they result in. Find more case studies here on Techdirt and on the TSF website.

Content Moderation Case Study: Vimeo Moderates Uploads Of 'Commercial-Use' Videos Using Unclear Guidelines (2009)

from the what-is-commercial-use dept

Summary: Vimeo, the video-hosting website created by CollegeHumor’s parent company in 2004, has always presented itself as a destination for creators who wished to free themselves from YouTube’s limitations and aggressive monetization. Vimeo remains ad-free, supporting itself with subscription fees.

Other efforts were made to distance Vimeo from YouTube. Its fairly aggressive content policy forbade plenty of things that were acceptable on Google’s platform, including videos promoting commercial services.

The terms of service didn’t explicitly forbid content that related to commercial services but were not attempts to sell services directly to other Vimeo users, but user experience consultant Paul Boag found his videos targeted by Vimeo and given a week to move them to another hosting service. While some videos of Boag’s rode the edge of the terms of service ban on commercial videos, others provided nothing more than marketing advice or reviews of browser plugins.

At that point, Vimeo also banned the embedding of hosted content on sites that also served up ads. Unfortunately for Boag, his own site contained ads, making it a violation of the terms of service to embed his own videos on his own site. And this rule wasn’t set in stone: Vimeo rather unhelpfully clarified it did allow embedding on some sites with ads, but it was a decision only Vimeo could make.

Vimeo players cannot appear on domains running ads, its a decision we made in the beginning and have been going back and forth with allowing or disallowing it, but so far we cannot allow it unless it is with one of our partners.

Decisions to be made by Vimeo:

  • Does distinguishing the service from YouTube with more onerous restrictions on content ultimately lower moderation costs by attracting a user base that self-selects?
  • Is the risk of losing paid users an acceptable tradeoff for preventing Vimeo from “devolving” into just another YouTube-like service?
  • Is making judgment calls on “commercial-use content” possible to do fairly when it appears to mainly be based on subjective calls by moderators?
  • Is Vimeo large enough to comfortably absorb any damage to its reputation or user goodwill when its moderation decisions affect content that doesn’t actually violate its policies?

Questions and policy implications to consider:

  • Would allowing users to pay to upload commercial-use videos move the platform closer to the competitors Vimeo has tried to distance itself from?
  • Would a transparent and open challenge process help Vimeo avoid losing paying users?

Resolution: Paul Boag’s videos were removed and Boag chose to use a different platform to host his content, rather than continue to struggle with Vimeo’s unclear content policies.

A few years later, Vimeo changed course and began allowing Pro users to upload content that was considered a violation of its terms of service in 2009. The restrictions on commercial-use content have since been rolled back even further, forbidding users from posting only certain kinds of commercial-use content:

We do not allow content that promotes:

  • Illegal schemes (like Pyramid/Ponzi schemes)
  • Businesses that promise wealth with little or no effort
  • Unregistered securities offerings (absent a legal basis)
  • Illegal products or services 
  • Products or services (even if legal) using deceptive marketing practices 
  • In addition, users may not use messaging capabilities for unsolicited direct marketing purposes.

Originally posted on the Trust & Safety Foundation website.

Filed Under: ,
Companies: vimeo

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Content Moderation Case Study: Vimeo Moderates Uploads Of 'Commercial-Use' Videos Using Unclear Guidelines (2009)”

Subscribe: RSS Leave a comment
1 Comment
Anonymous Coward says:

"Unclear Guidelines" are better than Techdirt's NONE!

Nearly every other site has a page of what’s allowed, but in 20 years, nothing from Techdirt.

However, commentors are subject to an alleged "community" (that’s all the information available about it), to which Techdirt has given control of its Heckler’s Veto, which it calls "hiding", that adds an Editorial warning besides requires effort of a click to see, thereby implementing viewpoint discrimination without The Maz being seen responsible.

Again, Maz, why don’t you ‘splain how your "moderation" works, rather than pretend you can advise others?

By the way, the "Held For Moderation" LIE (mine never come out, some dozens just in last few trying to get in), also includes mention of "staff will review" it. — WHO are your "staff", Maz? You never actually admit there’s an Admin, or Moderator, or anything involved in the hiding except for the mysterious "community" which has a "voting system" in which are no upvotes even possible (been admitted by a minion).

How the heck does your system work? Magic?

Add Your Comment

Your email address will not be published.

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Older Stuff
15:43 Content Moderation Case Study: Facebook Struggles To Correctly Moderate The Word 'Hoe' (2021) (21)
15:32 Content Moderation Case Study: Linkedin Blocks Access To Journalist Profiles In China (2021) (1)
16:12 Content Moderation Case Studies: Snapchat Disables GIPHY Integration After Racist 'Sticker' Is Discovered (2018) (11)
15:30 Content Moderation Case Study: Tumblr's Approach To Adult Content (2013) (5)
15:41 Content Moderation Case Study: Twitter's Self-Deleting Tweets Feature Creates New Moderation Problems (2)
15:47 Content Moderation Case Studies: Coca Cola Realizes Custom Bottle Labels Involve Moderation Issues (2021) (14)
15:28 Content Moderation Case Study: Bing Search Results Erases Images Of 'Tank Man' On Anniversary Of Tiananmen Square Crackdown (2021) (33)
15:32 Content Moderation Case Study: Twitter Removes 'Verified' Badge In Response To Policy Violations (2017) (8)
15:36 Content Moderation Case Study: Spam "Hacks" in Among Us (2020) (4)
15:37 Content Moderation Case Study: YouTube Deals With Disturbing Content Disguised As Videos For Kids (2017) (11)
15:48 Content Moderation Case Study: Twitter Temporarily Locks Account Of Indian Technology Minister For Copyright Violations (2021) (8)
15:45 Content Moderation Case Study: Spotify Comes Under Fire For Hosting Joe Rogan's Podcast (2020) (64)
15:48 Content Moderation Case Study: Twitter Experiences Problems Moderating Audio Tweets (2020) (6)
15:48 Content Moderation Case Study: Dealing With 'Cheap Fake' Modified Political Videos (2020) (9)
15:35 Content Moderation Case Study: Facebook Removes Image Of Two Men Kissing (2011) (13)
15:23 Content Moderation Case Study: Instagram Takes Down Instagram Account Of Book About Instagram (2020) (90)
15:49 Content Moderation Case Study: YouTube Relocates Video Accused Of Inflated Views (2014) (2)
15:34 Content Moderation Case Study: Pretty Much Every Platform Overreacts To Content Removal Stimuli (2015) (23)
16:03 Content Moderation Case Study: Roblox Tries To Deal With Adult Content On A Platform Used By Many Kids (2020) (0)
15:43 Content Moderation Case Study: Twitter Suspends Users Who Tweet The Word 'Memphis' (2021) (10)
15:35 Content Moderation Case Study: Time Warner Cable Doesn't Want Anyone To See Critical Parody (2013) (14)
15:38 Content Moderation Case Studies: Twitter Clarifies Hacked Material Policy After Hunter Biden Controversy (2020) (9)
15:42 Content Moderation Case Study: Kik Tries To Get Abuse Under Control (2017) (1)
15:31 Content Moderation Case Study: Newsletter Platform Substack Lets Users Make Most Of The Moderation Calls (2020) (8)
15:40 Content Moderation Case Study: Knitting Community Ravelry Bans All Talk Supporting President Trump (2019) (29)
15:50 Content Moderation Case Study: YouTube's New Policy On Nazi Content Results In Removal Of Historical And Education Videos (2019) (5)
15:36 Content Moderation Case Study: Google Removes Popular App That Removed Chinese Apps From Users' Phones (2020) (28)
15:42 Content Moderation Case Studies: How To Moderate World Leaders Justifying Violence (2020) (5)
15:47 Content Moderation Case Study: Apple Blocks WordPress Updates In Dispute Over Non-Existent In-app Purchase (2020) (18)
15:47 Content Moderation Case Study: Google Refuses To Honor Questionable Requests For Removal Of 'Defamatory' Content (2019) (25)
More arrow