Content Moderation Case Study: Xhamster, The 22nd Biggest Site On The Internet, Moderates Content Using Unpaid Volunteers (2020)
from the one-way-to-do-it dept
Summary: Formed in 2007 and operated out of Limassol, Cyprus, xHamster has worked its way up to become the 20th most-visited site on the internet. The site boasts 10 million members and hundreds of millions of daily visitors despite being blocked by a number of governments around the world.
Being in the pornography business poses unique moderation challenges. Not only do moderators deal with a flood of both amateur and professional submissions, they must take care to prevent the uploading of illegal content. This goes further than policing uploads for unauthorized distribution of copyrighted material. Moderators must also make decisions — with facts not in their possession — about the ages of performers in amateur videos to prevent being prosecuted for the distribution of child pornography.
Given the stakes, users would expect a well-staffed moderation team trained in the difficult art of discerning performers’ ages? or at least given the authority to block uploads until information about performers is obtained from uploaders.
Unfortunately, this does not appear to be the case. An undercover investigation by Vice shows one of the biggest sites on the internet has chosen to lower its costs by relying on an all-volunteer moderation team.
One member of the discussion is ?Holger?, a user created by VICE News to infiltrate the content moderation team and observe its inner workings. Holger finds himself in a team of over 100 unpaid, voluntary workers called ?the Reviewers Club?, which means he has partial control over which photos stay online and which are taken down.
Moderators are guided by a 480-page manual that explains what images and videos are permitted. The “Reviewers Club” then works its way through thousands of content submissions every day, making judgment calls on uploads in hopes of preventing illegal or forbidden content from going live on the site.
Decisions to be made by xHamster:
- Does relying on unpaid volunteers create unnecessary risks for the site?
- Would paying moderators result in better moderation? Or would paid moderation result in only nominal gains that would not justify the extra expense?
- As more revenge porn laws are created, does xHamster run the risk of violating more laws by turning over this job to volunteers who may personally find this content acceptable?
Questions and policy implications to consider:
- Given the focus on child sexual abuse material by almost every government in the world, does the reliance on an all-volunteer moderation team given the impression xHamster doesn’t care enough about preventing further abuse or distribution of illicit content?
- Does asking content consumers to make judgment calls on uploads create new risks, like an uptick in uploads of borderline content that appeals to members of the volunteer staff?
- Can the site justify the continued use of volunteer moderators given its assumed profitability and heavy internet traffic?
Resolution: Despite the site’s popularity, xHamster has not made the move to paid moderation that does not involve site users whose personal preferences may result in unsound moderation decisions. The investigation performed by Vice shows some moderators are also content contributors, which raises more concerns about moderation decisions on borderline uploads.
While xHamster informs users that all uploaded content requires the “written consent” of all performers, there’s no evidence on hand that shows the site actually collects this information before approving uploads.
Further skewing moderation efforts is the site’s highly-unofficial “reward” program which grants “badges” to reviewers who review more content. The site’s guidelines only forbid the worst forms of content, including “blood, violence, rape” and “crying” (if it’s determined the crying is “real.”). Underage content is similarly forbidden, but reviewers have admitted to Vice policing underage content is “impossible.”
Moderation decisions are backstopped by the site, which requires several “votes” from moderators before making a decision on uploaded content. The “democratic” process helps mitigate questionable decisions made by the volunteer staff, but it creates the possibility that illicit content may obtain enough votes to skirt the site’s internal guidelines.
Originally published on the Trust & Safety Foundation website.