Techdirt's think tank, the Copia Institute, is working with the Trust & Safety Professional Association and its sister organization, the Trust & Safety Foundation, to produce an ongoing series of case studies about content moderation decisions. These case studies are presented in a neutral fashion, not aiming to criticize or applaud any particular decision, but to highlight the many different challenges that content moderators face and the tradeoffs they result in. Find more case studies here on Techdirt and on the TSF website.

Content Moderation Case Study: Valve Takes A Hands Off Approach To Porn Via Steam (2018)

from the know-it-when-we-see-it dept

Summary: Different platforms have different rules regarding ?adult? content, but they often prove difficult to enforce. Even the US judicial system has declared that there is no easy way to define pornography, leading to Justice Potter Stewart?s famous line, ?I know it when I see it.?

Many, if not most, internet websites have rules regarding such adult content, and in 2017 Valve?s online game platform, Steam, started trying to get more serious about enforcing its rules, leading to some smaller independent games being banned from the platform. Over the next few months more and more games were removed, though some started pointing out that this policy and the removals were doing the most harm to independent game developers.

In June of 2018, Valve announced that it had listened to various discussions on this and decided that it was going to take a very hands off approach to moderating content, including adult content. After admitting that there are widespread debates over this, the company said that it would basically allow absolutely anything on the platform, with very, very few exceptions:

So we ended up going back to one of the principles in the forefront of our minds when we started Steam, and more recently as we worked on Steam Direct to open up the Store to many more developers: Valve shouldn’t be the ones deciding this. If you’re a player, we shouldn’t be choosing for you what content you can or can’t buy. If you’re a developer, we shouldn’t be choosing what content you’re allowed to create. Those choices should be yours to make. Our role should be to provide systems and tools to support your efforts to make these choices for yourself, and to help you do it in a way that makes you feel comfortable.

With that principle in mind, we’ve decided that the right approach is to allow everything onto the Steam Store, except for things that we decide are illegal, or straight up trolling. Taking this approach allows us to focus less on trying to police what should be on Steam, and more on building those tools to give people control over what kinds of content they see. We already have some tools, but they’re too hidden and not nearly comprehensive enough. We are going to enable you to override our recommendation algorithms and hide games containing the topics you’re not interested in. So if you don’t want to see anime games on your Store, you’ll be able to make that choice. If you want more options to control exactly what kinds of games your kids see when they browse the Store, you’ll be able to do that. And it’s not just players that need better tools either – developers who build controversial content shouldn’t have to deal with harassment because their game exists, and we’ll be building tools and options to support them too.

The company admitted that it would likely struggle with this plan, especially given different laws around the globe, but that it wanted to put the onus on end users, rather than itself.

Decisions to be made by Valve:

  • Is it really possible to allow anything that isn?t ?illegal or straight up trolling??
  • How do you define ?straight up trolling??
  • How do you make sure that parts of the Steam store are safe for younger users?
  • What tools need to be provided to users to set their own filters?

Questions and policy implications to consider:

  • With more and more pressure from governments to clean up the internet, will taking a ?hands off approach? lead to even more regulatory threats?
  • Does taking such a hands off approach create greater legal liability?
  • Can a hands off approach make users feel that the company is putting all of the responsibility on them, rather than itself?

Resolution: In the following few months, Valve released more ways to filter content in its store, including an adult filter. It also began approving more explicit games, as suggested by the policy.

At around the same time, it did continue to remove games, supposedly for violating its new ?no trolling? policy. The company admitted that the no trolling policy is intentionally vague.

It is vague and we’ll tell you why. You’re a denizen of the internet so you know that trolls come in all forms. On Steam, some are simply trying to rile people up with something we call “a game shaped object” (ie: a crudely made piece of software that technically and just barely passes our bar as a functioning video game but isn’t what 99.9% of folks would say is “good”). Some trolls are trying to scam folks out of their Steam inventory items, others are looking for a way to generate a small amount of money off Steam through a series of schemes that revolve around how we let developers use Steam keys. Others are just trying to incite and sow discord. Trolls are figuring out new ways to be loathsome as we write this. But the thing these folks have in common is that they aren’t actually interested in good faith efforts to make and sell games to you or anyone. When a developer’s motives aren’t that, they’re probably a troll.

Our review of something that may be “a troll game” is a deep assessment that actually begins with the developer. We investigate who this developer is, what they’ve done in the past, their behavior on Steam as a developer, as a customer, their banking information, developers they associate with, and more. All of this is done to answer the question “who are we partnering with and why do they want to sell this game?” We get as much context around the creation and creator of the game and then make an assessment. A trend we’re seeing is that we often ban these people from Steam altogether instead of cherry-picking through their individual game submissions. In the words of someone here in the office: “it really does seem like bad games are made by bad people.”

This doesn’t mean there aren’t some crude or lower quality games on Steam, but it does mean we believe the developers behind them aren’t out to do anything more than sell a game they hope some folks will want to play.

The company has still faced some criticism over these policies. In 2019 an anti-pornography group complained publicly that it was too easy to find adult content on Steam despite the new filters that were put in place, saying that the filters were ?mere speedbumps.?

In late 2020, Steam started to experiment with a revamp of how it organizes content, and that may include an explicit games area.

Originally published to the Trust & Safety Foundation website.

Filed Under: , , , ,
Companies: valve

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Content Moderation Case Study: Valve Takes A Hands Off Approach To Porn Via Steam (2018)”

Subscribe: RSS Leave a comment
29 Comments
GHB (profile) says:

I really hope they don't go the tumblr route.

Tumblr in the past allowed porn, just that it is only viewable when logged in and be at the age of 80 or older. 2018 was a grim year, and used an algorithm to remove porn off its site. The problem is this:
-Lots of false-positive
-If the post was flagged, and the user is inactive, it is possible that the post may be permanently gone forever, it is something that the user must be active and appeal the claim.

Because of TOS can be changed in the future, those content are prone to be once-allowed to be banned.

Christenson says:

Mere Speedbumps to Pronography

Heya Mike and Techdirt commentariat:
Here’s wondering if it’s actually possible to implement anything more than a speedbump to our favorite bad content if it’s popular, like pronography or an orange-haired ex-president, assuming arguendo he wasn’t an idiot.

Beyond a credit card, with all of its problems, any suggestions?

Anonymous Coward says:

Re: Mere Speedbumps to Pronography

On the internet? Impossible.

On Steam? You need a bank account / credit card to buy most games.

As it involves that, we should leave it up to the parents to manage what kids have access to. Any purchases should be run through them. I certainly wouldn’t leave someone alone with my credit card to buy whatever they want without my input.

I don’t think this is as much of a problem as we think it is here.

Anonymous Coward says:

Re: Mere Speedbumps to Pronography

Any suggestions?

The only ones that would work are an authoritarian’s wet dream:

Whitelist of Users allowed to access the content: Requires full national identity registration and verification to get on the whitelist.

Why? A CC authorization can be faked, just ask any 10 year old that raided their mother’s purse / dad’s wallet while they were sleeping. You’d think parents would spot odd transactions on the bill or their bank account, but most won’t bother unless the charge goes way outside of their "normal" monthly expenditures and causes them to notice a significant change in their balance. I.e. A $1.00 – $2.00 charge isn’t going to cut it for verification purposes.

National ID numbers can be stolen, again just dump them from any poorly secured corporate HR server. Hell in the US you can guess correct numbers with public information.

Email addresses are a dime a dozen.

Essentially nothing short of a manual review and verification process is going to make the situation any better. There’s no means to pre-detect when people are going to be offended at something. If there was, it would be already mandated by law everywhere for everything. Shockingly, the best solution is: "Sticks and stones" in addition to "be a parent", but we as a society have determined that those are no longer good enough. Instead we prefer: "Forced silencing of others" and "Legislative child rearing."

Anonymous Coward says:

Re: Re: Mere Speedbumps to Pronography

"Sticks and stones" in addition to "be a parent",

Yes, that is what was once called "the American way", and anyone espousing that viewpoint should be advised that it is explicitly not Techdirt’s philosophy.

Masnick’s Bois feel that any opinion expressed by any self-respecting, mentally healthy White person should be censored and silenced, for fear that it might set off an avalanche of white supremicisismsmsmacy who would start annudah shoah.

Scary Devil Monastery (profile) says:

Re: Mere Speedbumps to Pronography

"Here’s wondering if it’s actually possible to implement anything more than a speedbump to our favorite bad content if it’s popular…"

Not really, no. I refer to the long-running "wars" on pornography, entartete kunst, copyright infringers and government dissidents everywhere as exhibit A regarding how a speedbump on popular goods remains the very best you can hope for.

Not all the kings horses nor all the kings men, as it were.

This side of a full-on dictatorial dystopia where computers are abolished and literacy and math both heavily restricted there’s nothing to be done about that.

Anonymous Coward says:

I’m concerned "exploits children" could be twisted against anime type games. We know how Japanese culture can be there, and it could force a fair bit of censorship / games being dropped from the store. For better or for worse, some of the content can be a bit sexualized.

For the record, I am against games which exploit actual children. This should go without saying.

Scary Devil Monastery (profile) says:

Re: Re:

"We know how Japanese culture can be there, and it could force a fair bit of censorship / games being dropped from the store."

Could? Try "Will".

I still remember the old debates about youthful-looking 30 year old asian erotic models posing as high school girls, and the way japanese games sexualize anything remotely female. There’s a reason many of the more popular japanese games never make it to western markets…

Candescence (profile) says:

It was a good thing, but not perfectly handled.

I’ve seen cases of Valve removing certain adult games permanently without appeal due to "underage content" even when context clearly states there are no underage characters (and I mean stuff like characters being in college), which happens because the visual style isn’t realistic and makes how old characters look extremely subjective.

I do think Valve allowing adults-only content on Steam was a good move, there are plenty of adult games with actual gameplay and artistic value beyond just being interactive porn. And let’s be frank, anti-porn groups will never be happy until porn ceases to exist period, and in this case, there’s really not a lot Valve can do about what they’re complaining about.

Sure, we don’t need adult content to be everywhere (far from it), but having such content available on more mainstream platforms is a good thing. The fact that Twitter permits adult content is surprising, but it feels like Facebook is kind of an outlier nowadays considering there are a surprising number of social networks now allowing such content. Tumblr tried to kill adult content in a massive overreaction to worse problems, and it basically killed the site.

Anonymous Coward says:

Re: It was a good thing, but not perfectly handled.

which happens because the visual style isn’t realistic and makes how old characters look extremely subjective.

And by "extremely subjective," what we actually mean is deliberately young looking.

I don’t have problems with it as such, but pretending that this is some kind of miscommunication is simply false. The visual styles used are designed (oftentimes explicitly so) to make characters look young.

Depending on where the game came from (cough Japan), even the "clear context" is often a lie, created by the translators to fit perceived standards of acceptability in western markets.

Christenson says:

Re: Re: It was a good thing, but not perfectly handled

So, how long before we have the following situation, since for at least some, playing adult games with kiddie characters will always be a thing?

Adult oriented game offers open "skin" interface for the characters
Deepfake "skin" generator lets you set the characters to anything you like — Jeffery Epstein, a gorilla, or something that makes the game illegal

Anonymous Coward says:

Re: Re: Re: It was a good thing, but not perfectly handled

So, how long before we have the following situation, since for at least some, playing adult games with kiddie characters will always be a thing?

Adult oriented game offers open "skin" interface for the characters
Deepfake "skin" generator lets you set the characters to anything you like — Jeffery Epstein, a gorilla, or something that makes the game illegal

You don’t need to wait.

Hot Coffee was disabled within the game’s files, and required external modification to re-enable it. Didn’t save Rockstar from the ESRB and massive public outcry when the modification method was made public.

Never mind that most games, especially FPS, tend to not have children in them, let allow modding. Would not take much to have that controversy right now. Just pick the most recent CoD or Halo iteration and throw children into it. Maybe add something to encourage people to download and use the mod, (Meme or the like) and wait for it to hit TikTok. You’ll have CNN on it by the end of the week…..

Stephen T. Stone (profile) says:

Re: Re: Re:2

Technically, the Hot Coffee content didn’t make GTA San Andreas illegal. It forced the ESRB to re-rate the “Hot Coffee version” from M to AO, but that didn’t affect whether the game could be legally sold — only who would sell it. (And it forced Rockstar to re-release San Andreas after having taken out the Hot Coffee code and recalled the “Hot Coffee version”.) Even if it had somehow made the game illegal to sell, that could’ve been the start of a lawsuit that would’ve given First Amendment protections to videogames.

(Technically, San Andreas…kinda did that anyway? That game and its GTA predecessors likely helped inspire the California law that led to Brown v. Entertainment Merchants Association. Hot Coffee probably made the lawmakers who passed it believe they were doing the right thing.)

Also, games that allow for modding typically have EULAs with ass-covering “the publisher isn’t responsible for third-party mods” clauses. Someone modding kids into a Halo game wouldn’t open up Microsoft to any kind of lawsuit — though it might make Microsoft reconsider whether (or to what degree) people can mod Halo games.

Scary Devil Monastery (profile) says:

Re: Re: Re: It was a good thing, but not perfectly handled

"So, how long before we have the following situation, since for at least some, playing adult games with kiddie characters will always be a thing?"

Well, I guess it’ll start happening around the same time people start using computers to play games with and IBM finally caves on their assumption that 64 kbit might be bit less RAM than most people are likely to use…

"Adult oriented game offers open "skin" interface for the characters"

Already plenty of hacks to make that happen and in no few cases such as the bethesda lineup, the developers have actively encouraged the player base to mod the game. If there are modders who spend weeks compiling a full set of environmental textures to improve the looks of random shrubbery over what the devs have already created for a given game you can bet your ass there are plenty of kinky people eager to pursue their own vision of rule 34 through the medium of the game.

Candescence (profile) says:

Re: Re: It was a good thing, but not perfectly handled.

I’m not even talking about Japanese-made games, the main example I’m thinking of is from a western indie dev involving mature-looking characters in a college setting. Though in this case, the fact that it’s a "school-like" setting at all is what apparently spooked Valve, even though it’s, you know, college.

Anonymous Coward says:

Re: Re: Speedbumps

So, if filters are roadsigns, and I’m a parent, how am I supposed to protect my little kid from the big bad internet?

You watch your kids’ online activities, and punish them when they ignore your rules. It’s not everyone else’s job to make decisions for you nor enforce those decisions for your children.

Of course, if you go to the various imageboards, apparently those road signs are meant to tell the people on them what the public will allow them to post…..shrugs

This comment has been deemed insightful by the community.
PaulT (profile) says:

Re: Re: Speedbumps

"how am I supposed to protect my little kid from the big bad internet?"

By acting like a parent. Parents who take a real interest in what their kids are doing, monitor their access and lay down rules for having that privilege are usually doing OK. It’s usually the parents who use it as a babysitter who are having problems.

Scary Devil Monastery (profile) says:

Re: Re: Speedbumps

"So, if filters are roadsigns, and I’m a parent, how am I supposed to protect my little kid from the big bad internet?"

You don’t. Now as before the one choice the parent has is whether the kid gets their sex education from you and from responsible sources you can point it to, or from whatever they can glean from the confused cobbled-together "wisdom" of their peers and whatever they stumble across in random online searches for bare breasts.

Darkness Of Course (profile) says:

Oddly enough, today I'm thinking about Pres Clinton

With the current political goings on, Clinton came to mind.

Every time, Clinton comes up, I think of Hillary and Tipper. Or, as I called them at the time: The Nazi Twins. They wanted to prevent selling CDs containing bad words. Think of the children.

Lots of BS flung around and they were blocked, 1st A!

We did get something good out of it. Now when I look at the CD or its listing I look for ‘explicit’ to know I got the adult version.

Oddly enough, Steam’s solution is close to that (IMO).

Anonymous Coward says:

Steam has become a lot less appealing to me as a video game customer since the switch over to no moderation at all. It’s not a moral outrage that there are porn games on Steam but rather it’s the sheer volume of total garbage porn games on Steam. There is absolutely NO quality control and as a result rare high quality games are lost in an ocean of piss. Early Access, steam greenlight, and finally the opening of porn on Steam have all but killed PC gaming in a few short years and it’s sad.

Scary Devil Monastery (profile) says:

Re: Re:

"There is absolutely NO quality control and as a result rare high quality games are lost in an ocean of piss."

Yeah, Valve had a golden opportunity in opening their platform but lost much of it when they failed completely in curation. Some people do like to wade through the muck to find the bits of gold but most people aren’t really keen on having to spend that effort.

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...
Older Stuff
15:43 Content Moderation Case Study: Facebook Struggles To Correctly Moderate The Word 'Hoe' (2021) (21)
15:32 Content Moderation Case Study: Linkedin Blocks Access To Journalist Profiles In China (2021) (1)
16:12 Content Moderation Case Studies: Snapchat Disables GIPHY Integration After Racist 'Sticker' Is Discovered (2018) (11)
15:30 Content Moderation Case Study: Tumblr's Approach To Adult Content (2013) (5)
15:41 Content Moderation Case Study: Twitter's Self-Deleting Tweets Feature Creates New Moderation Problems (2)
15:47 Content Moderation Case Studies: Coca Cola Realizes Custom Bottle Labels Involve Moderation Issues (2021) (14)
15:28 Content Moderation Case Study: Bing Search Results Erases Images Of 'Tank Man' On Anniversary Of Tiananmen Square Crackdown (2021) (33)
15:32 Content Moderation Case Study: Twitter Removes 'Verified' Badge In Response To Policy Violations (2017) (8)
15:36 Content Moderation Case Study: Spam "Hacks" in Among Us (2020) (4)
15:37 Content Moderation Case Study: YouTube Deals With Disturbing Content Disguised As Videos For Kids (2017) (11)
15:48 Content Moderation Case Study: Twitter Temporarily Locks Account Of Indian Technology Minister For Copyright Violations (2021) (8)
15:45 Content Moderation Case Study: Spotify Comes Under Fire For Hosting Joe Rogan's Podcast (2020) (64)
15:48 Content Moderation Case Study: Twitter Experiences Problems Moderating Audio Tweets (2020) (6)
15:48 Content Moderation Case Study: Dealing With 'Cheap Fake' Modified Political Videos (2020) (9)
15:35 Content Moderation Case Study: Facebook Removes Image Of Two Men Kissing (2011) (13)
15:23 Content Moderation Case Study: Instagram Takes Down Instagram Account Of Book About Instagram (2020) (90)
15:49 Content Moderation Case Study: YouTube Relocates Video Accused Of Inflated Views (2014) (2)
15:34 Content Moderation Case Study: Pretty Much Every Platform Overreacts To Content Removal Stimuli (2015) (23)
16:03 Content Moderation Case Study: Roblox Tries To Deal With Adult Content On A Platform Used By Many Kids (2020) (0)
15:43 Content Moderation Case Study: Twitter Suspends Users Who Tweet The Word 'Memphis' (2021) (10)
15:35 Content Moderation Case Study: Time Warner Cable Doesn't Want Anyone To See Critical Parody (2013) (14)
15:38 Content Moderation Case Studies: Twitter Clarifies Hacked Material Policy After Hunter Biden Controversy (2020) (9)
15:42 Content Moderation Case Study: Kik Tries To Get Abuse Under Control (2017) (1)
15:31 Content Moderation Case Study: Newsletter Platform Substack Lets Users Make Most Of The Moderation Calls (2020) (8)
15:40 Content Moderation Case Study: Knitting Community Ravelry Bans All Talk Supporting President Trump (2019) (29)
15:50 Content Moderation Case Study: YouTube's New Policy On Nazi Content Results In Removal Of Historical And Education Videos (2019) (5)
15:36 Content Moderation Case Study: Google Removes Popular App That Removed Chinese Apps From Users' Phones (2020) (28)
15:42 Content Moderation Case Studies: How To Moderate World Leaders Justifying Violence (2020) (5)
15:47 Content Moderation Case Study: Apple Blocks WordPress Updates In Dispute Over Non-Existent In-app Purchase (2020) (18)
15:47 Content Moderation Case Study: Google Refuses To Honor Questionable Requests For Removal Of 'Defamatory' Content (2019) (25)
More arrow