Content Moderation At Scale Is Impossible: Facebook Still Can't Figure Out How To Deal With Naked Breasts

from the this-again? dept

Like a teenaged heterosexual boy, it appears that Facebook has no clue how to deal with naked female breasts. Going back over a decade, the quintessential example used to show the impossibility of coming up with clear, reasonable rules for content moderation at scale is Facebook and breasts. In the early days, as Facebook realized it needed to do some content moderation, and had to establish a clear set of rules that could be applied consistently by a larger team, it started with a simple "no nudity" policy -- and then after that raised questions, it was narrowed down to define female nipples as forbidden. As a wonderful episode of Radiolab detailed last year, questions kept getting raised about how specific do you need to be (each paragraph here is a different speaker, but since Radiolab doesn't supply transcripts, I'm not entirely sure who's speaking):

So, for example, by then, nudity was already not allowed on the site. But they had no definition for nudity. They just said "no nudity." And so the site integrity team -- those 12 people at the time -- they realized that they had to start spelling out exactly what they meant.

Precisely. All of these people at Facebook were in charge of trying to define nudity.

The first cut at it was "visible male and female genitalia." And then "visible female breasts." And then the question is "well, okay, how much of a breast needs to be showing before it's nude?" And the thing that we landed on was, if you could see essentially the nipple and areola, then that's nudity. And would have to be taken down.

This might have seemed like a straightforward rule... until mothers posting breastfeeding photos started complaining -- as they did after a bunch of their photos got blocked. Stories about this go back at least until 2008 when the Guardian reported on the issue, after a bunch of mothers started protesting the company, leading Facebook to come up with this incredibly awkward statement defending the practice:

"Photos containing a fully exposed breast, as defined by showing the nipple or areola, do violate those terms (on obscene, pornographic or sexually explicit material) and may be removed," he said in a statement. "The photos we act upon are almost exclusively brought to our attention by other users who complain."

More public pressure, and more public protests, resulted in Facebook adjusting its policy to allow breastfeeding, but photos still kept getting taken down, leading the company to have to keep changing and clarifying its policy, such as in this statement from 2012.

When it comes to uploaded photos on Facebook, the vast majority of breastfeeding photos comply with our Statement of Rights and Responsibilities, which closely mirrors the policy that governs broadcast television, and which places limitations on nudity due to the presence of minors on our site. On some occasions, breastfeeding photos contain nudity – for example an exposed breast that is not being used for feeding – and therefore violate our terms. When such photos are reported to us and are found to violate our policies, the person who posted the photo is contacted, and the photos are removed. Our policies strive to fit the needs of a diverse community while respecting everyone¹s interest in sharing content that is important to them, including experiences related to breastfeeding.

In the Radiolab episode they pointed out that photos of babies sleeping after having breastfed were getting taken down because the baby's head was no longer blocking the nipple.

In 2014, Facebook clarified its policies on nipples again:

“Our goal has always been to strike an appropriate balance between the interests of people who want to express themselves with the interests of others who may not want to see certain kinds of content,” a Facebook spokesperson told the Daily Dot. “It is very hard to consistently make the right call on every photo that may or may not contain nudity that is reported to us, particularly when there are billions of photos and pieces of content being shared on Facebook every day, and that has sometimes resulted in content being removed mistakenly.

“What we have done is modified the way we review reports of nudity to help us better examine the context of the photo or image,” the spokesperson continued. “As a result of this, photos that show a nursing mothers’ other breast will be allowed even if it is fully exposed, as will mastectomy photos showing a fully exposed other breast.”

Right. And then, just a few months later, people started protesting again, as more breastfeeding photos were taken down.

Again in the Radiolab program, they then discuss how this gets even more confusing, as some people started posting photos of "breast feeding porn" that appeared to show breast feeding that wasn't infants. So they modified the rule to say the breastfeeding individual had to be an infant. But how does Facebook determine who is and who is not an infant? We're right back to the definitional problem. The original rule Facebook put in place was "does the kid look old enough to walk?" which raises other problems, since many kids breastfeed long after they can walk. Facebook has to keep amending and changing. It eventually allows one (just one) nipple/areola showing if it appears related to breastfeeding... then after some time a second one could be shown.

But as Radiolab documented, every time you set a definition, a new exception comes up. In the midst of the breastfeeding mess, this happens:

Literally every time this team at Facebook would would come up with a rule that they thought was airtight--ka-plop--something would show up that they that they weren't prepared for, that the rule hadn't accounted for.

As soon as you think, "yeah, this is good" like the next day something shows up to show you, yeah, you didn't think about this.

For example, sometime around 2011, this content moderator is is going through a queue of things--accept, reject, accept, escalate, accept--and she comes upon this image: the photo itself was a teenage girl, african by dress and skin, breastfeeding a goat -- a baby goat. And the moderator throws their hands up and said "what the fuck is this?"

And we Googled breastfeeding goats and found that this was a thing. It turns out it's a survival practice according to what they found, this is a tradition in Kenya that goes back centuries, that in a drought, a known way to help your herd get through the drought is to, if you if you have a woman who's lactating, you have her nurse the kid, the baby goat, along with her human kid. And so there's nothing sexual about it.

.... And theoretically if we go point by point through this list: it's an infant--it sort of could walk so maybe there's an issue there--but there's physical contact between the mouth and the nipple. But (obviously) breastfeeding as we intended, anyway, meant human infants. And so, in that moment, what they decide to do is remove the photo. And there was an amendment, an asterisk, under the rule stating "animals are not babies." So in any future cases people would know what to do.

This then raised new problems and so on and so on.

And so consider me not at all surprised that Facebook is still facing this very same issue. Late last week there were reports in Australia of some (reasonably) outraged people, who were angry that Facebook was taking down a series of ads for breast cancer survivors.

Facebook has come under fire from outraged breast cancer awareness groups after it banned online advertisements that featured topless survivors, claiming they violated the platform’s nudity policy.

The Breast Cancer Network of Australia (BCNA), in partnership with Bakers Delight, launched its annual Pink Bun Campaign yesterday to raise awareness and money for charity.

As the article notes, the ads showed "10 topless breast cancer survivors holding cupcakes to their chests". In another article Facebook gives its reasoning, which again reflects much of the history discussed above:

Facebook said it rejected the ads because they did not contain any education about the disease or teach women how to examine their breasts.

It said since the ads were selling a product, they were held to a higher standard than other images because people could not block ads the way they could block content from pages they followed.

So, clearly, over time the rule has evolved so that there's some sort of amendment saying that there needs to be an educational component if you're showing breasts related to breast cancer (remember, above, years back, Facebook had already declared that mastectomy photos are okay, and at least some of these ads do show post-mastectomy photos).

The charity in question is furious about this and calls the whole thing "nonsensical," but it's actually the opposite of that. It's totally "sensical" once you understand much of the history, and the fact that Facebook keeps having to change and adapt these rules, often multiple times a month, to deal with the "new" cases that keep showing up that don't quite match. And you could (and many do) argue that it's "obvious" why these ads should be allowed, but you forget that the company can't just rely on something being "obvious." It has over 10,000 people it employs who are in charge of making these decisions, and what's obvious to one of them may not be obvious to another. And thus it needs clearly spelled out rules.

And those rules will never encompass every possible situation, and we'll continue to see stories like this basically forever. We keep saying that content moderation at scale is impossible to do well, and part of that is because of stories like this. You can't create rules that work in every case, and there are more edge cases than you can possibly imagine.

Filed Under: breast cancer, breastfeeding, breasts, content moderation, content moderation at scale
Companies: facebook


Reader Comments

Subscribe: RSS

View by: Time | Thread


  • identicon
    Anonymous Coward, 6 May 2019 @ 10:04am

    And if it's hard enough to codify moderation rules for humans to follow, how can we expect an automated filter to be able to do the job?

    reply to this | link to this | view in chronology ]

  • identicon
    Scote, 6 May 2019 @ 10:12am

    " Like a teenaged heterosexual boy, it appears that Facebook has no clue how to deal with naked female breasts."

    You can't blame that entirely on Facebook. Rather, it is society that is erratic on the display of breasts, where male nipples are ok to display in public, but female nipples, even if they look exactly like male nipples, are not. It's not that moderation doesn't scale it's that our societies rules about nipples are ridiculous.

    reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 6 May 2019 @ 10:25am

      Re:

      The flaw with content moderation is that one person is deciding what another person can or cannot see. No company can get it right, as people differ on what they think other people should be allowed to see.

      reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 6 May 2019 @ 1:03pm

      Re:

      It's not that moderation doesn't scale it's that our societies rules about nipples are ridiculous.

      That is exactly the reason that moderation doesn't scale. Once you get all 7.5 billion people to agree on what's acceptable, then content moderation will be easy... because no moderation will be necessary.

      reply to this | link to this | view in chronology ]

  • identicon
    bob, 6 May 2019 @ 10:13am

    Content Moderation At Scale Is Impossible: Facebook Still Can't Figure Out How To Deal With Naked Breasts

    they just suck at it?

    reply to this | link to this | view in chronology ]

    • icon
      Stephen T. Stone (profile), 6 May 2019 @ 10:35am

      In re: "They suck at it"

      I thik photos of breastfeeding are okay.

      reply to this | link to this | view in chronology ]

    • icon
      Jeffrey Nonken (profile), 6 May 2019 @ 10:35am

      Re:

      Ah hah hah hah. I was gonna downvote you, then I Saw What You Did There.

      reply to this | link to this | view in chronology ]

      • identicon
        bob, 6 May 2019 @ 2:45pm

        Re: Re:

        Thats a very comforting, snuggly feeling, I wasnt sure if people would. Definitely FB tries, and any moderation team should get a lot of credit even if they aren't producing the results others cry for.

        But my attempt at playing with the subject matter, undoubtedly would tweak some people. We can't all be like babies in life, you sometimes just need to latch onto it and suck out every drop of truth or you miss a key part of growing. It's a good nurturing function that I wish more people would grab hold of and tease till they are completely satisfied.

        But alas these things come in all shapes and sizes. Some don't like to be pulled and squished the same way as others. But in the end, we all get longer with age.

        reply to this | link to this | view in chronology ]

  • icon
    John85851 (profile), 6 May 2019 @ 10:20am

    What about artwork

    Then there's the issue of whether nudity is acceptable in artwork. If that happens, who will be the judge as to whether the artwork is "artistic"?

    On a related note, I went to Florence recently and took pictures of Michaelangelo's David (in the Accedemia dell'Arte) and of the Sistine Chapel (showing a nude Adam). Yet the photos are still up!
    So either someone at Facebook recognizes the photos of classical artwork or the photos don't have naked female breasts, or (more likely) no one complained that they were "indecent".

    reply to this | link to this | view in chronology ]

    • icon
      Mike Masnick (profile), 6 May 2019 @ 10:23am

      Re: What about artwork

      The Radiolab podcast linked above actually discusses that. Facebook had to write into the rules a "definition" for what is art when it comes to naked breasts...

      reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 6 May 2019 @ 10:27am

    Facebook definitions

    So when will facebook be publishing their dictionary?

    reply to this | link to this | view in chronology ]

  • icon
    Jeffrey Nonken (profile), 6 May 2019 @ 10:34am

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 6 May 2019 @ 10:36am

    like a dog chasing its tail

    The central core of the problem could be Facebook's progressivism. Why not just go by the FCC's much more conservative traditions on what you can and can't show regarding nudity on entertainment television broadcasts on public airwaves? That seems to be pretty much the way Facebook started out, since those rules have always been relatively simply and comparatively constant, and concepts such as "fairness" and "inclusiveness" has never been a major goal of the FCC's rather staid "my way or the highway" censorship policies.

    Perhaps it's not unlike the way that more and more letters keep having to get added to "LGB" and ever more creative types of genders keep having to get added to the trans-gender definition. Because the moment someone finally figures it all out and establishes a completely and universally inclusive standard, someone will crawl out of the woodwork and complain "but what about me? It's so unfair!!!"

    and the cycle continues ...

    reply to this | link to this | view in chronology ]

    • icon
      TKnarr (profile), 6 May 2019 @ 10:55am

      Re: like a dog chasing its tail

      Because it isn't Facebook's progressivism so much as it's user's progressivism conflicting with other user's lack of progressivism. And unfortunately there's no "Don't show my content to conservatives." setting for the progressive users to use.

      Most of the FCC's rules are around how to deal with content in a medium defined to be child-safe (at least during certain times). The Internet in general is simply not child-safe, never has been, never should be. In fact it's not adult-safe either (I've seen things that make Goatse look like a pleasant daydream), and the variation in users is so great that generating rules for content can only be done on the consumer's end where they only have to satisfy one or a relative few users.

      reply to this | link to this | view in chronology ]

    • icon
      Stephen T. Stone (profile), 6 May 2019 @ 10:58am

      Public airwaves are “government ‘property’ ”; Facebook servers are not. Whereas the “moderation” of public airwaves will always trend towards conservative values due to the nature of that particular beast, private entities can have whatever values they want so long as they do not break the law.

      Oh, and by the way: Your “joke” about queer people is sad. “Oh noes, we have to keep finding new ways of changing the language for reasons of inclusivity!” The issue there is…what, exactly? We alter our language patterns all the time; the word “thick” evolved into the memetic variant of “thicc” because enough people altered their own pattern and pushed the new variant into wider usage. The language surrounding sexual identity is constantly evolving toward a more “accurate”, more personally considerate vocabulary (e.g., “cisgender” instead of “normal”). If the evolution of that language bothers you, perhaps you should consider how badly the issue affects you…and whether that issue is the most important one in front of you at the moment.

      reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 6 May 2019 @ 2:25pm

      Re: And no I won’t get off your lawn

      To answer your extremely stupid question.

      Because it’s not the 1950’s anymore.

      reply to this | link to this | view in chronology ]

  • icon
    Zgaidin (profile), 6 May 2019 @ 10:39am

    Which is, more or less, why large scale social media isn't likely to last and probably isn't good for us, or maybe we're not good enough for it. You can't cram a billion people from all over the world into one gigantic room (digital or otherwise) and not expect endless problems. It's why Reddit will probably outlast Facebook. If I don't like a specific topic, I don't ever go to the subreddit for it. Why would I? That, in turn, allows each subreddit to mostly moderate its much smaller userbase as it sees fit. They never have to try to make one size fits all content rules because they didn't cram a billion people in one big room. They made a bunch of rooms, let users make an endless supply of new rooms, and then let them wander freely between rooms. Meanwhile Facebook, by its very nature, can never escape the hunt for one size fits all, because it's just one big room.

    reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 6 May 2019 @ 10:44am

      Re:

      Meanwhile Facebook, by its very nature, can never escape the hunt for one size fits all, because it's just one big room.

      Not really, as various groups exist on Facebook, and people decide who to follow. It could and should be more flexible that Reddit when it come to people associating with each other.

      reply to this | link to this | view in chronology ]

      • icon
        Zgaidin (profile), 6 May 2019 @ 11:17am

        Re: Re:

        I found out about this afterwards (shows you how long it's been since I left Facebook). That's a step in the right direction, but I doubt it's particularly successful long term, at least so long as Facebook continues to try to create a globally family/child friendly environment since there's no universally agreed upon definition for it.

        reply to this | link to this | view in chronology ]

    • icon
      Stephen T. Stone (profile), 6 May 2019 @ 11:03am

      Which is, more or less, why large scale social media isn't likely to last and probably isn't good for us, or maybe we're not good enough for it.

      Pretty much both, yeah. Humanity was not ready — and may never be ready — for the kind of communications made possible by Twitter, Facebook, and their ilk. The ideal, at least for me, is an old-school forum with a cap on userbase size to prevent things from getting too out of hand in re: moderation and community bullshit (with an optional chatroom for “live” communications). Discord is about the closest modern equivalent, at least to my knowledge and usage of it.

      reply to this | link to this | view in chronology ]

      • identicon
        Anonymous Coward, 6 May 2019 @ 1:09pm

        Re:

        The millennials and younger should be the ones to make those sorts of decisions, as they are the ones growing up with global communications. Th rest of use grew up with limited range communications, and while some have adapted to Global communications, others have gone over the top pushing their agendas.

        reply to this | link to this | view in chronology ]

      • icon
        Bamboo Harvester (profile), 6 May 2019 @ 3:19pm

        Re:

        You've hit on one of the two big problems - the sheer size of the community. Etiquette is a result of a culture defining the social norms it will tolerate, and those items which it will not.

        When dealing with global communications, you're going to have clashes, sometimes severe, of what the different groups find acceptable.

        The other problem is the willingness to let a tiny fraction of a population dictate special rules for their particular case.

        Which means the one-armed, left-handed, gravitationally challenged, brain damaged individual who "identifies" as a diseased goat's penis (aka: Jhon) gets to sue everyone else for not conforming to "it's" desire to define "normal".

        BTW, "normal", just like "sane" is whatever 51% of the polled population says it is.

        reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 6 May 2019 @ 10:52am

    And we Googled breastfeeding goats and found that this was a thing. It turns out it's a survival practice according to what they found, this is a tradition in Kenya that goes back centuries, that in a drought, a known way to help your herd get through the drought is to, if you if you have a woman who's lactating, you have her nurse the kid, the baby goat, along with her human kid. And so there's nothing sexual about it.

    TIL

    If we didn't live in a regressive society that always views nudity as inherently immoral/sexual in nature we'd be a lot better off (we could learn a lot from the "naturalism" movement of nudism).

    reply to this | link to this | view in chronology ]

  • icon
    Uriel-238 (profile), 6 May 2019 @ 11:13am

    Nudity taboos meet the Sorites Paradox

    Nudity is offensive when someone feels it is, but different people find different levels of nudity offensive, case in point naturist camps, in contrast to hijab mandates (and in the west, the offense taken to referring to legs rather than the more appropriate limbs)

    Which makes the subject of nudity an apt topic for the Sorites Paradox in its most common form What is the threshold of a heap of sand? At what point is it one grain away from no longer being a heap?

    For thresholds we feel rather than define in discreet terms, it's going to very from person to person. Judges know porn when they see it, but some judges see porn where others do not. Curiously, the art collections of the Vatican serve as an anthropological history of fine artists and offended clergymen with different works in varying states of censorship. (Fig leaves were added later not to cover genitals but the blank spot where they were chiseled away).

    What it means here: Even if we outline terms of nudity that a computer can check for and follow (with trees of rules and exceptions and exceptions to exceptions) someone's going to be offended that something is too much or too little and will disagree with the next person, both of whom will insist they represent a societal norm.

    Maybe NSFW gates that can be set in personal settings to be all-on / all-off / choose by visited account / choose by picture? I'm just guessing that might be the best compromise.

    reply to this | link to this | view in chronology ]

    • icon
      Uriel-238 (profile), 6 May 2019 @ 11:16am

      Ugh...spelling errors.

      For thresholds we feel rather than define in discreet terms, it's going to vary from person to person.

      Because homophones confuse the brain.

      reply to this | link to this | view in chronology ]

      • icon
        Thad (profile), 6 May 2019 @ 12:09pm

        Re: Ugh...spelling errors.

        Since you brought it up: you also used "discreet" when you meant "discrete".

        reply to this | link to this | view in chronology ]

        • icon
          Uriel-238 (profile), 6 May 2019 @ 1:50pm

          Discretion

          In fact discrete and discreet are two of my favorite words to confuse.

          reply to this | link to this | view in chronology ]

        • icon
          stderric (profile), 6 May 2019 @ 5:56pm

          Re: Re: Ugh...spelling errors.

          Weirdly enough, my discrete mathematics professor always handed back our exams in class from highest to lowest score so that we all all knew how well/poorly each other did. To this day, I don't know if it was a brilliant joke or gratuitous cruelty.

          reply to this | link to this | view in chronology ]

          • identicon
            Anonymous Coward, 6 May 2019 @ 11:24pm

            Re: Re: Re: Ugh...spelling errors.

            I once had a math teacher who said she didn't bother with an answer key but instead just used my exam to correct the others.

            reply to this | link to this | view in chronology ]

    • icon
      stderric (profile), 6 May 2019 @ 5:44pm

      Re: Nudity taboos meet the Sorites Paradox

      ...but some judges see porn where others do not.

      Except in Alex Kozinski's chambers. They all saw porn there.

      reply to this | link to this | view in chronology ]

  • identicon
    christenson, 6 May 2019 @ 11:56am

    Gonna remain impossible, until...

    We start asking and grouping users into what they want....
    We allow those answers to change according to circumstance....
    We recognize that CONTEXT matters...(exact same content is or is not "bad" depending on how it is framed)
    We recognize that fame is also a criterion... think of Goatse...famous enough that even if I disapprove of him, he is a subject of general discussion and at least some of his pictures need to be available!

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 6 May 2019 @ 12:30pm

    "animals are not babies"

    And there was an amendment, an asterisk, under the rule stating "animals are not babies." So in any future cases people would know what to do.

    Of course humans, as everyone knows, are animals, so we can guess what the next asterisk might be.

    reply to this | link to this | view in chronology ]

  • icon
    Ben (profile), 6 May 2019 @ 1:13pm

    and internationally speaking?

    So far the article and the discussion seem quite US-oriented (which is understandable given the nature of Techdirt and the broad distribution of the audience here). However Facebook has a much larger audience than just the US, so the suggestion, for example, that FCC rules could or should apply is entirely inappropriate. Indeed, much of what the FCC permits is deeply offensive when seen through Malaysian or Arabian or Mongolian eyes (just examples, not meant as either an inclusive nor exclusive list of places where culture differs from the US), and there is no doubt content such cultures would accept content USians would quail to see.
    In the end, I think the only answer to content moderation at scale is to leverage the user base as a starting point, accepting that a) determined bad actors can maliciously 'moderate' content away, and b) occasionally things will be missed from your personal point of view.

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 6 May 2019 @ 1:36pm

    Companies who make baby formula sexualized the breast to increase their market share.

    reply to this | link to this | view in chronology ]

    • icon
      Uriel-238 (profile), 6 May 2019 @ 2:28pm

      Sexualization of the breast

      Humans are unique among primates exhibiting breasts even when not lactating. It's one of those things zoologists flip out over sexy space alien girls in space opera clearly designed for the human male gaze: human boobs are about as species-specific as a giraffe's neck.

      We're pretty sure men have been sexualizing knockers since women have had them. So maybe when we were Australopithecus afarensis? It seems to correlate with walking upright concealing the vulva and nipple stimulation becoming part of foreplay.

      We don't consistently sexualize them in the bigger-is-better fetish stereotypical of the US, Australia and Japan. Rather breasts indicate age and nubility by development and sag. The primal offspring-seeking male brain wants protuberant minimally sagging mammae and a high hip-to-waist ratio.

      It's also why women in their teens and early twenties are often cast by Hollywood as foils to men twice their age in action and suspense thrillers. Her primary quality is breedable.

      reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 6 May 2019 @ 2:29pm

      Re: 🍅🍅

      Congratulating on saying the dumbest thing on the internet today.

      reply to this | link to this | view in chronology ]

      • identicon
        bob, 6 May 2019 @ 11:18pm

        Re: Re: 🍅🍅🍅🍅🍅

        Uriel is right, humans sexualize female human breasts way too much. Breasts exist for a biological reason to feed infants.

        The fact that people enjoy them for purposes of arousel isn't bad but it's easy to go overboard on the practice as well.

        reply to this | link to this | view in chronology ]

        • icon
          Uriel-238 (profile), 7 May 2019 @ 1:41am

          "Way too much"

          WTF?

          Boobs and stuff get us horny because that's what's propogated the species for millions of years. I'm pretty sure those who failed to go overboard died off before we even left the oceans, let alone developed hands or breasts.

          Now yes, we live in a culture that has been defined for fifteen hundred years by a church that controlled the laity by witholding sex except through relations it specifically condoned, and as a result of centuries of suppression we're very hung-up about sex. Hence we fear we lust too much, when medically and interpersonally we would be better off having more sex than we do.

          We cover ourselves up for modesty's sake because that's a norm we've established. But in doing so, it only exasperates our fixation on the physical qualities of others and our desire to see them. In contrast, the excitement quickly wears off in nudist and clothing-optional societies. In those societies we also have healthier, more realistic expectations of what normal human bodies look like.

          So no, I think our norms should be way more relaxed than they are. We'd be healthier and not freak out over seeing other people's bits. Fewer people would have to adhere to uncomfortable dress codes and Facebook would have fewer problems to solve.

          reply to this | link to this | view in chronology ]

          • identicon
            bob, 7 May 2019 @ 10:57am

            Re: "Way too much"

            But in doing so, it only exasperates our fixation on the physical qualities of others and our desire to see them.

            This is what I mean by way too much. Sure people find them attractive, I am one of them. And yes that is a good thing. I was just stating some people take that fixation too far.

            reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 6 May 2019 @ 2:59pm

    "Too expensive" and "impossible" are not the same thing.

    USENET survived just fine until those who monetized their USENET audiences no longer had an interest in the very free speech that built those audiences. It still exists today as mostly a common carrier, though few have any interest in promoting its use. We don't have internet free speech because it appears we do not want it.

    Large social-media networks should become common carriers like the phone company. An internet company may be private, but the internt airwaves are public, and once a company connects to those airwaves, it should not be allowed to censor without a court order.

    Also those who see senators tying Section 230 protection to political neutrality are not claiming that it is mandated by 230, but rather it is their condition for continuing to extend that immunity.

    reply to this | link to this | view in chronology ]

    • icon
      Stephen T. Stone (profile), 6 May 2019 @ 3:24pm

      Assume you own and operate a forum that is open to the public. (The niche/topic for the forum is irrelevant.) One day, someone joins your forum and immediately starts spamming White supremacist propaganda¹. None of their spam advocates for any illegal acts (thus making it legally protected speech), but all of it is at least distasteful. How would you feel if you could not immediately delete their postings and ban them from your forum because the government said “go get a court order first”?

      ¹ — Feel free to replace “White supremacist” with “homophobic”, “anti-Semitic”, “pro-Movie Sonic”, or any other adjective that describes the beliefs of an abhorrent group of people.

      reply to this | link to this | view in chronology ]

      • identicon
        Anonymous Coward, 6 May 2019 @ 4:14pm

        Re:

        Users can filter out the messages, and SPAM policies (like USENET's rules) can be content-neutral. Just restrict their ability to post to X messages per day to reduce the burden on the system, and give individual users the tools to filter the message. This of course doesn't solve the problem of OTHERS reading the content, which is what people don't like. It's one thing to say you don't want to read something, quite another for you to say you don't want ME reading it.

        What we define as "hate" or "trolling" is subjective. That's what makes moderation impossible. Should we unplug their telephones and refuse to deliver their mail next? The USPS used to moderate the mail until it was illegal to do so. AOL used to have "guides" moderate its chatrooms until their volunteer status and free-account compensation ran afoul of wage laws.

        reply to this | link to this | view in chronology ]

        • icon
          Stephen T. Stone (profile), 6 May 2019 @ 4:54pm

          SPAM policies (like USENET's rules) can be content-neutral

          How would a content-neutral automatic moderation policy that includes racial slurs differentiate between a post using the N-word as a racial slur and a post using the N-word as a discussion of the word itself/a news story where someone else said it? Until you can design a policy or automated system that can account for context and nuance, it will always find false positives and “censor” speech that would otherwise be unobjectionable.

          restrict their ability to post to X messages per day to reduce the burden on the system, and give individual users the tools to filter the message

          I agree that users should have more controls over what does and does not show up in their timeline. But that alone does not solve the problem of the person posting the content in the first place, which — according to your proposition — could not be removed without a court order, which can take time and money you might not be able to spare.

          This of course doesn't solve the problem of OTHERS reading the content, which is what people don't like.

          The problem would not necessarily be others reading it. The problem would be the inability to prevent it from showing up, or proactively delete it after it shows up, because the government would have said you could not do so without first going to court.

          What we define as "hate" or "trolling" is subjective. That's what makes moderation impossible.

          Truly objective moderation is, has been, and always will be impossible. All moderation is subjective; the only difference between “rulesets” is who makes them and what the rules say is “outlawed” on a given platform. If you dislike the rules of one platform, you can leave it and go to another one — or make your own platform with blackjack and hookers and Futurama references.

          Should we unplug their telephones and refuse to deliver their mail next?

          Booting someone off Facebook is not the equivalent of denying them delivery of their mail.

          reply to this | link to this | view in chronology ]

          • icon
            That One Guy (profile), 6 May 2019 @ 6:17pm

            'You first'

            You missed the best part of their comment, in that they advocated for a limit on comments that would be allowed to be posted per day, a limit that, as numerous comment sections have made clear, they would likely max out on very quickly, such that they could no longer post unless they wanted to try to avoid the very rules they're trying to foist on others.

            Also worth pointing out that they completely ignored your question/hypothetical, but I suppose as that's par for the course for them it's hardly surprising.

            reply to this | link to this | view in chronology ]

          • identicon
            Anonymous Coward, 6 May 2019 @ 6:50pm

            Re:

            SPAM policies (like USENET's rules) can be content-neutral
            How would a content-neutral automatic moderation policy that includes racial slurs differentiate between a post using the N-word as a racial slur and a post using the N-word as a discussion of the word itself/a news story where someone else said it? Until you can design a policy or automated system that can account for context and nuance, it will always find false positives and “censor” speech that would otherwise be unobjectionable.

            First they laughed at smokers who complained about the cigarette tax, and now they have a soda tax. Why stop at the N-word? Disability-based slurs are still mainstream. They'd have to be banned. Next up is poor-shaming ("winner" versus "loser"), etc. Content moderation is simply a SPEECH CODE. Since I'm against that, I say let users block it, and limit the ability to post to X per day if one feels overwhelmed, and buying another server isn't an option.

            restrict their ability to post to X messages per day to reduce the burden on the system, and give individual users the tools to filter the message
            I agree that users should have more controls over what does and does not show up in their timeline. But that alone does not solve the problem of the person posting the content in the first place, which — according to your proposition — could not be removed without a court order, which can take time and money you might not be able to spare.

            With free speech, there is no "problem" with any legal speech that can't be solved by blocking and post-limiting (Google can afford a high limit of posts obviously). The "problem" with free speech is that other people can talk back and no one can commandeer it.

            This of course doesn't solve the problem of OTHERS reading the content, which is what people don't like.
            The problem would not necessarily be others reading it. The problem would be the inability to prevent it from showing up, or proactively delete it after it shows up, because the government would have said you could not do so without first going to court.

            What is the problem with free speech "showing up?" I think it's the opposite: proof that we are hearing all voices. Someone who doesn't like a TV show can change their channel, but to change MY channel they need it censored.

            What we define as "hate" or "trolling" is subjective. That's what makes moderation impossible.
            Truly objective moderation is, has been, and always will be impossible. All moderation is subjective; the only difference between “rulesets” is who makes them and what the rules say is “outlawed” on a given platform. If you dislike the rules of one platform, you can leave it and go to another one — or make your own platform with blackjack and hookers and Futurama references.
            Should we unplug their telephones and refuse to deliver their mail next?
            Booting someone off Facebook is not the equivalent of denying them delivery of their mail.

            It's not quite mail denial, but it does influence public discourse to a significant level. I still see these complaints more as financial than political. Phone calls cost money because it uses system resources to call someone. The internet could function the same way, with moderation built into the equation.

            reply to this | link to this | view in chronology ]

            • icon
              Stephen T. Stone (profile), 6 May 2019 @ 8:37pm

              Content moderation is simply a SPEECH CODE.

              Yes, and this is a problem…why, exactly? Few places, if any, allow every type of protected speech “no matter what”.

              The "problem" with free speech is that other people can talk back and no one can commandeer it.

              …which is a problem if someone runs a platform meant for use by a marginalized community — LGBT people, for example — and said community comes under attack from assholes who would love to marginalize that group even further. Post limits and client-side filters will not change human behavior in the way you think it will, and they will not discourage the assholes more than solid moderation can and will.

              What is the problem with free speech "showing up?"

              The problem is that, within the framing of your “moderation by judicial order” plan, any speech that a platform does not want to host — regardless of whether it is protected by law, regardless of how the userbase at large acts — would be forced upon that platform for as long as the platform lacks a court order saying “you can delete that specific instance of that specific speech” or “you can ban this user and remove all their posts” or whatever.

              That even gets back to my original question, which you did not directly answer: How would you feel if, as a platform owner, you could not immediately delete content you absolutely did not want to host and ban the poster from your platform because the government said “go get a court order first”?

              I think it's the opposite: proof that we are hearing all voices.

              And the problem with this mindset is thinking all voices deserve to be heard and treated with equal respect. Someone who sincerely believes in the Flat Earth theory, for example, does not deserve the same respect as everyone else. To act as if they do because of some ridiculous “view from nowhere” mindset is to fool yourself into thinking all speech is “created” equal.

              it does influence public discourse to a significant level.

              This does not change the fact that using a platform such as Facebook is a privilege, not a right, and that privilege can be revoked if you violate the terms of service. Your “moderation through court order” system cannot exist within this framework. It would metaphorically spit in the face of every law, statute, and court ruling that says a platform for third-party speech is under no legal obligation to host any specific speech from any specific third party.

              Phone calls cost money because it uses system resources to call someone. The internet could function the same way, with moderation built into the equation.

              Yes, because every platform for third-party speech becoming a (likely expensive) paywalled service because they need the money for lawyers who can handle filing motions for moderation decisions with the court would totally be okay with everyone~.

              reply to this | link to this | view in chronology ]

              • identicon
                Anonymous Coward, 6 May 2019 @ 11:30pm

                Re:

                And the problem with this mindset is thinking all voices deserve to be heard and treated with equal respect.

                Some animals more equal than others?

                Equal access to public internet airwaves is not equal respect. My problem with censorship is that no one deserves that type of power.

                Those who can't stand the existence of the flat-earth society are welcome to block that content. Banning it would lead to a slippery slope where those who question all scientific dogma, like the Big Bang, are also banned. Same for other unpopular beliefs. It becomes groupthink.

                reply to this | link to this | view in chronology ]

                • identicon
                  Anonymous Coward, 7 May 2019 @ 2:42am

                  Re: Re:

                  objection "censorship": the topic discussed is a private company and not a government entity. If company policy excludes a subgroup, that subgroup may find success in funding their own platform. If it was a government policy, (reference human history).

                  reply to this | link to this | view in chronology ]

                • icon
                  Stephen T. Stone (profile), 7 May 2019 @ 4:37am

                  When discussing sociopolitical ideologies? Yes, racist beliefs are less equal to anyone who does not believe, say, “I’m a Christian and my Christian beliefs are you don’t do interracial marriage. … [W]hen it comes to all this stuff you see on TV, when you see blacks and whites together, it makes my blood boil because that’s just not the way a Christian is supposed to live.” The same goes for Flat Earthers, anti-vaxxers, and anyone who sincerely enjoyed Batman v Superman. Some views are so toxic, so harmful, so absolutely ridiculous on their face that we need not treat them as having credibility.

                  Client-side blocking of such content is all well and good, but it still does nothing about the content being there in the first place. And if you explicitly do not want to host content both you and your userbase find distasteful (which would be within your rights), telling everyone else that you cannot do anything about it because “the government says I can’t” will not appease people who think you have cowed to those who post that content. Getting rid of the content would prevent your platform from being associated with it; leaving it up because “free speech” would prevent your platform from being associated with any speech but the distasteful content. 8chan became associated with GamerGate (among other things) because 4chan kicked the Gaters out; that 8chan is both celebrated by its users for its “free speech” ideals and considered a breeding ground for the kind of bullshit you find in manifestos left behind by mass shooters is no coincidence.

                  Oh, and as for the “banning speech” thing: Any platform not owned by the government has every right to ban whatever speech it wants. Facebook could ban advocacy for the Flat Earth theory later today and nothing — not a single goddamn thing — could be done to legally force Facebook into hosting Flat Earther content. Using a third-party platform is a societal privilege, not a legal right. If’n you hate the rules of that platform, go find one with a ruleset you prefer. I hear 8chan is still a thing…

                  reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 6 May 2019 @ 5:59pm

      Re:

      "Too expensive" and "impossible" are not the same thing.

      They're not. The problem is that vested interests in government and copyright enforcement keep trying to frame "impossible" as simply "too expensive" and insisting that the issue is everyone else not wanting to foot their bill.

      reply to this | link to this | view in chronology ]

      • identicon
        Anonymous Coward, 6 May 2019 @ 6:41pm

        Re: Re:

        Copyright law is more efficient than a zillion individual publishing contracts.

        You can't eliminate copyright protection because you'd have to abolish contract law. Copyright law is therefore more like divorce law, which is effectively a government prenup.

        reply to this | link to this | view in chronology ]

        • identicon
          Anonymous Coward, 10 May 2019 @ 2:33am

          Re: Re: Re:

          Unfortunately for you copyright law could be abolished today and contracts would still exist.

          Try again.

          reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 6 May 2019 @ 7:11pm

    Rules are a lot simpler to define for traditional media. One reason is the cost of producing content. TV shows can cost millions of dollars per episode, and movies can be more expensive. When there is that much money riding on shows, studios want to stay away from any gray areas in the rules. So with breasts for example, shows will either cover up more than needed so that they don't accidentally show anything. Or they will have many naked breasts attached to characters having sex to justify a mature rating. Either way no one questions the rating the show was given.

    With Facebook, it is so easy to post photos that I believe millions of them were probably posted by accident. While Facebook does enforce rules, the penalty of a few photos betting removed is small. People will choose what photos they upload based on their own morality instead of thinking about what Facebook wants, and some people will even deliberately post pictures for the purpose of showing that the rules of Facebook or its users are arbitrary.

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 7 May 2019 @ 4:15am

    It's why Reddit will probably outlast Facebook.

    reply to this | link to this | view in chronology ]

  • icon
    crade (profile), 7 May 2019 @ 8:57am

    And they haven't even hit the problem of defining who is and is not "female" yet.

    reply to this | link to this | view in chronology ]

  • identicon
    Rekrul, 7 May 2019 @ 10:04am

    "When it comes to uploaded photos on Facebook, the vast majority of breastfeeding photos comply with our Statement of Rights and Responsibilities, which closely mirrors the policy that governs broadcast television, and which places limitations on nudity due to the presence of minors on our site."

    Um, I've seen full frontal female nudity on network TV at least twice. Both times it was in a miniseries. The first was back in the 70s or 80s. I forget what the miniseries was, but there was a nude native woman, her body partially covered with mud or some type of markings. The second time was in the 90s or early 00s in a miniseries about war. The scene was of women having their heads shaved in a concentration camp. I've also seen bare breasts fully exposed in documentary shows about breast cancer and reconstructive surgery.

    So, the rules governing broadcast TV aren't exactly clear either.

    reply to this | link to this | view in chronology ]

    • icon
      Uriel-238 (profile), 7 May 2019 @ 11:44am

      I, Claudius

      I, Claudius starring Derek Jakobi featured a surprising amount of nudity and was essentially the Game Of Thrones of the seventies (albeit taking place in classical Rome). Shown without censorship on PBS when I was first getting more buzz from sex than violence.

      More poisonings than stabbings, and without the benefits of classical Hollywood budgets or CGI but still enjoyable.

      reply to this | link to this | view in chronology ]

      • identicon
        Rekrul, 9 May 2019 @ 9:25am

        Re: I, Claudius

        Damn! I always used to see I, Claudius listed in TV Guide and never bothered to watch it once I saw that it was a drama about ancient Rome.

        reply to this | link to this | view in chronology ]


Add Your Comment

Have a Techdirt Account? Sign in now. Want one? Register here



Subscribe to the Techdirt Daily newsletter




Comment Options:

  • Use markdown. Use plain text.
  • Remember name/email/url (set a cookie)

Close

Add A Reply

Have a Techdirt Account? Sign in now. Want one? Register here



Subscribe to the Techdirt Daily newsletter




Comment Options:

  • Use markdown. Use plain text.
  • Remember name/email/url (set a cookie)

Follow Techdirt
Special Affiliate Offer

Advertisement
Report this ad  |  Hide Techdirt ads
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Chat
Advertisement
Report this ad  |  Hide Techdirt ads
Recent Stories
Advertisement
Report this ad  |  Hide Techdirt ads

Close

Email This

This feature is only available to registered users. Register or sign in to use it.