Content Moderation At Scale Is Impossible: YouTube Says That Frank Capra's US Government WWII Propaganda Violates Community Guidelines

from the say-what-now? dept

You've heard me say it over and over again now: Masnick's Impossibility Theorem is that it is literally impossible to do content moderation at scale well. There will always be dumb mistakes. The latest example? Rogue archivist Carl Malamud had posted filmmaker Frank Capra's classic Prelude to War on YouTube. If you're unfamiliar with Prelude to War, it's got quite a backstory. During World War II, the US government decided that, in order to build up public support for the war, it would fund Hollywood to create blatant American propaganda. They had Frank Capra, perhaps Hollywood's most influential director during the 1930s, produce a bunch of films under the banner "Why We Fight." The very first of these was "Prelude to War."

The film, which gives a US government-approved history of the lead up to World War II includes a bunch of footage of Adolf Hitler and the Nazis. Obviously, it wasn't done to glorify them. The idea is literally the opposite. However, as you may recall, last summer when everyone was getting mad (again) at YouTube for hosting "Nazi" content, YouTube updated its policies to ban "videos that promote or glorify Nazi ideology." We already covered how this was shutting down accounts of history professors. And, now, it's apparently leading them to take US propaganda offline as well.

Malamud received a notice saying the version of "Prelude to War" that he had uploaded had been taken down for violating community guidelines. He appealed and YouTube has rejected his appeal, apparently standing by its decision that an anti-Nazi US propaganda film financed by the US government and made by famed director Frank Capra... is against the site's community guidelines.

Obviously, this is ridiculous. But, it once again highlights the impossibility of content moderation at scale. YouTube has so much content and so much new content every minute, it can't possibly review all the content. And the content that it can review, it can't have people researching the history and provenance of every bit of content. So, here, it appears likely that the moderation team used a fairly simple heuristic: it's a film showing Germans celebrating Hitler. Therefore, it's not allowed. It's even entirely possible that the tone of the film made it clear that it was a propaganda film... it's just that it would take too much effort to figure out propaganda for whom.

Of course, as Malamud himself points out, what's particularly ridiculous is that this isn't the only version of the film he's uploaded. So while that one is still down, another one is still there. You can watch it now. Well, at least until YouTube decides this one also violates community standards.

Filed Under: carl malamud, content moderation, content moderation at scale, frank capra, masnick's impossibility theorem, nazis, prelude to war, propaganda, why we fight, world war ii
Companies: youtube


Reader Comments

Subscribe: RSS

View by: Time | Thread


  • icon
    Anonymous Anonymous Coward (profile), 15 Jan 2020 @ 7:49am

    How big is a 'community'?

    Articles like this tend to make me wonder what moderators consider a 'community'? Is it a 'community' of one? Two? Do they take things out of context and make assumptions about the whole (which goes both ways, the content and 'community' desires)?

    There is probably some explanation in this case, even with the appeal, watching an entire 52 minute long video would be beyond the scope of a mere moderator (or maybe even a appeal reviewer) who has to review and rule on many (1000's) of content objections in their meager 8 hour shift.

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 15 Jan 2020 @ 9:32am

    If content moderation at scale is impossible, then should YouTube and the other platforms just give up and let anybody post anything?

    Or maybe if they can't do a decent job moderating at their current size, then they are too big?

    reply to this | link to this | view in chronology ]

    • icon
      Anonymous Anonymous Coward (profile), 15 Jan 2020 @ 9:45am

      Re:

      It isn't just size, it is also about removing bias, bias that is inherent to all beings. Your flavor of bias is different than my flavor of bias, and both are different from that person over there. So, even if you break YouTube up into many, many, many different organizations (which would defeat much of its value to users) they would all still be big, and subject to bias. The number of needed moderators would not change, they would just be working for different organizations, who may or may not be able to quash the individual moderators bias to some degree.

      Then there are the ways that governments impose themselves, each with their own view on what is or isn't acceptable, so what works here, doesn't work over there and both want their way to be the only way. And now they have many, many, many different organizations to monitor, and under the presumption that the number of uploads does not go down after breaking up YouTube, the scale of the task does not change.

      reply to this | link to this | view in chronology ]

      • identicon
        Anonymous Coward, 15 Jan 2020 @ 10:11am

        Re: Re:

        Bias doesn't bother me. National Review and The Nation have different biases (or points of view) and that isn't a problem.

        As far as not being able to employ enough moderators, I don't believe it. That just means they aren't paying them enough.

        reply to this | link to this | view in chronology ]

        • icon
          That One Guy (profile), 15 Jan 2020 @ 10:22am

          Re: Re: Re:

          As far as not being able to employ enough moderators, I don't believe it. That just means they aren't paying them enough.

          Then you should educate yourself on the matter, because when you've got literally days worth of content uploaded per minute(a quick DDG search found an article that mentioned 300 hours per minute as of Dec 2017, a number which has almost certainly only gone up) 'just pay the moderators more/get more' is an absurd argument.

          reply to this | link to this | view in chronology ]

          • icon
            rangda (profile), 15 Jan 2020 @ 11:15am

            Re: Re: Re: Re:

            Some back of the napkin math should be fun.

            300 hours = 18,000 minutes per minute
            18,000 minutes per minute = 8,640,000 minutes per 8 hour shift.

            Let's assume that 10% of the content gets flagged by an automated system, that's 864,000 minutes of content that needs to get double checked per 8 hour shift.

            Let's further assume that a human can review on average 1 minute of content in 5 minutes.

            864,000 * 5 = 4,320,000 minutes worth of moderator time per 8 hour shift.

            We can assume our poor moderators get no break time at all (the floggings will continue until morale improves) and work 8 hours straight so that gives us 480 minutes of moderation time per moderator.

            4,320,000 / 480 = 9,000 moderators per shift.

            Based on this quick math youtube would need 27,000 moderators working nonstop to keep up with content circa 2017, assuming my optimistic numbers are even possible.

            And even if you had these 27,000 moderators you'd have to train them so they all have some idea what they are doing. And keep in mind that moderation rules might vary wildly from country to country and that might affect moderation[1]. You also need a system to moderate the moderators (who watches the watchmen?) to ensure that any individual moderator isn't applying some sort of inherent bias and not following your guidelines.

            [1] For example the USA and UK have radically different libel rules and what is considered libel in the UK would just be an opinion in the USA.

            reply to this | link to this | view in chronology ]

            • icon
              bhull242 (profile), 15 Jan 2020 @ 11:32am

              Re: Re: Re: Re: Re:

              And that’s actually a conservative estimate.

              reply to this | link to this | view in chronology ]

            • identicon
              Anonymous Coward, 15 Jan 2020 @ 11:51am

              Re: Re: Re: Re: Re:

              Moderation isn't a problem, it's an opportunity.

              If they want to keep things more or less in-house, then they could charge uploaders $1 / minute of video and then pay moderators $50 / hour to clear uploads. If they can't hire enough moderators at $50 / hour all they have to do is increase the fee and moderation pay until the upload rate equals their moderation rate.

              Alternatively, they could close YouTube from end-user uploads entirely and require everything to go through partner clearing houses. The partner would guarantee nothing violates YouTube guidelines and it would be up to each clearing house to figure out a business model that works.

              reply to this | link to this | view in chronology ]

              • identicon
                Anonymous Coward, 15 Jan 2020 @ 12:14pm

                Re: Re: Re: Re: Re: Re:

                And both proposals silence the majority of people. You would turn YouTube and all other platforms in publishers who accept a fraction of a percent of submitted content.

                reply to this | link to this | view in chronology ]

              • identicon
                Anonymous Coward, 15 Jan 2020 @ 12:16pm

                Re: Re: Re: Re: Re: Re:

                You apparently did not read or understand any of that posted in reply to your first post.

                In a nutshell, moderation that does not suck is impossible.

                You want to charge users for the privilege of being censored by humans rather than bots? Scalability problems you say? Sure, just hire more people. Eventually you will reach equilibrium because all those censors will be at work rather than online. How much will such a thing cost? Probably nothing to do with the actual cost of business as it would be mandatory without any competition. I imagine the price would be prohibitive to the point that only rich folk would be online. Is this their real goal?

                reply to this | link to this | view in chronology ]

              • icon
                Stephen T. Stone (profile), 15 Jan 2020 @ 12:54pm

                they could charge uploaders $1 / minute of video

                Alternatively, they could close YouTube from end-user uploads entirely and require everything to go through partner clearing houses.

                At which point YouTube bites the dust. The whole point of YouTube is that you can upload a video for free and share it immediately(-ish). Putting up a paywall, holding back content for an arbitrary time delay, forcing users to go through third parties just to post a video, or doing any combination of the three would discourage practically anyone who isn’t either rich as fuck or working for a major media conglomerate from uploading content to YouTube. You might think YouTube could survive with those kinds of policies intact, but you’re wrong.

                YouTube lives and dies based on “regular” people using the service, either to upload or to watch. If those people stop using the service (uploading or watching), no amount of deals with major media conglomerates is going to save YouTube from extinction.

                If they can't hire enough moderators at $50 / hour all they have to do is increase the fee and moderation pay until the upload rate equals their moderation rate.

                Paying moderators more money will run up YouTube’s costs to a point where even YouTube couldn’t afford to pay all its moderators. Besides, higher wages can’t and won’t guarantee quality moderation based on rules that everyone will always interpret the same and follow to the letter. Neither will tossing more bodies at the problem.

                reply to this | link to this | view in chronology ]

              • icon
                That One Guy (profile), 15 Jan 2020 @ 1:26pm

                Re: Re: Re: Re: Re: Re:

                Never fails to amuse when someone posting anonymously on a free platform suggests requiring payments to upload content, which would also require stripping anonymity.

                reply to this | link to this | view in chronology ]

                • icon
                  Scary Devil Monastery (profile), 16 Jan 2020 @ 3:48am

                  Re: Re: Re: Re: Re: Re: Re:

                  "Never fails to amuse when someone posting anonymously on a free platform suggests requiring payments to upload content, which would also require stripping anonymity."

                  And there's exactly ONE person I've observed suggesting exactly that, time and time again, for many years.

                  Looks like Bobmail/Blue/Jhon is temporarily out of his Hamilton suit.

                  reply to this | link to this | view in chronology ]

                  • icon
                    PaulT (profile), 16 Jan 2020 @ 4:02am

                    Re: Re: Re: Re: Re: Re: Re: Re:

                    My favourite is always when one of these guys loads up their FOSS browser and comes to a site built on FOSS and free protocols in order to tell everybody that free software is useless.

                    reply to this | link to this | view in chronology ]

                    • icon
                      Scary Devil Monastery (profile), 16 Jan 2020 @ 6:17am

                      Re: Re: Re: Re: Re: Re: Re: Re: Re:

                      "My favourite is always when one of these guys loads up their FOSS browser and comes to a site built on FOSS and free protocols in order to tell everybody that free software is useless."

                      Oh yes. We used to have a lot of fun back on Torrentfreak when Bobmail/Hamilton/Jhon/Blue used to harp about how no one would ever use open source commercially at a time when some 95% of the world's servers ran Apache.

                      Ironically that was in the period when he was steadfastly trying to argue from self-appointed authority as an "expert" on IT technology and law. Which was, I must say, not exactly the smartest move that dimbulb pulled on a site where half the audience consisted on actual experts in IT and law...

                      Haven't seen him try to pull that card much around here although he's still delivering his false assumptions and broken logic with the same casual arrogant assurance he did back then - when he isn't having one of his Hamilton episodes, that is.

                      It's almost guaranteed that there isn't more than one person spamming down all those posts. He tried the same sockpuppeting back then and was caught doing so more than once.

                      reply to this | link to this | view in chronology ]

              • identicon
                Anonymous Coward, 15 Jan 2020 @ 3:39pm

                Re: Re: Re: Re: Re: Re:

                That sounds like a good way to cripple YouTube and clear the space for someone else. As for moderating content at scale... not so much.

                reply to this | link to this | view in chronology ]

              • icon
                Scary Devil Monastery (profile), 16 Jan 2020 @ 3:46am

                Re: Re: Re: Re: Re: Re:

                "If they want to keep things more or less in-house, then they could charge uploaders $1 / minute of video and then pay moderators $50 / hour to clear uploads."

                So in other words, youtube ends as a concept.

                Your assumption still tries to make youtube pay, by the extremely conservative napkin estimate provided by rangda, above, 1 milllion USD per day to moderate the content, after the upload has paid their dues.

                And do note that out of those "hired moderators" half of them would be online trolls eager for a chance to fuck shit up for money. Unless you also hire managers to oversee their moderation and educators to train them into what amounts as a full-time job.

                "Alternatively, they could close YouTube from end-user uploads entirely and require everything to go through partner clearing houses. The partner would guarantee nothing violates YouTube guidelines and it would be up to each clearing house to figure out a business model that works."

                Ah, look, the old "Let's give all the power back to the Guild of Stationers" argument. Welcome back, Baghdad Bob.

                Seriously, Blue, don't you get it by now? The copyright cult propaganda still won't fly, even if you wrap it in polite language.

                reply to this | link to this | view in chronology ]

            • icon
              Code Monkey (profile), 15 Jan 2020 @ 12:46pm

              Back of the napkin math...

              Still waiting for AI to take over content moderation. At scale.

              {shudders}

              reply to this | link to this | view in chronology ]

              • identicon
                Anonymous Coward, 18 Jan 2020 @ 11:14pm

                Re: Back of the napkin math...

                That's your job. That will be your legacy! Code monkey gave birth to AI moderation!

                reply to this | link to this | view in chronology ]

            • icon
              urza9814 (profile), 16 Jan 2020 @ 1:35pm

              Re: Re: Re: Re: Re:

              You COULD hire those 27,000 moderators and pay them $30k/year (exactly average for a US wage, although most of these moderators are in third-world countries anyway) for around $2.2 million per day. YouTube doesn't release much about their finances, but the only estimate I can currently find* puts their profits at $1.9 million per day back in 2013. Presumably their profits are higher now, just as the amount of video uploaded certainly is.

              I don't think that's so clearly impossible. It would certainly consume the majority of their absurdly massive profit margin, but it might still be feasible. They could certainly do significantly more than they do right now. You say they need 27k moderators to actually moderate effectively; they currently seem to have about 2k total employees and PLENTY of money to hire more if they actually wanted to.

              I'm also not sure I'm buying that 10% of the content is getting flagged and disputed. According to Google's transparency report, over three months they removed 8.7M videos. The average YouTube video appears to be just under 12 minutes long. So at 300 minutes per minute, you get ~39 million minutes of video uploaded over three months, and 1.7 million minutes of video removed. That's 4%, not 10%, and human moderators would only have to review the subset of these videos that later get disputed. So hire 15k moderators and pay them $50k/year, YouTube can probably afford that and that probably would be enough to verify absolutely everything that gets flagged and disputed.

              Of course, there's probably ways to MASSIVELY reduce that requirement too. I wouldn't really care if they refused human moderation until you reached a certain threshold of views/videos/subscribers for example...gotta wonder how many of the current removals are trolls/bots/etc who create an account just to upload that one video...could also have trusted channels who can skip the review or stop allowing human reviews if your channel has reached a certain threshold of failed disputes...

              Finally, consider that right now YouTube has very little incentive to reduce false-positives, because it doesn't really cost them anything as long as the rates aren't high enough to drive off the users. If they had to actually pay someone to review all those false-positives, they might start looking for ways to improve that particular aspect of their algorithms...

              reply to this | link to this | view in chronology ]

              • identicon
                Anonymous Coward, 16 Jan 2020 @ 2:01pm

                Re: Re: Re: Re: Re: Re:

                Just how do you define moderation rules that are suitable for every nation, ethnic and religious group?

                reply to this | link to this | view in chronology ]

              • icon
                bhull242 (profile), 16 Jan 2020 @ 4:22pm

                Re: Re: Re: Re: Re: Re:

                Did you miss the part about how you cannot just throw money and people at the problem? People have their own biases and opinions, and it’s impossible to satisfy everyone.

                reply to this | link to this | view in chronology ]

                • icon
                  urza9814 (profile), 17 Jan 2020 @ 6:06am

                  Re: Re: Re: Re: Re: Re: Re:

                  Who says the only goal of moderation is to "satisfy everyone"? The goal of moderation is to enforce some set of rules. YouTube can set whatever rules they want, it's a private website, the only point that matters is whether or not they are actually capable of enforcing the rules that they implement, or if they're going to be shutting people down basically at random because they don't actually bother to verify "violations".

                  reply to this | link to this | view in chronology ]

                  • icon
                    bhull242 (profile), 17 Jan 2020 @ 12:52pm

                    Re: Re: Re: Re: Re: Re: Re: Re:

                    My point was that a lot of these rules involve subjective judgements to be taken that not everyone will agree on. In other words, reasonable people could disagree on whether certain videos do or do not follow the rules/guidelines/ToS set by the platform/service. Additionally, one must also consider nuance, which can be difficult or impossible for either an algorithm or a human moderator to consider every time.

                    But based on what you said, you’re focused on the ones that aren’t exactly close calls. Okay, let’s tackle this from a different angle. (And for the sake of argument, we’ll be looking only at cases that objectively follow/break the rules.) Even if the moderation was 99.9% effective for such videos, there would still be a lot of mistakes (both false positives and false negatives) that get through, and those mistakes tend to get noticed disproportionately compared to the times they get it right. No matter how many people, how much money, or how good an algorithm you use, it is fundamentally impossible to have a 100% success rate (0 false positives and 0 false negatives). And even if you get as close to 100% effectiveness as reasonably possible, a lot of mistakes will still pop up. And when even one mistake in either direction can cause a media circus…

                    As for lacking incentives, you’re not entirely right there. YouTube makes less money off of ad-free or demonetized videos than the videos with full ads, and they make no money off of videos that have been removed entirely. So false positives do reduce YouTube’s revenue. False negatives, of course, create their own problems due to ad companies threatening to or actually deciding to remove support. You’re still not entirely wrong as the incentives do tend to lean more in favor of eliminating false negatives than to preventing false positives, but it’s untrue to say that they have very little incentive to prevent false positives. It’s just that they have more incentive to go the other way.

                    So yes, it’s fundamentally impossible to enforce any set of rules without getting a sizable number of false negatives and/or false positives, even with manual enforcement. People make mistakes, after all, especially when they have to go through a lot of videos to make subjective, context-sensitive judgements on each. Now, it’s possible for a smaller service to get lucky and not get any errors by virtue of fewer chances to make errors and smaller stakes for getting it wrong a few times, but the issue is moderation at scale. As the amount of content (both in terms of number of videos and amount of time, both in total and per time period) and the number of users and outside viewers grow, the likelihood of failure grows even if the amount of money and number of moderators grow, too.

                    Plus, with more moderators, you get increased chances of inept moderators, bad actors among moderators, and (even among adept, motivated, well-meaning moderators) an increased chance at inconsistent moderating decisions being made. You could hire overseers for the moderators, but that has its own problems, and it throws a monkey wrench into your calculations. Then there’s the law of diminishing returns. Even if adding new moderators had more benefits than downsides, with each new moderator, the amount of benefits provided by adding another moderator go down. The costs, on the other hand, are pretty constant, so soon enough the costs will outweigh the benefits. And, of course, there’s still the fact that the flagging system itself creates false negatives as well, which occurs before any human employee gets involved in the process, leading to some videos that clearly violate the rules that will never be removed without some outside interference. And that gets even worse when you consider the false negatives from the idea that only those flagged videos that are actually disputed get reviewed by human moderators.

                    Finally, there’s a pretty significant issue with your calculations. See, you calculated that it would take an estimated $2.2 million per day to afford to hire 27k moderators (a conservative estimate of the number of moderators YouTube would need to hire just to watch all the content that gets uploaded and flagged on YouTube once without falling behind assuming 8-hour days without breaks and a rotating shift schedule and that each person is equally qualified to moderate each and every video, not to effectively moderate every video, which would require additional time and some breaks). You noted that the only available information on YouTube’s revenue or profit estimates that it earned about $1.9 million per day in profits in 2013, but claim that that has surely increased since then. Of course, this is an unsupported assumption that ignores that quite a bit has happened since then that would tend to reduce YouTube’s profits, such as the so-called Adpocalypse (and its sequel) and added costs from additional features and removed or reduced sources of revenue is some instances.

                    Could they improve? Possibly. But the fact is that it’s impossible to moderate at scale without making a fair number of mistakes. Plus, I haven’t found any statistics regarding the number of false positives based on videos removed regardless of whether they’re reinstated or not (or even after factoring in those that go through human moderation), so I cannot find any evidence that the number of false positives for any aspect of the moderation process comprise a substantial portion of the amount of content on YouTube, as a whole or per time period, or how many of the false positives that start to go through moderation get caught vs. those that make it all the way through. So I’m not even sure how effective the moderation actually is. Sure, there are definitely mistakes, but that’s inevitable given the sheer amount of content uploaded to YouTube, and there’s no way to create a moderation system that never has false positives. And adding more moderators may create new problems, as well, beyond cutting into YouTube’s profits.

                    reply to this | link to this | view in chronology ]

                  • icon
                    bhull242 (profile), 17 Jan 2020 @ 1:01pm

                    Re: Re: Re: Re: Re: Re: Re: Re:

                    I think that part of the problem you’re ignoring is that applying these rules to a particular video is often highly subjective, context-sensitive, and can vary by culture, religion, and personal values and experiences.

                    Of course, you also ignore the fact that language is a barrier to moderation as well.

                    reply to this | link to this | view in chronology ]

              • icon
                That One Guy (profile), 16 Jan 2020 @ 8:46pm

                Re: Re: Re: Re: Re: Re:

                they currently seem to have about 2k total employees and PLENTY of money to hire more if they actually wanted to.

                ...

                So hire 15k moderators and pay them $50k/year, YouTube can probably afford that and that probably would be enough to verify absolutely everything that gets flagged and disputed.

                You uh, wanna re-read what you just wrote out there, see if you can spot the absolutely insane part you're suggesting? Hiring more moderators would probably help(assuming they're all trained up right and adhere to a set of rules everyone agrees with, and good luck with that), but when you reach the point where you're talking about a company increasing the number of employees by that amount it's probably a good idea to take a step back and ask yourself exactly how reasonable your suggestions actually are, perhaps by applying that same logic to other fields/professions.

                reply to this | link to this | view in chronology ]

                • icon
                  PaulT (profile), 17 Jan 2020 @ 12:31am

                  Re: Re: Re: Re: Re: Re: Re:

                  "You uh, wanna re-read what you just wrote out there, see if you can spot the absolutely insane part you're suggesting?"

                  Indeed. If the suggestion is "it's easy, all you have to do is hire 8x the total number of employees you currently have:, you don't have a suggestion that can be taken seriously.

                  Also, if you say that then later complain at any point that YouTube/Google have a monopoly position, you seriously need to get your head examined.

                  reply to this | link to this | view in chronology ]

                  • icon
                    urza9814 (profile), 17 Jan 2020 @ 6:12am

                    Re: Re: Re: Re: Re: Re: Re: Re:

                    I never said it was easy. You act like I said they should just hire ten thousand people by noon today. Of course it's going to take a long time. The point is that they make these massive profits by refusing to hire enough staff to actually get the job done properly. It's not that they can't afford to do it, it's not that it's "impossible", they just don't want to make the investment. And of course they don't, because they're ALREADY essentially in a monopoly position for streaming user content, so they have no real incentive to make their service fair or effective.

                    reply to this | link to this | view in chronology ]

                    • icon
                      PaulT (profile), 17 Jan 2020 @ 6:25am

                      Re: Re: Re: Re: Re: Re: Re: Re: Re:

                      "The point is that they make these massive profits by refusing to hire enough staff to actually get the job done properly"

                      Citation?

                      But, the biggest point is that staffing alone will not and can not achieve the result you want. It's not about time, money and staffing and by concentrating on those factors you miss the whole point. At best, YouTube would get a 95% effective service and in doing so establish a position that no competitor would ever be able to topple.

                      "they're ALREADY essentially in a monopoly position for streaming user content, so they have no real incentive to make their service fair or effective"

                      Then start using competitors rather than demand that YouTube implement something that no competitor could afford to replicate. OK, let's go by your suggestion and YouTube can afford to hire 8x their workforce. Is any smaller site going to be able do that? What about new startups who don't have access to the automation that backs up YouTube's human moderators?

                      reply to this | link to this | view in chronology ]

                      • icon
                        urza9814 (profile), 17 Jan 2020 @ 6:40am

                        Re: Re: Re: Re: Re: Re: Re: Re: Re: Re:

                        My citation is the math in my previous comment.

                        And yeah, any site that's going to put in moderation rules needs to have some effective way of actually enforcing those rules. I do use some YouTube competitors, and I pay a monthly fee for the privilege, and they have none of these issues. Floatplane doesn't have this problem; Nebula doesn't have this problem...largely because they've designed a business model from the ground up to avoid those costs.

                        reply to this | link to this | view in chronology ]

                        • icon
                          PaulT (profile), 17 Jan 2020 @ 7:27am

                          Re: Re: Re: Re: Re: Re: Re: Re: Re: Re: Re:

                          "My citation is the math in my previous comment."

                          The math that ignores every single factor involved except the salary of the employees? You might need to work on that a bit.

                          "And yeah, any site that's going to put in moderation rules needs to have some effective way of actually enforcing those rules."

                          They do. Then people attack them for any mistake the automated systems make and demand they hire tens of thousands of human being to replicate the work.

                          "Floatplane doesn't have this problem; Nebula doesn't have this problem...largely because they've designed a business model from the ground up to avoid those costs."

                          Well, I'll admit I'd not heard of those particular platform till you mentioned them. Thanks for the introduction!

                          I dare say your time is perhaps better spent advocating for those platforms rather than arguing about fundamentally incorrect math estimates.

                          reply to this | link to this | view in chronology ]

                    • identicon
                      Anonymous Coward, 17 Jan 2020 @ 6:35am

                      Re: Re: Re: Re: Re: Re: Re: Re: Re:

                      If they implement an effective manually supported moderation system, then every government in the world will be demanding that they meet that governments standards for moderation, and hire more people if they need to do so to meet those standards.

                      reply to this | link to this | view in chronology ]

                • icon
                  urza9814 (profile), 17 Jan 2020 @ 6:08am

                  Re: Re: Re: Re: Re: Re: Re:

                  Yeah, it's easy to make things sound ridiculous when you cut out all of the evidence to show that it can actually be done.

                  Sure, they've been spending many years refusing the hire sufficient staff to get the job done. They have a lot of catching up to do. But they do have enough profits to do it, if they actually tried.

                  reply to this | link to this | view in chronology ]

                  • icon
                    PaulT (profile), 17 Jan 2020 @ 6:26am

                    Re: Re: Re: Re: Re: Re: Re: Re:

                    "But they do have enough profits to do it, if they actually tried."

                    ...and, again, if you think that spending is the issue, you don't understand the problem.

                    reply to this | link to this | view in chronology ]

                    • icon
                      urza9814 (profile), 17 Jan 2020 @ 6:43am

                      Re: Re: Re: Re: Re: Re: Re: Re: Re:

                      As I'm pretty sure I've already answered that concern, I think you are the one who is not understanding. The problem is absolutely about logistics. I don't care what policies they implement and have their moderators enforce; I care that they actually enforce some concrete policy instead of just taking stuff down essentially at random.

                      reply to this | link to this | view in chronology ]

                      • icon
                        PaulT (profile), 17 Jan 2020 @ 7:33am

                        Re: Re: Re: Re: Re: Re: Re: Re: Re: Re:

                        "The problem is absolutely about logistics"

                        Then why do all your suggestions involve ignoring every aspect of logistics except for employee numbers and salary?

                        "I care that they actually enforce some concrete policy instead of just taking stuff down essentially at random"

                        The fact that you think this is the case right now exhibits your lack of understanding of the matter. They don't take things down at random, they just have imperfect automated systems that are struggling to deal with conflicting and illogical information. There's room for improvement, perhaps even with extra human involvement, but it's no more random than it would be by having tens of thousands of people from different social, cultural and geographical backgrounds trying to implement the same policy.

                        reply to this | link to this | view in chronology ]

              • icon
                PaulT (profile), 17 Jan 2020 @ 12:28am

                Re: Re: Re: Re: Re: Re:

                "You COULD hire those 27,000 moderators and pay them $30k/year (exactly average for a US wage, although most of these moderators are in third-world countries anyway) "

                Exactly. So... when you have a global team, most of whom have different cultural upbringings, how do you maintain a consistent standard of moderation for US consumption?

                Again, if you think it's just money or number of asses on seats that's the main issue, you don't understand the problem.

                reply to this | link to this | view in chronology ]

              • identicon
                Anonymous Coward, 17 Jan 2020 @ 1:19am

                Re: Re: Re: Re: Re: Re:

                You COULD hire those 27,000 moderators and pay them $30k/year (exactly average for a US wage, although most of these moderators are in third-world countries anyway) for around $2.2 million per day.

                Plus the cost of managers and human resources people to look after them, plus the cost of buildings for them to work in, along with the staff to look after them, plus the cost of the computers and networking needed for them to do their job. The costs involve far more than the payroll of those doing the work.

                reply to this | link to this | view in chronology ]

                • icon
                  PaulT (profile), 17 Jan 2020 @ 1:36am

                  Re: Re: Re: Re: Re: Re: Re:

                  "Plus the cost of managers and human resources people to look after them"

                  Which, it must be stressed is far from trivial. The role would likely be on par with call centre jobs, which are notoriously high pressure, creating a lot of staff turnover and constant training requirements for new staff (ever wonder why those people mindlessly follow scripts? It because most of them don't last long enough in the hostile work environment to become independently experienced). Add the types of pressure that would be unique to this role (such as how to deal with people who are constantly exposed to things they find offensive during their work day), and you have a high cost that grow exponentially the more people you hire.

                  reply to this | link to this | view in chronology ]

              • identicon
                Anonymous Coward, 18 Jan 2020 @ 11:18pm

                Re: Re: Re: Re: Re: Re:

                What kind of a life would be $30K Anually? Who could live today on that? Maybe Youtube could parse a college course for 'moderation 101' and give students extra points as volunteers.

                reply to this | link to this | view in chronology ]

        • icon
          Stephen T. Stone (profile), 15 Jan 2020 @ 10:32am

          I don't believe it.

          How many hours of footage do you think a single person can review in an average eight-hour workday, even if they’re reviewing only part of a given video?

          How many people do you think YouTube would need to properly moderate even one hour’s worth of uploads?

          How many people do you think YouTube would need to cover an entire day’s worth of uploads, every day?

          And the $64,000 question: How can you believe all of those people would (or could) interpret YouTube’s rules and moderate content in the exact same way, such that all of YouTube’s moderators are objective?

          reply to this | link to this | view in chronology ]

        • icon
          Scary Devil Monastery (profile), 16 Jan 2020 @ 3:39am

          Re: Re: Re:

          "That just means they aren't paying them enough."

          No. The amount of money paid doesn't allow people the luxury of actual time dilation.

          By the time you've got enough moderators to manually handle even a fraction of youtube's uploads you've more or less solved the US unemployment problem - and youtube would be losing a hundred dollars for every dollar they bring in.

          At some point you're going to have to run a cost-benefit analysis and find that at certain scales any moderation will become completely useless due to being too liberal, too restrictive, too expensive, or two of the above. You never get to the point where it's "just right" in the level of restriction without it also being ridiculously expensive.

          If a company can't operate a certain segment of it's business without losing more money than the company earns in total then that segment has to go away. It's a good guess that trying to employ enough moderators to even take a bash at youtube would put all of Google in the red.

          reply to this | link to this | view in chronology ]

          • icon
            PaulT (profile), 16 Jan 2020 @ 4:00am

            Re: Re: Re: Re:

            "No. The amount of money paid doesn't allow people the luxury of actual time dilation."

            Also, anyone who thinks it's just a money issue really doesn't understand the real problems. Simply keeping staffing levels up while maintaining a consistent standard of moderation across all sites and moderators would be a massive undertaking. It's not simply a matter of getting human eyes on these things, it's about ensuring that you don't get a situation where the way a video is moderated depends on the location and mood of the moderator - in which case the entire exercise is pointless. It's not about number of employees, it's a massive logistics issue.

            Anyone who thinks that these things are just about throwing money at the problem has obviously never been near a position of responsibility in any company close to the size of something like Google - or were very bad at that job when they had it.

            reply to this | link to this | view in chronology ]

            • identicon
              Anonymous Coward, 16 Jan 2020 @ 5:09am

              Re: Re: Re: Re: Re:

              Just how do you moderate to meet the sensibilities of right wing Christians and Muslim extremists at the same time, while appeasing the chinese government, and Tayyip Erdogan along with Donald Trump amonst others?

              reply to this | link to this | view in chronology ]

            • icon
              Scary Devil Monastery (profile), 16 Jan 2020 @ 6:25am

              Re: Re: Re: Re: Re:

              "Simply keeping staffing levels up while maintaining a consistent standard of moderation across all sites and moderators would be a massive undertaking."

              Concur.

              I just don't think the argument that the soviet-model buraucracy doesn't work won't go very far with Baghdad Bob/Bobmail who, if memory serves me correctly, has on numerous occasions glorified and advised such systems with a frightening lack of understanding that human beings aren't perfect machinery.

              "it's a massive logistics issue. "

              Infrastructure isn't Baghdad Bob's strong suit. Nothing much beyond "It's the law, of course people should follow it!".

              "Anyone who thinks that these things are just about throwing money at the problem has obviously never been near a position of responsibility in any company..."

              What can I say, the man has faith? I've seen Bobmail/Blue suggest, in apparent earnest, that the solution to race riots is to just use more police, and that the proper approach to piracy is to abolish burden of proof when it comes to ip-based evidence. He likes simple solutions.

              reply to this | link to this | view in chronology ]

              • identicon
                Rocky, 16 Jan 2020 @ 7:02am

                Re: Re: Re: Re: Re: Re:

                Simple solutions for complex problems are usually put forward by simpletons or people with a specific goal in mind that's usually inimical to the majority affected of said solution.

                Baghdad Bob strikes me to be of the former persuasion since he often argues from an extremely dated position and refuses to change it even though the world has marched on for decades. His inability to adjust and learn new things outside his preconceptions is why I think he's a simpleton, and simpletons "know what they know because everyone else is wrong".

                reply to this | link to this | view in chronology ]

                • icon
                  Scary Devil Monastery (profile), 17 Jan 2020 @ 1:07am

                  Re: Re: Re: Re: Re: Re: Re:

                  "Simple solutions for complex problems are usually put forward by simpletons or people with a specific goal in mind that's usually inimical to the majority affected of said solution."

                  The specific goal Baghdad bob/Blue/Jhon has in mind appears to be pirates and copyright enforcement - notwithstanding every time he tries to troll a thread apart by spamming it with white supremacy in his "Hamilton" guise, his is that recurring theme where he has, for many years on end now, tried VERY hard to skew every narrative towards what seems to be the official propaganda peddled by the RIAA and the MPAA.

                  "Baghdad Bob strikes me to be of the former persuasion since he often argues from an extremely dated position and refuses to change it even though the world has marched on for decades."

                  I believe it's both. He is no doubt a simpleton, but that's mainly because his rhetoric tricks are shit to the point where the only thing he ever accomplishes is to leave bystanders with the idea that copyright adherents are malicious lunatics.

                  But his agenda, especially for us who've had to suffer him on both Torrentfreak in the past and Techdirt today, is pretty obvious. It's become clear that he has a real-life vested interest in his real name not coming up in googled searches - so that's where his bile towards google and section 230 comes from - and his pet hate against piracy appears to stem from some shit of his leaking to the pirate bay way back when and not even pirates wanting to touch it.

                  Add that to his frequent defense of Prenda, Shiva-who-did-not-invent-email, ACS:Law and other copyright troll outfits and the picture of a failed con man involved in copyright trolling and fraud emerges.

                  Yes, he's a simpleton because he can't even troll right. He's also personally invested into anything which will make it easier for a failed snake oil salesman to make a living conning people by way of fraud, extortion, or tort.

                  He's basically a dumb, malicious-minded carnie with entitlement issues who feels persecuted because the marks are on to him.

                  reply to this | link to this | view in chronology ]

                • identicon
                  Anonymous Coward, 19 Jan 2020 @ 12:05am

                  Re: Re: Re: Re: Re: Re: Re:

                  Education varies also from one generation to the next. Stuff that gets embedded tends to stay there! And why in the hell should it not, that education cost a ton of money.

                  reply to this | link to this | view in chronology ]

          • identicon
            Anonymous Coward, 18 Jan 2020 @ 11:22pm

            Re: Re: Re: Re:

            If they can't do it right, they shouldn't do it at all. They should not be forced to do it at all regardless.

            reply to this | link to this | view in chronology ]

            • identicon
              Anonymous Coward, 18 Jan 2020 @ 11:29pm

              Re: Re: Re: Re: Re:

              This whole debacle started when Takedown notices started for infringement without any evidence. Let those entities concerned with infringement do the moderation, but force them to provide evidence before takedown notices are issued.

              reply to this | link to this | view in chronology ]

            • icon
              bhull242 (profile), 20 Jan 2020 @ 10:00pm

              Re: Re: Re: Re: Re:

              I agree that they shouldn’t be forced to, but I have no issues with them doing so on their own (except copyright infringement).

              reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 15 Jan 2020 @ 10:03am

      Re:

      Moderation at scale is impossible, but note that the scale is the human race and not the size of the platform. Limiting the amount of material published silences people, because some has to select who can be published. So perhaps the best answer if for people to use the close button if they run into something that they find offensive, rather than insist on a third party doing something to protect their sensibilities.

      reply to this | link to this | view in chronology ]

      • icon
        Stephen T. Stone (profile), 15 Jan 2020 @ 10:21am

        You’re responsible for curating your experience on YouTube (or any other similar service). YouTube is responsible for upholding its terms of service. You can generally avoid “offensive” content and still report videos for violating the TOS when you do come across them. So the best answer is “close videos you think are offensive, but report those videos first if you think they break the rules”.

        reply to this | link to this | view in chronology ]

        • identicon
          Anonymous Coward, 15 Jan 2020 @ 10:37am

          Re:

          So the best answer is “close videos you think are offensive, but report those videos first if you think they break the rules”.

          The problem is the report them, as there are a lot of people that think, and will report what they consider offensive, and make as much noise as possible if YouTube does nothing. Moderation on YouTube has been creeping in scope because of this.

          reply to this | link to this | view in chronology ]

          • icon
            Stephen T. Stone (profile), 15 Jan 2020 @ 10:58am

            No content reporting system is foolproof or free from malicious manipulation (e.g., reportbombing). But removing such a system is not the answer to fixing, or at least mitigating, such manipulation.

            reply to this | link to this | view in chronology ]

          • identicon
            Anonymous Coward, 18 Jan 2020 @ 11:37pm

            Re: Re:

            So what if they're offensive. Some people love that stuff. It is a violation to take that stuff down. That is why this is a pussified generation.

            reply to this | link to this | view in chronology ]

            • icon
              bhull242 (profile), 20 Jan 2020 @ 8:57pm

              Re: Re: Re:

              The platform holder is not obligated to host content they deem too offensive if they don’t want to.

              reply to this | link to this | view in chronology ]

            • icon
              PaulT (profile), 21 Jan 2020 @ 12:40am

              Re: Re: Re:

              "Some people love that stuff"

              There's plenty of venues for them to view and host that content as well. Just because you like something, does not mean that you have the right to force people who don't to host and/or view it. Some people love hardcore porn - that doesn't mean the Disney Channel has to host it...

              "That is why this is a pussified generation."

              No, the pussies are the wimps who piss and moan that they can't force everyone else to confirm to what they want. People who aren't spoiled brats understand that the world doesn't revolve around them, and act accordingly.

              reply to this | link to this | view in chronology ]

    • icon
      That One Guy (profile), 15 Jan 2020 @ 10:12am

      Let's apply that thinking elsewhere, shall we?

      People use the roads to commit illegal acts, if the state can't keep that from happening then they should shut them down.

      People use the mail to commit illegal acts, if the post office can't keep that from happening then they should shut down.

      People make use of items bought in stores to commit illegal acts, if the stores can't prevent that then they should shut down.

      If content moderation at scale is impossible, then should YouTube and the other platforms just give up and let anybody post anything?

      That's a mighty fine false-dichotomy there, sure hope no one shoots it down by pointing out that 'anything goes' or 'perfect moderation/moderation good enough to satisfy everyone currently complaining about it' aren't the only two options.

      reply to this | link to this | view in chronology ]

      • identicon
        Anonymous Coward, 15 Jan 2020 @ 12:19pm

        Re: Let's apply that thinking elsewhere, shall we?

        People that commit illegal acts breath air, if the state can't keep that from happening then they should shut down the air.

        • EPA.gov

        reply to this | link to this | view in chronology ]

        • identicon
          Anonymous Coward, 18 Jan 2020 @ 11:40pm

          Re: Re: Let's apply that thinking elsewhere, shall we?

          Is that an actual quote from EPA? Because it is my experience that the EPA just shuts down the Air Quality Monitering Stations, not the actual AIR.

          reply to this | link to this | view in chronology ]

      • icon
        Scary Devil Monastery (profile), 16 Jan 2020 @ 3:51am

        Re: Let's apply that thinking elsewhere, shall we?

        "That's a mighty fine false-dichotomy there, sure hope no one shoots it down by pointing out that 'anything goes' or 'perfect moderation/moderation good enough to satisfy everyone currently complaining about it' aren't the only two options."

        And Baghdad Bob, as usual, won't give a fuck. Old Jhon/Blue has been looping this message like an old scratched vinyl record since he was trying to peddle the concept of gatekeeper supremacy back on Torrentfreak.

        reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 15 Jan 2020 @ 10:18am

      Re:

      If content moderation at scale is impossible, then should YouTube and the other platforms just give up and let anybody post anything?

      That would be a better idea for cases like this. Let people block Nazi content if they choose. But don't remove it completely, because it's useful for some people. Pro-Nazi and anti-Nazi propaganda are both valid research topics.

      reply to this | link to this | view in chronology ]

      • icon
        Stephen T. Stone (profile), 15 Jan 2020 @ 10:27am

        And if YouTube’s algorithms start recommending pro-Nazi propaganda to everyone because of malicious system manipulation (i.e., people gaming the algorithms), for what reason should we accept that as “the cost of being on YouTube”?

        reply to this | link to this | view in chronology ]

        • identicon
          Anonymous Coward, 15 Jan 2020 @ 1:31pm

          Re:

          A "never recommend this video to anyone" flag is almost as easy to implement as "never show this to anyone". That's basically what Google does for their search auto-completion. They'll try never to auto-complete a pro-Nazi phrase for you, but you can type it yourself and the search will work.

          reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 15 Jan 2020 @ 10:40am

      Re:

      Or maybe if they can't do a decent job moderating at their current size, then they are too big?

      What are you suggesting, then? That they stop growing?

      reply to this | link to this | view in chronology ]

      • icon
        bhull242 (profile), 15 Jan 2020 @ 11:36am

        Re: Re:

        For one thing, the problem isn’t the size of the company running YouTube but the size of the userbase that uploads content or flags it.

        reply to this | link to this | view in chronology ]

      • icon
        Scary Devil Monastery (profile), 16 Jan 2020 @ 3:53am

        Re: Re:

        "What are you suggesting, then? That they stop growing?"

        He's implying they should go away. That all Old Baghdad Bob ever tries to say. Because in his mind everything works so much better if what people can see and hear has to go past an MPAA-approved gatekeeper first.

        reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 15 Jan 2020 @ 11:52am

      Re:

      Or maybe if they can't do a decent job moderating at their current size, then they are too big?

      Size is irrelevant. If there was a small YouTube, Vimeo perhaps, that handled only 1% of the traffic YT does then their revenue is also 1% of YT's, or less. They couldn't afford to hire enough moderators and pay for the infrastructure to moderate their content any better than YT.

      Nice try though. Your agenda is clear, at least.

      reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 15 Jan 2020 @ 1:24pm

      Re:

      Maybe pointing out the reality of a situation isn't the go-sign for offering really stupid ideas or assumptions about intent?

      reply to this | link to this | view in chronology ]

  • identicon
    Agammamon, 15 Jan 2020 @ 9:47am

    They had Frank Capra, perhaps Hollywood's most influential director during the 1930s, produce a bunch of films under the banner "Why We Fight." The very first of these was "Prelude to War."

    In other words 'fake news' - so, indeed, it wasn't a mistake and this upload does violate Youtube's terms of service.

    Sounds like content moderation at scale done well to me.

    reply to this | link to this | view in chronology ]

    • icon
      Stephen T. Stone (profile), 15 Jan 2020 @ 9:54am

      In other words 'fake news'

      Please define “fake news” in objective terms such that everyone can agree to accept, then explain why Prelude to War counts as “fake news”.

      reply to this | link to this | view in chronology ]

    • identicon
      Glen, 15 Jan 2020 @ 10:15am

      Re:

      How is it "fake news"? Please put into context. Otherwise it isn't. If it shows the lengths the US government went to in order to prep its citizens for war, then it isn't fake news now, is it?

      reply to this | link to this | view in chronology ]

      • identicon
        Agammamon, 15 Jan 2020 @ 11:01am

        Re: Re:

        If it shows the lengths the US government went to smear a potential opponent in order to influence the public and make going to war easier then its a straight up call to violence, isn't it?

        reply to this | link to this | view in chronology ]

        • icon
          bhull242 (profile), 15 Jan 2020 @ 11:40am

          Re: Re: Re:

          1. How is that a “call to violence”, exactly?

          2. More importantly, that doesn’t answer how it is “fake news”, which was the question being asked. A call to violence isn’t necessarily “fake news”, nor is “fake news” necessarily a call to violence. Neither is remotely indicative of the other.

          reply to this | link to this | view in chronology ]

          • icon
            PaulT (profile), 16 Jan 2020 @ 1:19am

            Re: Re: Re: Re:

            "How is that a “call to violence”, exactly?"

            To be fair, it is called "Prelude To War", the first part of the series entitled "Why We Fight" and produced with the express purpose of preparing for the possibility of joining the war in Europe.

            "More importantly, that doesn’t answer how it is “fake news”, which was the question being asked"

            Sadly, due to the antics of current Nazis, their supporters and their enablers, that term has come to mean "something I don't like" rather than something that's factually incorrect.

            reply to this | link to this | view in chronology ]

        • identicon
          Anonymous Coward, 15 Jan 2020 @ 11:56am

          Re: Re: Re:

          Maybe... in the past. Don't be so obtuse. You're borderline trolling now and it's not a good look.

          reply to this | link to this | view in chronology ]

        • identicon
          Anonymous Coward, 15 Jan 2020 @ 1:26pm

          Re: Re: Re:

          Which lengths were those, exactly, in this case?

          reply to this | link to this | view in chronology ]

        • icon
          Scary Devil Monastery (profile), 16 Jan 2020 @ 3:57am

          Re: Re: Re:

          "If it shows the lengths the US government went to smear a potential opponent in order to influence the public and make going to war easier then its a straight up call to violence, isn't it?"

          You mean, by showing the US public what appears to be largely unedited clips of actual nazi propaganda?

          I'll take ad notam that you believe it's a smear tactic to factually quote a 3rd party and judge your logic broken.

          reply to this | link to this | view in chronology ]

  • icon
    Norahc (profile), 15 Jan 2020 @ 10:13am

    The more we try to obliterate the mistakes from our past, the more we actually promote them and risk repeating them again.

    reply to this | link to this | view in chronology ]

  • icon
    Code Monkey (profile), 15 Jan 2020 @ 10:20am

    Mike's impossibility theorem

    "Masnick's Impossibility Theorem is that it is literally impossible to do content moderation at scale well."

    I believe Iran and North Korea have done a STUNNINGLY good job at it......

    reply to this | link to this | view in chronology ]

    • icon
      Stephen T. Stone (profile), 15 Jan 2020 @ 10:23am

      That’s not so much “content moderation” as it is “censorship”. YouTube doesn’t say “you can’t post this video on another site” if it removes that video from YouTube.

      reply to this | link to this | view in chronology ]

    • icon
      James Burkhardt (profile), 15 Jan 2020 @ 10:40am

      Re: Mike's impossibility theorem

      If you read about the linked theorum, you'd find that what Masnick talks about is the satisfaction of 1) Users who post the content, 2) Viewers who consume the content, and 3) Those who are for personal, moral, or ethical reasons concerned with policing the substance of the content.

      Iran and North Korea satisfy (3) by removing any concerning content, at the expense of (1) and (2). In doing so, they face critizim from groups (1) and (2), and international members of group (3). So no, under the framework of Masnick's Impossibility Theorem they have indeed failed to moderate content in a way that satisfies all 3 groups.

      reply to this | link to this | view in chronology ]

    • icon
      bhull242 (profile), 15 Jan 2020 @ 11:50am

      Re: Mike's impossibility theorem

      By “do[ing] content moderation at scale well”, we mean both:

      1. removing all content that is sufficiently objectionable (essentially or exactly no false negatives) and

      2. not removing any content that is not objectionable (essentially or exactly no false positives)

      without making any mistakes, and where the definition of what is “objectionable” is agreed to by essentially all users, governments, corporate interests (excluding competitors), activist groups, social classes, etc., though the theorem also applies to privately run platforms when using the publicly stated terms for what is objectionable.

      Iran and North Korea don’t satisfy 2 at all, and what is “objectionable” is solely defined by the government, so that’s also outside the ruleset.

      reply to this | link to this | view in chronology ]

    • icon
      PaulT (profile), 16 Jan 2020 @ 1:20am

      Re: Mike's impossibility theorem

      Have they, though? It's still very possible for people in those countries to access information that they are told they should not access, it's just that most people there don't want to risk the consequences of doing so.

      reply to this | link to this | view in chronology ]

    • icon
      Scary Devil Monastery (profile), 16 Jan 2020 @ 3:58am

      Re: Mike's impossibility theorem

      "I believe Iran and North Korea have done a STUNNINGLY good job at it......"

      Err....no, "pulling the plug" isn't called "moderation".

      reply to this | link to this | view in chronology ]

  • icon
    Code Monkey (profile), 15 Jan 2020 @ 11:19am

    My bad

    Fair point, I shall state another way....

    "Masnick's Impossibility Theorem is that it is literally impossible to do content moderation at scale well."

    WARNING: The following thought is purely one of sarcasm, based on the notion that not only do I wholeheartedly agree with Mr. Masnick's theorem, but I also want to poke fun at the ridiculousness of oppressive regimes, such as Iran and North Korea, who completely take the internet (and most, if not all, forms of free speech) away from its citizenry. Obviously this must be spelled out overtly, lest someone misconstrue that it was, indeed, sarcasm and not an endorsement of said oppressive regimes. We apologize for any misunderstanding.....

    <sarcasm>

    I believe Iran and North Korea have done a STUNNINGLY good job at it......

    </sarcasm>

    reply to this | link to this | view in chronology ]

    • icon
      James Burkhardt (profile), 15 Jan 2020 @ 1:50pm

      Re: My bad

      Under Poe's Law, without the /s it was impossible to determine if you were serious or not.

      Without the /s, the massive disclaimer just implies you are a douche with bad case of butthurt because people didn't realize you were joking.

      reply to this | link to this | view in chronology ]

    • icon
      Scary Devil Monastery (profile), 16 Jan 2020 @ 4:00am

      Re: My bad

      I would wish Poe's Law didn't reign on these forums.

      Sadly, we DO have a resident copyright troll posting one absurdity after the other, in full earnest, so the /s tag is necessary.

      reply to this | link to this | view in chronology ]

  • identicon
    Chuck Sod, 15 Jan 2020 @ 11:35am

    Why we fight.
    I thought it was obvious, we fight to ensure our corporate overlords and their c-suite minions have everything their little hearts desire.

    reply to this | link to this | view in chronology ]

  • This comment has been flagged by the community. Click here to show it
    icon
    Zof (profile), 15 Jan 2020 @ 11:48am

    All sorts of things that were just fine 40 months ago are "bad"

    Because they are rigging an election.

    reply to this | link to this | view in chronology ]

  • This comment has been flagged by the community. Click here to show it
    icon
    Zof (profile), 15 Jan 2020 @ 11:54am

    It's amazing how we got all these

    reply to this | link to this | view in chronology ]

  • This comment has been flagged by the community. Click here to show it
    icon
    Zof (profile), 15 Jan 2020 @ 11:54am

    It's amazing how we got all these content moderation stories

    Right as social media started censoring themselves for the DNC.

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 15 Jan 2020 @ 12:35pm

    The youtube mods, are probably young people mostly on a low wage,
    capra,s ww2 films are famous , its probably optimistic to assume every mod knows what this film is about, a film thats over 60 years old.
    i can understand youtube is being extra cautious, lest it be accused of hosting
    ANY content that might be seen to promote nazism .
    its simply impossible to moderate content at scale,
    on a website where theres 1000,s of videos, being uploaded every minute.
    until recently any game that showed nazi symbols was banned in germany,
    even it was a simple shooter about shooting german soldiers .
    maybe there should be a new rule,
    when it comes to moderation ,
    simple rules or vague laws will very often censor legal content, in order to protect the platform from legal action by politicians or other users
    who may resent being silenced .
    mods don,t have to to watch every video and think about the context ,.

    i expect many american websites to be blocked in the eu,when article 13 comes into effect
    since they show audio or images, that are uploaded by users,
    without the permission of the ip holder.

    reply to this | link to this | view in chronology ]

  • icon
    sehlat (profile), 15 Jan 2020 @ 5:15pm

    The entire "Why We Fight" series is up on YouTube.

    Anybody want to go after those idiots for being anti-american? :)

    reply to this | link to this | view in chronology ]

  • identicon
    Rekrul, 15 Jan 2020 @ 7:46pm

    So, here, it appears likely that the moderation team used a fairly simple heuristic:

    You're giving YouTube too much credit. I doubt any humans (if there are any left) have even looked at this. Everything at YouTube and Google is automated. Whatever human employees still work there don't interact with the public in any way.

    I'll say it again; The only way this is going to change is if YouTube gets sued by someone with deep pockets. Anything less than legal action is dismissed as a disgruntled user who is obviously wrong.

    reply to this | link to this | view in chronology ]

    • icon
      PaulT (profile), 16 Jan 2020 @ 1:28am

      Re:

      "The only way this is going to change is if YouTube gets sued by someone with deep pockets."

      They already did, which is expressly things like ContentID exist.

      The only real way this is going to change is people stop their whining and start using competitors.

      reply to this | link to this | view in chronology ]

      • icon
        That One Guy (profile), 16 Jan 2020 @ 9:21am

        Re: Re:

        The only real way this is going to change is people stop their whining and start using competitors.

        The problem with that is getting a viable competitor, because while YouTube has Google to sign the checks for any legal fights unless a similarly large company/individual is willing to do the same any competitor would almost certainly be sued out of existence, potentially resulting in a string of really bad legal decisions not because the platform was in the wrong but simply because they couldn't afford to fight back.

        reply to this | link to this | view in chronology ]

        • icon
          PaulT (profile), 17 Jan 2020 @ 12:07am

          Re: Re: Re:

          "The problem with that is getting a viable competitor"

          Define "viable". Define "competitor". Because from where I'm sitting there's plenty of places that fit both definitions in my mind.

          YouTube has numerous competing sites, some of whom have been running almost as long as they have. They just don't get the traffic because people would rather stay on YouTube and bitch endlessly about how they dislike their methods, than actually use a different site.

          Legal issues are a valid concern, but they're no excuse for not attempting to use the existing competitors.

          reply to this | link to this | view in chronology ]

      • identicon
        Rekrul, 16 Jan 2020 @ 12:07pm

        Re: Re:

        They already did, which is expressly things like ContentID exist.

        They got sued and slapped a ham-fisted solution on the problem. Now they need to get sued to realize that their "solution" still needs work.

        The thing is that Google wants to take an entirely hands-off approach and just let automation handle everything.

        The only real way this is going to change is people stop their whining and start using competitors.

        Does such a thing exist? None of the ones I'm aware of come close to YouTube in features. I used to visit a video download site that would show trending videos on various sites, including Vimeo. If I followed their link, I could see the video page. If I went to the main site and searched for the title of the video, nothing came up. I search for various topics that should have videos and hardly anything shows up in the search. Daily Motion isn't much better.

        reply to this | link to this | view in chronology ]

        • identicon
          Anonymous Coward, 16 Jan 2020 @ 12:19pm

          Re: Re: Re:

          The thing is that Google wants to take an entirely hands-off approach and just let automation handle everything.

          The thing is, if you want to let everybody in the world have a voice, you have to use automation, which also applies to the other major social media sites like Twitter and Facebook.

          reply to this | link to this | view in chronology ]

          • icon
            PaulT (profile), 17 Jan 2020 @ 12:14am

            Re: Re: Re: Re:

            There is a difference between "use automation" and "only use automation with no human interaction to deal with things automation cannot handle". I think the complaint is the latter.

            reply to this | link to this | view in chronology ]

            • identicon
              Anonymous Coward, 17 Jan 2020 @ 1:24am

              Re: Re: Re: Re: Re:

              The problem is, as discussed higher up the discussion, is the number of people required to even handle things that are disputed. Handling the technology takes a small team compared with handling all the complaints that are sent the way of the large sites.

              reply to this | link to this | view in chronology ]

        • icon
          PaulT (profile), 17 Jan 2020 @ 12:13am

          Re: Re: Re:

          "They got sued and slapped a ham-fisted solution on the problem"

          Yes, there are probably better solutions with hindsight. Nevertheless, the point is that the reason the current situation exists is because they were sued by a serious threat at the time. Suing them again may not lead to the result you want, it might lead to worse things.

          "The thing is that Google wants to take an entirely hands-off approach and just let automation handle everything."

          Yes... and if you dislike that, move your business to a site that operates in a different way.

          "Does such a thing exist? "

          Yes.

          "None of the ones I'm aware of come close to YouTube in features"

          Ah, I see the problem. You don't want a YouTube competitor. You want a clone. One that copies everything YouTube does, but magically does so with an entirely different business model.

          "If I went to the main site and searched for the title of the video, nothing came up"

          That certainly is a problem, and possibly one that would be fixed by more users and more data / complaints to work on. But, bear in mind what you're complaining about here - sites that work primarily for video hosting have weaker search functionality than the company that works primarily as a search engine. Well, yes.

          The thing is, all these sites have their strengths and weaknesses, but you have the choice. If you choose to use YouTube for their search because it's more important to you than other features, don't be surprised when YouTube prioritises their search and ad features over your needs. They probably won't change unless the market changes, which involves people using competing products.

          reply to this | link to this | view in chronology ]

          • identicon
            Rekrul, 17 Jan 2020 @ 1:57pm

            Re: Re: Re: Re:

            Yes... and if you dislike that, move your business to a site that operates in a different way.

            I watch videos, I don't post them. If I go to other sites, they usually don't have the videos that I'm looking for. If I need a tutorial on how to fix something, how does it help if I go to another site and such a video doesn't exist?

            It's a catch-22: People won't use the sites without them having a large volume of videos that they want to watch, but without people using the sites, they'll never have a large volume of videos to attract users. Sure you can say that everyone should start uploading videos and make them more popular, but that's not a realistic view. Just like Windows, YouTube is king and convincing people to switch is like pushing a boulder up the face of a cliff.

            Ah, I see the problem. You don't want a YouTube competitor. You want a clone. One that copies everything YouTube does, but magically does so with an entirely different business model.

            People get used to a certain set of features in a product and anything that doesn't have those features is never going to be as popular. There's a reason that the majority or people today own smart phones instead of flip phones that can't run apps or access the net.

            But, bear in mind what you're complaining about here - sites that work primarily for video hosting have weaker search functionality than the company that works primarily as a search engine. Well, yes.

            Um, being able to actually find the videos that have the content that you're looking for is kind of an important function. If the search sucks, people aren't going to spend an hour browsing through videos trying to find what they're looking for. Consequently, people won't use it and they'll go to a site cough-YouTube-cough where they can find what they want. That's just the way the world works.

            Actually, other sites have the opportunity to surpass YouTube in search functionality and video organizing. We've argued in the past about how YouTube's search often seems to return unrelated videos and how I think they should also offer more date-based search options for a channel's videos.

            They probably won't change unless the market changes, which involves people using competing products.

            The way to succeed is to create a site that's even better than YouTube with more features. Even then, with the volume of videos YT has, it will be VERY difficult for any site to compete.

            When you want to buy something, do you go to a small store that probably won't have what you're looking for, or do you go somewhere that you know is almost certainly going to have what you want?

            reply to this | link to this | view in chronology ]

            • icon
              PaulT (profile), 20 Jan 2020 @ 2:04am

              Re: Re: Re: Re: Re:

              "It's a catch-22"

              You have this right, but the fix for it is for people to not depend on YouTube for everything. Nothing stops them posting to multiple sites.

              "People get used to a certain set of features in a product and anything that doesn't have those features is never going to be as popular."

              Then, if they refuse to move to a competitor because they want feature X, they also have to accept that feature Y that pisses them off isn't going to go away, especially if that feature is something that benefits the paying customers of YouTube (usually not the person bitching about it).

              "Um, being able to actually find the videos that have the content that you're looking for is kind of an important function"

              It is. But, what's the reason for the "failure". Is it because the search is weak, is it because the site is designed in a different way to YouTube. I can understand frustration, but if the problem is visiting sites that are designed not to be YouTube clones and your only complaint is that you can't use it like you would YouTube... well, you see the problem I hope. Alternatively, if you find the search weak and just go back to using YouTube instead of informing the site of the problems you have, there may never be a change.

              "The way to succeed is to create a site that's even better than YouTube with more features"

              Which isn't going to happen so long as the response to problems to . with YouTube is just to attack them and cause the barrier to entry in the market to go ever higher.

              "When you want to buy something, do you go to a small store that probably won't have what you're looking for, or do you go somewhere that you know is almost certainly going to have what you want?"

              That depends. Am I looking for a mass produced brand name or a specialised item? Is the big player something I disagree with on a fundamental moral level, or are they ethically the same as the small guys? That attitude is why you have such a lopsided industry - if you never give the smaller guys a chance because you assume they'll be inferior, then what you get is the same major corporations controlling everything, and you wind up bitching that what suits the multinationals is not what you want.

              reply to this | link to this | view in chronology ]

              • identicon
                Rekrul, 22 Jan 2020 @ 12:15pm

                Re: Re: Re: Re: Re: Re:

                It is. But, what's the reason for the "failure". Is it because the search is weak, is it because the site is designed in a different way to YouTube. I can understand frustration, but if the problem is visiting sites that are designed not to be YouTube clones and your only complaint is that you can't use it like you would YouTube... well, you see the problem I hope.

                Using Vimeo as an example. I sometimes see links to sexually explicit content on there. It never seems to get taken down as the authors "channel" will have videos going back years. Yet if I use the search box and search for adult content on my own, it rarely returns anything that could be considered explicit. Most of the time it won't even include videos with nudity in the search results, even though I set the safe search filter to off.

                It's not just adult videos either. There might be a link to a product review, but when I search for product reviews on my own, I get a ton of videos that aren't reviews, but which have the word "review" in the title.

                Alternatively, if you find the search weak and just go back to using YouTube instead of informing the site of the problems you have, there may never be a change.

                I've sent feedback, but it never seems to make any difference. I usually get back a polite "Thank for letting us know, we'll look into it" response.

                That depends. Am I looking for a mass produced brand name or a specialised item? Is the big player something I disagree with on a fundamental moral level, or are they ethically the same as the small guys? That attitude is why you have such a lopsided industry - if you never give the smaller guys a chance because you assume they'll be inferior, then what you get is the same major corporations controlling everything, and you wind up bitching that what suits the multinationals is not what you want.

                Recently I had a cold and I wanted to get some sugar-free (I'm slightly diabetic) Cold-Eeze lozenges, which seem to help. The only place I could find them was in the big chain stores. A while back I needed a new answering machine and the only place I could find one locally was Walmart. Back in October, I needed a new crankset for my bike. The local bike shop told me that they could order me one for $60. Amazon had a compatible one for $30 delivered. I guess $30 is the cost of my principles. I did buy the crank puller tool from them though, same price as online. I wanted an adapter to allow you to connect a bare internal style hard drive to a USB port. None of the local computer shops had any. They were all out of stock and didn't get any more in for at least the next two weeks, which was when I stopped checking. Amazon has plenty of models to choose from...

                reply to this | link to this | view in chronology ]

                • icon
                  PaulT (profile), 23 Jan 2020 @ 12:26am

                  Re: Re: Re: Re: Re: Re: Re:

                  "Using Vimeo as an example. I sometimes see links to sexually explicit content on there"

                  So, you're searching for stuff that by all normal standards would be blocked outright on YouTube, and using that as an example of why Vimeo is weaker?

                  "Most of the time it won't even include videos with nudity in the search results, even though I set the safe search filter to off."

                  I don't know much about Vimeo as I don't use it as often myself, but is that actually a problem with Vimeo's search or some other setting that's excluding that specific content? Hell, it could just be a problem with stored cookies in your browser that's not letting the safesearch setting be changed correctly. Also, are there flags that the uploaders can set that would exclude them from general search (something likely to be abused if the uploaders are trying to bypass filters)?

                  "There might be a link to a product review, but when I search for product reviews on my own, I get a ton of videos that aren't reviews, but which have the word "review" in the title."

                  So, the problem is the people uploading the videos lying in their titles, and not Vimeo failing to return relevant results based on those titles?

                  I understand the frustration you're describing with both of these issues, but neither of them necessarily scream that Vimeo's search doesn't work to me. Maybe it does suck, but that's not really proof.

                  "I've sent feedback, but it never seems to make any difference. I usually get back a polite "Thank for letting us know, we'll look into it" response."

                  That's bad on them, but at least you are being proactive. Lots of people I know will just complain to themselves and never try letting the service know there's a problem.

                  As for the rest of what you're saying, I understand but it's purely subjective based on your personal needs and the local stores you have access to. Others will have different experiences. The point is, if you find Amazon are the ones you automatically default to all the time because they usually have what you need, any issues you have with them are unlikely to be fixed while they lack real competition in whatever space you use them in.

                  If a company has a default monopoly because nobody bothers using the competition, then the effects of competition (such as raising the quality bar to stay ahead) won't happen.

                  reply to this | link to this | view in chronology ]

                  • identicon
                    Rekrul, 24 Jan 2020 @ 1:35pm

                    Re: Re: Re: Re: Re: Re: Re: Re:

                    So, you're searching for stuff that by all normal standards would be blocked outright on YouTube, and using that as an example of why Vimeo is weaker?

                    It's an example because if they allow that content, you should be able to find it.

                    I don't know much about Vimeo as I don't use it as often myself, but is that actually a problem with Vimeo's search or some other setting that's excluding that specific content? Hell, it could just be a problem with stored cookies in your browser that's not letting the safesearch setting be changed correctly.

                    I don't see any other settings that would pertain to it and my browser deletes the cookies when I close it.

                    As for the rest of what you're saying, I understand but it's purely subjective based on your personal needs and the local stores you have access to.

                    To be fair I live in a relatively small city that doesn't have a lot of specialty stores. I once watched a video on how to make a life-sized corpse decoration for Halloween. Most of the materials were easy to get, but then it said you needed about a quart of liquid latex, which it said would be about $10 at at theater supply store. There aren't any such stores around here and the only thing I could find was tiny little containers (like 5-6oz) at craft stores for about $7. I could order it online, but the shipping costs were almost as much as the cost of the product.

                    There are no electronics store left. We used to have a couple, but they closed many years ago and the only place left with any electronic parts was Radio Shack, now they're gone too. There are no private hardware stores left, only the chains and even they don't have more obscure stuff. I just went to Home Depot to look for small 'eye plates' for a project and they didn't have any. I think there's only one hobby shop left. On the other hand, if I want to get pierced or tattooed, there's about 4 places within walking distance...

                    reply to this | link to this | view in chronology ]

                • identicon
                  Anonymous Coward, 23 Jan 2020 @ 2:46am

                  Re: Re: Re: Re: Re: Re: Re:

                  You do realize that there are videos on YouTube that will not appear in searches, or on the channels video lists, as videos can be set as private, and only people who are given the link can play them.

                  reply to this | link to this | view in chronology ]

                  • identicon
                    Rekrul, 24 Jan 2020 @ 1:24pm

                    Re: Re: Re: Re: Re: Re: Re: Re:

                    You do realize that there are videos on YouTube that will not appear in searches, or on the channels video lists, as videos can be set as private, and only people who are given the link can play them.

                    Yes, I've done that myself with videos that I only wanted friends to see. However it's rare that you search for something in general and only get back 1-2 matches that match what you wanted.

                    reply to this | link to this | view in chronology ]


Add Your Comment

Have a Techdirt Account? Sign in now. Want one? Register here



Subscribe to the Techdirt Daily newsletter




Comment Options:

  • Use markdown. Use plain text.
  • Remember name/email/url (set a cookie)

Close

Add A Reply

Have a Techdirt Account? Sign in now. Want one? Register here



Subscribe to the Techdirt Daily newsletter




Comment Options:

  • Use markdown. Use plain text.
  • Remember name/email/url (set a cookie)

Follow Techdirt
Special Affiliate Offer

Advertisement
Report this ad  |  Hide Techdirt ads
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Chat
Advertisement
Report this ad  |  Hide Techdirt ads
Recent Stories
Advertisement
Report this ad  |  Hide Techdirt ads

Close

Email This

This feature is only available to registered users. Register or sign in to use it.