YouTube Copyright Filters Suck: The 'Beat Saber' And 'Jimmy Fallon' Edition

from the beat-down dept

At this point, it's plainly obvious that YouTube's ContentID platform for doing automated takedowns of videos that supposedly infringe on copyrights is a full on mess. That mess is multi-pronged. The filters themselves suck at identifying actual infringement, and throw up all kinds of false positives. The filters are also so broadly applied that building any nuance into what is blocked and what isn't is basically impossible. Finally, the whole system is so wide open for abuse that it's laughable.

The latest iteration of this concerns Beat Saber, a virtual reality rhythm game where you essentially wield two lightsabers to match the beats and rhythms that go along with the music. The game has become so wildly popular that it was recently featured on The Tonight Show with Jimmy Fallon. That's where things went sideways.

Brie Larson played Beat Saber on The Tonight Show Starring Jimmy Fallon which resulted in the video being uploaded to the show’s YouTube channel. Unfortunately, subsequent uploads with similar gameplay are getting copyright strikes because it appears to share similar gameplay footage, possibly from the same levels as played on the show.

Here is one of the tweets that highlighted the issue about the Beat Saber copyright strikes:


In case you can't see that tweet, it's essentially Beat Saber's team responding to one of the many people who had a let's play video taken down due to a takedown notice... from Jimmy Fallon's show. Confused as to why NBC is taking down videos that include only game footage of Beat Saber? Well, Fallon and his guest played those same levels on his show, leading the ContentID filters to think that the let's play videos were playing part of Fallon's show, when it was actually the other way around: Fallon's show included game footage. In other words, ContentID got it exactly backwards.

And, it should be noted, the folks behind Beat Saber absolutely do want you to upload video of game footage to YouTube.

“This was not planned by anyone, that’s just a really messed up youtube algorithm,” stated a subsequent tweet on Beat Saber’s account. “I wouldn’t be surprised if Jimmy’s team didn’t even know about the fact that this is happening. I will reach out to Jimmy’s team. Maybe they can help, but I am not sure about that. :(”

The Beat Saber team has turned off ContentID detection for the track, but this particular situation is somewhat out of their hands. Fortunately, the developers may have a solution underway for the Beat Saber copyright strikes. A follow-up tweet states that the people behind The Tonight Show Starring Jimmy Fallon are working with YouTube to resolve the issue. In the meantime, the developers advise that it may be helpful for uploaders to dispute the claim should they be one of the affected videos.

And, yet, there are hundreds of these takedowns. No, Beat Saber folks aren't being copyright jerks. No, NBC wasn't trying to takedown let's plays of Beat Saber. Instead, everyone is relying on an automated system that fully sucks at getting copyright questions correct. It sucks so bad, in fact, that they get the order of operations here backwards.

If you need another example that automated filters can't do copyright enforcement, you will never be satisfied.

Filed Under: beat saber, censorship, contentid, copyright, jimmy fallon, takedowns, youtube
Companies: youtube


Reader Comments

Subscribe: RSS

View by: Time | Thread


  • icon
    Thad (profile), 8 May 2019 @ 3:53pm

    And, lest we forget, filters like these are now mandatory in the EU.

    I say "like these" -- actually, most companies don't have Google's resources. Filters are now mandatory in the EU -- and most of them will be worse than ContentID.

    reply to this | link to this | view in chronology ]

    • icon
      Seegras (profile), 9 May 2019 @ 5:01am

      Re:

      "Will be mandatory", actually. As soon as every country has implemented a law following the directive.

      And, as soon as the courts decide the new copyright directives are NOT in fact illegal violations of human rights; and thus the countrys that did not implement the directive after two years are at fault and do have to pay the fines that were imposed because of that, and that they do have to implement the directive.

      Which is not going to happen, because some of the articles, notably article ex-13, is actually in violation of the European declaration of human rights and shouldn't have been accepted ever. Now it will have to be rendered void by judges. Again.

      But I expect article ex-11 (ancillary rights) and ex-12 (expropration of authors by publishers) will remain. We'll see.

      reply to this | link to this | view in chronology ]

      • icon
        Scary Devil Monastery (profile), 9 May 2019 @ 6:22am

        Re: Re:

        "But I expect article ex-11 (ancillary rights) and ex-12 (expropration of authors by publishers) will remain. We'll see."

        Depending on how much of a splash they do. Article 11 will choke the smaller publications to death and make things harder for the major actors...

        ...but the thing is that many smaller newspapers happen to be the unofficial heralds of political parties in the affected countries. There will be a lot of national outrage when those parties lose the ability to have their chosen court reporters render the latest proclamation immediately visible on Google.

        reply to this | link to this | view in chronology ]

    • identicon
      Bobvious, 9 May 2019 @ 5:18am

      Re: Filtering is EUseless

      The Hitchhiker's Travel Guide describes the Copyright Filtering Department of the Internetics Corporation as:

      "A bunch of mindless jerks who'll be the first against the wall when the revolution comes."

      Curiously, an edition of the Encyclopedia Galactica which conveniently fell through a rift in the time-space continuum from 1000 years in the future describes the Copyright Filtering Department of the Internetics Corporation as:

      "A bunch of mindless jerks who were the first against the wall when the revolution came."

      Unfortunately the same Copyright Filtering Department of the Internetics Corporation had issued an automated takedown for that edition of the Encyclopedia in order to thwart this outcome, but unbeknownst to them, a later edition fell through an even earlier rift in the time-space continuum, and it contained the very words used in the takedown order, thus establishing them as copyright protected a priori and therefore preventing their use as a takedown order.

      reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 8 May 2019 @ 4:31pm

    Dirty pirates....

    Obviously those @BeatSaber guys are just dirty pirates trying to steal Jimmy Fallon's show from NBC.

    I for one am glad they got the #BeatDown they deserved. This is what happens when you kids try to steal from those intertube things, now get off my cloud.

    reply to this | link to this | view in chronology ]

  • icon
    Gary (profile), 8 May 2019 @ 4:32pm

    First to File

    It's simple really - Copyright is a "First to File" system. Doesn't matter who created it, the first one to file their work with ContentID gets the credit and everyone else becomes an infringer.

    reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 9 May 2019 @ 6:47am

      Re: First to File

      That sounds like a really bad system

      reply to this | link to this | view in chronology ]

      • icon
        Bergman (profile), 19 May 2019 @ 2:35am

        Re: Re: First to File

        It IS a really bad system. It's a complete dumpster fire on its best day. And it's the best in the world at what it attempts to do, by at least an order of magnitude.

        Watching what happens to Europe now that every site pretty much is mandated to use filters nowhere near as good everywhere in the EU will be HILARIOUS. But only from a safe distance.

        reply to this | link to this | view in chronology ]

  • icon
    TKnarr (profile), 8 May 2019 @ 4:42pm

    I think what Beat Saber should do is publish a letter from it's own lawyers stating that the content i question is owned by Beat Saber, not NBC Universal, it's use on the Jimmy Fallon show is an authorized use, the use by all the uploaders is also authorized by Beat Saber, and that NBC Universal has been informed that it's attempting to claim ownership of material copyrighted by Beat Saber. Then every single person who receives a notice files a DMCA 512(f) claim with that letter attached. Judges may be willing to toss individual 512(f) claims, but I'd bet that if every judge in say Los Angeles was suddenly faced with a couple of hundred 512(f) claims each with a letter from the actual copyright holder attached saying the defendant knew they didn't hold the copyright and weren't authorized to issue a takedown notice that those judges would be much more skeptical of the defendant's claims that it isn't liable for issuing false takedowns.

    reply to this | link to this | view in chronology ]

    • icon
      Thad (profile), 8 May 2019 @ 6:02pm

      Re:

      NBC Universal has been informed that it's attempting to claim ownership of material copyrighted by Beat Saber

      It isn't, though. It's attempting to claim ownership of a clip of a TV show that includes gameplay of Beat Saber.

      I've discussed this before: when a rightsholder uploads a video that it partially owns but which also includes material it doesn't own, then filters don't know how to tell which parts are which. Because computers are not magic.

      reply to this | link to this | view in chronology ]

      • icon
        PaulT (profile), 9 May 2019 @ 12:07am

        Re: Re:

        Yes, that's exactly the problem. Which is why it's also impossible for filters to account for fair use and other legal uses of content. Which is why the magic wands people are trying to demand do not and cannot exist.

        reply to this | link to this | view in chronology ]

        • icon
          Thad (profile), 9 May 2019 @ 10:06am

          Re: Re: Re:

          Yes, exactly.

          It's often difficult for people to determine whether or not a clip is fair use. A computer being able to do it is far off into the realm of science fiction.

          reply to this | link to this | view in chronology ]

      • identicon
        Anonymous Coward, 9 May 2019 @ 9:37am

        Re: Re:

        So the proper action by NBC should have been to EXCLUDE any portions of it's show that contain other copyrighted material from the feed they provide to Content ID (not filtered on the youtube channel).

        This would be the PROPER way to inform Content ID of what is actually owned by NBC, but that would take actual WORK (TM - corporations don't do this anymore so they have to outsource it), someone would have to look at the show, find the parts to exclude from Content ID, and either flag them or filter them out before sending it to Content ID, and this would COST money. So just think about how much money NBC will STAEL when the take over the rights and monetize others pages that they don't have any right to?

        I mean when you are stealing from the 'little guys' it's easier to make it appear to be just a system process (we just send 'our show' to Content ID), and not a carefully planned out scheme to take control of the internet after all; put everything on a television show, claim copyright on the 'broadcast' of that content (even if they don't actually own what they showed and only had rights to use it on the show), and then flag and monetize everything, MUAHAHAHAH).

        Now get off my cloud.

        reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 8 May 2019 @ 6:08pm

      Re:

      faced with a couple of hundred 512(f) claims each with a letter from the actual copyright holder attached saying the defendant knew they didn't hold the copyright and weren't authorized to issue a takedown notice

      Except for one major point, this was NOT a DMCA takedown request, this was Google's own ContentID that is an automated system that will remove / de-monetize / disable a video without ever needing a DMCA takedown request.

      reply to this | link to this | view in chronology ]

      • icon
        Gary (profile), 8 May 2019 @ 6:17pm

        Re: Re:

        Also - Judge still wouldn't care. Hundreds or Thousands of bad notices (even official DCMA ones) are the price of doing business. There is no wristslap forthcoming.

        reply to this | link to this | view in chronology ]

      • icon
        Anonymous Anonymous Coward (profile), 8 May 2019 @ 6:56pm

        Re: Re:

        Doesn't content ID need some input from a copyright holder for it to match to? Yes it does. Someone put sufficient information into ContentID to make it respond this way. Now the real question is, who was that, and how to berate THEM.

        reply to this | link to this | view in chronology ]

      • icon
        PaulT (profile), 9 May 2019 @ 12:14am

        Re: Re:

        Yes, which is a big part of the problem. ContentID exists because the legacy corporations were trying to move that YouTube get stripped of DMCA protections by claiming that they weren't doing enough to stop infringement. So, ContentID gets created as a way of stopping some of the lawsuits. Now, these people can get things blocked without having to go through the perjury aspect of a DMCA notice, and thus not face any charges when they get it wrong.

        reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 8 May 2019 @ 5:46pm

    It was copyright for the people's sake, clearly. If Beat Saber footage wasn't copyrighted why would Jimmy Fallon ever be motivated to appear on TV again? /sarc

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 8 May 2019 @ 5:59pm

    This just means there needs to be a review system that minimizes incidents like this.

    reply to this | link to this | view in chronology ]

    • icon
      PaulT (profile), 9 May 2019 @ 12:17am

      Re:

      Not really. The problem is the volume is still huge (and thus take a long time to filter, especially if the review is done by humans) - and what do you do with the stuff being flagged for review? If you hold it until it's reviewed then it's still effectively blocked, potentially for weeks or longer, losing the viewership and revenue for that video for that period. If you let it play until it's been reviewed and deemed blockable, then the filters are pointless for the first few weeks of any video (at probably their busiest time).

      reply to this | link to this | view in chronology ]

      • identicon
        Anonymous Coward, 9 May 2019 @ 2:43am

        Re: Re:

        If you hold it until it's reviewed then it's still effectively blocked, potentially for weeks or longer,

        If the queue of clips to be reviewed keeps on growing then the only solution is to drop clips from the queue without review or unblocking them. This is a problem and solution well known to publishers, and why they give preferential treatment to these authors that they have already published works from.

        The legacy industries would love the online platforms to become publishers, with everything reviewed before publication, because it would cause 99% plus of new content on the Internet to disappear into that black hole of works awaiting review.

        reply to this | link to this | view in chronology ]

        • icon
          PaulT (profile), 9 May 2019 @ 2:49am

          Re: Re: Re:

          Yes, that is also sadly true. Unless everybody in the queue is deemed equal, then legacy industries would just push to the front of the queue. Which leads to more situations like the above where they get to hide independent content from public view just because they did something vaguely similar.

          reply to this | link to this | view in chronology ]

    • icon
      Scary Devil Monastery (profile), 9 May 2019 @ 2:11am

      Re:

      "This just means there needs to be a review system that minimizes incidents like this."

      Not possible, even in theory. The technology required to make that distinction will be, for all purposes, actual magic.

      reply to this | link to this | view in chronology ]

      • icon
        PaulT (profile), 9 May 2019 @ 2:55am

        Re: Re:

        As Arthur C. Clarke once said: "Any sufficiently advanced technology is indistinguishable from magic"

        It's not impossible to create such a system, it's just way beyond our current technology. It's like asking the Wright Brothers to create a passenger airline. They may be able to conceive of such a thing, but they're a long, long way from it actually being possible.

        reply to this | link to this | view in chronology ]

        • icon
          Stephen T. Stone (profile), 9 May 2019 @ 6:02am

          Making an automated filter system that can understand context and Fair Use is not impossible…but it may as well be, since no technology short of a sentient artificial intelligence would ever be capable of doing that job.

          reply to this | link to this | view in chronology ]

        • icon
          Scary Devil Monastery (profile), 9 May 2019 @ 6:27am

          Re: Re: Re:

          "It's not impossible to create such a system, it's just way beyond our current technology. It's like asking the Wright Brothers to create a passenger airline."

          It's a bit worse than that and Clarke's paradigm bites harder than what is merely "conceivable".

          By the time automated filters can accomplish what everyone now has to ask of them we will be in the middle of debating whether warp drives, matter replicators and AI's should be allowed as toy components.

          reply to this | link to this | view in chronology ]

          • icon
            PaulT (profile), 9 May 2019 @ 6:46am

            Re: Re: Re: Re:

            Well, yes, by the time we have AI that's capable of doing these thing we'll have other new things to be concerned with. For example - when an AI has the creative ability to assess subjective matter as well as a human, what are the implications of that creative ability applied to create new art? Does their creative output qualify for copyright, what are the implications if a human copies their work if not and so on. What happens when an AI can literally create any combination of notes that it's possible for any human composer to conceive of? And so on...

            The magical solutions that are being demanded do introduce a lot more problems once the magic is provided.

            reply to this | link to this | view in chronology ]

            • icon
              Bamboo Harvester (profile), 9 May 2019 @ 9:02am

              Re: Re: Re: Re: Re:

              Won't happen. Big Red Button is the most likely scenario.

              Five days at most after the first AI attains sentience, humans are extinct.

              reply to this | link to this | view in chronology ]

        • identicon
          Anonymous Coward, 9 May 2019 @ 8:01am

          Re: Re: Re:

          They may be able to conceive of such a thing, but they're a long, long way from it actually being possible.

          Which, by definition, means it is impossible, i.e. not possible.

          It's not impossible to create such a system, it's just way beyond our current technology.

          So this is wrong in that it is impossible to create such a system. Boom! Out-pedanted.

          reply to this | link to this | view in chronology ]

    • identicon
      Rekrul, 9 May 2019 @ 9:18am

      Re:

      This just means there needs to be a review system that minimizes incidents like this.

      There is. Here's video of Google's customer service department hard at work;

      https://www.youtube.com/watch?v=Wgw-tzbKJ60

      reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 8 May 2019 @ 7:18pm

    And every notice, even if disupted and quashed, probably still counts as a strike. Each strike brings them closer to a channel being banned and its content deleted.

    reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 8 May 2019 @ 10:13pm

      Re:

      The courts need to step in to let the websites know that a failed notice is not a strike.

      reply to this | link to this | view in chronology ]

      • identicon
        Anonymous Coward, 9 May 2019 @ 12:50am

        Re: Re:

        Several issues with that, despite the simple and intuitive idea that failed notices should really not be strikes:

        • Someone, or some panel, needs to decide that the notice counts as a failed notice. Lenz vs. Universal comes to mind; a sufficiently rich plaintiff is fully capable of dragging your contested claim for as long as it takes.

        • You run into the same issues with why human review systems don't work for copyright filters: the sheer number of claims to deal with. The amount of time taken for a team of reviewers to decide if a claim is valid or not, long as it is, pales in comparison to the time taken for a court to see if a plaintiff is bullshitting and decide whether or not to tell the plaintiff to jump in the lake.

        • Your file, video, sound, etc. which was falsely flagged? Still stays down. Even longer if things get dragged to the court level. Which, for the plaintiff, was the aim all along. So what incentivizes a plaintiff against submitting countless false claims for the courts to process? Absolutely nothing.

        reply to this | link to this | view in chronology ]

  • icon
    That Anonymous Coward (profile), 8 May 2019 @ 10:43pm

    Google/Alphabet whatever...
    Somewhere along the line they made 1 simple decision to streamline their product, it is based on a horribly flawed idea but it protects them from the Content Cartel's and their large cash warchests making their lives suck.

    The default assumption is "Only Corporations Can Own Copyrights".
    If Google didn't jump when they got a DMCA notice, they got sued & the law promises huge damages, even for shitty claims.
    Google started vetting the claims only because of smart moves like the HBO enforcement company demanding HBO.com be delisted.
    The law 'suggests' there might be a penalty for bogus claims... you let me know how much those are when they finally happen.
    The law clearly defines how high recipients have to jump, how fast, and how much it hurts to even stop and think for a moment.

    ContentID "solved" the problem of having to pay humans to make decisions (that even when correct pissed off a corporations who would rattle their sabres to get their way), because only corporations can own copyrights. We'll match everything to content you send us that you 'own' and the world will be better. News report features a video taken by someone else, well there is no way the uploader who put it up 5 days before owns that content, they are time traveling bastards stealing from the poor corporations.
    The other fun side effect is "corporations" that exist to just make claims, drop claims, make claims again to steal cash... which has now moved onto pay us (well b/c most YTers don't make that much) or we'll delete you with copyright strikes.

    Imagine if someone just used a tiny bit of meta data in the automated system... This video was uploaded 4 days before this new clip was added... perhaps it is the original & should have a human verify this simple fact. Of course then that invalidates a portion of what ContentID scans for & the Cartels can imagine how smart pirates would just fake it all to steal all the content & money from them so we can't have common sense involved.

    YouTube gets paid, no matter who gets the revenue in the end.
    They have no reason to worry, there are what 10 years of content uploaded each hour, each one churning out a tiny fraction of a cent & as long as Google gets 9/10s of that fraction they are fine with keeping the corps pacified.

    Until the law reflects that society has rights & standing enough to be deserving of compensation form the law on par with what corps get, its going to keep being lopsided & abused. Corps talk about how their content is abused & millions of dollars are lost b/c a segment of a tv show using someone elses video existed first, but no one notices the downside of the original is often deleted & that creator wonders why bother because their content can be stolen from them wholesale.

    We have a heavily lopsided system that hopefully soon will reach the breaking point where all ideas are owned by corporations & we can just accept the best we cna hope for is Sharknado 89.

    reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 8 May 2019 @ 11:29pm

      Re:

      but it protects them from the Content Cartel's and their large cash warchests making their lives suck.

      I can't imagine that Google / Alphabet doesn't have a bigger chest of money to blow on lawyers if they really wanted to fight the Content Cartel. Once they moved from geek culture to corporate culture in the upper management levels, spending money on things because it was the right thing to do was probably the first of the budget cuts. Think about Google Fiber, if they wanted to, they could spend the money to keep the fight going and still not need to worry that it would bankrupt the company, but instead, those types of expenses where the first to go.

      reply to this | link to this | view in chronology ]

      • icon
        Zgaidin (profile), 9 May 2019 @ 4:02am

        Re: Re:

        Do they have a bigger warchest than NBC Universal? Absolutely. Bigger than NBC + ABC + CBS? Probably. Bigger than all the TV production companies + movie production companies + music production companies + video game production companies + book/magazine production companies? Absolutely not.

        reply to this | link to this | view in chronology ]

      • identicon
        Anonymous Coward, 9 May 2019 @ 5:47am

        Re: Re:

        Pretty sure ContentID came about after Google spent nearly 10 years fighting Paramount in court and the best they could get was a draw (which was better than several other internet companies that went bankrupt fighting).

        The main problem is the DMCA which is so horribly one-sided (mainly because it was written by the legacy content industries) that Google are limited in what they can do with ContentID if they don't want to be dragged back into court, and the only real defence to a DMCA claim as the US ISP's are finding out is to just accept the notices as fact regardless of how false they are.

        Though one way for Google to put pressure on rightsholders would be to stop whitelisting domains in search and instead just honour every takedown request, so the next time HBO claim their own website is a pirate site just take down the links, same when they claim the BBC, imdb or amazon are infringing delist them but also send them an email informing them who reported their sites.

        Whilst another option would just be for Google to buy out a couple of the legacy entertainment companies - somewhat surprised they (or the likes of facebook, amazon or apple) didn't bid for Fox.

        reply to this | link to this | view in chronology ]

      • icon
        That Anonymous Coward (profile), 9 May 2019 @ 6:27am

        Re: Re:

        And all it would have taken is a motion to have YT taken offline & the servers frozen to destroy a large source of income for them.

        One might point at what they got the feds to do to Mega...
        'refusing to come here to be served is a flight from prosecution on charges you've never been served on, because you don't really have ties to the US.'

        Look at all of the blogs they got the FBI to take down on their say-so, never turned over alleged evidence of crimes, courts saw evidence the label supplied the music & authorized them to post it online... the government still held onto the domain & servers for months after they knew they had been lied to.

        We might not like it, but Google picks the battles they have to.
        They gave them ContentID, they accepted DMCA notices & acted on them despite that not actually being in the law (because they don;t host it, just index it).

        You might think the courts would totally understand all of this & rule correctly... I would ask have you recently suffered trauma to your brain. Courts routinely rule wrong on tech b/c no lawyer would ever lie to them & they are often to old to give a shit. Tell them a story about how its like backing a truck up to a series of tubes & that clogs them & you get rulings saying you can't have fatser speeds because...

        reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 9 May 2019 @ 4:59am

    No, NBC wasn't trying to takedown let's plays of Beat Saber.
    This is where the article is mistaken. This is precisely what NBC did.

    The ContentID system isn't just a "dumb" algorithm and it didn't "get it backward". It did precisely what it was told to do.

    What many don't know is the movie studios and record labels have a special administration panel to YouTube's ContentID feature.

    When they upload new content, like a music video or movie trailer, they have the option to enable special parameters of the ContentID system.

    If you've ever wondered why so many bad Sonic trailers still remain on YouTube, this is because Paramount didn't set the flag to remove the trailer. Most movie studios rarely do, because they want the movie to be advertised.

    But as for the Fallon clip, some dipshit set the option to remove content similar to the new clip.

    Thus, ContentID went to work applying its 45%+ matching algorithms to the most popular videos first, before filtering down to the rest of YouTube.

    This was helped further by the music score in the clip because, and this should surprise no one, music is easily identified than a % video chance.

    Jim Stirling picked up on this with his videos (which he doesn't use for ad revenue). Rather than upload a rant video, he twisted the rules to the point he adds in copyright material of multiple studios/labels so NONE of them can claim the video.

    Pure genius, but goes to show it's not ContentID at fault here.

    It's the fact Google gave the gatekeepers additional options we simpletons aren't allowed access to so Google can continue reaping in billions off ad revenue profits.

    If everyone applied the same tactics to their videos as Jim Stirling does, well, maybe then Google and the gatekeepers will take notice.

    Until then, expect 100% more uploads of content creators bitching because they don't fucking understand FAIR USE IS STILL COPYRIGHT INFRINGEMENT AND NO LAW HAS SET WHAT CONSTITUTES "FAIR USE".

    Want to avoid takedowns? Stop using content of others.

    It's literally that simple.

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 9 May 2019 @ 6:51am

    So ContentID is similar to FacialID in that the results are less than disappointing. But no worries, full speed ahead with deployment and the bugs will get worked out later. Yeah uh-huh. lol

    reply to this | link to this | view in chronology ]

  • identicon
    bobob, 9 May 2019 @ 1:54pm

    The people who are having their videos taken down erroneously either because of over zealous nitpicking on the part of the entertainment industry or because google is catering to the entertainment industry, need to band together for a class action. With enough claimants, the loss of revenue from demonitization could be substantial enough to make it worth filing and a win would go a long way toward reducing frivoulous takedowns.

    reply to this | link to this | view in chronology ]


Add Your Comment

Have a Techdirt Account? Sign in now. Want one? Register here



Subscribe to the Techdirt Daily newsletter




Comment Options:

  • Use markdown. Use plain text.
  • Remember name/email/url (set a cookie)

Close

Add A Reply

Have a Techdirt Account? Sign in now. Want one? Register here



Subscribe to the Techdirt Daily newsletter




Comment Options:

  • Use markdown. Use plain text.
  • Remember name/email/url (set a cookie)

Follow Techdirt
Techdirt Gear
Shop Now: I Invented Email
Advertisement
Report this ad  |  Hide Techdirt ads
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Chat
Advertisement
Report this ad  |  Hide Techdirt ads
Recent Stories
Advertisement
Report this ad  |  Hide Techdirt ads

Close

Email This

This feature is only available to registered users. Register or sign in to use it.