EFF Argues That Automated Bogus DMCA Takedowns Violate The Law And Are Subject To Sanctions

from the yes,-but... dept

Having just been victimized by a bogus DMCA takedown notice that censored our content, I’m certainly aware of ways in which the process needs to improve (a notice-and-notice provision, rather than a notice-and-takedown provision, would be a big, big start). However, as we have detailed here in the past, these automated takedowns are pretty typical—and they’re becoming an issue in a particular lawsuit. Hollywood went after Hotfile pretty strongly, but as part of Hotfile’s countersuit showed, Warner Bros. in particular seemed to have a habit of issuing takedown orders on content it had no rights to.

That’s a pretty big concern, no matter what the “intentions” of those breaking the law. Warner Bros.’ response takes a pretty cavalier attitude, more or less amounting to “hey, mistakes were made; no biggie” . The specific law on bogus takedowns — 512(f) of the DMCA — only says that there’s punishment for those who “knowingly materially misrepresent.” Warner Bros., of course, insists that just making a mistake does not trip that wire.

The EFF has now jumped in with an amicus brief that argues otherwise. The argument is pretty straightforward: if you’re doing automated, or semi-automated takedown notices without reviewing them, the efforts are so careless and negligent that they clearly misrepresent the claims needed for a legitimate DMCA takedown. The filing notes that such automated takedowns are a real problem (even citing our recent experience), and that if such automated takedowns aren’t liable for sanctions under 512(f) then that section is effectively meaningless.

Indeed, if Warner were correct, which it is not, Section 512(f) would become largely superfluous. Any company could sidestep accountability for improper takedowns by simply outsourcing the process to a computer. What is worse, copyright owners would have a perverse incentive to dumb-down the process, removing human review so as to avoid the possibility of any form of subjective belief. The tragic consequences for lawful uses are obvious: untold numbers of legal videos would be taken down, whether or not the uses were fair or even licensed.

Imagine the potential for mischief: Let’s say that Warner does not like competition from Universal. It could set a computer to search through Universal’s online presence, with the loosest possible settings, and issue takedown after takedown to Universal’s ISP for spurious claims. Nor is this scenario far-fetched: as noted above, supra at 4-5, anticompetitive uses of the DMCA takedown process are commonplace.

Among other things, the EFF filing highlights the Lentz v. Universal ruling that found that those filing takedowns have to take fair use into account — and pointing out that you can’t take fair use into account if you’re automating takedowns.

Unfortunately, historically, 512(f) has been a pretty toothless part of the law in response to bogus takedowns. The bar has been way too high. This is partly why we thought the parallel “remedy” that was found in SOPA was also likely to be equally useless. Attempts to make it stronger were rejected because those behind the bill knew it was toothless. Having the court agree with the EFF’s position on this would be a huge help in giving those who are victims of bogus takedowns a tool to fight back.

Filed Under: , , ,

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “EFF Argues That Automated Bogus DMCA Takedowns Violate The Law And Are Subject To Sanctions”

Subscribe: RSS Leave a comment
Anonymous Coward says:

I think bogus takedowns are definitely a problem, but isn’t the argument that companies are passing off liability by having an automated system almost precisely the same argument that’s used to absolve aggregation sites from incidental infringement?

If there’s far too much content being uploaded to sites like Youtube that it’s basically impossible to monitor it all by humans, doesn’t that argument also cut both ways? How can a company protect itself across the internet without an automated system? And isn’t there always going to be some mistakes made by such a system?

I mean, what’s the ratio of bogus takedown requests to valid ones? What’s the ratio of legal sharing of a piece of media to infringing uses? The argument that DMCA is broken because of some bogus requests seems suspiciously close to the reasoning that because some uses of a site are infringing, then that site must be taken down.

:Lobo Santo (profile) says:

Double-Edged Sword.

The answer to your initials queries boils down to “individual responsibility.”

The individual person/company who uploaded a whatever to your ineptly dubbed ‘aggregation site’ is the one responsible for it’s content–eg the party to be sued for having done something illegal.

Likewise, the individual person/company who send a fraudulent (whatnot) DMCA is the one responsible for it–eg the party to be sued for having done something illegal.

Ta-Da! Logic is easy and fun.

GMacGuffin says:


The point being that they can use the automated system to find the potentially infringing materials, but before a letter goes out virtually guaranteeing content will be removed, a human being should look at it to determine if it even contains the material claims (see bogus TechDirt Sopa article takedown), and after that, whether the content is actually fair use (see Prince in background for a few seconds while baby dances). Have at it for automated finding systems. DMCA Takedowns have to be signed under penalty of perjury, and thus far, robots are not legally competent to do that.

Anonymous Coward says:


Aggregation sites are of a summarizing nature, while takedowns are actually destructive: they remove (aka censor) information from the internet.

Content on youtube etc are uploaded by users who could be held liable in case of infringement. In case of automated takedowns, the list of content is not a summary of what other users have done: rather, it can be attributed to whoever initiates the automated takedowns (i.e. Warner in this case).

I might agree with you if the automated systems simply aggregated takedown requests from other sources, but in that case those other sources should be held liable in case of spurious takedowns.

John Fenderson (profile) says:


isn’t the argument that companies are passing off liability by having an automated system almost precisely the same argument that’s used to absolve aggregation sites from incidental infringement?

Not quite. The argument about sites that allow users to post content (not necessarily “aggregation sites”) includes a few factors that don’t apply to content producers.

The main one that applies here is not simply volume alone, but also that it is very difficult for third-party sites to be able to tell if content is infringing or not even when a human looks at it. Combine that with volume and it makes it impossible for sites to safely have user-supplied content at all. Combine that fact with the fact that this law is placing a burden on these sites that is close to, if not, unfair to them, and you have a strong reason to give greater leeway to those sites.

Copyright holders have an easier time of it. They only care about a subset of the total content, so the burden is lower for them. They presumably know who their licensees are, so can more easily tell if the content might be infringing, and, lastly, it’s ethically their problem in the first place, so they should bear the burden.

Rekrul says:

How about this; Each time a company is proven to have filed a false DMCA complaint, they are barred from submitting any others for a period of one month. These penalties will be cumulative, so that if they’re found to have filed 12 bogus takedown notices, they’re barred from filing any more for a year.

At the rate these companies send out notices, after the first round, they probably won’t be able to send anymore for a decade.

Anonymous Coward says:

Intentional bogus takedowns are an issue for sure, companies using DMCA to cause something they don’t like to be taken down without merit is an issue.

Unintentional or automated takedowns I think are a different game. Just like what DMCA permits (“oops”), I think that there should be an equal level of acceptance of error on the other side. Content owners are seemingly responsible to police the entire internet, and you cannot blame them for trying to automate at least part of that endless task.

EFF really needs to start understanding that there are issues for everyone involved, on both sides of the situation.

ToFit says:


The “Oops” have real costs to independant artists. As a total of earning income one mistake situation might be enough to take down a newcomer upstart media business. The established guys can survive a few take down and legal fights. The penalty needs to fit the size. I’d say since an individual can be sued for several million on one individual intellectual property. A larger corporate punishment for falsely claiming ownership should be at least 10x or more the individual amount. Plus the new media business should still pursue the suit on business oportunity loss as well.

Benjo (profile) says:


Content owners wasting their time policing the entire internet are the ones wasting time/money that could be better spent trying to understand their customer base.

And you seem to be making a line between intentional and unintentional/automated. The way the law is currently, the takedowns fall in the realm of intentional, automated, and anti-competitive.

I do agree with one thing, you cannot blame them for committing this shitty practice when they face almost no legal repercussions for it (and taking into consideration that they think it is good business practice, which probably couldn’t be farther from the truth).

simple simon says:


Doesn’t “always going to be some mistakes” equal “knowingly materially misrepresent”? I would think that the first time is a mistake, every time after that is “knowingly.” If not, how many times can they make the same mistake with the same software before it becomes “knowingly”?

If they can use automated software to mistakenly take down Web sites, can’t I use automated software mistakenly download their files?

Digitari says:

RE Estimated Vs Actual

I’ve never understood how “alleged” loss is more important to “actual” loss.

I have yet to see any actual, proven loss of income due to infringement, yet have read many stories of actual loss in the pursuit of the alleged loss.

Aren’t facts supposed to have more weight then guess’ in court??

tony says:

We need both: Notice-Takedown and Notice-Notice

The problem with Notice-Takedown is that that party issuing the notice has almost all the power due to the toothlessness of current laws.

It seems we need second option for right holders, Notice-Notice. The Notice-Notice system would be relatively toothless for the issuer, think of it as safe-harbor. However, failure to comply with a valid notice should have or increase penalty for the recipient.

This should not replace Notice-Takedown, but complement it. For Notice-Takedown we should greatly increase the penalty for the issuer.

By having both Notice-Takedown and Notice-Notice it would allow rights holders to build automated systems with Notice-Notice with little liability. However, they would retain the
Notice-Takedown rights with greatly increased liability.

TtfnJohn (profile) says:


Assuming it’s a song or part of one then it ought to be named in the takedown notice and the person reviewing it only needs to listen to the “infringing” upload and the “legitimate” one the company undoubtedly has.

If it’s a deep mix, beat, or whatever then yes, you need a musician to identify it but so what? You’re attempting to protect a copyright and, as the RIAA is tirelessly telling us, valuable source of income.

And I’m sure the RIAA has to have access to a few highly trained musicians with great ears and knowledge needed to do that work. 🙂

DCX2 says:


Why should they be equal? Do you sign anything under penalty of perjury when e.g. submitting a video to YouTube?

Even if we ignore that non-trivial detail, in order to get an equal level of acceptance of error on both sides, there should be an opportunity to file a counter-notice before the content is taken down. This would prevent the irreparable harm that has already been suffered by multiple parties in response to false DMCA takedowns.

Anonymous Coward says:


You don’t think and “oops” in posting up someones music or software package with a hack to the torrents is any better?

The rest of your post makes no sense. Already the content producers face incredible difficulties and costs with DMCA, and now you are suggesting that their risk of error should punish them so greatly that they wouldn’t want to even try?

Why not just abolish copyright and get it over with?

Anonymous Coward says:

RE Estimated Vs Actual

If taking down a page causes “loss”, then using someone’s work without permission also created “loss” – at least in loss of control of something they made and own.

Facts have weight in court, but clearly the fact that piracy is widespread, and that entire business models are predicated on it won’t make the case any easier for you piracy apologists to make.

Anonymous Coward says:


“Already the content producers face incredible difficulties and costs with DMCA, and now you are suggesting that their risk of error should punish them so greatly that they wouldn’t want to even try?”

What “incredible difficulties and risks” do content licensors (They’re not “producers”. Artists are “producers”.) face?

ChrisB (profile) says:


I can’t tell if you are being sarcastic or not. If it really is that hard to tell if something is infringing, I’m going to suggest, it isn’t. If you are straining to hear a snippet of a pop song in a home video uploaded to YouTube, it is probably fair use.

In addition, if it would take a “highly trained musician” to tell if stuff on YouTube is infringing, how can YouTube be expected to tell?

Ninja (profile) says:

We need both: Notice-Takedown and Notice-Notice

I don’t know how your comment isn’t marked insightful. While it does not address all the problems within copyright nowadays it would reduce the number of bogus takedown notices.

The keyword is indeed penalty for misusing. Notice-Notice should also get a limiter: if one notice was sent and challenged the automated system should not be allowed to resend it. The only way would be the takedown process and this would force the copyright holder to review the content to avoid penalties for sending bogus takedown notices.

Of course, we need copyright laws reviewed to include broader and clearer fair use and public domain text and it must clearly mention remixes, reuse and non-commercial use.

Mario says:

Automated upload

If an automated takedown can be considered an honest mistake, why not start automated upload of material “how could I have known it’s pirated, the computer decided to upload the stuff…”, “my computer made a mistake ripping this song from a disk and uploading it to cyberlocker of its choice, I will promptly reset it…”, “a flaw in my super_ripper_uploader software caused this mishap…”

Machin Shin (profile) says:


What would you think if someone came and took your car leaving you a note that it was reported as a stolen car. You then go to ask why it was taken and the response you get is”

“oh we are sorry, car theft is just so rampant we setup an automated system to repossess cars. It makes mistakes sometimes. We are very sorry about that, your car is out back, it might be a bit dinged up now though, we are so sorry about that”

Machin Shin (profile) says:

Automated upload

Or why not write a story or song. Make something, just about anything and copyright it. Then setup your own “automated system” that is set to look for anything matching your copyrighted content, but only search the sites belonging to big media. Then when you spit out tons of totally bogus take down requests you can point and say, “I didn’t know it wasn’t infringing, my system was checking them”

Anonymous Coward says:


Have you looked at how many takedowns there are?

Let’s use the old Youtube logic: If it’s too hard for Youtube to check all the videos uploaded by hand, then the SAME EXACT STANDARD should be used for rights holders making DMCA claims. They should be given the same sort of “too big to check” leeway.

It’s hard to justify any other solution.

Anonymous Coward says:


No, it doesn’t go both ways. As a copyright owner you know exactly what things you own the copyrights to. Which is a minuscule fraction of all copyrighted works. Finding possibly related infringing works by computer and then using humans to find actual infringement (of the severely small subset) is significantly less work than an aggregator looking at ALL their content for ALL copyrighted works.

Even with a magic system that knew all copyrighted content, having an aggregator look through all the possible infringing items and comparing them to all copyrighted content is theoretically next to impossible.

Copyright owners need only search for the (relatively) few works they own and take down those. That’s a significantly easier problem to solve.

Anyway, asking another industry to do YOUR job would be laughed at in any other case besides copyright monitoring. Copyright owners need to pay for their own monitoring, not demand Google to pay the bill.

Anonymous Coward says:


Google has to check all uploaded songs against all copyrighted works (music, video, images, etc.). This can be lowballed at X total uploads and 100M copyrighted works.

A copyright owner needs only check all uploaded songs against their small subset of works. This lowballed at X total uploads and ~50K copyright works for a pretty large owner.

Do you see the huge difference in scale?

Anonymous Coward says:


No real difference of scale here – the same number of uploads would have to be checked. With digital fingering print and the like, Google / Youtube would have a very good chance of catching much of it.

Heck, simple deal, really: a song / music video, not uploaded by the original source, could be just linked to the original source, if they have an approved version online. End issue.

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...