Hollywood Is Betting On Filtering Mandates, But Working Copyright Algorithms Simply Don't Exist
from the hello-1st-amendment dept
Facebook whistleblower Frances Haugen may not have mentioned copyright in her Congressional testimony or television interviews, but her focus on artificial intelligence (“AI”) and content moderation have a lot of implications for online copyright issues.
For the last decade, Hollywood, the music industry and others have been pushing for more technical solutions for online copyright infringement. One of the biggest asks is for Internet companies to “nerd harder” and figure out algorithms that can identify and remove infringing content. They claim content filters are the solution, and they want the law to force that onto companies.
And they have been successful in parts of the world so far. For example, the recent European Union Copyright Directive placed a filtering mandate on internet platforms. Hollywood and the record labels are pushing the U.S. to follow suit and make platforms liable for copyright infringement by users. They want NIST to develop standards for filtering software, and they are using the power of Congress and the U.S. Copyright Office to push for legislation and/or voluntary agreements to create more filters.
There is one huge problem with all of this: the technology does not exist to do this accurately. What the Facebook whistleblower made clear is that even the most sophisticated AI-based algorithms cannot accurately moderate content. They make tons of mistakes. Haugen even suggested that a huge part of the challenge is the belief that “nerding harder” will work. She blamed Facebook’s mantra to solve problems through technology as the main reason they are struggling with content moderation.
Copyright presents a unique context challenge to algorithms. It’s not easy to automatically determine what is copyright infringement and what is not. Even under today’s existing systems, about a third of takedown requests are potentially problematic, requiring further analysis. Most of these erroneous takedowns are done by algorithms. This analysis can be extremely complicated even for the American judicial system – so much so that the Supreme Court recently had to clarify how to apply the four-part fair use test. In court, each fair use case gets a very individual, fact-based analysis. Current AI-based algorithms are not close to being able to do the needed analysis to determine copyright infringement in fair use cases.
So why is there a big push from Hollywood, the movie industry and others on this? They are smart enough to know that algorithmic solutions are not close and may never be able to handle filtering for infringement accurately.
The reason is they do not want filtering technologies to be accurate. They want filtering technologies to over-correct and take anything that might be infringing off the internet. Congress cannot directly legislate such an overcorrection, because it is a clear violation of the First Amendment. But they might be able to introduce legislation that creates a de facto mandatory filtering requirement. Mandatory filtering legislation imposed via changing the Digital Millennium Copyright Act Section 512’s platform liability regime would lead companies to “voluntarily” implement over-correcting filtering solutions — or otherwise face a constant barrage of losing lawsuits and legal bills for any and all alleged infringement by users. And this could create an end run around the first amendment if a court decided that the company was “voluntarily” implementing.
At this point it is important to recognize the types of activity that we are talking about here: transformative works of creativity, pop-art, criticism and parody. This includes teens sharing lip sync TikToks and videos of your little kids dancing to a song. But fair use doesn’t apply to just the creative arts. It also includes collaborative efforts on an internet platform to develop cybersecurity solutions that require reverse engineering and allows teachers to share materials with students on online education platforms. Documentarians depend heavily on fair use, and efforts to distribute documentaries online would face stiff challenges.
All of these important capabilities would be severely at risk if we forced filtering requirements onto internet platforms via threat of liability. If we let Hollywood and music industry elites and the Members of Congress who do their bidding get their way, the rest of America will lose out.
Josh Lamel is the Executive Director of the Re:Create Coalition. This article was originally posted to the Re:Create Coalition blog.