Techdirt's think tank, the Copia Institute, is working with the Trust & Safety Professional Association and its sister organization, the Trust & Safety Foundation, to produce an ongoing series of case studies about content moderation decisions. These case studies are presented in a neutral fashion, not aiming to criticize or applaud any particular decision, but to highlight the many different challenges that content moderators face and the tradeoffs they result in. Find more case studies here on Techdirt and on the TSF website.

Content Moderation Case Study: Electric Truck Company Uses Copyright Claims To Hide Criticism (2020)

from the copyright-as-censorship dept

Summary: There are many content moderation challenges that companies face, but complications arise when users or companies try to make use of copyright law as a tool to block criticism. In the US, the laws around content that allegedly infringes on a copyright holder’s rights are different than most other types of content, and that creates some interesting challenges in the content moderation space.

Specifically, under Section 512 of the Digital Millennium Copyright Act (DMCA), online service providers who do not wish to be held liable for user-posted material that infringes copyright need to take a few steps to be free of liability. Key among those steps is having a “notice-and-takedown” process, in which a copyright holder can notify the website of allegedly infringing material; and if the website removes access to the work, it cannot be held liable for the infringement.

This process creates a strong incentive for websites to remove content upon receiving a takedown notice, as doing so automatically protects the site. However, this strong incentive for the removal of content has also created a different kind of incentive: those who wish to have content removed from the internet can submit takedown notices claiming copyright infringement, even if the work does not infringe on copyright. This creates an interesting challenge for companies hosting content: determining when a copyright takedown notice has been submitted for illegitimate purposes.

In September of 2020, news was released that Nikola, an alternative energy truck company’s promotional video showing its new hydrogen fuel cell truck driving along a highway was false. A report by a research firm criticized the company, saying that the truck did not move under its own propulsion. As it turned out, the truck did not actually have a hydrogen fuel cell and was instead filmed rolling downhill; Nikola admitted that it had faked its promotional video. In Nikola’s response, it admits that the truck did not move on its own, but it still claimed that the original report was “false and defamatory.” While the response from Nikola does highlight areas where it disagrees with the way in which the research firm wrote about the company’s efforts, it does not identify any actual “false” statements of fact.

Soon after this, many YouTube creators who made videos about the situation discovered that their videos about the incident were being removed due to copyright claims from Nikola. While video creators did use some of the footage of the faked promotional video in their YouTube videos, they also noted that it was clearly fair use, as they were reporting on the controversy and just using a short snippet of Nikola’s faked promotional video, often presenting it in much longer videos with commentary.

When asked about the situation, Nikola and YouTube spokespeople seemed to give very different responses. Ars Technica’s Jon Brodkin posted the comments from each side by side:

“YouTube regularly identifies copyright violations of Nikola content and shares the lists of videos with us,” a Nikola spokesperson told Ars. “Based on YouTube’s information, our initial action was to submit takedown requests to remove the content that was used without our permission. We will continue to evaluate flagged videos on a case-by-case basis.”

YouTube offered a different description, saying that Nikola simply took advantage of the Copyright Match Tool that’s available to people in the YouTube Partner Program.

“Nikola has access to our copyright match tool, which does not automatically remove any videos,” YouTube told the [Financial Times]. “Users must fill out a copyright removal request form, and when doing so we remind them to consider exceptions to copyright law. Anyone who believes their reuse of a video or segment is protected by fair use can file a counter-notice.”

Company Considerations:

  • Given the potential liability from not taking down an infringing video, how much should YouTube investigate whether or not a copyright claim is legitimate?
  • Is there a scalable process that will allow the company to review copyright takedowns to determine whether or not they are seeking to take down content for unrelated reasons?
  • What kind of review process should be put in place to handle situations, like what happened with Nikola, where a set of videos are reported as copyright violations and are taken down because those videos featured the copyrighted material as news or commentary, and the copyright infringement takedown requests were improper?
  • Improper takedowns can reflect poorly on the internet platform that removes the content, but often make sense to avoid potential liability. Are there better ways to balance these two competing pressures?

Issue Considerations:

  • Copyright is one of the few laws in the US that can be used to pressure a website to take down content. Given that the incentives support both overblocking and false reporting, are there better approaches that might protect speech, while giving companies more ability to investigate the legitimacy of infringement claims?
  • Under the current DMCA 512 structure, users can file a counternotice with the website, but the copyright holder is also informed of this and given 10 days to file a lawsuit. The threat of a lawsuit often disincentivizes counternotices. Are there better systems enabling those who feel wrongfully targeted to express their concerns about a copyright claim?

Resolution: After the press picked up on the story of these questionable takedown notices, many of the YouTube creators found that the takedown demands had been dropped by Nikola.

In July of 2021, nine months after the news broke of the faked videos, Nikola’s founder Trevor Milton was charged with securities fraud by the SEC for the faked videos.

Originally posted to the Trust & Safety Foundation website.

Filed Under: , , , , , ,
Companies: nikola, youtube

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Content Moderation Case Study: Electric Truck Company Uses Copyright Claims To Hide Criticism (2020)”

Subscribe: RSS Leave a comment
4 Comments
Ehud Gavron (profile) says:

The personal touch of great customer service.

Mike has covered the issues with automated takedown requests and the resulting takedown ad nauseum. TL;DR – It doesn’t work.

On the ISP end, we received notices daily of our clients’ file sharing. No matter how we replied we got crickets. Finally we updated our website’s DMCA policy to say that if they can’t comply with the law we’ll be happy to educate them, and the rates. More crickets but we could just delete and ignore. ALL takedown notices in a 20 year period were legally deficient.

Moral of that one: end-user servicing companies can delete and ignore. More on this in a second.

Second Example: At home we have two open access points, and every now and then an email appears from CableCo giving a notice of infringement. No matter how I’ve replied that too gets crickets.

Moral of that one: intermediary companies don’t care either.

Now on all the "delete and ignore stuff", people often frame this –as in this article– that the safe harbor provisions means "if you bow to every request you’ll be safe from expensive litigation, and NOBODY wants to do that."

Safe Harbor: there are plenty of cases where that has not worked out for the service provider who did nothing wrong and everything right. A "safe harbor" that sinks your boat is not worth anything.

Lawsuit Costs: when litigation occurs both sides pay a lot. With the exception of absurdly overinflated statutory fines, it’s often not worth the squeeze to get the juice.

My conclusions: don’t hire law-profs and junior newbs to represent you against Sony. Sorry, Mr. Tenenbaum. Secondly, if you’re going to hack Sony’s equipment, don’t settle by admitting you won’t do stuff you didn’t admit you did. Sorry, Mr. Hotz. And finally, if we don’t tolerate bullying in schools (but it’s there) and the military (it’s there) and the workplace (it’s there) … why do we allow these companies to bully everyone… and then blame the victim for not obeying instructions, or not blame the bully?

Ehud
Tucson, AZ US

Ehud Gavron (profile) says:

Re: Re:

Yes, there is that. There are also penalties under section f that are supposed to apply to those who file false DMCA complaints. To the best of my knowledge no court has applied those penalties.

In the current shove to remove women’s rights, undo legal votes, and allow right-wing nutjobs to undo the 1st amendment and any subsequent protections (like section 230)… it might behoove these elected non-representatives to take a step back from their petty 2-party squabbles and consider:

  • clearinghouse for notices so that the ability to verify a human identity of the notice filer exists. See more on why below…
  • real penalties to those whose DMCA notices contain false information. Note: a script is unable to swear under oath.
  • add penalties for legally deficient notices. Judges don’t let deficient filings get by – neither should the law.
  • fee shifting. If a DMCA notice is challenged and removed, filer pays the winner. This also implies no script notices.
  • fair-use IS part of the copyright law (17USC) and it should be considered as a RIGHT … PRIOR to any takedown. Failure to do so should be penalized. How about if someone takes down my fair-use content I get statutory damages of $150,000 multiplied by everyone who attempts to access that content?

Of course none of this will happen, because

Oh and they flat out own enough congress critters to get their way. — That Anonymous Coward

My very respectful regards and thanks to the veterans who have served to protect the freedoms we have today, and the sacrifices made by them and their families. Veterans’ Day in the US is all day. Thank a vet.

E

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...
Older Stuff
15:43 Content Moderation Case Study: Facebook Struggles To Correctly Moderate The Word 'Hoe' (2021) (21)
15:32 Content Moderation Case Study: Linkedin Blocks Access To Journalist Profiles In China (2021) (1)
16:12 Content Moderation Case Studies: Snapchat Disables GIPHY Integration After Racist 'Sticker' Is Discovered (2018) (11)
15:30 Content Moderation Case Study: Tumblr's Approach To Adult Content (2013) (5)
15:41 Content Moderation Case Study: Twitter's Self-Deleting Tweets Feature Creates New Moderation Problems (2)
15:47 Content Moderation Case Studies: Coca Cola Realizes Custom Bottle Labels Involve Moderation Issues (2021) (14)
15:28 Content Moderation Case Study: Bing Search Results Erases Images Of 'Tank Man' On Anniversary Of Tiananmen Square Crackdown (2021) (33)
15:32 Content Moderation Case Study: Twitter Removes 'Verified' Badge In Response To Policy Violations (2017) (8)
15:36 Content Moderation Case Study: Spam "Hacks" in Among Us (2020) (4)
15:37 Content Moderation Case Study: YouTube Deals With Disturbing Content Disguised As Videos For Kids (2017) (11)
15:48 Content Moderation Case Study: Twitter Temporarily Locks Account Of Indian Technology Minister For Copyright Violations (2021) (8)
15:45 Content Moderation Case Study: Spotify Comes Under Fire For Hosting Joe Rogan's Podcast (2020) (64)
15:48 Content Moderation Case Study: Twitter Experiences Problems Moderating Audio Tweets (2020) (6)
15:48 Content Moderation Case Study: Dealing With 'Cheap Fake' Modified Political Videos (2020) (9)
15:35 Content Moderation Case Study: Facebook Removes Image Of Two Men Kissing (2011) (13)
15:23 Content Moderation Case Study: Instagram Takes Down Instagram Account Of Book About Instagram (2020) (90)
15:49 Content Moderation Case Study: YouTube Relocates Video Accused Of Inflated Views (2014) (2)
15:34 Content Moderation Case Study: Pretty Much Every Platform Overreacts To Content Removal Stimuli (2015) (23)
16:03 Content Moderation Case Study: Roblox Tries To Deal With Adult Content On A Platform Used By Many Kids (2020) (0)
15:43 Content Moderation Case Study: Twitter Suspends Users Who Tweet The Word 'Memphis' (2021) (10)
15:35 Content Moderation Case Study: Time Warner Cable Doesn't Want Anyone To See Critical Parody (2013) (14)
15:38 Content Moderation Case Studies: Twitter Clarifies Hacked Material Policy After Hunter Biden Controversy (2020) (9)
15:42 Content Moderation Case Study: Kik Tries To Get Abuse Under Control (2017) (1)
15:31 Content Moderation Case Study: Newsletter Platform Substack Lets Users Make Most Of The Moderation Calls (2020) (8)
15:40 Content Moderation Case Study: Knitting Community Ravelry Bans All Talk Supporting President Trump (2019) (29)
15:50 Content Moderation Case Study: YouTube's New Policy On Nazi Content Results In Removal Of Historical And Education Videos (2019) (5)
15:36 Content Moderation Case Study: Google Removes Popular App That Removed Chinese Apps From Users' Phones (2020) (28)
15:42 Content Moderation Case Studies: How To Moderate World Leaders Justifying Violence (2020) (5)
15:47 Content Moderation Case Study: Apple Blocks WordPress Updates In Dispute Over Non-Existent In-app Purchase (2020) (18)
15:47 Content Moderation Case Study: Google Refuses To Honor Questionable Requests For Removal Of 'Defamatory' Content (2019) (25)
More arrow