Was The NO AI FRAUD Act Written By A Fraudulent AI? Because Whoever Wrote It Is Hallucinating

from the maybe-call-it-the-no-more-parody-or-satire-law? dept

A couple of weeks ago, a friend sent me Rep. Maria Elvira Salazar’s and Rep. Madeleine Dean’s proposed NO AI FRAUD Act, which purports to “protect Americans’ individual right to their likeness and voice against AI-generated fakes and forgeries.”

Now, there are legitimate concerns about uses of generative AI tools to create fake images, videos, audio, etc, but most of the fears to date have been pretty overblown. As we noted a few years back, it seems that most of those concerns can be handled directly. It’s possible that with these tools getting better and more accessible all the time, that things could change, but it’s no reason to rush through poorly thought out laws that take a sledge hammer to some pretty fundamental principles.

While the bill’s framing suggests it’s focused on generative AI, it’s really a giant, cumbersome federal publicity rights law. For nearly 15 years we’ve been talking about the problems of publicity rights, which has basically created a newish chaotic right that people have been abusing left and right to silence speech. The laws are all state based (currently) and thus vary by state, but there’s been a concerted effort to lift publicity rights up to the federal level and to enshrine it in the fictional bucket of “intellectual property” alongside copyright, patents, and trademarks.

But that’s not the point of publicity rights. They were initially designed for a pretty straightforward and reasonable purpose: to stop false claims of endorsement. That is, you can’t put Famous Hollywood Actor in your ads looking like they endorse your product unless Famous Hollywood Actor has agreed to it.

That’s reasonable.

But where it goes off the rails is where some people use those laws to silence things that have nothing to do with false or misleading endorsements in commercial ads. You have celebrities suing, saying you can’t tweet pictures of them, for example. Or the owner of a horse demanding a cut of a prize winning photo, because the horse was in it. Or maybe you have former central American strongmen suing video games for including a character based on him.

The list goes on and on and on. There are all sorts of cases where it makes perfect sense to be able to use a likeness of someone, even without their approval. We see it all the time in TV and movie shows representing real life events. The makers of those shouldn’t have to license everyone’s likeness. Or you see it in parodies on TV and online. Saturday Night Live regularly parodies real people. They should never need permission and a license to do so.

Unfortunately, the NO AI FRAUD Act doesn’t seem to understand any of that. It creates a very, very broad federal publicity right (which it officially deems an “intellectual property” right — thereby exempting it entirely from Section 230) with no consideration of just how broad and problematic the law is. It also extends publicity rights past death (something most state publicity rights laws don’t do, and the ones that do have found it to be a total mess).

As Elizabeth Nolan Brown at Reason notes, this law is just ridiculously broad:

So just how broad is this bill? For starters, it applies to the voices and depictions of all human beings “living or dead.” And it defines digital depiction as any “replica, imitation, or approximation of the likeness of an individual that is created or altered in whole or part using digital technology.” Likeness means any “actual or simulated image… regardless of the means of creation, that is readily identifiable as the individual.” Digital voice replica is defined as any “audio rendering that is created or altered in whole or part using digital technology and is fixed in a sound recording or audiovisual work which includes replications, imitations, or approximations of an individual that the individual did not actually perform.” This includes “the actual voice or a simulation of the voice of an individual, whether recorded or generated by computer, artificial intelligence, algorithm, or other digital means, technology, service, or device.”

These definitions go way beyond using AI to create a fraudulent ad endorsement or musical recording.

They’re broad enough to include reenactments in a true-crime show, a parody TikTok account, or depictions of a historical figure in a movie.

They’re broad enough to include sketch-comedy skits, political cartoons, or those Dark Brandon memes.

They’re broad enough to encompass you using your phone to record an impression of President Joe Biden and posting this online, or a cartoon like South Park or Family Guy including a depiction of a celebrity.

And it doesn’t matter if the intent is not to trick anyone. The bill says that it’s no defense to inform audiences that a depiction “was unauthorized or that the individual rights owner did not participate in the creation, development, distribution, or dissemination of the unauthorized digital depiction, digital voice replica, or personalized closing service.”

And because the law also covers intermediaries (and gets itself exempted from Section 230 by declaring the rights to be “intellectual property”), it means that people will be able to sue Meta or YouTube or whoever for merely hosting such content.

EFF has put out a warning about how terrible the bill is.

First, the Act purports to target abuse of generative AI to misappropriate a person’s image or voice, but the right it creates applies to an incredibly broad amount of digital content: any “likeness” and/or “voice replica” that is created or altered using digital technology, software, an algorithm, etc. There’s not much that wouldn’t fall into that category—from pictures of your kid, to recordings of political events, to docudramas, parodies, political cartoons, and more. If it involved recording or portraying a human, it’s probably covered. Even more absurdly, it characterizes any tool that has a primary purpose of producing digital depictions of particular people as a “personalized cloning service.” Our iPhones are many things, but even Tim Cook would likely be surprised to know he’s selling a “cloning service.”

Second, it characterizes the new right as a form of federal intellectual property. This linguistic flourish has the practical effect of putting intermediaries that host AI-generated content squarely in the litigation crosshairs. Section 230 immunity does not apply to federal IP claims, so performers (and anyone else who falls under the statute) will have free rein to sue anyone that hosts or transmits AI-generated content.

That, in turn, is bad news for almost everyone—including performers. If this law were enacted, all kinds of platforms and services could very well fear reprisal simply for hosting images or depictions of people—or any of the rest of the broad types of “likenesses” this law covers. Keep in mind that many of these service won’t be in a good position to know whether AI was involved in the generation of a video clip, song, etc., nor will they have the resources to pay lawyers to fight back against improper claims. The best way for them to avoid that liability would be to aggressively filter user-generated content, or refuse to support it at all.

Now, I’ve seen some defenders of the bill push back on the concerns people have raised by saying that it won’t be that bad because the bill contains a weird “First Amendment Defense” section that says you can use the 1st Amendment as a defense against claims in the bill.

But… the 1st Amendment can always be used in defense of an attempt to suppress speech, so there’s no reason to put this clause into the bill. That is, unless, it’s actually designed to really limit the 1st Amendment defenses. And that’s exactly what’s happening here. Again, the EFF explains:

Lastly, while the defenders of the bill incorrectly claim it will protect free expression, the text of the bill suggests otherwise. True, the bill recognizes a “First Amendment defense.” But every law that affects speech is limited by the First Amendment—that’s how the Constitution works. And the bill actually tries to limit those important First Amendment protections by requiring courts to balance any First Amendment interests “against the intellectual property interest in the voice or likeness.” That balancing test must consider whether the use is commercial, necessary for a “primary expressive purpose,” and harms the individual’s licensing market. This seems to be an effort to import a cramped version of copyright’s fair use doctrine as a substitute for the rigorous scrutiny and analysis the First Amendment (and even the Copyright Act) requires.

In other words, it’s a “First Amendment*” defense, where the First Amendment doesn’t really mean the First Amendment.

It really feels like this bill was written by people who got completely sucked into a moral panic about AI-generated deep fakes, but with absolutely no understanding or knowledge of free speech or publicity rights issues.

Filed Under: , , , , , , , , , ,

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Was The NO AI FRAUD Act Written By A Fraudulent AI? Because Whoever Wrote It Is Hallucinating”

Subscribe: RSS Leave a comment
15 Comments
This comment has been deemed insightful by the community.
Anonymous Coward says:

it applies to the voices and depictions of all human beings “living or dead.” And it defines digital depiction as any “replica, imitation, or approximation of the likeness of an individual that is created or altered in whole or part using digital technology.”

To create a likeness without using digital technology requires painting or sketching on paper or canvas, old school photography using light sensitive materials, or carving a likeness in wood, stone or clay etc. Everything else uses digital technology, including scanning an image, and posting it on the Internet.

Just think ow what the cops could do with such a law, as using a phone to record their actions is using digital technology.

sumgai (profile) says:

Re:

Just think ow [sic] what the cops could do with such a law, as using a phone to record their actions is using digital technology.

Or look at it from the other direction. Think about how:

  • Body cameras won’t be worn any more, because they’re digital.
  • No more fax machines, they’re all digital now.
  • Ditto for printers!
  • Wanted posters will once again be hand-drawn for manual distribution.
  • Wire-tap recordings will once again be tape recorded in analog, not with digital means.
  • A resurgence in film and paper will come about, because long-distance surveillance will require analog cameras – nothing digital will be allowed.

And that’s just off the top of my head, there are probably way more limits on the police using digital tech to definitively describe a “suspect” than I can come up with in just a few moments.

Of course, if I can see this, then so can others in a position to insert an amendment to exclude law enforcement activities from this particular steaming pile of bullshit.

Anonymous Coward says:

Much like Trademark...

Greetings, Mr Placeholder.

We have been reviewing content on YouTube and your videos came up.

Your voice in your YouTube video sounds entirely too like Mariah Carey. Understanding that YouTube videos are digitally produced, we have no choice but to sue you to preserve Ms Carey’s Intellectual Property Rights to her likeness.

While we sympathize with your position that you were exercising your first amendment rights, the law gives us no recourse if we want to preserve our client’s rights.

ECA (profile) says:

This is going to be fun

How many people can do a basic drawing, that could represent AT LEAST 4-10 people, with 1 drawing.
How far do you want to take didital recording? By copying a digital recording or the Production of a digital recording.

DO, understand Anything that says Digital. Analog recordings?? Included or NOT?
How many persons can Imitate Grover? Arnie? Ms. Piggy? And These are not Digital, but can be changed Enough NOT to BE/But Are, Similar? And that is illegal also?

This is a RIAA/MPAA Joyride. Not even counting, Taking Snippets of TRUMP/Biden/Carter/Any Political person. Just to Explain WHAT they said. This does not SAY for What purposes it CAN be done.

Uriel-238 (profile) says:

This genie is out of the bottle.

Malicious actors who want guns for their malicious acts are not going to be stopped by registries or criminalization. Similarly I infer malicious actors who want to create deepfakes for political ends are not going to be halted by legal proscriptions.

And here’s the thing, you can download and run the OpenAI package and train it on your own data set. There are communities on Lemmy where AI artists challenge each other to create images around a specific prompting theme, often to stress test the LLM (say to explicitly define who is hugging whom in an image, or to get the cat smoking a cigarette rather than Humphrey Bogart).

This is to say, if a political task force wants to create deep fakes to confirm a specific example, the means and talent will be available to them, no matter if it’s a crime to make or distribute such content in some regions (the internet will likely do the distro for them).

So even if some law passes in an effort to keep deepfakes out of the 2024 election, it’s expected (happened already!) we’re going to have deepfakes in the 2024 election. A better attack would be to develop an array of tools that detect deepfakes, even good ones that have pretty hands.

PS: AI artists have already figured out how to make hands pretty, and anatomically correct. It’s technique, rather than technology for now, but even that will change in time.

This comment has been deemed insightful by the community.
This comment has been deemed funny by the community.
Crafty Coyote says:

it applies to the voices and depictions of all human beings “living or dead.”

Won’t anyone please think of the Disney World animatronic Presidents. This stupid law will do to Fake Abraham Lincoln what John Wilkes Booth did to REAL Abraham Lincoln.
https://www.youtube.com/watch?v=10249ddTdPY

Anonymous Coward says:

Re:

The old animatronics are still legal, as they used mechanical mechanisms, and analogue recordings, however photographing and filming them might be a problem these days, as that is where they enter the digital realm.

That proposed law is so badly written that an actor using prosthetics can be made up to look like anyone, and they can imitate someones voice as well. It only becomes illegal when a digital recording is made of the actor.

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...