'Deep Fake' Legislation Is On The Way, Threatening Free Speech Protections

from the how-do-you-solve-a-problem-like-AI-making-Maria-sing-DJ-Assault-tracks dept

The proliferation of deep fake videos is going to start having an effect on First Amendment protections. Hint: it's not going to make these protections any stronger.

"Deep fake" may be easier to define than "fake news," but that doesn't mean there won't be collateral damage. The issue isn't a new one. Faking reality has been around nearly as long as reality itself. Cheap tools that make this anyone's game is the only thing new. Before we had deep fakes, we had Photoshop and its imitators.

Video used to be the last bulwark of truth. It couldn't be faked easily. But this too has been abused for years. Editing video to make it show what the editor wants it to show is a tactic that has been used for years. Now, however, tools make it possible to put new words in peoples' mouths, as was demonstrated to devastating satirical effect when a video of Facebook founder Mark Zuckerberg was tricked out to make it appear as though Zuckerberg was promising to swallow every user's data and privacy.

This is prompting legislators to act. Concerns over the potential of deep fakes to mislead people or, in some cases, destroy the unwitting participant's reputation, are leading to the production of legislation from people not entirely sure what they're dealing with.

Apparently shaken by a deep fake video of former president Barack Obama calling President Trump a "dipshit" and Housing Secretary Ben Carson "brainwashed," a California assemblyperson is pitching anti-deep fake legislation. Ben Christopher of CalMatters has the details:

“I immediately realized, ‘Wow, this is a technology that plays right into the hands of people who are trying to influence our elections like we saw in 2016,’” said Assemblyman Marc Berman, a Democrat whose district includes Silicon Valley.

So Berman, chair of the Assembly’s election committee, has introduced a bill that would make it illegal to “knowingly or recklessly” share “deceptive audio or visual media” of a political candidate within 60 days of an election “with the intent to injure the candidate’s reputation or to deceive a voter into voting for or against the candidate.”

This bill may be narrowly-crafted to target only perceived election interference, but that still isn't enough to ward off possible Constitutional problems. For one, this law would punish anyone "knowingly" sharing something "deceptive." The problem is the word "deceptive." It doesn't just cover deep fakes that put words in candidates' mouths. It would also cover videos edited to show candidates in a bad light by taking comments or statements out of context. This has never been illegal before. Just because tech is allowing people to do scary new things with video processing tools is no reason to start criminalizing common campaign tactics.

Unsurprisingly, this legislative effort is opposed by the ACLU, EFF, and two major California journalism organizations. The news publishers point out this effort will do damage to protected speech while doing almost nothing to ensure election integrity.

[W]hitney Prout, staff attorney with the publishers’ association, called the bill “an ineffective and frankly unconstitutional solution that causes more problems than it solves.” She warned that, if enacted into law, it could discourage social media users from sharing any political content online, lest it be a fake and they be held legally liable. Another possible consequence, she said, is that campaigns plaster every attack ad with a deepfake disclosure to shield themselves from lawsuits, leaving the voting public even more confused.

This issue isn't going to go away though, and it's inevitable laws will be passed to try to curtail the harm caused by deep fakes. At the federal level, the discussion has gotten a bit hyperbolic, with senators calling deep fakes a threat to national security, and when those words are used to justify Congressional action, the American public always comes out on the losing end.

Filed Under: 1st amendment, california, deception, deep fakes, free speech, intent, legislation, marc berman


Reader Comments

Subscribe: RSS

View by: Time | Thread


  1. This comment has been flagged by the community. Click here to show it
    identicon
    D. Bunker, 9 Jul 2019 @ 10:56am

    Yet you SWOON for VENEER FAKES like "Trump-Russia collusion"!

    There was / is not a grain of truth to that, yet Techdirt ran it for months until my hooting convinced Masnick to drop it.

    That was only Deep State / media / masnicks making allegations and expecting to create a milieu in which could over-turn the election. Period.

    Bet there are still some here who believe the story fully -- without being to state a single crime!

    And meanwhile, you deliberately overlook the actual crimes of Comey, Strzok and other gov't officials in going ahead knowing was a fabrication.

    Why did / do you believe that without basis, Techdirt? Why do you now, after two years of investigation turned up ZERO, NOT want the Deep State criminals prosecuted for attempting to over-turn an election? ... Answer is obvious.

    You may be a little skeptical of videos in future, yet you're not going to look into the past and consider whether you've been fooled by other "narratives", like the alleged WMD in Iraq, because that WOULD definitely undermine everything you believe. You can't handle reality.


Add Your Comment

Have a Techdirt Account? Sign in now. Want one? Register here



Subscribe to the Techdirt Daily newsletter




Comment Options:

  • Use markdown. Use plain text.
  • Remember name/email/url (set a cookie)

Follow Techdirt
Techdirt Gear
Show Now: Takedown
Advertisement
Report this ad  |  Hide Techdirt ads
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Chat
Advertisement
Report this ad  |  Hide Techdirt ads
Recent Stories
Advertisement
Report this ad  |  Hide Techdirt ads

Close

Email This

This feature is only available to registered users. Register or sign in to use it.