Trump Promises To Abuse Take It Down Act For Censorship, Just As We Warned
from the take-what-down,-mr.-president? dept
During his address to Congress this week, Donald Trump endorsed the Take It Down Act while openly declaring his plans to abuse it: “And I’m going to use that bill for myself too, if you don’t mind, because nobody gets treated worse than I do online, nobody.”
(You might think a former president openly declaring his intent to abuse a content moderation law would be big news. The media, apparently swamped with other Trump outbursts, didn’t even seem to notice.)
This is, of course, exactly what we (and many others) warned about in December when discussing the Take It Down Act. The bill aims to address a legitimate problem — non-consensual intimate imagery — but does so with a censorship mechanism so obviously prone to abuse that the president couldn’t even wait until it passed to announce his plans to misuse it.
And Congress laughed. Literally.
Let’s talk about non-consensual intimate imagery (NCII) for a minute. (People used to call it “revenge porn,” but that’s a terrible name — it’s not porn, it’s abuse.) The tech industry, after a fairly slow start, has actually been reasonably good more recently at trying to address this problem. You’ve got NCMEC’s Take It Down system helping kids get abusive images removed. You’ve got StopNCII.org doing clever things with hashes that let platforms identify and remove bad content without anyone having to look at it. These aren’t perfect solutions, but they show what happens when smart people try to solve hard problems thoughtfully.
But Congress (specifically Senators Ted Cruz and Amy Klobuchar) looked at all this work and said “nah, let’s just make websites legally liable if they don’t take down anything someone claims is NCII within 48 hours.” It’s the “nerd harder or we fine you” approach to tech regulations.
You can’t just write a law that says “take down the bad stuff.” I mean, you can, but it will be a disaster. You have to think about how people might abuse it. The DMCA’s notice-and-takedown system for copyright at least tried to include some safeguards — there’s a counternotice process, there are (theoretical) penalties for false notices. But TAKE IT DOWN? Nothing. Zero. Nada.
We already see thousands of bogus DMCA notices attempting to remove content with no basis in the law, even with those safeguards in place. What do you think will happen with a law that has no safeguards at all? (Spoiler alert: The president just told us exactly what will happen.)
Even given the seriousness of the topic, and the president’s support, you might think that Congress would care about the fact that the bill almost certainly violates the First Amendment, and thus would stand a high likelihood of being tossed out as unconstitutional. CDT tried to warn them, explaining that forcing websites to take down content without any court review creates some thorny constitutional problems. (Who knew that requiring private companies to censor speech based on unverified complaints might raise First Amendment concerns? Well, everyone who’s ever taken a constitutional law class, but apparently not Congress.)
Congress could have fixed those problems. But chose not to.
As currently drafted, however, the TAKE IT DOWN Act raises complex questions implicating the First Amendment that must be addressed before final passage. As a general matter, a government mandate for a platform to take down constitutionally protected speech after receiving notice would be subject to close First Amendment scrutiny. The question is whether a narrowly drawn mandate focused on NDII with appropriate protections could pass muster. Although some NDII falls within a category of speech outside of First Amendment protection such as obscenity or defamation, at least some NDII that would be subject to the Act’s takedown provisions, even though unquestionably harmful, is likely protected by the First Amendment. For example, unlike the proposed Act’s criminal provisions, the takedown provision would apply to NDII even when it was a matter of public concern. Moreover, the takedown obligation would apply to all reported content upon receipt of notice, before any court has adjudicated whether the reported image constitutes NDII or violates federal law, let alone whether and how the First Amendment may apply. Legally requiring such take-down without a court order implicates the First Amendment.
Even if you think the concerns about fake takedown notices are overblown, shouldn’t you want to make sure that the law would pass First Amendment scrutiny when it goes to court? It seems important.
Unfortunately, it does not appear that Congress paid attention. The Senate recently passed the Act via unanimous consent, and it’s now headed to the House with strong support. Earlier this week, Melania Trump endorsed the bill, and Donald Trump briefly mentioned it during his address to Congress, and as mentioned above, he explicitly revealed his plans to abuse it:
And Elliston Berry, who became a victim of an illicit deepfake image produced by a peer. With Ellison’s help, the Senate just passed the Take It Down Act and this is so important. Thank you very much, John. John Thune. Thank you. Stand up, John. [Applause] Thank you, John. Thank you all very much. Thank you and thank you to John Thune and the Senate.
Great job. To criminalize the publication of such images online is terrible, terrible thing. And once it passes the House, I look forward to signing that bill into law. Thank you. And I’m going to use that bill for myself too, if you don’t mind, because nobody gets treated worse than I do online, nobody.
There it is — a sitting president openly declaring his intent to abuse a content moderation law to remove speech he doesn’t like. This isn’t speculation or paranoia about potential misuse — it’s an explicit promise, made in front of both houses of Congress, as well as multiple Supreme Court Justices, of his intent to weaponize the law against protected speech.
So here we are. Civil liberties groups have been jumping up and down and waving their arms about how this bill needs basic safeguards against abuse. The media, apparently suffering from Trump-crazy-statement-fatigue, has mostly yawned. Congress, eager to show they’re “doing something” about online abuse, doesn’t seem interested in the details.
And why would they be? The bill is framed as protecting people from having compromising imagery posted online. Who could be against that? It’s like being against puppies or ice cream.
But here’s the thing: When someone tells you they plan to abuse a law, maybe… listen? When that someone is the President of the United States, and he’s saying it in front of Congress and multiple Supreme Court Justices, maybe pay extra attention?
The good folks at EFF have set up an action alert asking people to contact their representatives about the bill. But realistically, the bill has a strong likelihood of becoming law at this point.
Look, I can already hear the counterargument: “NCII is so harmful that we need strong measures, even if there’s some collateral damage to free speech.” And yes, NCII is genuinely harmful. But here’s the problem — a law designed with giant, exploitable holes doesn’t actually solve the problem. If it becomes primarily a tool for the powerful to suppress criticism (as Trump just promised), victims of actual NCII will be left with a discredited law that courts may eventually strike down entirely. The real goal should be a targeted, constitutional solution — not a censorship free-for-all that the president openly plans to weaponize against his critics. That serves no one except those who want to silence opposition.
We’ve spent the last two decades watching the DMCA’s takedown system be abused to silence legitimate speech, even with its (admittedly weak) safeguards. Now we’re about to create a similar system with no safeguards at all, precisely when the president has announced — to laughter and applause — his plans to weaponize it against critics.
Congress is building a censorship machine and handing the controls to someone who just promised to abuse it. That’s not fighting abuse — that’s enabling it.
Filed Under: 1st amendment, amy klobuchar, censorship, donald trump, free speech, ncii, take it down, ted cruz




Comments on “Trump Promises To Abuse Take It Down Act For Censorship, Just As We Warned”
Unsurprising, he wants to stop people (rightfully) making fun of him, as well as articles depicting his ugly mug in a suitably ugly light.
That any dems voted yes for this is a sign that they need to go. The democrat party needs candidates who actually know their shit.
Blockhead
“… because nobody gets treated worse than I do online, nobody.”
And, he STILL hasn’t taken the hint. Hey Donald, maybe ask yourself why, and look at what you can do to alter the way you are being. I know, I know, it’s not your fault…
This comment has been flagged by the community. Click here to show it.
Just like the section 230 repeal, this will pass easily and we’ll all be worse off for it.
This one may at least have a chance of only having a limited lifespan, at least, reducing the longterm harms a little.
But the fact remains that’s a potentially good bill gone down the drain because it was written more like a weapon than a shield.
Re:
They will not be able to repeal section 230 and it will not be easy.
This comment has been flagged by the community. Click here to show it.
Re: Re:
just flag them
Re: Re:
Congress could repeal Section 230 if Republicans both hold firm in the House and find enough Dems to vote with them in the Senate. That isn’t likely to happen, but it isn’t outside the realm of possibility. To think otherwise is to indulge in a fantastical optimism that borders on believing 230 will be protected forever by God, Allah, and Eris. You’d be better off preparing for a repeal of 230 while hoping it never comes to pass.
Re: Re: Re:
If the MAGA psychos seem to be protected from direct consequence by God, it sure seems unfair that the internet can’t be too.
This comment has been flagged by the community. Click here to show it.
Re:
can you shut the fuck up about 230 already
One of the big arguments against a revenge porn law 10+ years ago was “how do you separate NCII from something newsworthy like Anthony Weiner’s scandal”? Apparently Congress has decided to enable as many future Weiners as possible by allowing them to get those stories shut down.
Trump porn? Eye bleach plz..
In his current state of decay, I don’t see how NCII of Trump wouldn’t inherently qualify as parody or satire.
It almost certainly wouldn’t be wank fodder, unless someone has some seriously bizarre fetishes beyond the fever dreams of even furries
Re:
Deep fake of Trump with a feminine body?
Yes, it is a terrible thing. Thanks Donald for reminding it.
But I’m waiting for Musk from preventing Donald to sign this into law since it’s too much work for Twitter.
Key to the threat is:
The notifier declares that the image is NCII. … but the image only has to be identifiable as the requester. It doesn’t have to be intimate, just identifiable.
A press photo of Trump would qualify.
This comment has been flagged by the community. Click here to show it.
Censorship is a good thing, though. It keeps children safe.
Re:
Are you taking the piss?
Re:
Censorship is the tool of decievers and fascists. They want you ignorant and gullible — willing to believe anything they say and treat it as the unerring Word of God — which is why they try to censor anything that might make you less ignorant and less gullible. As for “protecting children”: You can protect your own child from information and/or media you think they shouldn’t have, but demanding that the law “protect” everyone else’s children based on your fears (and possibly your desire for control of people who are outside of your control) is a step down the path of fascism. And I know that because I’m not a gullible idiot.
Re:
-Anonymous Coward enters the Techdirt comment section.
-Says “Censorship is good.”
-Refuses to elaborate.
-Leaves.
-Gets labeled as spam.
Re: Re:
The daily routine, innit?
Re:
trump: “I’ve stopped all government censorship and brought back free speech in America”
also trump: “I’m going to use that bill for myself [to censor anything unflattering said about me]”
dick warrell: “I’m okay with this!”
Re:
Censorship is good to MAGAts indeed.
If you mean safely in the clutches of those who would abuse them, you are correct.
Dear Mr. Chump: welcome to the First Amendment, where prancing assholes get bitch-slapped till they run away crying. If you think things are going to get better for you by being a bigger Nazi, then you’re no doubt going to be very surprised. Ultimately, the only thing you’re going to accomplish is getting the GOP declared a terrorist organization. So, please, just keep doing what you’ve been doing…
'This will lead to the removal of entirely legal speech!' 'Well yeah, that's the point.'
Any law that doesn’t include a penalty for abuse after it’s been pointed out as a possibility according to a current reading of the law is one where that abuse is considered a feature, not a bug, to it’s authors.
That’s not a mistake, it’s the way the game has been played for years now. They want ridiculous, unconstitutional laws that will get struck down. Then they get to have the bullet point in their fund-raising emails, and rail against the courts for striking it down, all without the actual messiness of having to try and write a good law that, (everyone knows,) will only partially work.
Trump came along and took all this fictional rhetoric seriously, and the Democratic Congress-critters are too focused on ‘get back to normal’ to see that they have a hand on this clown-car wheel as it races into the nearest solid wall.
To criminalize the publication of such images online is terrible, terrible thing.
Dude isn’t even coherent.
Remember when internet communities used to be good at identifying horrible internet censorship laws packaged in this kind of ‘not voting for it is basically saying you want to kill puppies’ way? And then organized and acted on it to put pressure on congress?
What a brief, hopeful time. It’s unfortunate that large corporations and governments realized so quickly how to co-opt those social channels to completely overwrite actual conversation, debate and organization with siloed communities, reactionary content, rampant misinformation and rewarding users financially who communicate as a walking billboard to dumb down any conversation that does happen.
This is the type of shit that in 2008-2012 would have people flooding congress’ phone lines and emails for months on end with mainstream media reluctantly picking it up and weakly trying to spin it against those weird young internet activists, only to walk it back after they realized they didn’t control the conversation. Man, we didn’t know how good we had it.
Now we’re busy sharing content rather than taking any action, or you’re so far down whatever hole you’re in, you’re just circle jerking memes and calling it ‘organizing’. Or you’re one of the ones who fell down the conservative meme hole and let a walking meme into the president’s office, jubilant as he tears apart the government years of peace among allied nations and soon to deliver some of the final nails into the internet with draconian censorship and making sure there are no safeguards against the torrent of AI generated content and bots we’re bracing against.
It’s been fun.
I think that it should also be mentioned how misused the reporting system, the cyber tip line, from NCMEC has become. This new act will expand the net. Things like CSAM and NCII are horrid, but invading personal private data and calling innocent family photos CSAM or innocent private intimate photos between couples NCII is worse. This will pump up the volume of false reports. The consequences, if modeled after the action taken against suspected CSAM are followed, could result in destruction of people’s digital lives. Much like what happened from The NY Times report on Google and two fathers other the tons of other news reports on the issue. We need data protection and data privacy. No one should be harmed. I was harmed by Google’s algorithms falsely disabling my account and there was no human to intervene or court option to undo Google’s mistake. That was almost three years ago and I am still suffering from the trauma. These zero tolerance and tough on crime approaches have not historically been helpful to the abused. They often cause more abuse in the process. It’s like using a sledgehammer vs a scalpel.
Denial of service
Given the lack of safeguards like having a counter-notice procedure, one way to protest this law would be to file a bunch of complaints against non-NCII content to the point where ISPs have to either take down everything or exercise actual discretion in what they take down.