Your Job Was Stopping CSAM? Trump Says No Visa For You!
from the censoring-the-non-censors-for-anticensorship dept
You want to see actual government censorship in action? And have it done by people claiming they’re doing it to stop censorship? Check out last week’s revelation (originally reported by Reuters) that the US State Department will now start denying H-1B visas for anyone who has anything to do with trust & safety, fact checking, content moderation, or mis- or disinformation research. The government is now punishing people for speech—specifically, punishing them for the false belief that their work constitutes censorship.
The cable, sent to all U.S. missions on December 2, orders U.S. consular officers to review resumes or LinkedIn profiles of H-1B applicants – and family members who would be traveling with them – to see if they have worked in areas that include activities such as misinformation, disinformation, content moderation, fact-checking, compliance and online safety, among others.
“If you uncover evidence an applicant was responsible for, or complicit in, censorship or attempted censorship of protected expression in the United States, you should pursue a finding that the applicant is ineligible,” under a specific article of the Immigration and Nationality Act, the cable said.
It’s like JD Vance’s “the rules were you weren’t going to fact check me” taken to a new level.
This policy censors non-censors for not doing the thing that the White House and MAGA folks are actively doing every day. MAGA knows content moderation is necessary—they’re super eager to have it applied when it’s speech they don’t like. As we’ve recently discussed, they’ve suddenly been demanding social media companies stop foreign influence campaigns and remove anything mean about Charlie Kirk. At the same time, the White House itself is engaged in a twisted version of what it claims is fact checking and demanding that media orgs hire MAGA-friendly censors.
The hypocrisy is the point. But it’s also blatantly unconstitutional. As Carrie DeCell, senior staff attorney at the Knight First Amendment Institute at Columbia University, said in response to this news:
People who study misinformation and work on content-moderation teams aren’t engaged in ‘censorship’— they’re engaged in activities that the First Amendment was designed to protect. This policy is incoherent and unconstitutional.
Incoherent and unconstitutional is being too kind.
The real work that trust & safety professionals do makes this policy even more perverse. As trust & safety expert (and occasional Ctrl-Alt-Speech guest host) Alice Hunsberger told (the recently defunded) NPR:
“Trust and safety is a broad practice which includes critical and life-saving work to protect children and stop CSAM [child sexual abuse material], as well as preventing fraud, scams, and sextortion. T&S workers are focused on making the internet a safer and better place, not censoring just for the sake of it,” she said. “Bad actors that target Americans come from all over the world and it’s so important to have people who understand different languages and cultures on trust and safety teams — having global workers at tech companies in [trust and safety] absolutely keeps Americans safer.”
So the administration is now barring entry to people whose work includes stopping child sexual abuse material and protecting Americans from foreign bad actors—all while claiming to oppose censorship and demanding platforms remove content about Charlie Kirk. The only way this makes sense is if you understand what the actual principle at work is: we get to control all speech, and anyone who might interfere with that control must be punished.
There are no fundamental values at work here beyond “we have power, and we’re going to abuse it to silence anyone who stands in our way.”
Filed Under: 1st amendment, censorship, content moderation, disinformation, donald trump, fact checking, free speech, h1-b visas, marco rubio, misinformation, trust and safety


Comments on “Your Job Was Stopping CSAM? Trump Says No Visa For You!”
No bloody wonder. Trump is a pedophile.
Fact checking Trump style
Please check your facts at the door. You will get them back when you are leaving the country.
Amazing
In politics, Proof isnt enough, Lies work better, as you can Supply all the data you want, as you pay someone to Write the letters, or Post all over the net.
Lies are easy when you have Dirty money, ignorant Supporters, no fact checks, no Monitoring.
Fact checkers?
So journalists from foreign countries are now ineligible to enter the USA?
You’re not a country anymore, you’re a criminal syndicate
Notwithstanding all the other censorship this administration does, now they are doing something else related to T&S. They should all be deported under their own rules.
Well, “content moderation” is absolutely “censorship.” But it’s not government censorship, which is what is unconstitutional (aka illegal). What Trump’s abomination is doing is what’s illegal, but then, it is a day of the week ending in ‘y.’ Another day at the White House just like every day before it since Trump moved in and every day after it until Trump leaves.
Re: Damn, haven't had to use this one in a while.
Moderation is a platform/service owner or operator saying “we don’t do that here”. Personal discretion is an individual telling themselves “I won’t do that here”. Editorial discretion is an editor saying “we won’t print that here”, either to themselves or to a writer. Censorship is someone saying “you won’t do that anywhere” alongside threats or actions meant to suppress speech.
Re: Re:
You’re using “is” to mean “my personal definition” again.
Re: Re: Re: You’re getting an extra-large helping of snark today.
So what? I’m not changing that copypasta because you alone have an issue with the phrasing. Any change to the wording will happen because someone makes an argument for an improved nuance with which I agree. But if you really want me to “improve” it by making it needlessly wordier so that it can be better registered as an opinion just to make you happy, fine. For one time and one time only, this is for you:
Does that cover enough bases for you, or is there anything I need to add so I all but disclaim basically everything I believe while also making the actual point of the copypasta more about making people understand what I’m saying is my opinion and less about the actual belief I’m expressing between the lengthy-yet-necessary(-for-you) disclaimers?
Re: Re: Re:2
I mean, tbh I’d prefer it be fixed in the main article(s). It’s pretty clearly a consistent problem, so much so that you have a copy-pasta for it, with an easy fix (a link and/or quick call out, like we would normally do). If it was just me, I wouldn’t bother bringing it up, but it’s clearly not. Although i should’ve taken it out more on the main article.
But what you makes you think the copy pasta is going to be any clearer to them than the article? You’re essentially just restating the article wording, but with more detail. You say it’s needless, but you already know they’re not going to clock it as a normative claim based on how they responded to the main article with the exact same verbiage.
As far as wordiness, you don’t need to write an entire paragraph every time. A very minor wording change would cover it- typically just the “is/ought” distinction. (Or heck, even just a link as part of the copy-pasta). And you’re perfectly aware enough to do this normally, you’re usually very clear when you’re asserting descriptive vs normative. So is Mike. You even did it yourself when you wrote the original article on it (without making it notably longer). You don’t even have to write anything new, because you already have something to refer people to. You just… don’t?
Or honestly, don’t bother, I’ll just link it myself, no work on your end required. It just seems silly and completely avoidable. If people grokked it, that’d be a different story.
Re: Re: Re:3
[None of the following text should be taken as any kind of objective meaning of any word in any dictionary or thesaurus, professional or urban, and nothing I am about to say has the force of law to back up what is ultimately my subjective and personal opinion.]
In my personal opinion which has no strength of fact or law, I personally and subjectively believe that if I wanted to change my copypasta in any way, then I would, subjectively speaking from a non-objective viewpoint that has no actual social weight and can be dismissed without consideration, change that copypasta on my own terms instead of, subjectively speaking from a non-objective viewpoint that has no actual social weight and can be dismissed without consideration, being bullied into changing it for the sake of, subjectively speaking from a non-objective viewpoint that has no actual social weight and can be dismissed without consideration, pleasing literally the only person who’s ever tried to jump down my throat about it this much.
[None of the preceding text should be taken as any kind of objective meaning of any word in any dictionary or thesaurus, professional or urban, and nothing I have said has the force of law to back up what is ultimately my subjective and personal opinion.]
Re:
The thing is that language evolves with usage and the term censorship has become more commonly used to describe specifically unlawful government censorship, in the same way that saying “you have attitude” means “you have a bad or rude attitude.” When people talk about censorship, they’re often referring to violations of “free speech” rights which are only guaranteed by the 1st Amendment and that only restricts government censorship. There’s a whole swath of trolls and idiots who think they have a constitutional right not to be “censored” by non-governmental actors.
There’s the other aspect that content moderation by a platform owner/representative is just more speech rather than censorship because its the platform owner choosing not to host speech they don’t want to host and the alternative is forcing them to functionally repeat speech against their will. Refusing to repeat the words of others isn’t censorship, but your own freedom of speech. And the ability of a troll to say something somewhere else means they’re not censored except in a very small specific context that is functionally irrelevant to their ability to speak freely, so “technically” censorship by some definitions but not meaningful enough to bother calling it that.
Re: Re:
It’s more commonly used that way, but it’s not quite universal, either. Really it’s best not to assume. You still have people like EFF or Cory Doctorow who use it that way, or are worried about it.
While it’s not as hot a topic, it’s not just trolls. E.g. sexwork often gets heavily censored on most major platforms (and not just one). Things like calls to violence, as well. Basically that narrow band of things that aren’t considered advertiser friendly but are technically legal. These tend not to be controversial for the most part, so there isn’t a big fuss, but they are content moderation decisions.
While the “muh freezepeach” thing is often brought up by trolls trying to work the refs, since the major platforms mostly haven’t abused it in a concerted way, there are some legitimate concerns around speech and platforms. Especially these days as those major platforms start to get more consolidated.
Re: Re: Re:
Note that the EFF specifically modified the term as “platform censorship.” That’s a good method for differentiating to prevent confusion. I don’t disagree with Cory’s assertions on the topic, but that’s also not mutually exclusive with what I said. He’s using the technical definition in a particular context in which it is appropriate.
But sex work is in some jurisdictions illegal and platforms can get in trouble legally for hosting it (Backpage, hello!).
Calls to violence are generally illegal and not protected speech. Platforms have a good reason to “censor” that.
I would dispute the idea that these are technically legal. It would depend on the specific examples and the nuance may require a court case to determine. In general, calls to violence and soliciting sex work can be considered illegal.
Re: Re: Re:2
I guess I wasn’t clear on the wording, but I was thinking more the legal type. Like porn, which is unambigously 1A protected in the U.S. Particularly the darker stuff like snuff videos, after United States v. Stevens.
In the U.S., generally they require things like imminence (under Brandenburg). Stuff like general pro-terrorism content would not be illegal.
And there are other examples, like gore, or covid misinformation. In the past I would normally say stuff like Nazism, but that seems to have loosened in recent years. There’s a lot of not-technically-illegal content that’s contained to shithole sites like Stormfront/Kiwi Farms or the various chans (and in those examples, they’ve had difficulty even getting hosting, services like Cloudflare won’t touch them anymore)
Re: Re: Re:3
I’m okay with private operators deciding whether porn or calls for violence are appropriate for their platforms. I don’t consider that censorship unless the policies aren’t applied evenhandedly. That’s a free speech right for the owners of the platform. There are platforms where porn is welcome. There are platforms where calls for violence are accepted.
If Mike blocks a porn post here in the comments, more power to him. It’s not a porn site where people are looking for or expecting that. That’s on the person making the post for thinking it’s the appropriate place to post it. Usually platforms will have a stated policy about what content is allowed.
Re: Re: Re:4
I don’t disagree, just trying to get at the evenhandedness part, and that there’s more to it than just being blocked on more than one site.That said:
That would mean someone exercising their policies unevenly would be exercising their free speech rights while also censoring, right? A part of free speech is that someone can lie, be deceptive, or biased (with a few narrow exceptions for fraud/defamation, etc).
And you can kind of see this in policy. While most sites have a TOS, there isn’t really any mechanism to see if it’s actually being applied evenhandedly. Moreso, most TOS’s have a clause that allows them to remove things at their sole discretion (which to be fair, is important to stop rules lawyering trolls). This is true in person as well- a host doesn’t have to be evenhanded in who they remove (barring a few protected classes)
Why oh why would a cult headed by Trump be against CSAM monitoring...
The regime upon it being pointed out that the people they’re keeping out are the ones that find and take down CSAM:
In public: That’s nothing but a liberal lie from the woke mob, the GOP cares deeply about the well-being of children and we would never attempt to reduce the ability for platforms to be less able to spot and take down CSAM in a timely manner. If they do so that’s because they are lazy and corrupt and are refusing to do so.
In private: Working as intended, hopefully this’ll mean our guys will stop being caught so often now…
Fairly brave to speak out, given how spineless Columbia has acted
We have enough native Karens (you) trying censor the internet already, k thx.
Stop using “CSAM” as an excuse. Kiddie porn is best found by AI anyway.
Re:
Do you have, like, an actual point, or are you just talking?
Re:
i like how you try to bury being utterly full of shit in something that could be partially true if you squint at it from the right angle and totally ignore your obvious intent through overmuch charitability.
Re:
hæh
The government is now punishing people for speech
blocking