In Upholding The TikTok Ban, SCOTUS Compromises, And With It The First Amendment
from the what-it-means-going-forward dept
As the we wrote in our amicus brief (which it appears the justices did not read – guess they didn’t have time…), if the TikTok ban is blessed, it provides a roadmap for how to avoid the Constitution’s prohibition to “make no law” abridging free expression. All the government needs to do is declare that what it is doing it is doing for national security purposes, or perhaps to address some other similar exigency, and to seal the deal include such an accelerated time for enforcement that it will be impossible for the courts to appropriately review what the government is doing. (In fact, simply either claiming a provocative reason, or rushing enforcement, might be enough alone to help the government get away with an unconstitutional attack on speech).
We need not determine the proper standard for mixed-justification cases or decide whether the Government’s foreign adversary control justification is content neutral. Even assuming that rationale turns on content, petitioners’ argument fails under the counterfactual analysis they propose: The record before us adequately supports the conclusion that Congress would have passed the challenged provisions based on the data collection justification alone.
Finding that the law effectively banning TikTok is somehow constitutional is a bad decision with all sorts of bad consequences, not the least of which being that it tells the world that we’re not really all that serious about protecting speech when the chips are down, and so maybe other governments need not care about it so much either. The consequence this post is focused on, however, is to what degree the First Amendment’s protection of speech has been undermined altogether here in America. In short: it’s been undermined, although possibly not as badly as it could have been.
But that there might be a glimmer of modest hope does not exonerate this otherwise inexcusable decision. This case should not have been hard: speech interests were affected by this law, whose terms failed to even address the most reasonable justification underpinning the law. (As TikTok pointed out, if data protection was the motivating concern, why were no other platforms targeted? Or even just other Chinese-owned platforms, like Temu?) Because speech interests were affected – those of the platform, as well as those of its users – strict scrutiny should have been applied to the law, at which point the Court should have seen that the lack of narrow tailoring (the law took out a whole platform!) put the law beyond anything that the Constitution would permit.
Yet the Supreme Court still somehow found otherwise.
The question now is whether the decision is indeed as narrow as the Court claims it is, and something that is truly exceptional that leaves untouched other, stronger First Amendment precedent. And there do seem to be a few bright spots. For instance, it basically leaves untouched a few important notions that it looks like the Court is accepting, namely that platforms do have First Amendment rights, and that algorithms implicate this protected editorial discretion. It is also good, perversely, that in finding that only intermediate scrutiny applied, it left untouched the stronger strict scrutiny standard. One concern with the decision at the DC Circuit was that if the TikTok law could survive strict scrutiny, then any unconstitutional action probably could. We would no longer have any robustly meaningful test to use to protect us against incursions on speech rights, or even any rights. So, at least, in the wake of this decision, strict scrutiny remains intact and useful.
On the other hand, what’s the point of it remaining a useful test if the Court can so easily find a basis not to use it. The fundamental problem with this decision is that it takes a law with huge impacts on speech interests and declares it to be a law that is not speech related. Technically it hinges on being “content neutral,” but the upshot is that the Court basically says, “La la la we can’t hear you,” to any speech concerns raised by TikTok or its users.
The challenged provisions are facially content neutral. They impose TikTok-specific prohibitions due to a foreign adversary’s control over the platform and make divestiture a prerequisite for the platform’s continued operation in the United States. They do not target particular speech based upon its content, contrast, e.g., Carey v. Brown, 447 U. S. 455, 465 (1980) (statute prohibiting all residential picketing except “peaceful labor picketing”), or regulate speech based on its function or purpose, contrast, e.g., Holder v. Humanitarian Law Project, 561 U. S. 1, 7, 27 (2010) (law prohibiting providing material support to terrorists). Nor do they impose a “restriction, penalty, or burden” by reason of content on TikTok—a conclusion confirmed by the fact that petitioners “cannot avoid or mitigate” the effects of the Act by altering their speech. Turner I, 512 U. S., at 644. As to petitioners, the Act thus does not facially regulate “particular speech because of the topic discussed or the idea or message expressed.” Reed, 576 U. S., at 163.
[From page 10]
Instead, by ignoring those speech interests, and the more heightened scrutiny that should have applied as a result, the Court applied what essentially was little more than rational basis review, even though they called it intermediary scrutiny. In short, according to the Court, because the government had good reason to be concerned with how TikTok slurped up user data and shared it, the government was free to do whatever it wanted in response, no matter how unduly destructive to speech interests (and ineffective in support of its own intended ends) its actions were.
The problem here is that not only was this decision an avoidance of the normal constitutional rule that should have better protected the affected speech interests, but there’s little to keep this particular sort of cop-out limited to this particular case. It will be very easy for other government actions that impact speech to be forgiven in the future, just as this one was, because there’s nothing that actually justifies this one. The same flimsy reasoning could easily be applied in another case, despite the Court’s insistence to the contrary. We’ve seen it happen before*, when the court tries to take a baby step to walk back the First Amendment but ends up with a decision that gets stuck on the books as a giant leap backwards, leaving everyone much less protected than they were before.
* Holder v. Humanitarian Law Project, another case dealing with foreign pressure on First Amendment rights, comes to mind. There was language in that decision explaining how its reasoning curtailing those rights was allowable in that case’s context, and just that context. (“We conclude that the material-support statute is constitutional as applied to the particular activities plaintiffs have told us they wish to pursue. We do not, however, address the resolution of more difficult cases that may arise under the statute in the future.”) Yet that decision nevertheless reverberates in other contexts, including this case, as the Court rested part of its analysis regarding the TikTok ban on that earlier exception that it had somehow found itself Constitutionally able to make.
The TikTok decision is a bad decision, and the per curiam nature of the decision hints that even the Court knows it. It reads like a compromise decision – an attempt to sacrifice TikTok without sacrificing everything – in a situation where, in an extremely tight timeline, the Court needed at least five votes to do something, and there wasn’t enough agreement as to what that something should be. At oral argument, and later during the Free Speech Coalition v. Paxton argument earlier this week, it became clear that several justices were uncomfortable issuing a stay or an injunction to buy more time to adjudicate this case and the important issues implicated more carefully. And it seems there weren’t five votes to say the law was unconstitutional – probably, as oral argument also revealed, because some justices were extremely spooked by the national security implications related to data collection practices.
So if TikTok was going to lose – and it would have effectively lost even if the Court did nothing, given that the deadline for divestment was rapidly approaching – the compromise may have been to try to make it lose in a way that undermined protective First Amendment precedent in the least damaging way. As it was, both Justices Gorsuch and Sotomayor could, correctly, see that the law implicated speech interests, and that ability to recognize it will be important in the future when we need the Court to see them again. But as their concurring opinions made clear, they still would have found the law constitutional, despite its utter lack of narrow tailoring, which strict scrutiny requires. They would have left us with a decision no better than the DC Circuit had issued, where strict scrutiny would become all but useless to protect speech interests.
Under the circumstances, then, this decision may have been the least damaging one the Court could come up with, at least in the available time. But the hope that it wasn’t damaging at all seems naïve. The best we can hope for is that this decision somehow turns out to be the government’s one free bite at the apple, because if it happens again, where the government adopts this roadmap to act unconstitutionally against speech interests, even this Court might start to notice the constitutional problem with such laws and finally decide to do something about them.
Filed Under: 1st amendment, content neutral, free speech, intermediate scrutiny, strict scrutiny, supreme court, tiktok ban
Companies: bytedance, tiktok


Comments on “In Upholding The TikTok Ban, SCOTUS Compromises, And With It The First Amendment”
Nah. The justices have undermined the constitution and I would call all of them traitors. I have zero faith this will not be used to further remove rights.
This comment has been flagged by the community. Click here to show it.
Re:
Cool off with the doomposting.
This comment has been flagged by the community. Click here to show it.
Re:
please shut up
Concerning, but not surprising. Tiktok had the entire damn US government majorly spooked for…Being chinese ontop of shaky evidence of it being a psy-op?
I guess it checks out with the paranoia aspect. Here’s to hoping it won’t affect something like the Paxton case negatively.
Re:
Xitter is a psyop. Why wouldn’t TikTok be?
Re: Re:
When it’s purchased by a US billionaire, it will be.
Re: Re:
Not saying either of them aren’t, it’s just that the evidence isn’t exactly empirical.
Re: Re:
The platform being run by a psychologically unstable narcissist doesn’t really make it a psyop.
Re:
“Paxton case negatively.” i don’t think there connected so i think it didn’t really do much for that
Re: Re:
They aren’t.
Re: Re: Re:
then yea i don’t think it would affect that case
Re: Being Chinese
Tiktok is Singaporean-traded. With a Singaporean CEO. I’m not denying that the Chinese government couldn’t lean heavily on ByteDance, nor that the company is responsive to Chinese sensibilities and self-censors to some extent. But a lot of US companies also have Chinese investors.
Personally, I’m amused by how much the US seems to be looking at China and saying “Yeah, we’ll have some of that!” Over the years I’ve seen reports of companies opening up subsidiaries in China and having Chinese Party members make the foreign owners an offer they can’t refuse, buying the subsidiary for a bargain price with the alternative being prison. Now we see a similar (sell to one of our billionaires or we’ll shut you down) scam being pulled off in the US.
“Progress”…
Hypocrisy / ISP's
This ban is beyond hypocritical when viewed in the context of Section 702.
Also, while this ban specifically only goes after hosting providers and app stores, I wouldn’t be surprised if, especially if TikTok makes a mobile website, we see calls from legislators to institute site-blocking at the ISP’s – and once that happens, it’s a bell that cannot be unrung. I would not be surprised if the MPAA and RIAA love this ban for this exact reason. The MPAA, in particular, is a very public supporter of site-blocking: https://www.techdirt.com/2024/04/17/the-motion-picture-association-doesnt-get-to-decide-who-the-first-amendment-protects/
Remember the Sdarot case, when site-blocking was extremely close to becoming a reality in the US but the orders were rescinded at the last minute? And how the rightsholders’ block page looked had a part what was directly screenshotted from the MPAA/ACE domain seizure page? https://www.techdirt.com/2022/05/04/who-needs-sopa-judge-orders-every-us-isp-to-block-entire-websites-accused-of-enabling-piracy/
And remember the Newzbin2 case, when site-blocking which was already in use in the UK for other reasons was expanded to become a means of copyright enforcement in a case brought by none other than the MPAA? https://www.techdirt.com/2011/07/28/uk-court-orders-bt-to-block-access-to-usenet-site-hollywood-hates/
In short: the MPAA is pure evil.
Re:
That’s a big if, as Trump is working to reverse it.
The Court argument that TikTok, as a generic platform, doesn’t try to silence some particular speech is very weak.
Even if the service is not known to be used exclusively by some specific communities, it doesn’t mean some people use only TikTok to communicate in a specific way, that would be lost on ban.
It’s like banning a whole music streaming, and all bands selling their songs on it, just because it’s Chinese/foreign.
Other platforms do fall under Sec 2(g)(3)(b). They’re just not directly called out, but it is there. Although, (legally, not morally), underinclusiveness isn’t inherently fatal, either.
Just having speech interests affected usually isn’t enough to trigger strict scrutiny. It usually has to be content based. This is normal, although strict scrutiny might be better.
Only if they can find a non-speech important reason. And that is still going to be a pretty big limiter (albeit an imperfect one, and less than the standard for compelling/narrowly tailored/least restrictive, which it should be). It’s not that trivial, unless the Court wants to pretend to be blind.
Taking out an entire platform doesn’t make something not narrowly tailored. To satisfy being narrowly tailored, it just needs to be written to only fulfill it’s written goals, and not be overbroad. That can be fulfilled while taking out a platform.
I mean, that’s just the strict (and intermediate) scrutiny roadmap. And it has to do a bit more than declare it, it has to be plausible enough for deference (which is not guaranteed).
The fact that a bunch of TikTok users have jumped to Red Note quite prominently makes the case that the users of the app weren’t concerned about their data privacy. After all, a bunch of prominent large American platforms have been hacked or otherwise use and abuse user data.
This seems like the same scenario as conservative Christians claiming that others are grooming children. They’re jealous of the perceived competition for their own pool of victims.
Who cares about a “communist” government across the ocean when you have oligarchs at home more likely to abuse you?
Re:
I think it’s less about that and more about sending a “fuck you” to the U.S. government for banning TikTok (and on a weak premise, at that). Maybe I’m wrong, but it doesn’t really feel like a “who gives a fuck about data privacy” move to me.
Re: Re: Big Brother is watching either way
The ones I’ve corresponded with have basically said that there’s no meaningful difference between corporations watching their every move and the Chinese government watching their every move. They’re not Chinese, after all.
Re: Re: Re:
Women in many US states would probably be better off using a CCP app for period tracking than anything domestic.
Re: Re: Re:2
The question, then, is whether the CCP would sell out American women to state governments run by a bunch of anti-abortion fuckheads who believe the law should force 13-year-old girls raped by their fathers to birth the child conceived by that rape.
Re: Re: Re:2
I suggest that use of any app will compromise the data.
Best use paper and pencil if anything at all.
Re: Re:
I agree. If one’s data is potentially at risk from Chinese apps and definitely at risk from American apps, then flocking to Chinese apps seems more like a big “Fuck you for trying to control where I spend my time online” to the US Government.
Re: Re:
I don’t think those positions are mutually exclusive. I’d even say that they’re just two coinciding reasons among a variety that are driving the move. I’ve seen videos of people specifically saying they aren’t worried about their data privacy and gleefully fantasizing about the dismay they’re causing to people who thought they would care and react out of fear to return to Facebook or Twitter, etc.
They said what they meant
Under the circumstances, then, this decision may have been the least damaging one the Court could come up with, at least in the available time.
Complete and utter bollocks. If they were concerned about not having enough time to have a more comprehensive trial and careful ruling they could have just ordered that the law be stayed until they could do so, not rushed it at the last hour.
(They also could have just tossed it out as the evidence-free, bigotry and corruption-based political stunt that it is if they wanted to do more than just punt it down the line for a bit…)
They chose to rule how they did and when they did, and as such deserve no ‘Well they did the best they could in the narrow time frame they had’-benefit of the doubt.
Re:
But issuing a stay of any length meant the Biden administration would never get to enforce the law – which matters a lot when Trump has said he won’t enforce it. For like 99% of laws in that situation you’d probably be really mad if SCOTUS issued a stay. You just don’t like this law.
That said, if they didn’t have time to properly consider the case, then they probably should have just let the lower court ruling stand rather than rush things.
Re: Re: non-enforcement is meaningless
Trump can say he wont enforce it but as long as the law is on the books carriers would need to obey the law and not carry tiktok. They cant just ignore it on Trump’s word that there wont be consequences.
Re: Re: Re:
Which is why there are no marijuana businesses in states which have legalized it – they know the federal government could start enforcing federal law at any time.
Oh, wait, there are a bunch of them? Well then.
Re: Re: Re: 'And I'll continue to not enforce it so long as you continue to do what I want...'
Not just the carriers, so long as the law is on the books convicted felon Trump can say all he wants that he won’t enforce it today but that says nothing about what he might do tomorrow, making it a sword hanging over TikTok’s head to be used any time he wants to wield it against them for any perceived slight.
Re:
You are absolutely right, here, except…
It was evidence free because the court decided not to review the National Security evidence, and even bragged that they weren’t going to do that. Whatever “solid” evidence they had available, they declined to consider.
Re: Re:
Well that’s one way to tell the world that your ruling has nothing to do with evidence and everything to do with politics…
At least you’re mostly positive.
In theory, if Tiktok shuts down, would the Anderson case regarding Section 230 from the third circuit still be heading for SCOTUS? And is that a good or bad thing, considering the tomfuckery the ruling would cause if it remains standing?
This comment has been flagged by the community. Click here to show it.
Best Farmhouse In Mumbai
Kenwoods Farmstay
Here’s the piece (Peace?) of Paradise that you are looking for! 2 hours from Mumbai. Or 3.5 hours from Surat.
Your detox from city life set in this vibrant green, breathe-easy oasis. A choice of 3 Farmstay homes spread over 3.5 acres. Take in a million stars in the clear sky above at night. And wake up to some REAL twitter in the trees at the break of dawn… Read More – https://kenwoodsfarmstay.com/
This comment has been flagged by the community. Click here to show it.
The evil CCP could just sell TikTok to a US-approved buyer. That they don’t speaks volumes. SCOTUS must be respected and revered for defending our national interests against evil, slant-eyed, rice-eating commies.
Eh, the Court, given it’s current behaviors, could just decide to revisit the case on their own and change the law out of the blue, just like in Roe v Wade. Right? lol
I don’t understand why TikTok (or any of these “platforms”) is arguing on the basis of free speech. They’ve been insisting for years that they are not the speaker, mostly to avoid being SLAPPed around by anyone who doesn’t like the opinions being posted. CDA section 230 and all? If they could make themselves look like a common carrier à la FedUPS or HellsBell System or whatever, they would.
Now suddenly they’re the speaker? Can’t have it both ways.
Even more rich is that this supposed lecture about free speech is coming from commie China, one of the worst enemies of free speech on the planet. Great Firewall of China aka “Golden Shield Project” anyone?
I’m no fan of the US, but Beijing has no right to lecture anyone.
Re:
Hosting a platform is free speech, the same way publishing a newspaper full of the words of other people is freedom of the press.
Re: Re:
Publishing a newspaper full of the words of other people means having to take legal liability for those words.
The “platforms” therefore loudly insist they’re not this, much like fleaBay insists they’re “just a venue” when their vendors sell you rubbish.
Re: Re: Re:
You’ll note that I referenced freedom of speech and freedom of the press. I didn’t conflate the two. Hosting a platform is an exercise of free speech. 230 means they have no liability for the speech of others that appear on their sites. But that doesn’t mean they aren’t expressing themselves by hosting the platform itself or moderating what appears on it.
Re: Re: Re:
Yes, and there’s a reason for that: Newspaper editors know all the content that is going to be in a given issue because they see the content before it’s published and give approval to publish that content. Interactive web services don’t do that because the people who own and operate such services don’t generally see content before it goes live. 230 exists to place liability for speech where it belongs—the speaker—and short circuit lawsuits that try to place liability on the tool used to post that speech.
YouTube has an ungodly amount of hours of content posted to its servers every day. If YouTube moderators had to pre-screen all of the content posted to the site in a single day for the sake of avoiding legal liability for defamation, copyright infringement, et cetera, it would probably take weeks to process that single day’s worth of content. I doubt you want that fate for YouTube—but under a standard that would force YouTube into the same legal jeopardy as a newspaper that prints whatever the hell it wants, that fate would become reality.
Re: Re: Re:2
And yet when a website does this by pre-approving comments, they get Section 230 protection on content they had every opportunity to know was potentially defamatory (for example) before it was posted. This is wrong and should be changed, but changing it does not require any alteration of the text of 230, only some of the case law on it.
Re: Re: Re:3
Got any evidence to back up that delusion?
Re: Re: Re:3
And yet when a website does this by pre-approving comments
[Citation Needed], which specific platform(s) ‘pre-approves’ comments and where in their TOS does it say they do so.
Re: Re: Re:4
Blogs frequently pre-approve comments, but such is the nature of those websites that they have no TOS for commenters, which doesn’t matter as Section 230 applies regardless of TOS. Here’s one example of a blog that pre-approves comments.
Re: Re: Re:5
You seem to be using “pre-approve” to mean “allow comments to be posted without first holding them for moderation and review.” That’s not “pre-approval.” That’s lack of pre-moderation. Approval means something different.
Re: Re: Re:
“The “platforms” therefore loudly insist they’re not this”
And that is why truth social is allowed to contain so many lies.
Cuts both ways
Re: Re: Re:
“Publishing a newspaper full of the words of other people means having to take legal liability for those words.”
And this is why the likes of infowars, breitbart, twitter, truth social, etc are website based so they can lie out their ass with impunity.
Re: Re: Re:2
Those sites are still liable for first-party speech, which 230 doesn’t cover. Alex Jones/InfoWars knows this all too well.
Re:
I would argue the other way around: it’s sad that a Chinese-owned company argues in favour of free speech, and a US court doesn’t care.
this isn't about free speech!
the 1A is a cute touch! but the real reason for the ban is who owns it and who gets the data! so…. if we are going to play that card! then google, fakebook and the many other data brokers should be up next! because they all buy, sell and trade your data to anyone with a nickel! including china…..
SO! lets get cracking on this “NATIONAL SECURITY” thing! and shut down the data brokers!
Re: 'Tainted meat is a health threat and needs to be stopped! ... In this ONE store.'
That’s really been the giveaway that the whole TikTok thing has been and is nothing more than a dishonest, bigotry and corruption-fueled PR stunt from the outset: how despite politicians claiming that it’s of National Security-level importance to protect user privacy and reign in rampant data collection only one company/platform has been targeted, with all the others not even mentioned.
A Chinese lab has an accident where a pathogen escapes, killing more than a million Americans, and the U.S. government is fine with that.
A platform that includes political speech that is resistance to a hostile U.S. business environment because of foreign ownership – big problem.
This reads like a cheap novel. The U.S. wants all political speech under control by U.S. businesses so that the government can coerce censorship by creating a hostile business environment for those not censoring as the government wants.
Re:
Remains science fiction.