What State Action Doctrine? Biden Administration Renews Push For Deal With TikTok, Where US Government Would Oversee Content Moderation On TikTok

from the that's-not-how-any-of-this-works dept

So, for all of the nonsense about what level of coercive power governments have over social media companies, it’s bizarre how little attention has been paid to the fact that TikTok is apparently proposing to give the US government control over its content moderation setup, and the US government is looking at it seriously.

As you likely know, there’s been an ongoing moral panic about TikTok in particular. The exceptionally popular social media app (that became popular long after we were assured that Facebook had such a monopoly on social media no new social media app could possibly gain traction) happens to be owned by a Chinese company, ByteDance, which has resulted in a series of concerns about the privacy risks of using the app. Some of those concerns are absolutely legitimate. But many of them are nonsense.

And, for basically all of the legitimate concerns the proper response would be to pass a comprehensive federal data privacy law. But no one seems to have the appetite for that. You get more headlines and silly people on social media cheering you on by claiming you want to ban TikTok (this is a bipartisan moral panic).

Instead of recognizing all of this and doing the right thing after Trump’s failed attempt at banning TikTok, the Biden administration has… simply kept on trying to ban TikTok or force ByteDance to divest. That’s another repeat of a bad Trump idea, which ended not in the divestiture, but Trump getting his buddy Larry Ellison’s company, Oracle, a hosting deal for TikTok. And, of course, TikTok and Oracle now insist that Oracle is reviewing TikTok’s algorithms and content moderation practices.

But, moral panics are not about facts, but panics. So, the Biden administration did the same damn thing Trump did three years earlier in demanding that TikTok be fully separated from ByteDance, or said the company would get banned in the US. Apparently negotiations fell apart in the spring, hopefully because TikTok folks know full well that the government can’t just ban TikTok.

However, the Washington Post says that they’re back to negotiating (now that the Biden administration is mostly convinced a ban would be unconstitutional), and the focus is on a TikTok proffered plan to… wait for it… outsource content moderation questions to the US government. This plan was first revealed in Forbes by one of the best reporters on this beat: Emily Baker-White (whom TikTok surveilled to try to find out where she got her stories from…). And it’s insane:

The draft agreement, as it was being negotiated at the time, would give government agencies like the DOJ or the DOD the authority to:

  • Examine TikTok’s U.S. facilities, records, equipment and servers with minimal or no notice,
  • Block changes to the app’s U.S. terms of service, moderation policies and privacy policy,
  • Veto the hiring of any executive involved in leading TikTok’s U.S. Data Security org,
  • Order TikTok and ByteDance to pay for and subject themselves to various audits, assessments and other reports on the security of TikTok’s U.S. functions, and,
  • In some circumstances, require ByteDance to temporarily stop TikTok from functioning in the United States.

The draft agreement would make TikTok’s U.S. operations subject to extensive supervision by an array of independent investigative bodies, including a third-party monitor, a third-party auditor, a cybersecurity auditor and a source code inspector. It would also force TikTok U.S. to exclude ByteDance leaders from certain security-related decision making, and instead rely on an executive security committee that would operate in secrecy from ByteDance. Members of this committee would be responsible first for protecting the national security of the United States, as defined by the Executive Branch, and only then for making the company money.

For all the (mostly misleading) talk of the US government having too much say in content moderation decisions, this move would literally put US government officials effectively in control of content moderation decisions for TikTok. Apparently the thinking is “welp, it’s better than the Chinese government.” But… that doesn’t mean it’s good. Or constitutional.

“If this agreement would give the U.S. government the power to dictate what content TikTok can or cannot carry, or how it makes those decisions, that would raise serious concerns about the government’s ability to censor or distort what people are saying or watching on TikTok,” Patrick Toomey, deputy director of the ACLU’s National Security Project, told Forbes.

The Washington Post has even more details, which don’t make it sound any better:

A subsidiary called TikTok U.S. Data Security, which would handle all of the app’s critical functions in the United States, including user data, engineering, security and content moderation, would be run by the CFIUS-approved board that would report solely to the federal government, not ByteDance.

CFIUS monitoring agencies, including the departments of Justice, Treasury and Defense, would have the right to access TikTok facilities at any time and overrule its policies or contracting decisions. CFIUS would also set the rules for all new company hires, including that they must be U.S. citizens, must consent to additional background checks and could be denied the job at any time.

All of the company’s internal changes to its source code and content-moderation playbook would be reported to the agencies on a routine basis, the proposal states, and the agencies could demand ByteDance “promptly alter” its source code to “ensure compliance” at any time. Source code sets the rules for a computer’s operation.

Honestly, what this reads as is the moral panic over China and TikTok so eating the brains of US officials that rather than saying “hey, we should have privacy laws that block this,” they thought instead “hey, that would be cool if we could just do all the things we accuse China of doing, but where we pull the strings.”

Now, yes, it’s true that an individual or private company can voluntarily choose to give up its constitutionally protected rights, but there is no indication that any of this is even remotely close to voluntary. If the 5th Circuit found that simply explaining what is misinformation about COVID was too coercive for social media companies to make moderation decisions over, then how is “take this deal or we’ll ban your app from the entire country” not similarly coercive?

Furthermore, it’s not just the rights of TikTok to consider here, but the millions of users on the platform, who have not agreed to give up their own 1st Amendment rights.

Indeed, I would think there’s a very, very high probability that if this deal were to be put in place, it would backfire spectacularly, because anyone who was moderated on TikTok and didn’t like it would actually have a totally legitimate 1st Amendment complaint that it was driven by the US government, and that TikTok was a state actor (because it totally would be under those conditions).

In other words, if the administration and TikTok actually consummated such a deal, the actual end result would be that TikTok would effectively no longer be able to do much content moderation at all, because it would only be able to take down content that was not 1st Amendment protected.

So, look, if we’re going to talk about US government influence over content moderation choices, why aren’t we talking much more about this?

Filed Under: , , , , , ,
Companies: bytedance, tiktok

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “What State Action Doctrine? Biden Administration Renews Push For Deal With TikTok, Where US Government Would Oversee Content Moderation On TikTok”

Subscribe: RSS Leave a comment
Anonymous Coward says:


It is seen as a zero-cost punching bag. They are not significant campaign contributors, there is no measurable voting block that supports them, and they don’t offer high-paying jobs to former legislators. On the other hand, pointing and screaming ‘TikTok bad’ curries favor with the news media that is jealous of social media’s profits and reach and distracts the electorate from the real problems that the politicians refuse to take on.

This comment has been flagged by the community. Click here to show it.

This comment has been flagged by the community. Click here to show it.

T.L. (profile) says:

If the Biden administration recognizes that an outright ban would be unconstitutional, why would they move onto an idea that is also unconstitutional?

The administration has asked the Supreme Court to rule on the constitutionality of Florida and Texas’s respective social media moderation laws (both of which have been found unconstitutional), in light of the Fifth Circuit Appeals Court’s ruling allowing the Texas law to go into effect (a decision led by Trump-appointed Andy Oldham, who appears to have almost as much misunderstanding of how the constitution works as the man who appointed him does)… and the administration is OPPOSING those laws. So, they really should look at the optics: even negotiating such an arrangement like this gives the impression that it’s only OK for the government to decide how a social media platform moderates content if one party does it, a concept Republicans both decry when they think Democrats do it (Republican AGs sued the administration for contacting platforms about moderation decisions regarding COVID disinformation and misinformation) and hypocritically do themselves (as with the Texas and Florida moderation laws, designed to prevent “conservatives” who spout bigotry and lies from being subject to social platforms’ terms of service). They should also recognize this would ultimately put them in the same boat legally as the situation Abbott, DeSantis and their GOP brethren are in, because of their lack of constitutional understanding, and risk sinking their case on the social media laws they passed.

This comment has been deemed insightful by the community.
Anonymous Coward says:

Still with the “concerns” about privacy/security somehow leading to control of moderation. How the hell does this work?

The TikTok thing, while being a bunch of xenophobic security theatre, is also a bunch of bafflegab to get a foot in the door. “Well, TikTok is controlled by the government, so shouldn’t all the things be controlled by the government?”

Kevin P. Neal (profile) says:

Where's the law when you need it?

Can someone point me at the statute or other source of law that gives the federal government the right to even attempt to force these sorts of terms onto a company that has not lost a lawsuit justifying them?

I can’t figure out if TikTok has bad lawyers or if they’re just overly used to operating the Chinese way.

T.L. (profile) says:


The White House’s lawyers should probably have researched whether this could pass legal muster, ‘cause having the government oversee moderation decisions would risk being a First Amendment violation, in the same manner that got Texas and Florida sued (I explained this in more detail in another comment).

NetChoice, which has TikTok as a member and filed the suits over the Texas and Florida moderation laws, could easily sue the Biden administration (even if TikTok is voluntary offering to allow the government regulate its content moderation practices), because of the precedent it would set for other social platforms, especially under a GOP president.

This comment has been deemed insightful by the community.
That One Guy (profile) says:

From flawed moderation to no moderation

The irony of course is that like all government attempts to put social media under their control if they actually ever succeeded then the first amendment which doesn’t bind private entities but does bind the government would force them to take a near-completely hands-off approach, quickly turning any platform into an unusable cesspit filled with the worst legal content possible.

Great if you want to pull an Elon and destroy a social media platform and/or show why moderation is necessary but not so great if the goal is to clean up the platform into something ‘better’.

Nemo_bis (profile) says:

Takings Clause

[the company] would be run by the CFIUS-approved board that would report solely to the federal government, not ByteDance

Almost like being in administrative receivership? I see they’re copying ideas from Russia too (where companies from “unfriendly countries” get new administrators appointed by the government if they’re insufficiently aligned, while nominally retaining their owners).

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...