What State Action Doctrine? Biden Administration Renews Push For Deal With TikTok, Where US Government Would Oversee Content Moderation On TikTok
from the that's-not-how-any-of-this-works dept
So, for all of the nonsense about what level of coercive power governments have over social media companies, it’s bizarre how little attention has been paid to the fact that TikTok is apparently proposing to give the US government control over its content moderation setup, and the US government is looking at it seriously.
As you likely know, there’s been an ongoing moral panic about TikTok in particular. The exceptionally popular social media app (that became popular long after we were assured that Facebook had such a monopoly on social media no new social media app could possibly gain traction) happens to be owned by a Chinese company, ByteDance, which has resulted in a series of concerns about the privacy risks of using the app. Some of those concerns are absolutely legitimate. But many of them are nonsense.
And, for basically all of the legitimate concerns the proper response would be to pass a comprehensive federal data privacy law. But no one seems to have the appetite for that. You get more headlines and silly people on social media cheering you on by claiming you want to ban TikTok (this is a bipartisan moral panic).
Instead of recognizing all of this and doing the right thing after Trump’s failed attempt at banning TikTok, the Biden administration has… simply kept on trying to ban TikTok or force ByteDance to divest. That’s another repeat of a bad Trump idea, which ended not in the divestiture, but Trump getting his buddy Larry Ellison’s company, Oracle, a hosting deal for TikTok. And, of course, TikTok and Oracle now insist that Oracle is reviewing TikTok’s algorithms and content moderation practices.
But, moral panics are not about facts, but panics. So, the Biden administration did the same damn thing Trump did three years earlier in demanding that TikTok be fully separated from ByteDance, or said the company would get banned in the US. Apparently negotiations fell apart in the spring, hopefully because TikTok folks know full well that the government can’t just ban TikTok.
However, the Washington Post says that they’re back to negotiating (now that the Biden administration is mostly convinced a ban would be unconstitutional), and the focus is on a TikTok proffered plan to… wait for it… outsource content moderation questions to the US government. This plan was first revealed in Forbes by one of the best reporters on this beat: Emily Baker-White (whom TikTok surveilled to try to find out where she got her stories from…). And it’s insane:
The draft agreement, as it was being negotiated at the time, would give government agencies like the DOJ or the DOD the authority to:
- Examine TikTok’s U.S. facilities, records, equipment and servers with minimal or no notice,
- Veto the hiring of any executive involved in leading TikTok’s U.S. Data Security org,
- Order TikTok and ByteDance to pay for and subject themselves to various audits, assessments and other reports on the security of TikTok’s U.S. functions, and,
- In some circumstances, require ByteDance to temporarily stop TikTok from functioning in the United States.
The draft agreement would make TikTok’s U.S. operations subject to extensive supervision by an array of independent investigative bodies, including a third-party monitor, a third-party auditor, a cybersecurity auditor and a source code inspector. It would also force TikTok U.S. to exclude ByteDance leaders from certain security-related decision making, and instead rely on an executive security committee that would operate in secrecy from ByteDance. Members of this committee would be responsible first for protecting the national security of the United States, as defined by the Executive Branch, and only then for making the company money.
For all the (mostly misleading) talk of the US government having too much say in content moderation decisions, this move would literally put US government officials effectively in control of content moderation decisions for TikTok. Apparently the thinking is “welp, it’s better than the Chinese government.” But… that doesn’t mean it’s good. Or constitutional.
“If this agreement would give the U.S. government the power to dictate what content TikTok can or cannot carry, or how it makes those decisions, that would raise serious concerns about the government’s ability to censor or distort what people are saying or watching on TikTok,” Patrick Toomey, deputy director of the ACLU’s National Security Project, told Forbes.
The Washington Post has even more details, which don’t make it sound any better:
A subsidiary called TikTok U.S. Data Security, which would handle all of the app’s critical functions in the United States, including user data, engineering, security and content moderation, would be run by the CFIUS-approved board that would report solely to the federal government, not ByteDance.
CFIUS monitoring agencies, including the departments of Justice, Treasury and Defense, would have the right to access TikTok facilities at any time and overrule its policies or contracting decisions. CFIUS would also set the rules for all new company hires, including that they must be U.S. citizens, must consent to additional background checks and could be denied the job at any time.
All of the company’s internal changes to its source code and content-moderation playbook would be reported to the agencies on a routine basis, the proposal states, and the agencies could demand ByteDance “promptly alter” its source code to “ensure compliance” at any time. Source code sets the rules for a computer’s operation.
Honestly, what this reads as is the moral panic over China and TikTok so eating the brains of US officials that rather than saying “hey, we should have privacy laws that block this,” they thought instead “hey, that would be cool if we could just do all the things we accuse China of doing, but where we pull the strings.”
Now, yes, it’s true that an individual or private company can voluntarily choose to give up its constitutionally protected rights, but there is no indication that any of this is even remotely close to voluntary. If the 5th Circuit found that simply explaining what is misinformation about COVID was too coercive for social media companies to make moderation decisions over, then how is “take this deal or we’ll ban your app from the entire country” not similarly coercive?
Furthermore, it’s not just the rights of TikTok to consider here, but the millions of users on the platform, who have not agreed to give up their own 1st Amendment rights.
Indeed, I would think there’s a very, very high probability that if this deal were to be put in place, it would backfire spectacularly, because anyone who was moderated on TikTok and didn’t like it would actually have a totally legitimate 1st Amendment complaint that it was driven by the US government, and that TikTok was a state actor (because it totally would be under those conditions).
In other words, if the administration and TikTok actually consummated such a deal, the actual end result would be that TikTok would effectively no longer be able to do much content moderation at all, because it would only be able to take down content that was not 1st Amendment protected.
So, look, if we’re going to talk about US government influence over content moderation choices, why aren’t we talking much more about this?