Congress Pretends It’s Fixed All The Problems With KOSA; It Hasn’t

from the you-can't-fix-what's-fundamentally-broken dept

On Wednesday, the Senate revealed an amended version of the Kids Online Safety Act (KOSA) for today’s hearing over the bill. One would hope with so much public pushback over the bill, they might do something crazy like trying to fix the bill.

That is, apparently, way too much to ask.

Earlier today, the Senate held its markup on the bill, in which numerous Senators from both parties pretended they had listened to the concerns people had about KOSA and “fixed” them. They did not. There was misleading talk about “protecting the children” (the bill will put them at greater risk) and a bunch of other nonsense. Lots of Senators then tried to amend KOSA further to add their own pet (usually unconstitutional) “save the children” bills to it.

Then they moved forward with the bill so that it might come to a floor vote if Chuck Schumer decides that it’s really time to destroy the internet.

TechFreedom’s Ari Cohn has an incredibly thorough breakdown of the current version of the bill and its myriad problems. For example, in response to concerns that KOSA would require age verification, the amended KOSA basically adds a “nuh uh, we have no such requirement” into the bill. But then creates an impossibly (and constitutionally) vague standard:

As we wrote last year, KOSA’s original language would have effectively required covered platforms to verify the age and thus the identity of every user. KOSA’s revised text attempts to avoid this First Amendment problem by requiring covered platforms to protect only those users that it has “actual knowledge or knowledge fairly implied on the basis of objective circumstances” are minors. Furthermore, new rules of construction say that the bill does not require platforms to collect age-related data or perform age verification. While doubtless well-intentioned, these changes merely trade a clear, explicit mandate for a vague, implicit one; the unconstitutional effect on anonymous expression will be the same.

It is entirely unclear what constitutes “knowledge fairly implied” that a particular user is a minor. In an enforcement action, the Federal Trade Commission must consider the “totality of the circumstances,” which includes, but is not limited to, “whether the operator, using available technology, exercised reasonable care.” Vague as this provision is, it apparently does not apply to civil suits brought by state attorneys general, which could give them even more unpredictable discretion.

Basically, the new KOSA softens the language that previously appeared to effectively force companies into adopting age verification, and instead basically says that the FTC gets to set the standards for what “knowledge fairly implied” is regarding if there are children on the site. This is basically giving a massive, and easily abused, power to the FTC.

As Cohn notes, this provision might also be a sort of ticking time-bomb for encryption, allowing the FTC to say the use of encryption is an evasion tool, and also to extend its authority way beyond what’s reasonable:

Thus, one can only speculate as to how this key term would be interpreted. This uncertainty alone makes age verification the most risk-averse, “reasonable” course of action for platforms—especially with respect to end-to-end-encrypted services. Both the FTC and state attorneys general will likely draw interpretive cues from COPPA, which requires parental consent for the “collection, use, or disclosure of personal information from children” when a service has actual knowledge that a user is under 13 years old or when the service is “directed to” children under 13—effectively, the service has constructive knowledge that its users are highly likely to be minors. To date, COPPA has had negligible effects on adults because services directed to children under 13 are unlikely to be used by anyone other than children due to their limited functionality, effectively mandated by COPPA. But extending COPPA’s framework to sites “directed to” older teens would significantly burden the speech of adults because the social media services and games that older teens use are largely the same ones used by adults.

The FTC recently began to effectively extend COPPA to cover teens. Whether or not the FTC Act gives the Commission such authority, this example illustrates what the FTC—and state attorneys general—might do with the broad language of KOSA. In a 2022 enforcement action, the FTC alleged that Epic Games had committed an unfair trade practice by allowing users of its popular Fortnite game to chat with other users, despite knowing that “a third of Fortnite players, based on social media data, are teens aged 13-17.” While the complaint focused on Epic Games’ use of its audience composition for marketing purposes, its logic could establish “knowledge fairly implied” under KOSA. This complaint was remarkable not only for extending COPPA to teens but also because the FTC had effectively declared a threshold above which it would consider a site “directed to” them—something the FTC had never done for sites “directed to” minors under COPPA.

The revised bill also fails to fix KOSA biggest, worst, and central problem: the “duty of care” concept. As we’ve been explaining for years, the “duty of care” concept is dangerous. It is, effectively, the way the Chinese Great Firewall worked originally. Rather than telling websites what couldn’t be posted, they were basically told that if they allowed anything later deemed to be bad, then they would get fined. The end result? Massive overblocking.

The “duty of care” concept is effectively the same mechanism. Because, if anyone using your site somehow gets harmed (loosely defined), then blame can flow back to the site for not preventing it, even if the website had no visibility or understanding or direct connection to that harm. Instead, it will be argued that the site had a magical “duty of care” to see into the future to figure out how some random usage might have resulted in harm, and to have magically gone back in time to prevent it.

This is nothing but scapegoat bait. Anyone does anything bad and talks about it on the internet? You get to sue the internet companies! The end result, of course, is that internet companies are going to sterilize the shit out of the internet, taking down any conversation about anything controversial, or that might later be tied to some sort of random harm.

As Cohn explains:

This duty of care directly requires platforms to protect against the harmful effects of speech, the overwhelming majority of which is constitutionally protected. As we explained in last year’s letter, courts have held that imposing a duty to protect against harms caused by speech violates the First Amendment.

The unconstitutionality of KOSA’s duty of care is highlighted by its vague and unmeetable nature. Platforms cannot “prevent and mitigate” the complex psychological issues that arise from circumstances across an individual’s entire life, which may manifest in their online activity. These circumstances mean that material harmful to one minor may be helpful or even lifesaving to another, particularly when it concerns eating disorders, self-harm, drug use, and bullying. Minors are individuals, with differing needs, emotions, and predispositions. Yet KOSA would require platforms to undertake an unworkable one-size-fits-all approach to deeply personal issues, thus ultimately serving the best interests of no minors.

This is such a key point. Studies have shown, repeatedly, that content around eating disorders and mental health impacts people in very different ways. For some people, some kinds of content can be helpful. But the very same kind of content can be harmful for others. How do you prevent that? The answer is that you prevent all “controversial” content, which does significant harm to the people who were previously getting help through that content.

For example, within the eating disorder world, it’s common for services to intersperse content about healthy eating practices, or guiding people towards sources of information intended to help them find help. But some people argue that even seeing that content could trigger some to engage in dangerous activity. So, now even something as simple as a PSA about eating properly could fail the test under KOSA.

And, of course, the whole thing is wildly unconstitutional:

The Supreme Court has upheld the FCC’s authority to regulate indecency on broadcast media, reasoning that children have easy access to broadcasts, and the nature of the medium makes it impossible to “completely protect . . . from unexpected program content.” But even so, the courts have consistently held that imposing a duty of care on broadcasters to protect minors would violate the First Amendment. There can be no doubt that imposing a duty of care against online platforms, over which the government has far less regulatory authority, is still more obviously unconstitutional.

I know that the Senators insist this amended version of KOSA magically “fixed” all the problems, but the problems seem fundamental to the bill — and to the grandstanding politicians pushing it.

Filed Under: , , , , , ,

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Congress Pretends It’s Fixed All The Problems With KOSA; It Hasn’t”

Subscribe: RSS Leave a comment
8 Comments
Anonymous Coward says:

Re:

… the open web really is doomed, isn’t it.

Only those portions of the open web that the government can access. The rest of us know how to “darken” things up” a bit, if you get my drift…..

I think what we’re really seeing here is a re-enactment of the Old West and how it was tamed, so to speak. (Using the obligatory car analogy would require far too much verbiage, with side detours every few years over the past century and a half.) In short, while we’re no longer armed to the teeth morning, noon and night, are still experiencing the same idiocy in having under-developed-brain little children running around shooting off both their weapons and their mouths, sometimes together.

IOW, meet the new boss, same as the old boss. (copyright 1971, Pete Townshend)

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...