Congress’ Kids Online Safety Act Puts Kids At Risk Through Fuzzy Language
from the will-this-protect-or-harm-children? dept
The Kids Online Safety Act (KOSA) was voted out of committee with a long list of amendments. Advocates had been warning about some severe unintended consequences that could arise out of this bill, the most concerning of which was forcing tech companies to out LGBTQ+ minors to their parents — potentially against their wishes. The amendments were supposed to fix these issues and more. But did they?
The short answer is there was an honest attempt but I believe it falls short, and I think it falls short for a specific reason.
The background of the bill
In order to understand why this bill has significant problems, we first have to cover some basics and separate the intended and unintended harms from the bill.
Let’s start with what the bill wants to do, which is set a floor of protections for minors. It does that by creating a duty of care to act in the best interest of the minor. The bill then goes on to loosely define what that means and what category of harms online platforms need to be shielding minors against, requiring the creation of certain tools parents can use to monitor their kids, etc.. It also gives platforms plenty of homework, like creating an annual report identifying the risks they think minors will encounter on their platform and what they are doing to mitigate those harms.
So why did I say this bill has intended harms? Well drafting a bill is hard, you have to describe what you mean when you say a company “shall act in the best interests of a minor” to “take reasonable measures” to “prevent and mitigate mental health disorders” or “addiction”. The more granular you get the more confusing it gets and the more broad it’s stated the harder it is to apply to specific facts.
Let’s say I’m playing a game with VOIP and someone calls me a slur. Was that because the game company failed to take reasonable measures? If I want to play a game during all my free time is it because the game is really good or because it was intentionally made to provoke “compulsive usage”? What even are “reasonable measures”? Especially when many of the things the bill describes impacts people differently.
KOSA’s intentional fuzzy language
KOSA’s authors are basically outsourcing to courts how to apply the bill’s fuzzy language to actual facts. Practically, this means that if the bill is passed all platforms will attempt to comply with what they think the text means. Then at least one of the platforms will almost certainly get sued for falling short. Those companies will then have to go through a lot of discovery and judges will just muddle through it.
This will be a lengthy, painful, expensive, and time consuming process. But I think it’s intentional. Many in Congress think that platforms are not doing enough to protect kids, even though they should have the resources to do so. They either don’t see, or don’t care about, the large amount of resources already going into trust and safety divisions to protect all users, including minors. They see a problem that needs to be immediately solved, and believe a strong regulatory response will give platforms enough of a kick in the pants to figure it out. This is the famous “nerd harder” complaint that often gets leveled at Silicon Valley.
If you look at KOSA through this lens, everything kind of makes sense. It doesn’t matter that it sets up a bunch of expensive new compliance efforts that may or may not be productive. It doesn’t matter that it may kill off some companies or force consolidation. It doesn’t even matter that some platforms will try to bar minors from their platform completely (of course we all know that kids will figure out how to get on the platforms anyways). It’s a big extrinsic shock that they hope will shake things up enough so that platforms will finally nerd hard enough.
After all, the enforcement of KOSA is limited to the FTC and state AGs. We can trust them to only bring cases that will advance the welfare of children right?
KOSA’s extremely bad unintended harms
In Normal Times™, this is how the debate on whether to pass KOSA would go: this bill is a mess and will be too painful (and expensive) to sort out — vs. — we really don’t care, the platforms can afford it, and we think it will do something to at least make the world slightly better.
But these aren’t normal times, and advocates have been warning that not only will this bill be painful to sort out, it provides an avenue of attack from ideologues using the legal system to go after marginalized communities. This is a real threat that no lawmaker (especially Democrats) should be complicit in, especially considering that the overturning of Roe has become a starter pistol for using the legal system towards culture wars and extreme ideological ends.
The main avenue of attack built into the original KOSA was towards the LGBTQ community, and the feedback given at the time was that it will out kids to parents that might not be tolerant and could result in things like minors being thrown out of homes or sent to conversion therapy. This is what advocates warned the drafters of, and what new language sought to fix.
So was this fixed? Sort of. They added a provision saying that the bill shouldn’t be interpreted to require the disclosure to parents of things like browsing behavior, search history, messages, content of communications. The tools that platforms are required to provide to parents now seems solely directed at high level things like time used, purchases, etc.. But, there is a sort of dangling requirement that there are “control options that allow parents to address the harms described in” the big section describing the harms they want to stop. What option stops bullying? I’d like to know (maybe it will allow me to stop being T-bagged in multiplayer games).
Sorting that out may or may not sweep some sensitive data back in and expose kids. Sometimes kids keep secrets to protect themselves from their parents. This makes sense to me, I had a friend growing up sent to one of the reform schools Paris Hilton warned us about. However, I’m overall less concerned about forced outing than I was before the amendments.
I’m now more concerned this bill invites a broad attack against platforms allowing a kid to see any pro-LGBTQ content. The culture wars’ Eye of Sauron has turned to harassment and vile behavior towards this community, especially trans persons, and they are doing so under the banner of protecting children.
Unfortunately, the language these people are using to vilify the LGBTQ community is everywhere in the bill. Being trans has been called a mental health disorder, and this bill says platforms are required to protect minors from that. Seeing a drag queen, period, has also been described as sexual exploitation, grooming, and sexual abuse. Again, barred in KOSA. Gender affirming care has been referred to as self-harm, which again platforms are required to protect against under KOSA.
The bill’s fuzzy language, which may have been seen by the drafters as an asset, is now a huge liability. And it’s not just limited to anti-LGBTQ content. For example, a minor seeking information about how to receive a safe abortion could also be described as self-harm.
The bill’s authors might think that they are safe from their bill being used in these culture wars because enforcement is limited to the FTC and State Attorneys General. While I worry less about the FTC (now) it’s easy to imagine certain state AGs getting before the right judge and successfully barring minors from access to basic information they need to understand what they are going through and how to receive help, if they need it. Just look to Florida, where governor Desantis has filed a complaint against a restaurant and bar that allowed kids at a drag brunch and said that parents that allow their children to see a drag performance could be targeted by child protective services.
This bill is throwing a hand grenade into the middle of a particularly dark moment of our legal system. I don’t think that’s wise, or very smart politically when the odds are actually quite high someone decides to take this bill up on its offer.
Matthew Lane is a Senior Director at InSight Public Affairs.