Congress’ Kids Online Safety Act Puts Kids At Risk Through Fuzzy Language
from the will-this-protect-or-harm-children? dept
The Kids Online Safety Act (KOSA) was voted out of committee with a long list of amendments. Advocates had been warning about some severe unintended consequences that could arise out of this bill, the most concerning of which was forcing tech companies to out LGBTQ+ minors to their parents — potentially against their wishes. The amendments were supposed to fix these issues and more. But did they?
The short answer is there was an honest attempt but I believe it falls short, and I think it falls short for a specific reason.
The background of the bill
In order to understand why this bill has significant problems, we first have to cover some basics and separate the intended and unintended harms from the bill.
Let’s start with what the bill wants to do, which is set a floor of protections for minors. It does that by creating a duty of care to act in the best interest of the minor. The bill then goes on to loosely define what that means and what category of harms online platforms need to be shielding minors against, requiring the creation of certain tools parents can use to monitor their kids, etc.. It also gives platforms plenty of homework, like creating an annual report identifying the risks they think minors will encounter on their platform and what they are doing to mitigate those harms.
So why did I say this bill has intended harms? Well drafting a bill is hard, you have to describe what you mean when you say a company “shall act in the best interests of a minor” to “take reasonable measures” to “prevent and mitigate mental health disorders” or “addiction”. The more granular you get the more confusing it gets and the more broad it’s stated the harder it is to apply to specific facts.
Let’s say I’m playing a game with VOIP and someone calls me a slur. Was that because the game company failed to take reasonable measures? If I want to play a game during all my free time is it because the game is really good or because it was intentionally made to provoke “compulsive usage”? What even are “reasonable measures”? Especially when many of the things the bill describes impacts people differently.
KOSA’s intentional fuzzy language
KOSA’s authors are basically outsourcing to courts how to apply the bill’s fuzzy language to actual facts. Practically, this means that if the bill is passed all platforms will attempt to comply with what they think the text means. Then at least one of the platforms will almost certainly get sued for falling short. Those companies will then have to go through a lot of discovery and judges will just muddle through it.
This will be a lengthy, painful, expensive, and time consuming process. But I think it’s intentional. Many in Congress think that platforms are not doing enough to protect kids, even though they should have the resources to do so. They either don’t see, or don’t care about, the large amount of resources already going into trust and safety divisions to protect all users, including minors. They see a problem that needs to be immediately solved, and believe a strong regulatory response will give platforms enough of a kick in the pants to figure it out. This is the famous “nerd harder” complaint that often gets leveled at Silicon Valley.
If you look at KOSA through this lens, everything kind of makes sense. It doesn’t matter that it sets up a bunch of expensive new compliance efforts that may or may not be productive. It doesn’t matter that it may kill off some companies or force consolidation. It doesn’t even matter that some platforms will try to bar minors from their platform completely (of course we all know that kids will figure out how to get on the platforms anyways). It’s a big extrinsic shock that they hope will shake things up enough so that platforms will finally nerd hard enough.
After all, the enforcement of KOSA is limited to the FTC and state AGs. We can trust them to only bring cases that will advance the welfare of children right?
KOSA’s extremely bad unintended harms
In Normal Times™, this is how the debate on whether to pass KOSA would go: this bill is a mess and will be too painful (and expensive) to sort out — vs. — we really don’t care, the platforms can afford it, and we think it will do something to at least make the world slightly better.
But these aren’t normal times, and advocates have been warning that not only will this bill be painful to sort out, it provides an avenue of attack from ideologues using the legal system to go after marginalized communities. This is a real threat that no lawmaker (especially Democrats) should be complicit in, especially considering that the overturning of Roe has become a starter pistol for using the legal system towards culture wars and extreme ideological ends.
The main avenue of attack built into the original KOSA was towards the LGBTQ community, and the feedback given at the time was that it will out kids to parents that might not be tolerant and could result in things like minors being thrown out of homes or sent to conversion therapy. This is what advocates warned the drafters of, and what new language sought to fix.
So was this fixed? Sort of. They added a provision saying that the bill shouldn’t be interpreted to require the disclosure to parents of things like browsing behavior, search history, messages, content of communications. The tools that platforms are required to provide to parents now seems solely directed at high level things like time used, purchases, etc.. But, there is a sort of dangling requirement that there are “control options that allow parents to address the harms described in” the big section describing the harms they want to stop. What option stops bullying? I’d like to know (maybe it will allow me to stop being T-bagged in multiplayer games).
Sorting that out may or may not sweep some sensitive data back in and expose kids. Sometimes kids keep secrets to protect themselves from their parents. This makes sense to me, I had a friend growing up sent to one of the reform schools Paris Hilton warned us about. However, I’m overall less concerned about forced outing than I was before the amendments.
I’m now more concerned this bill invites a broad attack against platforms allowing a kid to see any pro-LGBTQ content. The culture wars’ Eye of Sauron has turned to harassment and vile behavior towards this community, especially trans persons, and they are doing so under the banner of protecting children.
Unfortunately, the language these people are using to vilify the LGBTQ community is everywhere in the bill. Being trans has been called a mental health disorder, and this bill says platforms are required to protect minors from that. Seeing a drag queen, period, has also been described as sexual exploitation, grooming, and sexual abuse. Again, barred in KOSA. Gender affirming care has been referred to as self-harm, which again platforms are required to protect against under KOSA.
The bill’s fuzzy language, which may have been seen by the drafters as an asset, is now a huge liability. And it’s not just limited to anti-LGBTQ content. For example, a minor seeking information about how to receive a safe abortion could also be described as self-harm.
The bill’s authors might think that they are safe from their bill being used in these culture wars because enforcement is limited to the FTC and State Attorneys General. While I worry less about the FTC (now) it’s easy to imagine certain state AGs getting before the right judge and successfully barring minors from access to basic information they need to understand what they are going through and how to receive help, if they need it. Just look to Florida, where governor Desantis has filed a complaint against a restaurant and bar that allowed kids at a drag brunch and said that parents that allow their children to see a drag performance could be targeted by child protective services.
This bill is throwing a hand grenade into the middle of a particularly dark moment of our legal system. I don’t think that’s wise, or very smart politically when the odds are actually quite high someone decides to take this bill up on its offer.
Matthew Lane is a Senior Director at InSight Public Affairs.
Filed Under: congress, fuzzy language, kosa, lgbtq, nerd harder, parents, protect the children, secrecy, trust and safety
Comments on “Congress’ Kids Online Safety Act Puts Kids At Risk Through Fuzzy Language”
Online Safety Act
“But these aren’t normal times”.
I FULLY agree. And I don’t think it will change any time soon…
And this is why I really wish that after a bill has been voted on by Congress and signed by the President it immediately goes before SCOTUS for a yay/nay on it being constitutional. Waiting till someone has been harmed by a bill is already too late, “making good the damage dealt” is never as good an option as preventing the damage in the first place.
This comment has been flagged by the community. Click here to show it.
You know what Else puts kids at risk? Abortion doctors
Re:
They all are running around, kidnapping children or something?
Off topic and citation needed, motherfucker.
Re:
They actually save kids’ lives.
It’s the forced-birthers putting kids at risk.
Nerd Harder: The bill
‘Platforms have to keep ‘dangerous’ stuff as defined by whatever AG wants to define that word away from children and allow parents to know when their children are exposed to ‘dangerous’ stuff even if their knowing that might be more dangerous for the kid than the content in question.’
Ah yes, nothing like a For The Children(tm) bill supported and pushed by politicians striving for Look At Me Doing Something(tm) soundbites to leave kids significantly worse off than they were before.
Honestly at this point any time I hear a politician talking about how Something Must Be Done and and how they’re proposing a bill For The Children my immediate first assumption is twofold: It will leave kids if not everyone worse off and it is being proposed in bad faith, and bills like this really aren’t helping to counter that assumption.
Old days.
This is asking you to be Like the old small stores.
There used to be the common stores and then a few stores for Adults, Some stores had a section for adults.
But you generally had a person that could see most of the store, and they would keep the kids out.
The Problem here is you have a Blind Owner, watching ove the store, that May have an adult section.
Might as well be a Mail order catalog(a book of products for sale)(for those to young). A younger person could order anything, as long as they sent the money/Money order. Anyone got a idea on how to use a Single cow milking machine?? I bet the kid does.
These conservatives
I am going to love when these Democrat attorneys in left wings states go after breitbart and infowars saying that there sites are harming children because they have no age verification software on there sites to prove there readers are over 18. They will have to block there sites out of certain states and then you will hear them complain about censorship.
Hey breitbart blame your buddy Senator Blackburn of Tennessee for the fact your site will not be allowed in that Democrat state because it harms children. Guaranteed we will hear plenty of complaints if this kids online safety act ever becomes law
Nothing says “for the kids” like a bill that’s for the politicians (so, all of them really).
Excuse me? The bill says platforms are required to protect children from “mental health disorders”? The best possible interpretation I have of this sentence is that they’re required to protect children from people with mental disorders, and that still says some bad things about how we treat mental health in this country.
Re:
The bill says platforms are required to protect children from “mental health disorders”?
Yes. Platforms will be required to protect children from developing mental illnesses, and since such conditions often run in families, that requires platforms to fund eugenicist abortions. /s