from the this-is-a-bad,-bad-idea dept
Senator Richard Blumenthal is apparently a bottomless well of terrible internet regulation ideas. His latest is yet another “for the children” bill that will put children in serious jeopardy. This time he’s teamed up with the even worse Senator Marsha Blackburn to introduce the Kids Online Safety Act, which as the name suggests is full of a bunch of overbearing, dangerous nonsense that will not protect children at all, but will make them significantly less safe while giving clueless, authoritarian parents much more power to spy on their kids.
About the only “good” part of the bill is that it doesn’t attack Section 230. But the rest of it is nonsense, and based on a terrible misunderstanding of how, well, anything works. The bill doesn’t just take its name from the UK’s Online Safety Bill, but it also takes a similar “duty of care” concept, which is a nonsense way of saying “if you make a mistake, and let undefined ‘bad stuff’ through, you’ll be in trouble.” Here’s the duty of care is self-contradictory nonsense:
BEST INTERESTS.?A covered platform has a duty to act in the best interests of a minor that uses the platform?s products or services
How the hell is a website going to know “the best interests of a minor” using its platform? That’s going to vary — sometimes drastically — from kid to kid. Some kids may actually benefit from learning about controversial topics, while others may get dragged down into nonsense. There is no one way to have “best interests” for kids, and it’s a very context-sensitive question.
PREVENTION OF HARM TO MINORS.?In acting in the best interests of minors, a covered platform has a duty to prevent and mitigate the heightened risks of physical, emotional, developmental, or material harms to minors posed by materials on, or engagement with, the platform, including?
(1) promotion of self-harm, suicide, eating disorders, substance abuse, and other matters that pose a risk to physical and mental health of a minor;
(2) patterns of use that indicate or encourage addiction-like behaviors;
(3) physical harm, online bullying, and harass17 ment of a minor;
(4) sexual exploitation, including enticement, grooming, sex trafficking, and sexual abuse of minors and trafficking of online child sexual abuse material;
(5) promotion and marketing of products or services that are unlawful for minors, such as illegal drugs, tobacco, gambling, or alcohol; and
(6) predatory, unfair, or deceptive marketing practices.
So, so much of this is nonsense, disconnected from the reality of how anything works, but let’s just focus in on the whole thing about how a covered platform has a duty to “prevent and mitigate” risks associated with “eating disorders.” Last year we had a content moderation case study all about the very, very difficult and nuanced questions that websites face in dealing with content around eating disorders. Many of them found that trying to ban all such conversations actually backfired and made the problem worse. But often by allowing conversations about eating disorders it actually helped steer people away from eating disorders. In fact, much of the evidence showed that (1) people didn’t start getting eating disorders from reading about others with eating disorders, and (2) people writing about their eating disorders made it easier for others to come and help them find the resources they needed to get healthy again.
In other words, it’s not a matter of telling websites to block information about eating disorders — as this Blumenthal and Blackburn bill would do. That will often just sweep the issue under the rug, and kids will still have eating disorders, but not get the help that they might have otherwise.
Once again, a Blumenthal bill is likely to make the problem it ostensibly tries to solve “worse.” There is similar evidence that suicide prevention is an equally fraught area, and it’s not nearly as simple as saying “no discussions about suicide,” because often forums for discussing suicide are where people get help. But under this bill that will be prevented.
This bill takes extremely complex, nuanced issues, which often need thoughtful, context-based interventions, and reduces to block it all. Which is just dangerous. Because kids who are interested in suicide or eating disorders… are still going to be interested in those things. And if the major websites, with big trust and safety teams and more thoughtful approaches to all of this are forced to take down all that content, the kids are still going to go looking for it and they’re going to end up on sketchier and sketchier websites, with fewer controls, fewer thoughtful staff, and it is much more prone to a worse outcome.
Honestly, this approach to regulating the internet seems much more likely to cause serious, serious problems for children.
Then, there’s the terrible, terrible parental surveillance section. The bill would mandate websites provide “parental tools” that would be “readily-accessible and easy-to use” so parents can spy on their kids’ activities online. Now, to avoid the problems of surreptitious surveillance, which would be even worse, the bill does note that “A covered platform shall provide clear and conspicuous notice to a minor when parental tools are in effect.” That’s certainly better than the opposite, but all this is doing is teaching kids that constant surveillance is the norm.
This is not what we should be teaching our kids.
I know how tempting it is for parents to want to know everything their kids are doing online. I know how tempting it is to be afraid about what kids are getting up to online, because we’ve all heard various horror stories. But surveilling kids of all ages, all the time is a stupid, dangerous idea. First of all, the kinds of things that a parent of a six-year-old might need are drastically different than the parents of a 16-year-old. But the bill treats everyone 16 and younger the same.
And there are already lots of tools parents can use — voluntarily — to restrict the behavior of their kids online. We don’t need to make it the expected norm that every website gives parents tools to snoop on their kids. Because that alone can do serious damage to kids. Just a few months ago there was an amazing article in Wired about how dangerous parental surveillance of kids can be.
Constant vigilance, research suggests, does the opposite of increasing teen safety. A University of Central Florida study of 200 teen/parent pairs found that parents who used monitoring apps were more likely to be authoritarian, and that teens who were monitored were not just equally but more likely to be exposed to unwanted explicit content and to bullying. Another study, from the Netherlands, found that monitored teens were more secretive and less likely to ask for help. It’s no surprise that most teens, when you bother to ask them, feel that monitoring poisons a relationship. And there are very real situations, especially for queer and trans teens, where their safety may depend on being able to explore without exposing all the details to their family.
And yet, this bill requires the kind of situation that makes teenagers less safe, and pushes them into more risky and dangerous activity.
Why is it every Blumenthal bill “for the children” will make children less safe?
And just think about how this plays out for an LGBTQ child, brought up in a strictly religious family, who wants to use the internet to find like-minded individuals. Under this bill, that information gets reported back to the parents — and seems way more likely to lead to distress, harm and even possibly suicidal ideation — because of this bill.
In other words, this bill tries to prevent suicide by forcing websites to take down information that might help prevent suicides, and then forces vulnerable kids in dangerous home environments to share data with their parents, which seems more likely to drive them towards suicide.
It’s like the worst possible way of dealing with vulnerable children.
There are, of course, other problems with the bill, but the whole thing is based on a fundamental misunderstanding of how to raise resilient kids. You don’t do it by spying on their every move. You do it by giving kids the freedom to explore and learn, but equipped with the knowledge that not everything is safe, and not every idea is a good one. You teach them to recognize that the world can be dangerous, but they need to learn how to be equipped to deal with that. Obviously, the best strategies for that will differ at different ages and based on the individual child. But assuming that all children up to age 16 must be surveilled by their parents and that websites should be forced to block information about which many kids will want to explore, seems like it would create a horrifically bad result for many, many children — including the most vulnerable.
It’s truly incredible how many horrible, horrible laws about the internet one man can sponsor, but Senator Blumenthal really has become a one-man “terrible bill idea” shop. People of Connecticut: do better. As for Blackburn, well, she’s always been terrible, but I find it amusing to remind people she put out this video a decade ago, screaming about how the internet should never be regulated. And now look at her.
Filed Under: children, duty of care, eating disorders, for the children, harm prevention, marsha blackburn, online safety, parental tools, richard blumenthal, suicide, surveillance