Common Sense Media Has No Common Sense When It Comes To Internet Laws

from the common-sense-is-in-short-supply-here dept

Common Sense Media provides some really useful tools if you’re a parent looking to see if certain content is age appropriate. I’ve used it for years. But… also, for years, the organization has been way out over its skis in supporting all sorts of absolutely horrible laws that would do real damage to the internet, to privacy, and to free speech. Over and over again if there’s a bad internet law, Common Sense Media has probably supported it.

For years, it has railed against Section 230, including filing one of the most ridiculous amicus briefs out of the dozens filed in the Gonazalez v. Google case (all of which the Supreme Court effectively ignored while punting the case). It’s supported many laws that attack free expression, including the Age Appropriate Design Code laws in both the UK and California. It regularly pushes for ridiculous laws, including blatantly unconstitutional “protect the children!” laws around the US.

This is somewhat frustrating, because outside of its advocacy for bad laws, Common Sense Media’s actual content offering is actually showing how there’s a better way: by providing parents, teachers, and even kids themselves with more information about how to handle different types of content and what’s appropriate and what’s not. Its main product shows us how empowering people with information is often a much better solution than laws that strip away the rights of people. But then… it advocates for stripping away rights.

The latest such case is in California, where the legislature has just been non-stop pushing terrible, terrible internet laws “for the children.” We recently talked about SB 680 (which Common Sense Media supports!), which was pushed based on a Senator completely misreading an already junk science study. It appears that 680 may actually be dead in California, but instead, the legislature is pushing another problematic bill, and Common Sense Media has lost every last bit of common sense about it.

AB 1394 is yet another “protect the children on the internet” bill, that again rushes to broadly attack the internet, and create all sorts of problems without understanding any of the issues, or how this bill will make things way worse, not better. You can think of AB 1394 as California’s special version of FOSTA, saying that if a website “knowingly facilitates, aids, or abets” child sexual abuse material (CSAM), then there’s a private right of action, allowing people to sue the company for statutory damages of $1 million to $4 million.

There are all sorts of issues with this, with the first being that anyone who knows anything about how the internet works will explain to you that any website that allows user generated content will, at some point, be used for CSAM. It’s a constant fight. Of course, federal law already has provisions on how to deal with this, requiring companies to report such content, as the sites become aware of it, to NCMEC and follow some pretty specific procedures to allow law enforcement to do its thing (which it rarely seems to actually do) to try to help victims.

But, once you put massive civil liability on top of that for “knowingly” doing something, YOU ARE TELLING COMPANIES TO STOP TRYING TO HELP. Because the way you avoid “knowing” something is to never, ever look, and never do anything to try to learn about CSAM on the platform. This bill literally incentivizes companies to do way less in the fight against CSAM, because the more they do, the more they risk “knowing” that CSAM is on their platform, and the greater the liability.

And the law is even worse than that. It says that any “system, design, feature, or affordance” on a website that is a “substantial factor” in enabling sexual exploitation of minors, counts as “facilitating, aiding, or abetting.” And that’s true, even if that feature has perfectly legitimate purposes. I’m reminded of various cases where websites are accused of “aiding and abetting” or inducing law breaking activity, because they have a “search” engine on their website. That was part of what got early file sharing services killed. Of course in the case of Megaupload, the company was attacked for not having a search engine, claiming this is how it tried to “hide” its allegedly illegal activity. In other words, having a feature and not having that same feature can be used by creative lawyers to attack basically any company. This is exceptionally broad and will lead to all sorts of ridiculous lawsuits from lawyers eager to get that multi-million dollar statutory payout.

We’re already seeing that happen with FOSTA, and AB 1394 is way broader than FOSTA.

And… the law gets worse. Rather than just mimicking the already problematic parts of FOSTA, AB 1394 also takes some of the very worst ideas from Congress’ EARN IT bill (which, thankfully, still isn’t law), which had the problem of potentially making websites state actors, by demanding that they search for CSAM. As many legal experts explained with EARN IT, this is a massive problem, because once a platform becomes a state actor, the 4th Amendment applies, and now you have to deal with having to get a warrant to do many of these searches, that, otherwise, a private platform would have been able to do by itself.

What that means is that, in practice, a ton of CSAM evidence will not be usable in court, because it will have been obtained in violation of the 4th Amendment.

Once again, the bill would make the fight against CSAM significantly more difficult. Defenders of AB 1394 will claim that it doesn’t require proactive scanning for all CSAM, but that’s wrong. What it does require is the scanning for and blocking of all instances of any reported images. So once a site learns of an image, it has to find and block all copies of it, or violate the law. And, in doing so, this introduces that massive 4th Amendment problem that would make it much more difficult to obtain useful evidence against those creating and distributing CSAM.

And that’s doubly stupid, because basically any of the sites that actually matter already use tools like PhotoDNA and others to find, block, and report known CSAM. When they do it by themselves, it’s not a 4th Amendment issue. As soon as the law requires it, it becomes one, and screws up the ability to make use of that evidence.

There’s more to it, but this is yet another terrible bill, put together by people who think they’re saving the children, when they don’t understand how anything actually works, and how their bill interacts with the real world.

And, of course, Common Sense Media supports it. Not only does it support it, but it’s gone on the offensive, trying to drum up a nonsense story about how big tech supports “pedophilia, bestiality, trafficking, and child sex abuse.” Really. Here’s Common Sense Media founder and CEO Jim Steyer, claiming that any effort to block or even fix this problematic bill means you support those things:

The press release Common Sense put out is ridiculously misleading, to the point of being just out and out disinformation. Here’s how it starts:

The Wall Street Journal recently reported that Instagram, owned by Meta (a member of TechNet), is actively connecting a vast online network of pedophiles and human traffickers to disturbing imagery and videos of child sex abuse. TechNet is an accomplice in Meta’s attempt to gut AB 1394, a bill that aims to hold social media companies accountable for commercial sexual exploitation of children.

It’s interesting that they point to the Wall Street Journal article to make those claims, rather than the underlying report it’s based on. Perhaps because that report is actually way more nuanced than Common Sense wants to admit. The report is not about known instances of CSAM, but rather about the much more difficult to detect “self-generated” CSAM (i.e., newly generated CSAM that is created and then posted by the child themselves, which is a horrific situation, and a massive challenge). The report actually highlights the difficulty of dealing with this issue on Instagram, and nowhere among its recommendations on how to deal with this is anything associated with AB 1394. Because AB 1394 would make the problem significantly worse, not better. It would make it more difficult for Instagram to find and remove this content, and it would make it far more difficult for law enforcement to use such material as evidence.

But, Common Sense connects the two anyway, and then sent this obnoxiously stupid letter to TechNet members:

You are a member of TechNet. And TechNet, on your behalf, is lobbying to stop a bill in Sacramento, AB 1394, that would specifically crack down on child sex trafficking, child pornography, and child sexual abuse, including bestiality, to engage sexual predators and pedophiles online. This is being done in your companies’ names.

Why do innovative and thoughtful leaders like yourselves want to attach your brand and reputation with a company like Meta in opposing a bill to stop child sex trafficking? We are calling on you to renounce TechNet’s opposition and amendments that aim to gut this incredibly important bill for children and families authored by Asm Buffy Wicks. We are calling on you to either leave TechNet or publicly denounce TechNet’s duplicitous behavior and partnership with Meta.

Except, again, AB 1394 would not, in fact, crack down on those things. As explained above, it would make it way more difficult to do so, and create some huge legal problems for both the platforms and law enforcement seeking to actually address the problems.

Once again, Common Sense Media has no common sense.

TechNet responded to the letter and it’s a pretty masterful letter, highlighting not just the problems of the bill in actually achieving its stated goals, but noting that it has actually been working to fix those problems, while Common Sense Media and Jim Steyer just post stupid shit online.

Our commitment, and our member companies’ commitment, to fighting back against sexual predators is crystal clear: the internet, and any platforms on it, should not be a safe haven for these activities and criminals should be prosecuted to the fullest extent of the law.

Since TechNet has spent the last several months working in good faith in pursuit of these objectives by negotiating amendments to this bill with the author and sponsors, we will assume your press release, letter, and social media post were the product of ignorance rather than malice.

Allow us to help you catch up on the latest with respect to your co-sponsored legislation:

•On June 26, TechNet approached Assemblymember Wicks with good faith amendments that would 1) make AB 1394 workable from both a legal and policy perspective and 2) result in the removal of more child pornography. We asked to sit down and negotiate, expressed a clear goal of negotiating industry-wide neutrality, and proposed a series of meetings to work in that direction.

• Since that time, we’ve had more than a dozen discussions and meetings with the author, sponsors, and other key legislative personnel, and those meetings have resulted in substantial progress. We know your organization didn’t participate in those negotiations, so perhaps you’re unaware of the details.

• During that process we have offered amendments that, if accepted, would result in the strongest piece of legislation in the country related to the removal of child pornography from the internet. We’ve made numerous concessions and focused our efforts on providing sound policy alternatives that will result in more child pornography being removed from the internet. For example, despite our strong opposition to increased civil liability, our amendments do not aim to strike the two private rights of action against platforms that fail to comply with the bill.

• Importantly, our amendments attempt to protect AB 1394 from likely First and Fourth Amendment challenges that could invalidate the bill or help perpetrators keep evidence out of court to avoid prosecution for their abhorrent crimes. The last thing TechNet or our members want is for a criminal defendant to be able to overturn their conviction based on evidence collected as a result of this bill, and we hope Common Sense would agree.

• As of last week, we were a few minor details away from an agreement that would remove our opposition. Unfortunately, organizations like yours have decided to upend major points of agreement and are knowingly pushing away from collaboration and toward litigation.

So Common Sense wants a bill that will create massive 1st and 4th Amendment problems, push companies to look the other way to avoid “knowingly” doing stuff to prevent CSAM… and when TechNet tries to fix those issues, Common Sense and its CEO go public attacking them as supporting CSAM.

Once again, Common Sense Media has no common sense.

Filed Under: , , , , , , , ,
Companies: common sense media

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Common Sense Media Has No Common Sense When It Comes To Internet Laws”

Subscribe: RSS Leave a comment
20 Comments
This comment has been deemed insightful by the community.
This comment has been deemed funny by the community.
Anonymous Coward says:

It says that any “system, design, feature, or affordance” on a website that is a “substantial factor” in enabling sexual exploitation of minors, counts as “facilitating, aiding, or abetting.”

Allowing people to communicate with each other is now illegal, as it enable people to groom children!

This comment has been deemed insightful by the community.
Anonymous Coward says:

I know the article had more important things to focus on, but I couldn’t help but notice that Steyer’s tweet claims that these nebulous ‘Tech CEOs’ he speaks of are zoophillic pedophiles. Not even a ‘they support/condone this’, but a straight-up accusation of them being either consumer and/or distributors. I can’t help but feel like that’s dancing dangerously close to some potential legal fire.

This comment has been deemed insightful by the community.
Anonymous Coward says:

The Wall Street Journal recently reported that Instagram, owned by Meta (a member of TechNet), is actively connecting a vast online network of pedophiles and human traffickers to disturbing imagery and videos of child sex abuse. TechNet is an accomplice in Meta’s attempt to gut AB 1394, a bill that aims to hold social media companies accountable for commercial sexual exploitation of children.

This quote is great. If you read it carefully, it’s literally saying “Meta objects to being held liable for the crimes committed by other independently of Meta”.

As far as I know child abuse is still illegal. If Meta (or any other company) were actual participants in it… they would have already been rip apart (figuratively) in court. A “YOU are guilty!” bill is… never a good idea.

But if we have to have bad ideas, can we make it a crime to be a CEO a look[1] like your lying about a bill? Or better yet, give people (including those in other states) a private right of action to sue for damages. Ranging from $10 Million to $50 Million.

[1] If we are going with bad ideas, I see no reason not to fry the CEO, even if someone else posted the tweet without his supervision or knowledge.

This comment has been deemed insightful by the community.
Mr. Blond says:

Common Sense Media has always been a prohibitionist entity, being one of the strongest lobbyists for restrictions on violent video games down to the day of the Brown v. EMA decision. They think their moral judgments on TV, movies, games, etc. are the way everyone should parent their children.

Common Sense Media has always adopted the same view as other busybody entities like the Parents’ Television Counsel, the Lion & Lamb Project, and the National Institute on Media and the Family. They are just better at putting on a show of respectability and moderation as they avoid some of the more extreme positions and rhetoric (for the most part; this is obviously an exception), and have managed to obtain some clout due to their founder being the brother of Tom Steyer, a major political player in California.

This comment has been deemed insightful by the community.
Cat_Daddy (profile) says:

You know a lot of these non-profit organizations are kinda like the worst aspects of your family. If the Heritage Foundation is like the openly racist MAGA grandpa from the south, and If NCOSE is the homophobic, puritanical aunt, then CSM is the helicopter mom who is also a Karen who reads nothing but opinion columns.

Probably the least of the three evils mentioned, but still an evil and still perplexing nonetheless.

Cat_Daddy (profile) says:

Re:

Stopping bad legislation at the legislative level is pretty much a no-go. It’s the courts where it (usually) matters. After all both the Arkansas and Texas laws targeting children were taken down on the same day and the chances of California’s own Appropriate Codes aren’t looking optimistic against its own court. So even when AB 1394 passes the legislature process, it is more likely it could die during the judicial process.

That One Guy (profile) says:

Protecting the ones behind the cameras rather than the ones in front of them

First and foremost that response letter… just perfection. ‘We were trying to do something about the problem and were making good progress on it until some jackass bumbled into the process and decided to make a mess of the whole thing, putting all our collaboration at risk.’

That out of the way…

So funny thing about those accusations of support for that content, much like the only ones who benefited from FOSTA were the politicians who boasted about Doing Something(tm) and sex traffickers, the only people I can see benefiting from FOSTA 2.0(Now For CSAM) are politicians who will boast about Doing Something and those creating and sharing CSAM, for the same reasons that applied to FOSTA.

Online platforms are actually trying to do something about the problem because surprise surprise they don’t want that content on their platforms, the likes of Common Sense Media just seem to be trying to brush it under the rug and/or make it harder to find and prosecute.

So… who was it that was supporting “pedophilia, bestiality, trafficking, and child sex abuse” again?

Anonymous Coward says:

Re:

Children should be introduced to kissing other children of the same gender. It’s only fair. Japan already does that, teaching grade school girls to date each other. Progressiveness!

Steyer is a priest who desperately wants to fiddle children, because that’s what all straight men want. Drag queens on the other hand are entirely wholesome and beyond any sort of blame or guilt.

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...