The REPORT Act: Enhancing Online Child Safety Without the Usual Congressional Nonsense

from the a-good-bill?-for-the-children? dept

For years and years, Congress has been pushing a parade of horrible “protect the children online” bills that seem to somehow get progressively worse each time. I’m not going through the entire list of them, because it’s virtually endless.

One of the most frustrating things about those bills, and the pomp and circumstance around them, is that it ignores the simpler, more direct things that Congress could do that would actually help.

Just last week, we wrote about the Stanford Internet Observatory’s big report on the challenges facing the CyberTipline, run by the National Center for Missing & Exploited Children (NCMEC). We wrote two separate posts about the report (and also discussed it on the latest episode of our new podcast, Ctrl-Alt-Speech) because there was so much useful information in there. As we noted, there are real challenges in making the reporting of child sexual abuse material (CSAM) work better, and it’s not because people don’t want to help. It’s actually because of a set of complex issues that are not easily solvable (read the report or my articles for more details).

But there were still a few clear steps that could be taken by Congress to help.

This week, the REPORT Act passed Congress, and it includes… a bunch of those straightforward, common sense things that should help improve the CyberTipline process. The key bit is allowing the CyberTipline to modernize a bit, including allowing it to use cloud storage. To date, no cloud storage vendors could work with NCMEC, out of a fear that they’d face criminal liability for “hosting CSAM.”

This bill fixes that, and should enable NCMEC to make use of some better tools and systems, including better classifiers, which are becoming increasingly important.

There are also some other factors around letting victims and parents of victims report CSAM involving the child directly to NCMEC, which can be immensely helpful in trying to stop the spread of some content (and on focusing some law enforcement responses).

There are also some technical fixes that require platforms to retain certain records for a longer period of time. This was another important point that was highlighted in the Stanford report. Given the flow of information and prioritization, sometimes by the time law enforcement realized it should get a warrant to get more info from a platform, the platform would have already deleted it as required under existing law. Now that time period is extended to give law enforcement a bit more time.

The one bit that we’ll have to see how it works is that it extends the reporting requirements for social media to include violations of 18 USC 1591, which is the law against sex trafficking. Senator Marsha Blackburn, who is the co-author of the bill, is claiming that this means that “big tech companies will now be required to report when children are being trafficked, groomed or enticed by predators.”

Image

So, it’s possible I’m misreading the law (and how it works with existing laws…) but I see nothing limiting this to “big tech.” It appears to apply to any “electronic communication service provider or remote computing service.”

Also, given that Marsha Blackburn appears to consider “grooming” to include things like LGBTQ content in schools, I worried that this was going to be a backdoor bill to making all internet websites have to “report” such content to NCMEC, which would flood their systems with utter nonsense. Thankfully, 1591 seems to include some pretty specific definitions of sex trafficking that do not match up with Blackburn’s definition. So she’ll get the PR victory among nonsense peddlers for pretending that it will lead to the reporting of the non-grooming that she insists is grooming.

And, of course, while this bill was actually good (and it’s surprising to see Blackburn on a good internet bill!) it’s not going to stop her from continuing to push KOSA and other nonsense moral panic “protect the children” bills that will actually do real harm.

Filed Under: , , , , , ,
Companies: ncmec

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “The REPORT Act: Enhancing Online Child Safety Without the Usual Congressional Nonsense”

Subscribe: RSS Leave a comment
18 Comments
Anonymous Coward says:

Re:

It’s true

https://thehill.com/policy/transportation/4636676-faa-senate-vote/

“One way to advance those priorities could be a manager’s package that would attach items that have wide support across the chamber.”

“Among those is the Kids Online Safety Act, a bill backed by Sens. Richard Blumenthal (D-Conn.) and Marsha Blackburn (R-Tenn.), which Schumer also supports.”

I’m so fucking tired of this. First they pass the tiktok ban under the cover of helping ukraine now they’re doing this crap again under another “must pass” bill. There goes my youtube account (I’m not giving any of these sites my ID) unless using a vpn to appear outside america is a solution.

TDestroyer209 says:

Re: Re:

I find it kind of funny and a bit wtf when the “think of the children” senators act like their bill would protect kids and yet it doesn’t plus when you have to merge it into a must pass bill it’s so ridiculous like jfc.

The fact that KOSA needs to be added into a must pass bill when it has 60+ cosponsors is just so goddamn stupid.

Starting to wonder hopefully if this is Blumenthal’s last chance overall to ram KOSA through considering his old age but that will remain to be seen.

This comment has been flagged by the community. Click here to show it.

Rocky says:

Re:

You are conflating 2 different things. Detecting and reporting CP to NCMEC also means the provider will remove the content at the first opportunity because knowingly hosting CP is illegal in almost all circumstances. Ie, they don’t really work together since the provider only notifies NCMEC.

Allowing providers to secure and store the content as evidence also means they now can work together with NCMEC to trace and combat CP much easier. This also allows smaller players to secure evidence of CP without exposing themselves to insane legal risks that is associated with handling CP, regardless of the intent.

Currently, it all comes down to that detecting, reporting and removing CP (ie destroying evidence) carries less legal risk than detecting, reporting and storing CP as evidence for NCMEC/legal authorities.

Anonymous Coward says:

Re:

If your context makes no sense, try a different context.

you took
“no cloud storage vendors could work with NCMEC”
and read
“no cloud storage vendors could work with NCMEC to report CSAM”.

The NCMEC does more than simply act as a database of CSAM Hashes. the actual context is that By the time a cloud provider contacts the NCMEC, the CSAM, and any valuable data it could have provided to locate the abuser creating it and a child being exploited, has been deleted under existing CSAM laws. All the cloud provider saves is a Hash, which is used to identify the photo in the future, assuming no one changes a single pixel.

the actual context is that “no cloud storage vendors could work with NCMEC efforts to identify perpetrators and bring a stop to the actual criminal abusers who make CSAM”

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...