The REPORT Act: Enhancing Online Child Safety Without the Usual Congressional Nonsense
from the a-good-bill?-for-the-children? dept
For years and years, Congress has been pushing a parade of horrible “protect the children online” bills that seem to somehow get progressively worse each time. I’m not going through the entire list of them, because it’s virtually endless.
One of the most frustrating things about those bills, and the pomp and circumstance around them, is that it ignores the simpler, more direct things that Congress could do that would actually help.
Just last week, we wrote about the Stanford Internet Observatory’s big report on the challenges facing the CyberTipline, run by the National Center for Missing & Exploited Children (NCMEC). We wrote two separate posts about the report (and also discussed it on the latest episode of our new podcast, Ctrl-Alt-Speech) because there was so much useful information in there. As we noted, there are real challenges in making the reporting of child sexual abuse material (CSAM) work better, and it’s not because people don’t want to help. It’s actually because of a set of complex issues that are not easily solvable (read the report or my articles for more details).
But there were still a few clear steps that could be taken by Congress to help.
This week, the REPORT Act passed Congress, and it includes… a bunch of those straightforward, common sense things that should help improve the CyberTipline process. The key bit is allowing the CyberTipline to modernize a bit, including allowing it to use cloud storage. To date, no cloud storage vendors could work with NCMEC, out of a fear that they’d face criminal liability for “hosting CSAM.”
This bill fixes that, and should enable NCMEC to make use of some better tools and systems, including better classifiers, which are becoming increasingly important.
There are also some other factors around letting victims and parents of victims report CSAM involving the child directly to NCMEC, which can be immensely helpful in trying to stop the spread of some content (and on focusing some law enforcement responses).
There are also some technical fixes that require platforms to retain certain records for a longer period of time. This was another important point that was highlighted in the Stanford report. Given the flow of information and prioritization, sometimes by the time law enforcement realized it should get a warrant to get more info from a platform, the platform would have already deleted it as required under existing law. Now that time period is extended to give law enforcement a bit more time.
The one bit that we’ll have to see how it works is that it extends the reporting requirements for social media to include violations of 18 USC 1591, which is the law against sex trafficking. Senator Marsha Blackburn, who is the co-author of the bill, is claiming that this means that “big tech companies will now be required to report when children are being trafficked, groomed or enticed by predators.”
So, it’s possible I’m misreading the law (and how it works with existing laws…) but I see nothing limiting this to “big tech.” It appears to apply to any “electronic communication service provider or remote computing service.”
Also, given that Marsha Blackburn appears to consider “grooming” to include things like LGBTQ content in schools, I worried that this was going to be a backdoor bill to making all internet websites have to “report” such content to NCMEC, which would flood their systems with utter nonsense. Thankfully, 1591 seems to include some pretty specific definitions of sex trafficking that do not match up with Blackburn’s definition. So she’ll get the PR victory among nonsense peddlers for pretending that it will lead to the reporting of the non-grooming that she insists is grooming.
And, of course, while this bill was actually good (and it’s surprising to see Blackburn on a good internet bill!) it’s not going to stop her from continuing to push KOSA and other nonsense moral panic “protect the children” bills that will actually do real harm.
Filed Under: csam, cybertipline, jon ossoff, marsha blackburn, modernization, report act, sex trafficking
Companies: ncmec
Comments on “The REPORT Act: Enhancing Online Child Safety Without the Usual Congressional Nonsense”
There are some excellent tools for matching duplicate reports that use sophisticated probability tactics to find matches that aren’t obvious. Reducing the duplication would make the task less daunting.
Talking about KOSA they are trying to add it to the FAA Reauthorization Bill (Federal Aviation Administration), There debate if they will be able to add it seeing the House Speaker opposes adding unrelated stuff to it.
Re:
It’s true
https://thehill.com/policy/transportation/4636676-faa-senate-vote/
“One way to advance those priorities could be a manager’s package that would attach items that have wide support across the chamber.”
“Among those is the Kids Online Safety Act, a bill backed by Sens. Richard Blumenthal (D-Conn.) and Marsha Blackburn (R-Tenn.), which Schumer also supports.”
I’m so fucking tired of this. First they pass the tiktok ban under the cover of helping ukraine now they’re doing this crap again under another “must pass” bill. There goes my youtube account (I’m not giving any of these sites my ID) unless using a vpn to appear outside america is a solution.
Re: Re:
Actually, they passed the legislation to help Ukraine by agreeing to vote in the TikTok ban.
Re: Re:
Do want to point out it won’t fully come into force until 18 months after the bill has passed and it will likely face a constitutional challenge right a way.
Re: Re:
I find it kind of funny and a bit wtf when the “think of the children” senators act like their bill would protect kids and yet it doesn’t plus when you have to merge it into a must pass bill it’s so ridiculous like jfc.
The fact that KOSA needs to be added into a must pass bill when it has 60+ cosponsors is just so goddamn stupid.
Starting to wonder hopefully if this is Blumenthal’s last chance overall to ram KOSA through considering his old age but that will remain to be seen.
Re: Re:
If KOSA weren’t made by child abusers for child abusers, they wouldn’t have done this coward move and instead would have been able to have Kosa stand on merit.
As reasonable as this bill seems, there should be, like, a double interrobang at the end of the department heading.
Finally some good news
While it’s necessary to report on the bad things happening on the word, credit must be given when it is due.
I like this approach to reporting, highlighting both the good and the bad on a few key issues.
This comment has been flagged by the community. Click here to show it.
kamigee
As reasonable as this bill seems, there should be, like, a double interrobang at the end of the department heading.
https://shoerackplanet.com/
I’m not sure what you mean by “no cloud storage vendors could work with NCMEC”… when for example Google Photos and Amazon Photos are cloud storage and they report to NCMEC…
Re:
You are conflating 2 different things. Detecting and reporting CP to NCMEC also means the provider will remove the content at the first opportunity because knowingly hosting CP is illegal in almost all circumstances. Ie, they don’t really work together since the provider only notifies NCMEC.
Allowing providers to secure and store the content as evidence also means they now can work together with NCMEC to trace and combat CP much easier. This also allows smaller players to secure evidence of CP without exposing themselves to insane legal risks that is associated with handling CP, regardless of the intent.
Currently, it all comes down to that detecting, reporting and removing CP (ie destroying evidence) carries less legal risk than detecting, reporting and storing CP as evidence for NCMEC/legal authorities.
Re: Re:
In this context, what are specific examples of “cloud storage vendors”? Are they organizations like Google Photos or Amazon Photos?
Re: Re: Re:
Cloud storage vendors generally refer to companies/services like Google Drive, Dropbox, OneDrive etc but also includes companies/services tailored towards specific content, like those you mentioned.
Re: Re: Re:
AWS, Google Cloud, Azure. Not “product built for consumer-facing cloud-based services”.
Re:
If your context makes no sense, try a different context.
you took
“no cloud storage vendors could work with NCMEC”
and read
“no cloud storage vendors could work with NCMEC to report CSAM”.
The NCMEC does more than simply act as a database of CSAM Hashes. the actual context is that By the time a cloud provider contacts the NCMEC, the CSAM, and any valuable data it could have provided to locate the abuser creating it and a child being exploited, has been deleted under existing CSAM laws. All the cloud provider saves is a Hash, which is used to identify the photo in the future, assuming no one changes a single pixel.
the actual context is that “no cloud storage vendors could work with NCMEC efforts to identify perpetrators and bring a stop to the actual criminal abusers who make CSAM”
Re: Re:
Sorry for the misunderstanding and thank you for the clarification.
Unless the cloud provider actually did save the hash and other user data to trace the offender, which would be the ideal situation since now they will be able to work with NCMEC.
Re: Re:
More specifically, NCMEC could not farm out their storage and processing to a third party, including cloud computing vendors.