EU Commissioner Pens Barely Coherent Defense Of Spying On Everyone, For The Children

from the hey-eu,-what-are-you-doing? dept

You may recall back in May we wrote about a batshit crazy proposal out of the EU Commission to “protect the children” by mandating no encryption and full surveillance of all communications. Those behind the proposal would argue it’s not technically surveilling all messages, but all messages have to be surveillable, should the government decide that a company is not doing enough to “mitigate” child sexual abuse material (CSAM).

These are the kinds of “solutions” that very silly politicians come up with when they don’t understand the nature of the problem, nor anything about technology, but feel the need to “do something.” No one denies that CSAM is an issue, but it’s an issue that many experts have been working on for ages, and they recognize that the “easy” solutions that foolish people come up with often have tradeoffs that are actually much, much worse. And that’s absolutely true in this case with this proposal.

It would actually put children at much greater risk by removing their ability to communicate privately — including to alert others that they need help. And the proposal wasn’t just about actual CSAM, but it talked about surveilling messages to detect the more broadly defined “grooming.” Again, there are real problems with grooming that have been widely studied and reported upon, but “grooming” has now become a slur used by Trumpists and others to attack anyone who believes that LGBTQ+ people have the right to exist. In other words, it’s the kind of term that can be abused — and the EU wants to wipe out encryption and enable governments to force internet services to spy on anyone who might be “grooming” despite the fact that these days, that term is almost meaningless.

Thankfully, cooler heads in the EU, namely the EU Data Protection Board and the European Data Protection Supervisor, released a report a few weeks ago absolutely trashing the proposed rule. Well, at least as far as one can “trash” a proposal using typical EU bureaucratic speech:

The EDPB and EDPS stress that the Proposal raises serious concerns regarding the proportionality of the envisaged interference and limitations to the protection of the fundamental rights to privacy and the protection of personal data. In that regard, the EDPB and EDPS point out that procedural safeguards can never fully replace substantive safeguards. A complex system of escalation from risk assessment and mitigation measures to a detection order cannot replace the required clarity of the substantive obligations.

The EDPB and EDPS consider that the Proposal lacks clarity on key elements, such as the notions of “significant risk”. Furthermore, the entities in charge of applying those safeguards, starting with private operators and ending with administrative and/or judicial authorities, enjoy a very broad margin of appreciation, which leads to legal uncertainty on how to balance the rights at stake in each individual case. The EDPB and EDPSstress that the legislator must, when allowing for particularly serious interferences with fundamental rights provide legal clarity on when and where interferences are allowed. While acknowledging that the legislation cannot be too prescriptive and must leave some flexibility in its practical application, the EDPB and EDPS consider that the Proposal leaves too much room for potential abuse due to the absence of clear substantive norms.

As regards the necessity and proportionality of the envisaged detection measures, the EDPB and EDPS are particularly concerned when it comes to measures envisaged for the detection of unknown child sexual abuse material (‘CSAM’) and solicitation of children (‘grooming’) in interpersonal communication services. Due to their intrusiveness, their probabilistic nature and the error rates associated with such technologies, the EDPB and EDPS consider that the interference created by these measures goes beyond what is necessary and proportionate

Trust me, in EU bureaucratese, that’s pretty harsh. It’s basically saying this proposal is a dumpster fire that attacks the privacy rights of tons of people, without an understanding of how poorly the scanning technology proposed actually works, and without a real understanding of the nature of the problem — combined with broad and vague terminology that is wide open to abuse.

Now, one of the main backers of the proposal, the European Commissioner for Home Affairs, Ylva Johansson, has responded to the report in an almost incomprehensible blog post. It does not actually address the many detailed technical and legal issues raised by the EDPB report. Instead, it just is a performative “but think of the children” screed.

Sexual abuse of a child is a horrific act. It can destroy people’s lives, their sense of self. When images circulate online for years after the psychological effects on the person can be catastrophic. This right to not have images circulating, this right to privacy, is entirely absent in the opinion.

Again, no one denies that CSAM is a horrible problem. But what the actual experts are trying to explain is that you don’t solve it by spying on everyone, taking away their privacy rights, and breaking the technology that protects all of us. Johansson insists that there’s no technological issue here because tech platforms will have their choice of which way they wish to destroy encryption and spy on all communications:

The legislation leaves to the provider concerned, the choice of the technologies to be operated to comply effectively with detection orders, provided that the technologies meet the requirements of the Regulation.

Yes, but all of those options are bad. That’s what the EDPB is trying to explain. All of those options involve fundamentally breaking the technology that keeps us secure, and keeps out data private. And you’re just saying “eh, it’s no problem to destroy that.” And, you’re also insisting that destroying the technology that keeps all of us safe will magically keep kids safer, when all of our historical evidence from the people who actually study this stuff says the exact opposite — that it will put them at greater risk and greater danger because those who are looking to control them and abuse them will have even greater control over their lives.

Johansson waves away the privacy concerns again, by noting tech platforms should choose the “least privacy-intrusive” method of destroying privacy. That’s like saying, “our plan to blow up the sun won’t consume the Earth because we’re asking the sun to be blown up with the least planet destroying method available.” If all of those methods destroy the Earth, the problem is with the plan, and providing “options” doesn’t fix anything.

Over and over again, Johansson brushes off the actual detailed concerns about how this proposal will be abused to destroy people’s privacy, by doing the equivalent of saying “but don’t do that part.” I mean, just for example, here’s her response to the dangers of client-side scanning that would be mandated under this regulation — which creates all sorts of privacy concerns, and concerns about how that data can and will be misused. Johansson basically says “well, we won’t allow the data to be misused”

Detection systems are only to be used for the sole purpose of detecting and reporting child sexual abuse, and strict safeguards prevent use for other purposes.

The proposal provides for judicial redress, with both providers and users having a right to challenge any measure affecting them in Court. Users have a right of compensation for any damage that might result from processing under the proposal.

Sounds great in the theoretical world of Perfectistan, that does not exist in reality. Once you enable things like client-side scanning, it becomes way too tempting for everyone, from law enforcement on down, to gradually (and not so gradually) seek to expand access to that data. The idea that you can just say “well, don’t misuse it and put in place strict safeguards” ignores basically all of human and technological history.

Actual “strict safeguards” are that you keep encryption and you don’t allow client-side scanning. The fact that there is judicial redress isn’t that useful for most people. Especially if your data has been leaked or otherwise abused thanks to this process, to then have to go through years of expensive litigation to get “redress” is no answer at all.

Johansson, like way too many EU bureaucrats seems to think that law enforcement and governments would never abuse their authority to snoop on private messages. This is naïve in the extreme. Especially at a time of creeping authoritarianism, including across the EU. To open up everyone’s private messages is madness. But Johansson isn’t concerned because the government will have to approve the snooping.

Only where authorities are of the opinion that there is evidence of a significant risk of misuse of a service, and that the reasons for issuing the detection order outweigh negative consequences for the rights and legitimate interests of all parties affected, would they announce their intent to consider a detection order on child sexual abuse material or grooming targeted to the specific risks identified by the service provider.

Look, we’ve been here before. EVERY SINGLE TIME that some sort of mandated access to communications is granted to law enforcement, we’re told it’s only going to be used in special circumstances. And, every single time, it is widely abused.

And, not surprisingly, Johansson cites the ridiculous paper that recently came out from two Government Communications Headquarters employees pushing for client-side scanning. The same paper that we ripped apart for all its many flaws, but to Johansson, it’s proof that you can do client-side scanning on an encrypted system without violating privacy. That’s wrong. It’s simply untrue.

Incredibly, Johansson concludes her blog post with pure, unadulterated projection:

What frustrates me most is that there is a risk of advocates creating opposition based only on abstract concepts. Abstract notions based on misunderstandings.

That’s the summary of this entire nonsense proposal: it’s all abstract concepts based on misunderstandings about technology, security, privacy, and even how CSAM works and how best to stop it.

It’s great that the EDPB carefully detailed the problems with this proposal. It’s laughable that Johansson’s hand-waving is considered an acceptable response to this ridiculously dangerous proposal.

Filed Under: , , , , , ,

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “EU Commissioner Pens Barely Coherent Defense Of Spying On Everyone, For The Children”

Subscribe: RSS Leave a comment
15 Comments
Samuel Abram (profile) says:

Now, one of the main backers of the proposal, the European Commissioner for Home Affairs, Ylva Johansson, has responded to the report in an almost incomprehensible blog post. It does not actually address the many detailed technical and legal issues raised by the EDPB report. Instead, it just is a performative “but think of the children” screed.

Mike, I believe you once said that it was the American politicians whose posture towards tech policy was performative grandstanding whilst the European politicians were serious towards tech policy, but they have bad ideas. Is it the case now that European politicians’ have a grandstanding posture as well?

Nemo_bis (profile) says:

Substance

I can’t find a single mention of the word “substantive” in the whole “response” to the argument that

procedural safeguards can never fully replace substantive safeguards

so I must conclude that the argument is accepted. Translated, it means it doesn’t matter what the proposed law says, it matters more what it actually does.

If you pass a law mandating that every child have a hand grenade under their bed, “but they must be used very carefully”, it doesn’t matter how many words on a piece of paper claim everything will be fine.

Cowardly Lion says:

Johansson’s response is titled “Children deserve protection and privacy“.

It’s just staggering that she thinks this proposal will protect children’s privacy by absolutely taking it away. I question whether this is doublespeak, or if she’s pushing some hidden agenda of authoritarian types. Then again maybe she’s just a clown…

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...