from the these-are-my-feelings,-he-said,-presenting-them-as-facts dept
Since its inception, the UK’s latest attempt to directly regulate the internet has been a disaster. Once dubbed the “Online Harms Bill,” it has since been rebranded to make it appear less harmful to the internet. The bill hasn’t gotten any better, but it does have a less alarmist name, even if everyone pushing hard for its passage tends to align themselves with alarmists.
Directly, the legislation targets harmful content, focusing the most harmful (and most capable of swaying public and legislative sentiment): child sexual abuse material (CSAM). “For the children” is the pitch but the proposal affects far more than those abusing children.
Indirectly, the bill attempts to criminalize end-to-end encryption. Client-side scanning is mandated, which means providers of services utilizing this encryption would need to remove encryption from the client side to scan content before it’s transmitted. Those who fail to perform this scanning would be held criminally and civilly liable, which pretty much criminalizes the same encryption the UK government itself has acknowledged actually makes children safer.
Since the proposed mandates not only conflict with common sense, but would result in UK residents having access to fewer internet communication services, those willing to talk over these implications continue to raise the volume of their voices.
The latest to say dumb things about a bad law is Security Minister Tom Tugendhat, who offered up a nonsensical defense of the Online Safety bill during his appearance at the Policing Institute for the Eastern Region (PIER) conference earlier this month.
It’s easy to overestimate a problem when your sole source for stats is the largest communications service in the world. But that’s what Tugendhat does, leading off a sales pitch for increased government surveillance and fewer privacy protections for UK residents.
In 2022 NCMEC received over 32 million reports of suspected child sexual exploitation and abuse.
21 million of these came from Facebook alone, which not only speaks to the severity of the issue they face.
It also leads me to suspect that other companies are significantly under-reporting.
But how does that lead him to suspect this? That goes unexplained. It’s not unusual that the world’s largest social media platform would submit the most reports to NMCEC. But that doesn’t mean others are under-reporting. It simply may be that other platforms aren’t nearly as popular with child abusers.
Even if they are, it doesn’t mean they’re not reporting enough. And, as for Facebook’s 21 million reports, there’s no evidence offered by either this minister or NMCEC that millions of reports are having any impact on stemming the flow of CSAM, much less deterring those creating this illegal content.
NMCEC receives ~30 million reports a year. UK politicians like Tugendhat seem to believe the minimal effort they make on their end (the UK government, that is) justifies broad legislation that criminalizes the sort of security protections online users expect (i.e., encryption) and makes it that much easier for the government to demand service providers become active participants in broad, unending surveillance of their users.
It’s clear, then, that this is a threat of immense scale and complexity, and I’m grateful for the valiant efforts of our law enforcement agencies.
Every month UK law enforcement agencies arrest 800 people and safeguard 1200 children.
Arrests are not convictions. And “safeguarding children” is a meaningless phrase without more context. All this means is that some cops are earning their paychecks. The rest of this phrase means nothing more than politicians and law enforcement officials have decided CSAM creators exploit 1.25 children each — a stat that sounds suspiciously close to the old cliché of the nuclear family and its 1.5 children. It certainly doesn’t sound like solid facts.
But facts aren’t important here. This is legislation driven by emotion and powered by guilt. If one believes the government shouldn’t have these powers, then they’re obviously in favor of child exploitation. People want their representatives to be rational. But they’re no better than the rest of us, as Tugendhat demonstrates in this presentation, which is based on things he’s seen and heard, but not on much more than that.
He lists three high-profile arrests of sex offenders over the past several years as evidence of the need to expand the government’s power. But three arrests against millions of reports a year is hardly evidence the government knows what to do with the powers it already has, much less that it would be that much more effective if allowed to criminalize encryption and mandate client-side scanning of content.
The argument made is this: the government should be able to prevent you from utilizing end-to-end encryption. If you question the government’s reason for doing so, please re-read the assertion above.
The UK government is in favour of protecting online communications.
And it is possible to offer your customers the privacy they expect…while also maintaining the technical capabilities needed to keep young people safe online.
Meta are just choosing not to, many others have already taken the same path.
In other words, when the government says people should not have access to encrypted communications services, it is well and right and good. When companies decide users should have access to encrypted communications, they are wrong and greedy and evil. It’s apparently just that simple.
Making things even stupider is this clumsy analogy, where the Minister decides he’s only responsible for some parenting.
My children love going to a playground near where we live.
While they’re there it’s clear who’s responsible for their safety.
Me of course, as their parent – but also the council, who have a duty to ensure the environment is safe and well-maintained, and our local police force, who have a duty to make sure nothing dangerous or illegal is taking place.
Both have clear lines of accountability to me and to our local community.
[…]
But what happens when they do go online?
Who’s responsible for their safety?
And is anyone accountable to them – or to me?
In my view it’s clear.
Companies like Meta enjoy vast power and influence over our lives.
With that power should come responsibility.
Fantastic. Tugendhat says he will abdicate his responsibility when his children go online. And he apparently believes this refusal to be responsible should be a problem for online services, rather than something he should have to deal with.
And that pretty much sums up the UK’s government’s approach to “online harms.” It’s everyone’s fault but the legislators who feel you shouldn’t be permitted to use secure messaging services just because there are apparently thousands of sexual predators the government somehow hasn’t managed to round up despite receiving more than 30 million alerts from NMCEC a year.
And where does this all end up? With a government figure turning a single company into a villain, strongly suggesting future legislative proposals will be crafted not with deterring CSAM in mind, but with inflicting as much financial and civil damage to a single US company as possible:
Some will have heard the words I have used today to be particularly critical of one company, they are right, I am speaking about Meta specifically and Mark Zuckerberg’s choices particularly. These are his choices, these are our children. He is not alone in making these choices, other companies have done too.
Let me be clear again: this government will not look away.
We will shortly be launching a campaign. A campaign to tell parents the truth about Meta’s choices, and what they mean for the safety of their children.
This isn’t about fighting CSAM. This is about the UK government’s antipathy towards Meta for daring to roll out end-to-end encryption.
It’s not that CSAM isn’t a problem. It’s that the legislative responses are consistently terrible. And what Tugendhat is pitching is nothing more than an extremely petty form of revenge, but one that’s backed by considerable government power.
Filed Under: client side scanning, csam, encryption, end-to-end encryption, moral panic, online safety bill, tom tugendhat, uk