from the in-return,-politicians-promise-to-provide-more-bad-legislation dept
In the face of “extremist” content and other internet nasties, British PM Theresa May keeps doing something. That something is telling social media companies to do something. Move fast and break speech. Nerd harder. Do whatever isn’t working well already, but with more people and processing power.
May has been shifting her anti-speech, anti-social media tirades towards the Orwellian in recent months. Her speeches and platform stances have tried to make direct government control of internet communications sound like a gift to the unwashed masses. May’s desire to bend US social media companies to the UK’s laws has been presented as nothing more than as a “balancing” of freedom of speech against some imagined right to go through life without being overly troubled by social media posts.
Then there’s the terrorism. Terrorists use social media platforms to connect with like-minded people. May would like this to stop. She’s not sure how this should be accomplished but she’s completely certain smart people at tech companies could bring an end to world terrorism with a couple of well-placed filters. So sure of this is May that she wants “extremist” content classified, located, and removed within two hours of its posting.
May’s crusade against logic and reality continues with her comments at the Davos Conference. Her planned speech/presentation contains more of her predictable demand that everyone who isn’t a UK government agency needs to start doing things better and faster.
Although she is expected to praise the potential of technology to “transform lives”, she will also call on social media companies to do much more to stop allowing content that promotes terror, extremism and child abuse.
She will say: “Technology companies still need to go further in stepping up to their responsibilities for dealing with harmful and illegal online activity.
“These companies simply cannot stand by while their platforms are used to facilitate child abuse, modern slavery or the spreading of terrorist and extremist content.
“We need to go further, so that ultimately this content is removed automatically. These companies have some of the best brains in the world. They must focus their brightest and best on meeting these fundamental social responsibilities.”
“Go further…” but to what point? This is all May has said for years. Social media companies continue to struggle with moderating content, but it’s not for a lack of trying. They’re dealing with contradictory demands from multiple world governments, each of them declaring different types of speech to be unacceptable. The pressure isn’t imaginary. Twitter has taken proactive measures in response to Germany’s new hate speech law, resulting in some spectacular collateral damage. Other platforms are doing the same thing, even if the damage hasn’t been as ironically glorious.
May wants harder nerding, up to and including all-knowing bots that kill objectionable content before it reaches human eyeballs. She wants the impossible. Even if it were theoretically possible to police speech better with AI, that’s still years away from being the deployed at scale. Efforts that have been deployed have been routinely disastrous. Ask anyone how YouTube’s Content ID is doing handling copyright infringement and you’ll get a general idea of just how well algorithms police content.
For now, the problem is handled by a mixture of algorithms, human moderators, and crowd sourcing. The algorithms can’t reliably target unwanted content. The humans are, well, human — prone to error and bias. The last part — reporting functions for users — basically give every heckler a veto button, resulting in abuse of the system to bury content certain users don’t want to see. All these efforts work well for the governments demanding them — and these governments are the entities most likely to abuse them to silence dissent.
This is what the argument has been reduced to: calls for “more” without any interest in determining whether “more” will be helpful or even possible. The result will be the suppression of speech, rather than a victory over terrorism.