UK Parliament Takes First Step Towards Making Google & Facebook Censor Everything

from the this-is-a-bad-idea dept

Look, let’s just start with the basics: there are some bad people out there. Even if the majority of people are nice and well-meaning, there are always going to be some people who are not. And sometimes, those people are going to use the internet. Given that as a starting point, at the very least, you’d think we could deal with that calmly and rationally, and recognize that maybe we shouldn’t blame the tools for the fact that some not very nice people happen to use them. Unfortunately, it appears to be asking a lot these days to expect our politicians to do this. Instead, they (and many others) rush out immediately to point the fingers of blame for the fact that these “not nice” people exist, and rather than point the finger of blame at the not nice people, they point at… the internet services they use.

The latest example of this is the UK Parliament that has released a report on “hate crime” that effectively blames internet companies and suggests they should be fined because not nice people use them. Seriously. From the report:

Here in the UK we have easily found repeated examples of social media companies failing to remove illegal content when asked to do so?including dangerous terrorist recruitment material, promotion of sexual abuse of children and incitement to racial hatred. The biggest companies have been repeatedly urged by Governments, police forces, community leaders and the public, to clean up their act, and to respond quickly and proactively to identify and remove illegal content. They have repeatedly failed to do so. That should not be accepted any longer. Social media is too important to everyone?to communities, individuals, the economy and public life?to continue with such a lax approach to dangerous content that can wreck lives. And the major social media companies are big enough, rich enough and clever enough to sort this problem out?as they have proved they can do in relation to advertising or copyright. It is shameful that they have failed to use the same ingenuity to protect public safety and abide by the law as they have to protect their own income.

Social media companies currently face almost no penalties for failing to remove illegal content. There are too many examples of social media companies being made aware of illegal material yet failing to remove it, or to do so in a timely way. We recommend that the Government consult on a system of escalating sanctions to include meaningful fines for social media companies which fail to remove illegal content within a strict timeframe.

This is the kind of thing that sounds good to people who (a) don’t understand how these things actually work and (b) don’t spend any time thinking through the consequences of such actions.

First off, it’s easy for politicians and others to sit there and assume that “bad” content is obviously bad. The problem here is twofold: first, there is so much content showing up that spotting the “bad” stuff is not nearly as easy as people assume, and second, because there’s so much content, it’s often difficult to understand the context enough to recognize if something is truly “bad.” People who think this stuff is obvious or easy are ignorant. They may be well-meaning, but they’re ignorant.

So, for example, they say that these are cases where such content has been “reported” on the assumption that this means the companies must now “know” that the content is bad and they should remove it. The reality is much more difficult. DO they recognize how many such reports these companies receive? Do they realize that before companies start taking down content willy nilly, that they have to actually understand what’s going on? Do they realize that it’s not so easy to figure out what’s really happening sometimes?

Let’s go through the examples given: “dangerous terrorist recruitment material.” Okay, seems obvious. But how do you distinguish terrorist recruitment videos from documenting terrorist atrocities? It’s not as easy as you might think. Remember how a video of a European Parliament debate on anti-torture was taken down because the system or a reviewer thought it was promoting terrorism? People think this stuff is black and white, but it’s not. It’s all gray. And the shades of gray are very difficult to distinguish. And the shades of gray may differ greatly from one person to another.

Sexual abuse of children. Yes, clearly horrible. Clearly things need to be done. There are, already, systems for government-associated organizations and social media platforms to share hashes of photos deemed to be problematic and these are blocked. But, again, edge cases are tricky. Remember, it wasn’t that long ago that Facebook got mocked for taking down the famed Napalm Girl photo? Here’s a situation that seems black and white: no naked children. Seems reasonable. Except… this naked child is an iconic photo that demonstrates the horrors of war. That doesn’t mean we should let all pictures of naked children online — far from it, obviously. But the point is that it’s not always so black and white, and any policy proposal that assumes it is (as the UK Parliament seems to be suggesting) has no idea what a mess it’s causing.

Next on the list: “incitement to racial hatred.” This would be so called “hate speech.” But, as we’ve noted time and time again, this kind of thinking always ends up turning into authoritarian abuse. Over and over again we see governments punish people they don’t like, by claiming what they’re saying is “hate speech.” But, you say, “incitement to racial hatred” is clearly over the line. And, sure, I agree. But be careful who gets to define both “incitement” and “racial hatred.” You might not be so happy. Here in the US, there are people who (ridiculously, in my opinion) argue that groups like Black Lives Matter are a form of “incitement to racial hatred.” Now, you might think that’s crazy, but there are lots of people who disagree with you. And some of them are in power. Now are you happy about handing them the tools to demand that all social media sites take down their content or face fines? Or, how do you expect Google and Facebook to instantly determine if a video is a clip from a Hollywood movie, rather than “incitement to racial hatred?” There are plenty of powerful scenes in movies that none of us would consider “polite speech,” but we don’t think they should be taken down as “incitement to racial hatred.”

Then the report notes that “the major social media companies are big enough, rich enough and clever enough to sort this problem out.” First off, that’s not true. As noted above, companies make mistakes about this stuff all the time. They take down stuff that should be left up. They leave up stuff that people think they should take down. You have no idea how many times each and every day these companies have to make these decisions. Sometimes they get it right. Sometimes they don’t. Punishing them for a mistake in being too slow is a near guarantee that they’ll be taking down a ton of legit stuff, just to avoid punishment.

Separately, who decides who’s a “major social media company” that has to do this? If rules are passed saying social media companies have to block this stuff, congrats, you’ve just guaranteed that Facebook and Google/YouTube are the last such companies. No new entrant will be able to take on the burden/liability of censoring all content. If you try and somehow, magically, carve out “major” social media companies, how do you set those boundaries, without creating massive unintended consequences?

The report falsely claims that these companies have successfully created filters that can deal with advertising and copyright, which is laughable and, once again, ignorant. The ad filter systems on these platforms are terrible. We use Google ads for some of our ad serving, and on a near constant basis we’re weeding out terrible ads, because no company is able to and awful people are getting their ads into the system all the time. And copyright? Really? If that’s the case, why are the RIAA/MPAA still whining about Google daily? These things are much harder than people think, and it’s quite clear that whoever prepared this report has no clue and hasn’t spoken to anyone who understands this stuff.

Social media companies currently face almost no penalties for failing to remove illegal content.

What a load of hogwash. They face tremendous “penalties” in the form of public anger. Whenever these stories come out, the companies in question talk about how much more they need to do, and how many people they’re hiring to help and all that. They wouldn’t be doing that if there were “no penalties.” The “penalties” don’t need to be legal or fines. It’s much more powerful when the actual users of the services make it clear what they don’t like and won’t stand for. Adding an additional legal threat doesn’t change or help with that. It just leads to more problems.

And that’s just looking at two awful paragraphs. There’s much more like that. As Alec Muffett points out, the report has some really crazy ideas, like saying that the services need to block “probably illegal content” that has “similar names” to illegal content:

Despite us consistently reporting the presence of videos promoting National Action, a proscribed far-right group, examples of this material can still be found simply by searching for the name of that organisation. So too can similar videos with different names. As well as probably being illegal, we regard it as completely irresponsible and indefensible.

So, not only do the authors of this report want Google to remove any video that is reported, no questions asked (despite a long history of such systems being widely abused), it wants them to magically find all “similar” content that is “probably illegal” even with “different names.” Do they have any idea what they’re asking for? And immediately after that they, again, insist that this must be possible because of copyright filters. Of course, these would be the same copyright filters that tried to take down Cory Doctorow’s book Homeland because it had a “similar name” to the Fox TV show “Homeland.” “Similar names” is a horrific way to build a censorship system. It will not work.

What’s so frustrating about this kind of nonsense is that it keeps popping up again and again, often from people with real power, in large part because they simply do not comprehend the actual result of what they’re saying or the nature of the actual problem. There are not nice people doing not nice things online. We can all agree (hopefully) that we don’t like these not nice people and especially don’t like the not nice things they do online. But to assume that the answer to that is to blame the platforms they use for not censoring them fast enough misses the point completely. It will create tremendous collateral damage for tons of people, often including the most vulnerable, while doing absolutely nothing to deal with the not nice people and the not nice things they are doing.

Filed Under: , , , , , , ,
Companies: facebook, google, twitter, youtube

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “UK Parliament Takes First Step Towards Making Google & Facebook Censor Everything”

Subscribe: RSS Leave a comment
69 Comments
Ninja (profile) says:

This will keep popping up until it is successfully implemented and the inevitable chaos ensue. Remember when the Govt offered money to people that killed some rodent/pigeon because it was a plague and people started breeding more to get more money? Yeah. Fun times. Don’t mention the after-effects of revoking the rewards program when people simply dumped the useless animals on the streets.

Humans learn by pain. Expect such idiocy to be implemented to some degree and politicians be left wondering what to do when it fails.

Anonymous Coward says:

Re: Re:

I agree with your first paragraph.

But your second paragraph, humans learn by pain? What? Are you serious? That is very dumb and careless to say. You are basically doing the brainwashing work for tyrants.

In your world, you will have to kick the crap out of your kid so he/she learns, right? What a stupid comment.

Sure, in your world there is nothing as actually CONVINCING people and TEACHING people by means of education, information, data, understanding, empathy.

Sure, that is why innocent people had to be nuclear bombed in Hiroshima and Nagasaki right, so they learn a lesson!

Speak for yourself, but don’t project your stupidity on others.

Sure, the first man on the moon got there by being whiplashed and punished all day. Nothing to do with training, data, technology, etc. You are a fool. And a fool with draconian views.

Anonymous Coward says:

Re: drop out

Then where do they go? The UK wants them to policy hate speech (from the sound of this article). China wants them to block anti-government speech. The US wants them to block “copyrighted material”. The EU wants them to remove everything on the request of someone who’s name is mentioned (right to be forgotten). Australia seams to want all of the above.

Anonymous Coward says:

A bit off topic but I have a question.

Let’s say I live outside of the UK and I offer a social media service that becomes popular in the UK. And as a result, they decide I need to policy the content. How are they going to enforce this? I’m not in the UK. I’m not subject to their laws. If my hosting is in the UK and they push that aspect, my response would be to switch to a non-UK hosting provider.

DannyB (profile) says:

Re: A bit off topic but I have a question.

In the UK, information that the government doesn’t like has the right to be forgotten. Information that rich or powerful people don’t like also has the right to be forgotten.

Your site could be censored if you publish hate speech such as a political opinion that does not favor the right people, or facts contrary to the interests of politicians or their friends.

Anonymous Coward says:

Google and Facebook ARE major "social media" as everyone but Masnicks know.

Just take that as a given instead of looking like an idiot arguing semantics. Sheesh.

The key fact is that those companies refuse to take down content EVEN WHEN INFORMED. What’s clearly outside common law can’t be allowed to operate openly. We’re NOT better off when, say, drug dealers and prostitutes advertise openly. We’re not better off because murderers and rapists how have a worldwide “platform”. That’s just silly libertarianism, also known as childish nihilism. You’re past 40 now. Grow up.

You manifestly want corporate giants to operate without any responsibility to the public. — But you don’t object over their chosen target of censorship, such as withdrawing all advertising from Infowars. Clearly, there’s a category of speech that you won’t defend.

Your schtick of wailing about slippery slope imminent is a constant among soi-disant elites for at least a century now, and yet society clearly prospers when SOME measures are taken against simply “going too far”. If you won’t stop where we can clearly see the abyss, when will you?

Anonymous Coward says:

Re: Google and Facebook ARE major "social media" as everyone but Masnicks know.

Yes, because the majority of requests for censorship of an item deals with “drug dealers and prostitutes … murderers and rapists”.

I must not be reading the same publications that you do. Many requests are to remove damaging info about the person making the request, be it a direct request or via a third party business. Also many requests are for the removal of infringing art (photos, painting, music, books) and the requester thinks they have the copyright … or at least they are supposed to – many do not but make the request anyway. How do these requests against drug dealers, prostitutes, murderers and rapists rank compared to the two above categories?

Anonymous Coward says:

Re: Google and Facebook ARE major "social media" as everyone but Masnicks know.

We’re NOT better off when, say, drug dealers and prostitutes advertise openly

And your proof of this is… what, exactly?

> We’re not better off because murderers and rapists how have a worldwide “platform”.

And your proof of this is… what, exactly?

> If you won’t stop where we can clearly see the abyss

And your proof that there is an abyss, that “we can clearly see”, is… what, exactly?

PaulT (profile) says:

Re: Google and Facebook ARE major "social media" as everyone but Masnicks know.

“The key fact is that those companies refuse to take down content EVEN WHEN INFORMED”

Citation please.

“We’re NOT better off when, say, drug dealers and prostitutes advertise openly.”

I disagree. If someone is advertising an illegal service openly, then the open advertisement is clearly visible to the police and they can investigate and prosecute far more easily then when those services are hidden. Stopping the open advertising would make police investigation more difficult, it would not stop the illegal activity.

“You manifestly want corporate giants to operate without any responsibility to the public.”

Only if you lie about what’s being written.

“But you don’t object over their chosen target of censorship, such as withdrawing all advertising from Infowars”

That’s not censorship. The advertisers have as much ability to exercise free speech as Infowars does to spread their bullshit, and that includes deciding not to advertise on that site. Infowars are free to fund their site in ways other than 3rd party advertising, the advertisers cannot be compelled to advertise on their site.

Anonymous Coward says:

Google and Facebook ARE major "social media" as everyone but Masnicks know.

Just take that as a given instead of looking like an idiot arguing semantics. Sheesh.

The key fact is that those companies refuse to take down content EVEN WHEN INFORMED. What’s clearly outside common law can’t be allowed to operate openly. We’re NOT better off when, say, drug dealers and prostitutes advertise openly. We’re not better off because murderers and rapists how have a worldwide “platform”. That’s just silly libertarianism, also known as childish nihilism. You’re past 40 now. Grow up.

You manifestly want corporate giants to operate without any responsibility to the public. — But you don’t object over their chosen target of censorship, such as withdrawing all advertising from Infowars. Clearly, there’s a category of speech that you won’t defend.

Your schtick of wailing about slippery slope imminent is a constant among soi-disant elites for at least a century now, and yet society clearly prospers when SOME measures are taken against simply “going too far”. If you won’t stop where we can clearly see the abyss, when will you?

Anonymous Coward says:

Re: Google and Facebook ARE major "social media" as everyone but Masnicks know.

not really a slippery-slope fallacy when it is demonstrably proven that government will abuse any and all power given (or taken by) them. if you actually believe this then you should be careful you don’t end up sounding like a parody of your argument.

Anonymous Coward says:

Re: Re: Google and Facebook ARE major "social media" as everyone but Masnicks know.

hey, corporations will also abuse any power given…

if not go ask Mike Masnick and all those Techdirt articles about the abuses done to the market (high prices, bad service, monopoly, etc) by Comcast and the pack.

So, government sucks ballls, but corporations too. And more often than we know, they collude, for profit and money, against people and their rights.

And that is precisely the problem. If you think corporations are nay better than any government you are blind.

Anonymous Coward says:

Re: By the way, appears that Masnick so believes in free speech that he's again pre-approving all comments.

Can’t get a word in edgeways, he says, while spamming the thread. The poor advocate for free speech lets you display your speech, multiple times.

out_of_the_blue hates it when due process is enforced. And logic.

PaulT (profile) says:

Re: By the way, appears that Masnick so believes in free speech that he's again pre-approving all comments.

My last reply to you was held for moderation. I didn’t whine about it like a little girl, though, I’ll just wait rather than try to spam the comments and whine again when the next comment is held. Feel free to read it when it’s approved.

Anonymous Coward says:

Re: Blame the tools.

Russians are the enemy, Chinese are the enemy, Muslims are the enemy, Immigrants are the enemy, Drugs are the enemy, Mexicans (all the way from Guatemala to Argentina) are the enemy, Communists are the enemy, the Reds are the enemy, the yellow are the enemy, Islam is the enemy, Speech is the enemy, Terrorism is the enemy, etc etc etc etc etc

Yours truly,
Your USA government and junior aka UK government
“For your safety”

Shilling says:

The trifecta. Child abuse ,terrorism and discrimination. So usefull when they only try to fight the symptoms but never the cause. And this will only make the problem less visible for the general public but i assume they will still be pretty mad when their bus blows up or their child gets abused.

I wish they would start using the appropriate reasons for why they want something. For example: this needs to be changed because cows can’t see into the earth’s core, my fingers aren’t circles and those letters are arranged in such a way that when I change them to numbers it still does not make any sense.

Rocky says:

Shifting the blame...

Since the politicians doesn’t take their responsibility they shift the blame to the platforms that exposes their shortcomings indirectly. If they did what they are supposed to do, foster a government and society that produces well educated citizens there would be no need for censoring the internet.

Anonymous Coward says:

Re: Re: Shifting the blame...

Thats not education. That is information.

People with more information create more censorship. FTFY

Education has nothing to do with schools, information or else. EDUCATION comes from home, from family values, personal values, kindness to others, etc. Not prestigious school, ivy league bullshit and all that crap, that would be just information or prestige but NOT education.

Schools DON”T educate people. Family educate people.

Anonymous Coward says:

so, how long before the UK government actually stops everything on the internet except what IT wants there and/or is paid to have there (like the crap put out by the entertainment industry that has it’s own UK Police Force, paid for by the tax payer, it’s own laws, locking people up for 10 years for daring to copy a music disk or download a movie, while joe bloggs, serial rapist, gets sentenced to 3 years and gets time off for good behaviour!)? it is getting worse than China and is definitely being run by the USA! it has to be, because as soon as there is something the USA cant or refuses to do, the UK idiots jump straight in and do it instead!! talk about moronic fucking puppets!!

Machin Shin says:

I really think trying to compare this to the filtering things for copyright is drastically over simplified.

Kind of like saying “We have put a man on the moon so why don’t we put a man on Pluto?”

Sure, if you don’t know much about space this seems perfectly reasonable. We built a rocket and lander and sent people to the moon. Pluto is just another rock out there so shouldn’t be that different. Then once you talk to someone at NASA you quickly find out it is a VERY different problem.

Anonymous Coward says:

Re: Re:

Isn’t what UK (and USA) government have been doing for a long time? Oversimplifying anything? (for your convenience and fear i mean entertainment)???

“Its all terrorism”

“All muslim are evil”…..”Islam is evil”

“Drugs are bad”

“Restore peace”

“All immigrants are bad”

etc etc etc

SoWhat says:

Its a non issue for speech

Facebook and Google are businesses.

Some look at this as speech. Except the speech isn’t generated by these companies, it’s externally generated based on rules these companies have created for their userbase and have been ineffective at monitoring or censoring per their own terms of service.

Some look at this as influence peddling (election meddling) aka Facebook or Google not censoring known false articles or criminal activities via the use of their tools. This again is on the shoulders of both Facebook and Google to monitor and block content that violates their own terms of service which have been ineffective.

Some look at this as individuals rights to speech including forms of art or commentary that a majority may find inappropriate content; While some get off on seeing the purge played out live using the tools Facebook and Google provide.

Facebook and Google could shutdown tomorrow and no one’s speech will stop. The methods for that speech will change but speech isn’t defined by the tools eg an artist using a paintbrush doesn’t stop being an artist because they switched to felt tipped markers. The form may change, the but the art, the speech remains.

Look Google and Facebook break the rules, change their terms of service to benefit shareholders and have done so with little regard to their impact on individuals. Individuals are the product and as long as people keep using these tools they will have little say it how the tools use them.

Governments in my opinion are doing their jobs and struggling to come up with solutions to problems these companies are both creating/enabling while also ignoring to further generate click revenue.

Governments don’t always get it right (laugh) or take a long time to get it right (laugh) and in this case could be over reacting but in their attempts ARE putting pressure on these companies to do more about owning the tools they provide and used by individuals and business and the content stored on their tools. That’s called regulation, a function of governments around the world.

I for one am not worried about infringements of speech. Speech will continue in all it’s forms, perhaps with less reliance or dependence on either of these companies – which in my view would be a good thing for speech… by that I mean less click bait, less forwarding of someone elses viewpoints and perhaps more real life in person conversations which are much healthier for our species.

I refuse to fight on behalf of these multi billion dollar entities just because they have cool tools. Their lobbies have already effected legislation in multiple countries that benefit them, these aren’t charities or benevolent citizens. Paraphrasing Warren Buffet these are money generating machines.

Their management created these problems by ignoring reality that making money isn’t the only responsibility these mega corporations have. Time for them to face the fire and grow as entities or die out one country at a time.

Anonymous Coward says:

Re: Its a non issue for speech

I agree with you, Facebook and Google are nefarious companies that regularly break their own TOS and regularly change them too, all for profit, they don’t care about the individual. It is all money money money.

On the other hand i disagree with you, governments are not doing their job, they are just hungry for more power. They are, as the article says, criminalizing the tool and not the INDIVIDUALS that make this nefarious “expressions”.

You contradict yourself, Google or Facebook are not enabling anything, because as you said, the art/the speech was already there, be it hate speech, CP, or whatever.

So all in all, corporations don’t care, they are all about money, governments don’t care, they are all about power and control (with little understanding of technology)…..the solution will NEVER come from government nor from corporation. The only real solution is Organized Civil Society, forming Non Government Organizations non-profit that their primary goal is human rights, civil liberties, etc, not elections, not money.

Jeffrey Nonken (profile) says:

And then there was AOL, who banned the word “breast” and pissed off
– a generation of breast-feeding mothers
– several generations of breast cancer victims and survivors
– their families
– their friends
…not to mention anybody cooking fowl and trying to avoid the dark meat.

Porn producers LOVE to use punny names based on popular media or literature. I’m dying to see how the “similar names” rule works with that.

Anonymous Coward says:

ONE MORE ANALOGY

Yes, on repeated occasions the population has urged the government to clean up their act and take care of those evil peoples that use the National Power Grid to commit their horrendous crimes, yet the government has repeatedly failed to do so.

In particular the USA has failed to remove crime from streets, police killing innocent people, racism, elitism, corruption, etc etc etc.

In the UK we just need to talk about those horrendous shoe boxes where they force people to live. That is just a crime.

Leave a Reply to Anonymous Coward Cancel reply

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...