Contrary To The Claims Of Grandstanding Politicians, Child Porn Is Very Difficult To Stumble Onto Accidentally

from the so-omnipresent-hardly-anyone-ever-sees-it dept

Google has decided to be even more “proactive” in fighting child pornography, crafting a database of flagged images that will be made available to law enforcement, investigators and even its own competitors. Somehow the company plans to make it searchable while simultaneously deleting the offending items from the web.

There’s no reason for Google to be doing this other than as a response to the UK government’s consensus that Google = Internet, and is therefore responsible for policing everything it crawls. Unfortunately, many offending images will remain beyond the reach of Google. Additionally, turning a hunt for child porn into an algorithmic search will lead to false positives and deletions, as anyone familiar with ContentID and YouTube can readily attest.

The politicians crusading for a child porn-free internet will be satiated. Google’s new offensive plays to their strengths, namely:

1. Proclaiming something must be done.
2. Allowing someone else to do that “something.”

The UK’s current porn-blocking efforts (of regular, legal porn) are a comedy of errors. “Child safety” filtering on mobile networks has already resulted in the mistaken blocking of YouTube, Orange, and The Jargon File. With these filters becoming mandatory next year, more and more sites will find themselves cut off from their users due to the general ineptness of blocking software crafted at the behest of hand-wringing bureaucrats.

Child porn, however, remains the true enemy, especially in Britain, where its profile is heightened due to recent events. In the oft-echoed call for someone (namely, Google) to do something about child porn, a rather startling statistic was quoted. According to the Internet Watch Foundation (IWF — an industry-funded group that compiles lists of keywords and illegal abuse sites for subsequent banning by Google, et al), “more than 1.5 million internet users in the UK mistakenly viewed child abuse images last year.” (Only 40,000 were reported to the IWF, a point which is left open to speculation.)

It’s a rather alarming number. But is it accurate? UK website Ministry of Truth went digging into the math behind this “statistic.” The “1.5 million” quote above was pulled from an IWF press release that offered no citations. Perusing the IWF’s site itself, MoT found another press release that applied a bit of hedging to the claim.

New study reveals child sexual abuse content as top online concern and potentially 1.5m adults have stumbled upon it.

Note that one word that changes everything.

Hang on a second, we’ve just gone from “1.5 million adults have stumbled across” child porn to “potentially 1.5 million adults have stumbled upon it”, which rather starts to suggest that the IWF’s “study” might not be quite what they’re making it out to be and, sure enough, a little further down the page we hit paydirt:

The ComRes poll conducted among a representative sample of 2058 British adults for the Internet Watch Foundation (IWF) shows the vast majority of people in Britain think that child sexual abuse content (“child pornography”) (91%) and computer generated images or cartoons of child sexual abuse (85%) should be removed from the internet.

Riiiiight… so it’s not actually a study, it’s an opinion poll; a grade of evidence that generally sits just above the story you heard from a bloke down the pub who swears blind that his cousin’s boyfriend knows a bloke who knows the bloke that it actually happened to.

Long story short (although the long story is a very interesting read), the poll used skewed demographics (weighted heavily towards the 55-and-older set) to produce this meaningless percentage:

“- 3% have seen/encountered ‘Child pornography'”

According to the 2011 Census the adult population of Great Britain is just over 48.1 million and 3% of that is a little under 1.43 million people, which the IWF has rounded up to 1.5 million (ignoring the usual rules on rounding) for its press releases.

The problem with accepting this at face value (and then attaching it to multiple press releases) are numerous. For starters, as many as 1 in 7 UK citizens have never used a computer, much less have internet access. For another, one person’s “child porn” is another person’s “adult film starring consenting, paid adults.” One needs look no further than the Daily Mail’s disastrous attempt to show how easy it was to find child porn simply by using the same search terms as those found on a convicted child killer’s internet history.

The Mail’s Amanda Platell claimed to have taken a journey to the “hell known as internet child porn.” Unfortunately, her only souvenir from the trip was a misidentified clip from a 13-year-old (adult) porn film. True, the content of the film would be repulsive to many (simulated sexual assault), but the film was made and distributed legally.

Not only are the number of “potential” child porn viewers lower than the IWF claims, but the number of readily accessible pages containing child porn images on the internet is more “rounding error” than panic-worthy.

Here are the numbers the IWF came up with in its 2012 report.

In total, the IWF found 9,550 web pages that hosted child sexual abuse content spread across 1,561 internet domains in 38 different countries. 60% of the child sexual abuse content identified by the IWF was found on ‘one click hosting website’, i.e. a file hosting service/cyberlocker which, for reasons known only to itself, the IWF insists on referring to as a ‘web locker’ despite the fact that no else else seems to use that particular phrase.

A brief glance at that total should readily tell you the percentage is insignificant. And this is a number compiled by a group tasked with hunting down child pornography, an entity that would have a much higher hit rate than the average person browsing the web. Here’s how it stacks up to the whole of the internet.

Out of an estimated 14,8 billion indexed web pages, the British public reported just 9,696 web pages (0.000065%) containing child pornography to the IWF in the whole of 2012.

In that same year, just 1561 internet domains (0.001%) were reported to the IWF that were found to contain child pornography out of a minimum of 145.5 million registered domains (and that’s just for five gTLDs and one country specific domain).

In fact, on a single ordinary day in May 2013, 92 times as many new domains were registered across just the six TLDs we have figures for, than were reported and found to be hosting child porn by members of the UK general public in the whole of 2012.

How hard would it be to access child porn if you weren’t looking for it specifically? The Ministry of Truth puts your odds at 1 in 2.6 million searches. (MoT points out the odds will fluctuate depending on search terms used, but for the most part, it’s not the sort of thing someone unwittingly stumbles upon.)

All those demanding Google do more to block child porn fail to realize there’s not much more it can do. The UK already has an underlying blocking system filtering out illegal images at the ISP level, and Google itself runs its own blocker as well.

The above calculations should put the child porn “epidemic” in perspective. As far as the web that Google actively “controls,” it’s doing about as much as it can to keep child porn and internet users separated. There are millions of pages Google can’t or doesn’t index and those actively looking for this material will still be able to find it. Google (and most other “internet companies”) can’t really do more than they’re already doing already. But every time a child pornography-related, high profile crime hits the courtroom (either in the UK or the US), the politicians instantly begin pointing fingers at ISPs and search engines, claiming they’re not doing “enough” to clean up the internet, something that explicitly isn’t in their job description. And yet, they do more in an attempt to satiate the ignorant hunger of opportunistic legislators.

If Google is “the face of the internet” as so many finger pointers claim, than the “internet” it “patrols” is well over 99% free of illegal images, according to a respected watchdog group. But accepting that fact means appearing unwilling to “do something,” an unacceptable option for most politicians.

Filed Under: , ,

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Contrary To The Claims Of Grandstanding Politicians, Child Porn Is Very Difficult To Stumble Onto Accidentally”

Subscribe: RSS Leave a comment
50 Comments
Ninja (profile) says:

Even if you do look for child porn actively you’ll find it very hard to find. Most likely you’ll find pics of adult actresses that look very young. There’s this Chinese mangaka that’s in her 30s who looks like she is 14. Besides it’d take an incredibly dumb person to host such material with open access. I’d guess child porn is kept in private servers very heavily protected.

That’s not how you deal with child porn. You get the police to do their job, infiltrate in the networks where it’s distributed and make hell break loose.

But something must be done. Except that we want to make the least effort possible and we are not really worried if it will be effective, eh?

Not an Electronic Rodent (profile) says:

Re: Re: Re:3 Re:

Pretty sure it used to be 16 but went up, so stuff that used to be legal now isn’t.

It’s 16, but you can’t look at or appear in “adult images” until 18… so looking at real naked 16-year olds is apparently fine but not a picture of one… now that makes sense…
The argument is I believe that “looking at drawings might lead to a thirst for more ‘real’ things”. No-one ever seems to consider or even acknowledge the flip-side to that coin where drawings might assuage the urge and prevent something actually bad…

nasch (profile) says:

Re: Re: Re:4 Re:

The argument is I believe that “looking at drawings might lead to a thirst for more ‘real’ things”.

Which are actually legal to do. Makes perfect sense. facepalm I think a more compelling (less nonsensical) argument would be that it’s OK for a 16 year old to have sex with an adult, but that society (and the individuals) is better off if 16 year olds aren’t working in porn or prostitution. That is, it’s not worse to look at a picture of a naked 16 year old than it is to have sex with one, but it is worse to pay a 16 year old to take his or her clothes off than to do it for non-monetary reasons. That sort of sidesteps the issue of naked photos for fun, though, whether selfies or taken by a boyfriend or what have you.

Zakida Paul (profile) says:

I love all these calls for Google to do more to stop child porn.

Contrary to popular belief, paedophiles are not getting this material from Google. They get it from the darknet via peer to peer sharing networks; and they use techniques to hide what they are doing. Anything Google (or even ISPs, for that matter) do will be totally ineffectual.

The problem of child porn will not be solved with a technological solution because it is a social problem and, as such, requires a social solution.

PaulT (profile) says:

Re: Re: Re:

The problem is that in order for child porn to be produced, a child has to be abused. So, even if he’s just looking at a collection from home, child abuse and possibly rape has taken place somewhere in order for him to do so. That’s a social problem, no matter how you spin it – which is why we need to go after those who produce it, not Google or people who stumble across it.

Michael (profile) says:

I'm not so sure...

It seems to me that any time I hear about someone finding child porn on someone’s computer they immediately say: “That just ‘popped’ up!”, “That was an accident!”, or “It wasn’t me, it must have been a popup that downloaded the 451 movies!”

Yeah. A bit like your wife finding a pile of singles and an ATM receipt from a gentleman’s club balled up in your pocket and suddenly that kind of thing apparently appear’s in pockets all the time.

Ninja (profile) says:

Re: I'm not so sure...

Well, it worked well when mom found my porn stash. I told her those pics were cached from not so nice advertisement in sites*. Apparently it works with law enforcement then?

* I think she pretended to be fooled and I pretended it was a good excuse but that’s another story.

EVIL CACHES, SUE MOZILLA FOR ALL THE CHILD PORN.

Anonymous Coward says:

whenever there is something on the ‘net that someone doesn’t like, they always make a big play of Google. the thick fuckers in the UK, who think it is so easy to stoop this stuff and that there will be no harm done except to the parts of the ‘net that deserve it are about as informed as my big toe is on mars exploration. if it were so easy, then things could perhaps be achieved. however, the entertainment industries have been saying for years how easy it is to identify and block infringing material. we have seen how easy it is. they couldn’t even identify their own stuff , let alone block what should have been. anyone with a mouth as big as Perry’s can lay blame at the door of whoever. when in a ‘discussion’ (and i use the term loosely) with someone on TV, she won the whole thing hands down. the thing was, she just wouldn’t let the other side say anything! anyone can win a debate if the other side is never allowed to say anything! her biggest problem, however, is not the mouth that just wont keep closed, it’s the stupidity of her whole idea. all that is going to happen is that the issue will be forced underground and make it all but impossible for the police to catch those responsible! that, however, doesn’t matter to people like her! being in the limelight and having her name associated with an ‘unsavory internet process’ is what matters!

mik (profile) says:

realy is it soo hard

?child pornography / if you looking for ascitently hit the 1 one/ real fing go use kazza-lite download some of games movies and you may hit it or use emule same time you may allways hit cop honypot
/
computer generated images or cartoons of child sexual abuse
2(toes not take a sec to find stuff) is ewen easier use google tipe lol con shearch use google corecsion pics /
1.3 use tgp press reapeadetly pics that open to oder legal tgp’s and you may soon be in illigal waders no looking needed
/
1.4 find a rank list japans and you have acsess to all gainda stuff
/
1.5look for tinami.com use link surf ower links

+ getchu
http://www.google.com/search?client=opera&q=getchu&sourceid=opera&ie=utf-8&oe=utf-8&channel=suggest
dlsite/
Cosplay (105)
Fantasy (799)
Heartwarming (1004)
Hilarious (1018)
Magical Girl (203)
Moe (332)
Nekomimi (Catgirl) (176)
Puni (195)
Robot (124)
School (461)
SF (256)
Uniform (571)
Yaoi (189)
Yuri/Girls Love (196)
/
and so on and on
if you stuppid you simply look for stuff if you have brains you have vpn

Violated (profile) says:

Let us be quite clear in that our fascist UK Government is only using Child Porn (and other lawful but distasteful porn) in order to block access to standard adult porn.

Not to overlook that just about every Internet using UK citizen is totally hating them for doing this.

Had they really wanted to attack child porn then they are over a decade too late to join that party. The Internet population did choose to self censor where with the aid of law enforcement the Web was cleaned. So let us be clear what they find now namely one questionable image on an otherwise adult site where even that image is simply a lawful age model who simply looks younger than her true age.

So the home of CP now are the dark nets like Tor

Violated (profile) says:

Re: Re:

Damn a cut off malfunction. No edit so to continue…

So the home of CP now are the dark nets like Tor Core but even there you wont find CP unless you go through many sites and links in an active hunt for it.

I have seen an UK adult site filter in action before on a mobile SIM but I gave up all hope on it when many adult sites were not blocked because they were unknown to them. Then many innocent sites did get blocked like the ASCII archive. Block ASCII art really?

So this mandatory policy only makes the situation worse by fooling parents into a false sense of security.

To top this off then it is well known that easy porn access has led to a large drop in sexual crimes. So to save us from standard porn would mean more people getting molested and raped.

Then all this in a country where national news papers read by all ages get women naked proving to all what topless women look like.

Anonymous Coward says:

What is child porn?

A point mentioned but not explored is, what is defined as CP? You ask 10 different people you’ll get 10 different answers. Simple nudity should not be conflated with hardcore porn, yet I’m sure it regularly is. In some US states a wet t-shirt can define an image as CP, but in most it does not. Age of the “victim” makes a big difference too. A 17 yo posing topless is in a completely different category than a 12 yo being forced to perform penetrative sex acts, and there is a whole range therein. To conflate the two is beyond ridiculous. And drawings and cartoons? How are you even going to begin to logically classify those?

I will say that, if you search for porn a lot, it is not hard to stumble across an image of child porn, especially if we are talking softcore “porn”. Although I have not tried (for obvious reasons -how many people are really willing to test the truth of the alarmist’s assertions?), my sense is it is rather difficult to find large quantities of “quality” CP.

A very great danger is the use of CP images to blackmail/shakedown people, including people in positions of power. It is all too easy for a few images of softcore borderline porn to get mixed in with legal material. Its far to easy to plant this kind of stuff on someone’s computer. That is why possession of CP should be decriminalized. Not legalized, it would still be contraband, and possession would be no more than a violation. The distinction between hardcore and softcore should be taken into account as well. Distribution and obviously production would still be crimes of graduating severity.

Anonymous Coward says:

false positves not a problem

The recent media reports of Google’s pro-active “effort to eradicate child abuse imagery online” are all based on a blog entry that was clearly written by someone at Google who is non-technical.
http://googleblog.blogspot.com/2013/06/our-continued-commitment-to-combating.html
It is hard to tell from this what exactly Google has done on its own in this effort apart from providing money, software and hardware to other groups trying to identify and maybe filter out child porn from the internet.

A few basics. Google uses a database of hashes to identify copies of known child porn images. It is not clear that Google itself has added to the databases that have been created through law enforcement efforts. The hashes traditionally have used MD5 which has cryptographical weaknesses related to someone designing a file that, when hashed, will match a target hash value. This is still very hard to do so the problem of false positives is pretty much nonexistent. This issue may become important if there is an attempt to use hash values of encrypted files in court to prove possession of child porn. I must emphasize that with a 16 byte (128 bit) hash collisions are exceedingly unlikely. Unlike the algorithms used for ContentId there will not be a problem with false positives.
The Google blog says:
“Recently, we?ve started working to incorporate encrypted ?fingerprints? of child sexual abuse images into a cross-industry database.”
I believe that should be corrected to say that they are incorporating fingerprints of encrypted child sexual abuse images into a cross-industry database. That is, they are trying to identify encrypted child porn by hashing those files as well. This is potentially useful but assumes that pedophiles don’t re-encrypt images for storage or further distribution.
It is not clear, from the blog, whether Google actually filters out search results for cp files, or web pages containing such files. Nor is it clear that Google uses any other methods to identify such files (e.g. searching gmail accounts for matches).
It is fairly easy to defeat identification through a hash database by altering the image file in absolutely any minimal way. Apparently, law enforcement has had pretty good success in identifying known porn images through this method, so the usual conventional wisdom that most criminals are stupid seems to be true. My impression of Google’s blog is that it is a feel-good PR piece that makes it seems like Google is doing a lot when it really isn’t. Not that they should have to. Law enforcement should welcome the status quo as they can identify and track pedophiles because they are using the internet to exchange files that are not uniquely encrypted. If such exchanges become impossible due to filtering from companies such as Google then exchanges will be pushed further underground with the use of cryptography so that even law enforcement will have a hard time identifying and tracking pedophiles.

John85851 (profile) says:

Because blaming Google is easy

Obviously it’s easier for politicians to deal with this issue by telling Google to simply take down the images (if they even could) rather than go after the websites hosting the images (which is a crime) and the people producing the images (which is an even bigger crime). I would think politicians would be thanking Google for acting as a billboard directing law-enforcement directly to the criminal sites.

But, nope- as usual, they think it’s better to put a band-aid on the issue to cover it up rather than actually dealing it, because dealing with it is hard. Plus, tracking down the creators and owners of the websites takes time and the arrests could occur when the next DA takes office, so it’ll count on his record.

And how many of these websites are located in Russia, China, or some other country where the UK politicians can’t easily arrest someone? It’s much easier to go after a large US corporation which can be “persuaded” to cooperate under threat of not being able to do business in the country.

Anonymous Coward says:

Re: Because blaming Google is easy

Obviously it’s easier for politicians to deal with this issue by telling Google to simply take down the images

When you are in a position to be able to order someone to do something, without having to tell them how, all problems are easy to solve. When asked how, you can simply say that is up to the questioner to find the solution.

Failure to solve the problem is not your problem, but rather a failure or people to do what you told them to do.
Further people will report that they have done as requested, when they have passed the problem down the food chain.

This is how large bureaucracies end up lying to themselves.

Anonymous Coward says:

There is a simple reason that it’s impossible to get rid of Child Porn completely. AFAIK, the legal definition is porn of a minor. That is to say, somebody under the age of consent. The age of consent is different in different countries. Ergo, a picture could be legal porn in one country, but illegal child porn in another.

th (profile) says:

I would guess that stat is about right

It’s way way too easy to click on a link in say Tumblr and come across an image that makes you back out as quickly as possible, as if you’d stepped on a hot coal. I am not sure if they’re actually classifiable as kiddie porn but that seems to be the effect they’re going for and anyway I don’t stay and ponder the issue either…they’re clearly pictures of people of questionable age or made to look as such. Who needs these landmines laying about ? Good riddance.

As far as reporting them goes, that would require me to look at them for longer than it takes to find my browser’s back button. The general fear I think is that a crusading AG (I am in the US) would see you as an easy target since in theory at least that image could be cached somewhere on your computer, thus you *have* it and thanks for reporting yourself , sucker. Who wants to invite that wolf top your door? Who needs the feds kicking down your door at 3 a.m. ransacking your house, filing charges even if it all gets sorted out later? Try *rehabilitating* yourself after something like that.

The lack of reporting represents one very dysfunctional thing, I am quite sure. It’s silent testimony to people’s lack of faith in their attorneys general’s sincerity, trustworthiness, honest intentions and good judgement. Maybe some of them have common sense, but what if you have one who doesn’t? How do you know? In the US more than a few appear to be careerist opportunists and even likely sociopaths who have wormed their way into positions of power and will take any innocent, low hanging fruit they can get, charge the shit out of it, force a plea bargain on the properly terrorized citizen then use the whole affair in their next election commercial in order to show that they’re “tough on crime”.

I say go Google go. I wish Tumblr and some of the other mainstream image sharing sites would police themselves a lot better. Are you really saying it costs too much or you’re worried about *free speech*? Give me a break. These sites who are raking in cash from their free user generated content, if they had a freaking conscience, would jump over each other to get a the chance to target this crap for the bit bin and just eat the cost, figuring that along with making money, you’re on earth to do some good where you can.

Guess they don’t see things that way.

Beauty in the breakdown says:

I won’t lie. I’m into kink Porn. It’s kinda my thing. Been watching it for years. Today I stumbled across a new site when I clicked on it up popped CP. I was so disgusted I closed it right away. In hind sight I wish I hadn’t bc then I could have reported. Anyway point being its out there! Never been a problem or happened before now I don’t even want to turn my computer back on.

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...