Filters, Laws Won't Clean Up Net

from the whatever-happened-to-responsibility dept

The latest column at Wired from Lauren Weinstein does an excellent job making a point that seems so obvious to so many people, but doesn’t seem to get through the heads of those who make laws or filtering software programs. These filters and laws simply don’t work. They create too many of both types of errors: blocking stuff that shouldn’t be blocked, and letting through things that (according to the filters’ rules) should be blocked. Instead of focusing on such useless laws and filters wouldn’t it make a lot more sense to rely on education and personal responsibility? Teach children how to use the internet responsibly, and why certain things aren’t appropriate, instead of lying and trying to hide them from it by pretending these things don’t exist. Kids are smart, and they’ll figure out that they do exist. Wouldn’t it be better if they learned how to deal with these things rather than to deny their existence?


Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Filters, Laws Won't Clean Up Net”

Subscribe: RSS Leave a comment
16 Comments
dorpus says:

what if education/responsibility doesn't work?

You could tell kids they shouldn’t go to adult chat rooms because there are bad people who will abduct them, but that will just excite the kids and make them want to go there even more. You could tell girls they shouldn’t visit pro-anorexia web sites, you could tell kids they shouldn’t visit incest, bestiality, or S&M sites, but… you get the idea.

Mike (profile) says:

Re: Re: Re: what if education/responsibility doesn't work?

What is the price of teaching responsibility for their own actions? Kidnapping, starvation, trauma.

You say this as if the only options are either kids being lied to or kids being kidnapped, starved and traumatized. It must be fun to live in your world of polar extremes. Kids are much smarter than you think. It’s kind of sad that you have so little faith in your own children.

Anyway, if you want to lie to your kids, you’re absolutely free to do so. However, if I want to teach my kids responsible behavior I don’t think the government should get to force filters on me. There is a huge difference between parents intervening and governments intervening.

dorpus says:

Re: Re: Re:2 what if education/responsibility doesn't work?

It’s not a question of polar extremes. It’s a question of risk. You can tell kids about the consequences of playing with fire, but they will probably play with fire anyway. The internet poses a similar risk — they might end up being found as a decapitated body next to the highway, spend the rest of their lives in a wheelchair, or require years of expensive psychotherapy to recover.

In such a context, it is not unreasonable to pass laws governing kids’ use of the internet. We do not allow our kids to drive automobiles just because we lectured them on responsibility.

bbay says:

Re: Re: Re:3 what if education/responsibility doesn't work?

Ok, I have to add my $0.02 here.

When calculating risk, the severity of the consequences are not the only factor you have to look at. To make an informed descision you must also look at the probablility of occurance.

The implication that any exposure of a child to the unfiltered internet will result in his or her decapitation is ridiculous. Your risk assessment IS a polar extreme, because it presumes the occurance and doesn’t take into account it’s small likelihood.

Also, with regard to fire and driving, you are conflating a number of developmental stages here. We DO let children drive, at 16 around here. And I’m certainly not afraid of a 12 year old finding matches, especially if I know that they’re familiar with the possible dire consequences. Exactly what age of child are you imagining here?

Anonymous Coward says:

Re: Re: Re:4 what if education/responsibility doesn't work?

Admittedly, decapitation is an extreme example with a lower probability. But then there are higher probabilities of damaging encounters that result in rape, attempted rape, kidnapping, trauma, or the triggering of anorexia. A great many such incidents do not get publicized either, because families avoid the unwanted publicity.

We let kids drive at 16, but we would not think of allowing a 7 or 12 year old to drive. A 7 year old around matches is trouble. There are quite a few 10 or 12 year old girls who go into chat rooms and make sexually provocative comments at men, because they think it’s “funny”. Some of them are, in fact, dumb enough to also say where they live.

But yeah, if you want your kids to learn life’s lessons the hard way, that is up to you.

Mike (profile) says:

Re: Re: Re: what if education/responsibility doesn't work?

Certainly a valid question, and the answer is no, I don’t have any kids. If that means that my opinion on this subject isn’t valid to you, so be it. I think my points still make sense. Clearly, it’s worth revisiting the subject at a time in the future when I do have children of my own. However, that doesn’t mean I can’t think about what I would do with my children. At the same time, I was raised in just such an open environment, so I can speak with authority on the effect on the children.

Let’s be clear, though. As I said, a parent is free to do as they want. If they want to block their kids from accessing stuff, that’s completely fine. My problem is if the government decides what should be blocked. Isn’t that the parents’ job?

Futhermore, I’ve seen enough people who were raised in extra-sheltered homes to know that they tend to have more problems when they’re suddenly unleashed on the real world out there. Denying that these things exist don’t help people learn responsibility and how to deal with things.

My parents raised me to question things, and take responsibility for my actions. They reasoned with me, and told me when they thought I was making a mistake, but they also trusted me to make my own mistakes and learn from them. I am grateful to them for that learning experience, and I hope I can do as good a job with my kids.

A Parent says:

Re: Re: Re:2 what if education/responsibility doesn't work?

Thank you for your clarifying comments. I agree that it is perfectly legitimate for you to comment on this issue whether as a parent or not. I was, as I said, just curious. And I would/will be interested to see if and how your position might evolve once you become a parent yourself.

I am the father of four boys. The oldest is 14; the youngest is 3. I wrestle with these issues on a daily basis. While I want my children to, as you say, “question things and take responsibility,” I also recogize that they need to be protected from things which they are not yet ready to handle. Sheltering vs. not sheltering your children is not a binary decision. It is a continuum where the correct position is usually somewhere in the middle. And, of course, the position shifts as they get older. Establishing this position and communicating it to your children, in my view, is one of the most challenging responsibilities that any parent has.

We have filtering software installed on our computers at home. Not because we don’t trust our children but because we believe that it’s too easy to come across content that would be emotionally damaging (especially to the younger children) without even meaning too. We communicate to our children that we are doing this. And that we will look at the sites that they are visiting. The software is not perfect, but combined with our vigilence, it has done the job so far.

As for libraries and other public places, I am, quite frankly, torn. I hate the idea of someone not getting to a legitimate site that they need because of inadvertant filtering. And I’m not at all comfortable with government or a filtering company making decisions in the public square. But I also recognize that outside the confines of my own home, I lose control, not of my own children but of those whose shoulders they may be peering over. And librarians make decisions not to carry (or at least not to make available to children) inappropriate print content. Why should Internet access be any different?

I guess where I come down on this is that there shouldn’t be any overriding government regulation but that individual librarise should be free to install filtering software on their own accord if they feel that it is necessary. And I, for one, would encourage my local library to do so.

dorpus says:

Re: Re: Re:3 what if education/responsibility doesn't work?

Or to take a less dramatic example, there are predators who do not seek to physically harm children, but get off on making them feel uncomfortable and mentally traumatizing them. Kids who are victimized as such may feel too ashamed to tell their parents what happened.

For example, your 14 year old son could meet someone in a game room who says he is 14, then asks him questions like:

– how long is your penis?
– does the sight of a man’s penis make you uncomfortable?
– why?
– what color are your nipples?

Despite all the posturing about modern maturity, kids are quite gullible and easily sucked in through the hands of a skilled predator.

bbay says:

Re: Re: Re:4 what if education/responsibility doesn't work?

AC said:

We let kids drive at 16, but we would not think of allowing a 7 or 12 year old to drive. A 7 year old around matches is trouble. There are quite a few 10 or 12 year old girls who go into chat rooms and make sexually provocative comments at men, because they think it’s “funny”.

I agree. This is the point I was making. “Children” are not a homogeneous population. And age is no indicator either, parents must take an active role in determining what each of their kids are capable of. I’m not pretending that this is easy, it is certainly no job to delegate to a piece of software (that doesn’t work).


But yeah, if you want your kids to learn life’s lessons the hard way, that is up to you.

Just to be clear, I could not be more against this. The school of hard knocks only produces more teachers of hard knocks.

dorpus said:


Or to take a less dramatic example, there are predators who do not seek to physically harm children, but get off on making them feel uncomfortable and mentally traumatizing them. Kids who are victimized as such may feel too ashamed to tell their parents what happened.

[…]
Despite all the posturing about modern maturity, kids are quite gullible and easily sucked in through the hands of a skilled predator.

dorpus, you have spoken reasonably and with clarity. After so many inflammatory comments, I’m somewhat taken aback. I agree that there is a danger in fact, and that children need protection from it.

We speak often about the unreasonable collateral cost that filtering exacts on the rights of adults. What we often fail to make clear is that the collateral cost is so unreasonable largely because filtering doesn’t work. If filtering was technically feasable, the debate would be a matter of weighing costs in a particular situation (eg., public library or home).

Parents especially must know this. To represent filtering software as effective to any degree is highly irresponsible, and puts any number of children at risk whose parents are ill informed.

Because filtering doesn’t work, it’s a devil’s bargain. “For the small price of some liberties, you will receive … nothing. Plus, act now and we will include this false sense of security for no extra charge.”

Ok, enough deconstruction, now for something constructive.

Some friends of mine have, as a means of mitigating these risks, placed their computer in a highly trafficked area of the house. They don’t allow the kids to use the computer in their home office, and the kids are certainly not going to have a computer in their own room for the foreseeable future. This makes it nearly impossible for any child to use the internet while alone. Additionally, it should be noted that this couple is highly technical, and have a firewall in place that basically shuts the internet completely off when they’re not around. This seems to me to be a good solution, it’s much harder to get into troublesome situations when your mom can see the screen.

Admittedly, they also have filtering software installed, but according to them the number of sites that should be filtered but aren’t is astonishingly large.

Also, I myself have no children, in case anyone was wondering.

dorpus says:

Re: Re: Re:5 what if education/responsibility doesn't work?

BTW, the “anonymous coward” was me, I just forgot to put my name in.

People have bashed filtering software, but I think it is quite possible to have the next generation that does more than just keep a list of forbidden addresses. One could build AI capabilities to assess the decency of web pages, scan the images for nudity. Perhaps there are hidden patterns to indecent sites that only a computer would notice. I do not think it is a lost cause.

bbay says:

Re: Re: Re:6 what if education/responsibility doesn't work?

That’s quite an optimistic prediction, as far as AI is concerned. Marvin Minsky considers computers which can make even the simplest judgements about morality to be decades away. Though he does seem to believe that they are definitely possible to build.

“Next generation” are not the words I would choose. Such a thing may not occur in our lifetimes, in fact.

So, until a computer is as thoughtful as you are, don’t trust your kids to them. That’s what I say.

dorpus says:

Re: Re: Re:7 what if education/responsibility doesn't work?

I don’t know, I think AI-pessimism is its own kind of hype. AI is already better than humans in a growing number of topics beyond just chess. Technical documents can often be translated just as well as an average human translator, at a fraction of the time. The human is useful only for reality checks.

The key to future growth in AI will be finding patterns where humans don’t expect it. An AI filter for porn sites may find patterns in the html code, the words used in the address, whatever.

Steve says:

Long story short on the issue

As always, it’s up to parents to raise their children not the government. If I want to use filtering software (which is inneffective at best now but at some point will be adequate) that’s my perogative. If I want to give them free reign because I trust them completely thats my call too. Or anywhere in between. The government has no business getting involved unless there is plenty of probable cause. As always, if a child is clearly showing signs of emotional or physical problems, there are accepted processes for a child services agency to get involved.

Being a parent is about protecting, educating and most important loving a child. It’s the parents right and responsiblity to use their best judgement and values to raise their child. In many cases, parents need to be educated on the internet and what kinds of things are available so they can make an intelligent, informed decision on how to deal with it. But the government should not ever make that decision for us.

This post is already too long to get into the very stickly library filtering issue.

Leave a Reply to Mike Cancel reply

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...