We Should Probably Stop Blaming Technology For The Failings Of Human Beings

from the technology-is-a-mirror dept

I’ve been thinking a lot lately about how so many of the “problems” that people bring up with regards to the internet these days — much of them having to do with disinformation, misinformation, propaganda, etc. — are really manifestations of the problems of people in general, perhaps magnified by technology. At some point I’ll likely write more on this concept, but as such it’s difficult to see how blaming the technology solves the underlying problems of humanity. It’s treating the symptoms, not the disease. Or, worse, in many cases it seems like an attempt to brush the disease under the rug and hope it disappears (to mix metaphors a bit). Alicia Wanless has written a long an interesting post that makes a similar, though slightly different point.

She also notes that blaming technology seems unlikely to solve the underlying issues.

And yet, in the fear that is spreading about the future of democracy and threats of undue influence (whatever that might mean), there is no shortage of blame and simplified calls to action: ?the governments should regulate the tech companies!?; ?the tech companies should police their users!?; ?we should do back whatever is done against us ? so, more propaganda!? In all of this, I cannot help but think we might be missing something; are we fundamentally approaching the problem in the wrong way? ?

Technology might have brought us to this point where we now must look at ourselves, our society in this digital mirror. But it is not to blame. These issues are not new. Persuasion, manipulation, deception, abuse ? none of these are new. Humans have been doing and suffering from these things forever. What is new is the scale, speed and reach of such things. If anything, ICTs have only amplified our pre-existing issues ? and as such, no technical solution can truly fix it.

She further notes that, since technology is changing so rapidly, any attempt to solve the “problems” discussed above by targeting tech platforms is unlikely to have much long term impact, as the purveyors of disinformation will just move elsewhere. Now that people know how to leverage technology in this manner, it’s not like they’re just going to stop. Furthermore, she notes that merely censoring and brushing content under the rug can often have the opposite of the intended effect (something we’ve talked quite a bit about here):

Filtering and blocking people into protection is not just impractical in a digital age, it is dangerous. This is the informational equivalent of obsessive disinfectant use ? what will happen to that bubbled audience when ?bad? information inevitably comes through? To say nothing of the consequences for democracy in such a managed information environment. After all, blocking access to information only makes it more coveted through the scarcity effect. And given that the people exposed to misinformation are seldom those who see the corrective content questions remain about the utility of efforts to dispel and counter such campaigns. If we want to protect democracy, we must make citizens more resilient and able to discern good information from bad.

That last line is important, but I’ve seen a number of people pushing to regulate tech more mock this idea, rolling their eyes and saying “media literacy is not the answer.” I’m not convinced of that, in general, but Wanless has a slightly different approach. She’s not just talking about media literacy, but about the fact that the western world seems blind to the fact that we have spent decades promoting our own propaganda on the rest of the world, and they find it laughable when we complain about them doing the same, using the technology that we developed.

Of course, to do this, we in liberal democracies, will have to come to terms with something we have avoided for a while: how much persuasion is acceptable and where are the lines? What is good versus bad information and who decides that? Who can legitimately use persuasion and when? ? This is the elephant in the room ? persuasion. And it is not enough to say that when we use persuasion it is acceptable, but when an adversary does so it is not.

Until we in the West come to terms with our awkward relationship with propaganda and persuasion, we cannot effectively tackle issues associated with the manipulation of the information environment. For far too long our aversion to persuasion has made us hypocrites, trying to disguise attempts at influencing other populations with various euphemisms (which also might explain why words are failing us now in describing the situation).

As she notes, we in the west can argue that US and western influence campaigns around the world were different from, say, Russian or Chinese influence campaigns these days, but it’s a distinction that doesn’t much matter to those pushing disinformation campaigns today. They see it all as the same thing.

She ends her piece with some suggestions on what to do — and I recommend going there to read them — but I’m still thinking a lot about how the internet has really held up a mirror to society, and we don’t really like what we see. But rather than recognizing that we need to fix society — or some of our political corruptions — we find it easier to blame the messenger. We find it easier to blame the tool that held up this mirror to society.

We can’t fix the underlying problems of society — including over-aggressive tribalism — by forcing tech companies to be arbiters of truth. We can’t fix underlying problems of society by saying “well, this dumb view should never be allowed to be shared” no matter how dumb. We fix the underlying problems in society by actually understanding what went wrong, and what is still going wrong. But fixing society is hard. Blaming the big new(ish) companies and their technology is easy. But it’s not going to fix anything at all if we keep denying the larger fundamental flaws and problems in society that created the conditions that resulted in the tech being used this way.

Filed Under: , , ,

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “We Should Probably Stop Blaming Technology For The Failings Of Human Beings”

Subscribe: RSS Leave a comment
39 Comments
Anonymous Anonymous Coward (profile) says:

Society, open or closed?

The symptom of ‘do something, anything, even if it’s wrong’ as displayed by politicians in search of votes rather than actual solutions to the problems, seems to be infectious. I don’t know whether they are infecting us or we infected them but the ‘solutions’ presented often don’t resolve the intended issues.

Our government has been pushing agendas for many decades, often through the CIA, and often via incredibly bad notions of what might work, that not only fail in the long run but set examples of how others should go about pushing their agendas. We, at least so far as our government goes (and likely the rest of us at least as a whole), has a terrible time looking in the mirror for the causes to the effects we find not to our liking. And it is not just in accepting responsibility for our actions, but in identifying the sources of the newly created or exacerbated problems.

The response to hide speech we don’t like is like security through obscurity, and those who take security seriously, and not just as a slogan, know that that method does not work. It is not only likely that the information that is being blocked will be found and disseminated (it might no longer be on Facebook or Twitter or elsewhere public it is likely still on someones hard drive), but long term repercussions from that blocking will have significant impact on how things are viewed in the future.

For us, at least in our Democratic Republic, change will come with difficulty as it will be necessary to change parts of our current election and accountability model in order to achieve any substantive results, and that won’t come easily as those in power will wish to retain their power in a manner that will ensure ongoing power to those that have it. Whereas the needed changes require that power be mitigated and accountability raised. An untenable result for the power hungry that tend to dominate the political arena.

Setting such an example, over unfortunately much too much time, might go a long way to help others in the world realize that the ideals of open vs closed societies with governments that work for the people rather than for their own retention, are better for the world in general, including their own parts of it. Even if it doesn’t do much to feed their ego trips.

Rastus Iconoclastus says:

Re: Re: Society, open or closed?

One key to an open society is open, transparent government. That would be a big improvement.

And HOW are you going to achieve that, "Gary"? In face of opposition by entrenched Rich and their corporations that employ masnicks by the hundreds to support their arbitrary power.

You need specific goals, kid, else are just whining.

The solution is to prevent growth of centralized power. Yes, even if "private corporation".

Break up corporations simply because TOO LARGE.

Confiscate "money" because TOO MUCH. (It’s not actually money or even income, just a notion, largely based on central bank making ever-bigger numbers. Nor will confiscating 99% of Bill Gate’s numbers reduce his personal luxury, only limit ability to do evil.)

The inevitable result of letting size and power grow is that The Rich enslave the rest.

Anonymous Coward says:

Re: Re: Re: Society, open or closed?

Break up corporations simply because TOO LARGE.

And HOW are you going to achieve that, MORON? When doing so requires actions by the same government that you just said is going to encounter "opposition by entrenched Rich and their corporations". Hm?

Also, how large is TOO large?

You need specific goals, kid, else are just whining.

Well, same to you. None of your solutions are any more specific than his.

Confiscate "money" because TOO MUCH.

How much is too much?

(It’s not actually money or even income, just a notion, largely based on central bank making ever-bigger numbers. Nor will confiscating 99% of Bill Gate’s numbers reduce his personal luxury, only limit ability to do evil.)

Oh so you can’t even confiscate it because it doesn’t exist in the first place. Do you even think before you roll your face on the keyboard? If it doesn’t exist and they don’t actually have it, how can they use it to do anything, good or evil?

Anonymous Coward says:

Re: Re: Re: CAPTCHA

Really? We’ve reached the point where it’s actually a big deal whether you have to do CAPTCHAs in certain scenarios?

Is it really so hard for you to click a check box? Or identify a few cars in pictures? Do you not know how important it is to protect your site/service against bots?

Also, I DON’T get a CAPTCHA when clicking on the link. So either the OP is lying or he’s behind a VPN or some other software/device that is doing out of the ordinary things with his web traffic, thereby causing him to be prompted for human verification. In either case, he has no reason to complain about it and should just suck it up and stop whining about it.

Rastus Iconoclastus says:

Done. Now stop letting a few failed humans inflict technology

to control the rest. That is: surveillance capitalism, every gadget spying, all giving NSA "direct access" as Snowden stated.

Whoever has power will abuse it. Period.

Just look at the weenies here at Techdirt gleefully exercising power to censor every opinion that they can’t bear, on a blog that solicits opinion! Sheesh. Those with any degree of power can always find excuse for its exercise.

The hint here of doing good is just a veil. Masnick wishes "technology" corporations to operate with NO control or taxation, without regard to The Public’s good. They know what’s best. They’re wise and benevolent. Let Them Rule.

sumgai (profile) says:

Re: Done. Now stop letting a few failed humans inflict technolog

Ya know, blueBalls, Techdirt does not solicit opinions, it merely allows them to be expressed – that’s quite a difference.

And to be sure, the corollary is that Techdirt does not require anyone to respect the expressed opinions of other posters. In fact, this place is a microcosm of a democratic society. We vote, and we live with the results – just like Democracy is defined in the (any) dictionary. But feel free to express your dissent. Just keep in mind that the chances of your opinions swaying the rest of us are infintesimal at best. (That last was my opinion, BTW – YMMV.)

Anonymous Cowherd says:

Media literacy is the answer for the people. Censorship is the answer for government, who want to minimize rival propaganda without rendering the populace resistant to their own.

Inherent in the concept of censorship is the idea that someone gets to not only decide what the "truth" is, but also to dictate it to others.

Anonymous Coward says:

The "never-regulate" crowds strikes again, with their favorite tactic of misrepresenting the issue, and cherry-picking the argument they want to have rather than address the actual problem.

Social media technology does not cause bad actors or bad actions, and nobody is seriously promoting that argument– but that won’t stop the "never-regulaters" from claiming that’s what their opponents are doing, because it’s an argument they can beat. (Why look for your wallet where you dropped it, when the light is so much better under the street lamp?)

What social media actually does is amplify and intensify the harm caused by bad actors, effectively making bad actors more powerful and efficient at hurting people. The existence of a movement that insists we must do nothing to mitigate this amplification effect is baffling to me, all the more so because the only move in their playbook seems to be to insist that we must solve the problem of human nature once and for all before we should even consider regulations. Yes, why pursue harm reduction when we could insist that others perform the impossible first?

This is rather like refusing to pay one’s bills until after one wins the lottery. You might just as well insist that we not regulate the social media industry until after we’ve found a magic lamp and exhausted our three wishes first.

Roy Rogers says:

Re: Re: Re:

While politicians rail against "terrorist content," encryption, and the right for people to remain generally unmolested by their governments, they’re leaning hard on social media platforms to eradicate this content ASAP.
https://www.techdirt.com/articles/20190701/07060242499/removing-terrorist-content-isnt-helping-win-war-terror.shtml

By using your preferred method as opposed to actual regulation

Anonymous Coward says:

Re: Re:

The "never-regulate" crowds strikes again

No one anywhere has said that. Please do explain, though, how the government dictating what people can or cannot say, or can or cannot allow people to say on the internet does not directly and explicitly violate the First Amendment of the Constitution. Hm?

misrepresenting the issue

How so?

cherry-picking the argument they want to have rather than address the actual problem

Oh so the problem is not that some people are jerks not only IRL but also online? Because I thought that was what the problem was, a few people online are being jerks and making other people online uncomfortable. Is that not what started all this?

Social media technology does not cause bad actors or bad actions

Correct.

nobody is seriously promoting that argument

No, but they are promoting the argument that tech companies should be responsible for the bad actors and bad actions of others. Now who is misrepresenting the issue and changing the argument they want to have?

that won’t stop the "never-regulaters" from claiming that’s what their opponents are doing

No one is claiming that, not even in this article. In fact the article pretty explicitly states that they want to hold tech companies responsible, not that they are the cause of the issue. It’s just easier for them to do that then treat the actual issue.

because it’s an argument they can beat

Well since no one is arguing that then how do you know?

What social media actually does is amplify and intensify the harm caused by bad actors, effectively making bad actors more powerful and efficient at hurting people.

And obviously you didn’t read the article since that is not far off of what it says.

The existence of a movement that insists we must do nothing to mitigate this amplification effect is baffling to me

So do you support registrations for all authors then? Only government approved authors can write and publish books? Do you also support banning the door-to-door campaigning for special interest issues, sidewalk protests, email campaigns, bull horns, roadside billboards, phone calls, etc…???? Because all of those are all amplifying effects. Yes they aren’t on the same scale as the internet but they still amplify it and you are apparently opposed to all amplification methods so therefore you are against all these other methods too, right?

all the more so because the only move in their playbook seems to be to insist that we must solve the problem of human nature once and for all before we should even consider regulations.

When those regulations infringe on the First Amendment rights and freedoms of speech for every single other person, then yes, abso-frigging-lutely.

Yes, why pursue harm reduction when we could insist that others perform the impossible first?

Why must that harm reduction ONLY be to quash everyone’s else freedom of speech?

This is rather like refusing to pay one’s bills until after one wins the lottery

And your suggesting is like saying the person who one the lottery not only has to keep paying their bills but also pay the bills of everyone in the country or have the money taken away from them.

You might just as well insist that we not regulate the social media industry until after we’ve found a magic lamp and exhausted our three wishes first.

And you might as well just admit that you are either ignorant of the First Amendment and how that applies only to the government and that forcing tech companies to tell people what they can or cannot say online is the DEFINITION of government regulated speech and expressly prohibited by the First Amendment, or you are intentionally and deliberately misrepresenting the issues and facts at hand for whatever reason.

People are jerks and say some pretty awful things. The solution is not to run around and throw duct tape over everyone’s mouths or insist that in a room of people we have to all be forced to listen to the idiot in the corner or someone yell expletives and derogatory remarks at everyone and not boot them out the door.

Anonymous Coward says:

Re: Symptoms vs. root-cause

The thing you’re missing here is that regulating the Internet is but a palliative, aimed at the symptoms (speech folks "don’t like", disinformation) of deeper-seated social problems (polarization, radicalization, our unhealthy relationship with both ends of the word "propaganda", lack of transparency, lack of the education and understanding needed for that transparency to be useful).

While we certainly won’t solve the problems of human nature in one go, there are certainly tools available to address facets of those problems at a human level. It is now a matter of having the political will to apply those tools effectively instead of reaching for an illusionary "Easy" button that will simply blow up in the face of those who push it…

Mike Masnick (profile) says:

Re: Re:

The "never-regulate" crowds strikes again

Where? I am not part of the "never regulate" crowd, so if you’re implying I am, well, you’re wrong. I am, however, a part of the "regulations, poorly implemented, for bad reasons, probably do more harm than good" crowd, and because of that tend to think that when we do regulate, we should look closely at what is the actual harm being dealt with and whether or not the regulatory proposals will solve it. So perhaps I’m in the "when we regulate, we should do it in an evidence-based fashion" crowd. Sue me.

favorite tactic of misrepresenting the issue, and cherry-picking the argument they want to have rather than address the actual problem.

What is the actual problem and how will regulations solve it?

What social media actually does is amplify and intensify the harm caused by bad actors, effectively making bad actors more powerful and efficient at hurting people.

That is the theory. Can you support it? The evidence on this seems mixed. There are some cases of this happening. There are also cases of social media being used to stop bad actors. And I’m curious to see how any regulation can stop one, but not the other, and how it can be done in a way that doesn’t stifle other useful speech. If you have an actual plan for that, let me know.

The existence of a movement that insists we must do nothing to mitigate this amplification effect is baffling to me

I’m not saying do nothing. I’m saying if you want to "stop bad people from doing bad stuff because it will be amplified" you should at least be able to show (1) that the bad stuff is actually amplified by social media that (2) stopping it is possible (3) that stopping it can be done in a way that doesn’t have greater negative consequences and (4) that it can be done in a way that doesn’t stifle innovation or other speech. Help me out here.

Yes, why pursue harm reduction when we could insist that others perform the impossible first?

I’m not asking for the impossible. I’m asking to weigh the trade offs. That never seems to happen.

This is rather like refusing to pay one’s bills until after one wins the lottery.

Wait. Asking for evidence to support that (a) there is a problem, (b) that this solution will solve that problem and (c) that this solution won’t create worse problems is like what now? Uh… no. It’s not.

Anonymous Coward says:

Re: Re:

Prove how social media amplifies the (claimed) harm of bad actors. Then feel free to create and assess some evidence-based regulation, removing it if it doesn’t fill its purpose or creates more harm than it supposedly fixes.

Then take your regulation to meatspace, where the sort of harms supposedly amplified by social media actually take place. Have fun with that.

michael (profile) says:

Information literacy

What both this author and Wanless are talking about here is "information literacy," a concept that goes back to the ’70s and has been a major talking point in higher ed — particularly in information science — for over a decade.

Perhaps more people would pay attention to it if those newcomers espousing its virture bothered using the correct terminology.

https://en.wikipedia.org/wiki/Information_literacy

ECA (profile) says:

Wow, really?

"I’ve been thinking a lot lately about how so many of the "problems" that people bring up with regards to the internet these days — much of them having to do with disinformation, misinformation, propaganda, etc. — are really manifestations of the problems of people in general,"

Nice comment..
The human condition is absurd.
We believe everything we read is ‘LOCAL’, not just Over there.
We try to believe everyone is like us, and telling us the truth..
We cant understand the diff of concepts/ideals/reality or what if’..
The more we see something on TV, we absorb over time. and its NOT REAL.. There are no people inside your TV.
The thought that an Actor, ISNT what he/she is representing on display ISNT REAL is abit alien to some people.
Unless shown How to Think and criticize, we tend to believe in everything. Out of the box, sideways thinking Just dont happen.

Dont get upset with me..IMHO..
Iv told others how I treat people, unless they show me better.. I treat persons like I would an animal, I love animals. BUT understand that I have to do things Simply. I will pt them on the head when they do well, I will Tap them on the nose if they do something wrong. Its the thought that Everyone wants to feel good, and always look for acceptance and that little bone we give our dogs for being good. this really works with More people then I ever thought.
If A person shows me they can think ABIT, I will (tend) to add more to the way I treat them. I will see if they can create NEW thoughts and ideas, I will LISTEN to them and see how they relate.
Then those that do Really well, and I see they are smarter then usual..I make my friends..
I like discussions and comments, and ideas, But I need then to Explain their sides.
Many people have never needed to THINK about what they think. to many believe what they are told, like an abused dog, thats mean…A few pats on the head and a bone and they will follow you forever..

Anonymous Coward says:

Re: Re:

Technology is a failing of humanity…

So all the advances in medicine, communication, science, astronomy, etc… that have greatly improved live for humanity are all worthless and we should all go back to the stone age?

Please, just stop.

to be happy. Go outside. Enjoy the world. Disconnect. It’s within your grasp and needs no IP.

Going outside is not the only thing that can make people happy. Many people derive happiness from lots of technology related things, like air conditioning, central heating, video games, hobby electronics, etc…

Not everyone is like you, no matter how much you want it to be true.

Lawrence D’Oliveiro says:

You Can’t Ignore Technology

Technology is very much a part of just about every problem we face today. It can also be part of the solution. When a technology is bad, we fix it or replace it with something better. For example, clamping down on the spread of guns, or polluting and unsafe cars, or lead paint, or any other harmful product. It’s just the smart thing to do.

When Internet companies become part of the problem, then yes, we need to make them modify their behaviour, so they instead become part of the solution. If they can’t or won’t do that, then they need to go.

Anonymous Coward says:

Re: You Can’t Ignore Technology

You haven’t explained how internet companies are part of the problem. The spread of guns, polluting/unsafe cars, lead paint, and other harmful products are harmful intrinsically. Internet companies are not, so why should they be regulated as such?

Gerald Robinson (profile) says:

Right on, old, old problems

I totally agree with "If we want to protect democracy, we must make citizens more resilient and able to discern good information from bad." ! We have a bad systemic problem, regulate the thing! Don’t try to change/regulate people tech regulation is the answer! This showed up in the early automobile and continues today. Dishonest homophobes push for "gun control" that st work. Around 90% of mass facility shootings occur in gun free zones (FB I data) so eliminating gun free zones will prevent mass shootings!? Then when the penny broad sheets became newspapers there were efforts to regulate them, they continue today with the shutdown of ‘Backpage’. Any Democratic form of government relies on an descerning, intelligent population! I’m the U.S. and the EUROPEAN that has been destroyed by an education system which demands conformity not critical thoughts. Regulating tech is futile, just look at the auto death rate, they have been regulated heavily for over a century.

Leave a Reply to Anonymous Coward Cancel reply

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...