Most Information About Disinformation Is Misinformation

from the disinfo-about-disinfo dept

Reporter Joseph Bernstein recently published a fantastic cover story in Harpers all about “disinformation” and “fake news” but not in the way you normally think about it. It’s not like most such articles — often decrying just how much disinformation is flowing out there, but rather taking a very critical eye about how we (especially the media) talk about such things. The piece is thought-provoking and well worth reading, and I’ve spent the last week or so letting it sit and percolate in my head before writing up this post about it.

Right after the 2016 election, there was a flurry of hand-wringing from the media, trying to understand how what they were positive would happen (a Hillary Clinton victory and a Donald Trump loss), didn’t actually happen. A convenient scapegoat for this surprising turn of events was… Facebook. The narrative took over that it was “fake news” on Facebook that convinced a bunch of gullible people to support a clearly unqualified candidate. That this convenient scapegoat also happened to be successfully siphoning advertising dollars away from some traditional media organizations was mostly just made pointing fingers at it feel even better. However, as we warned at the time, focusing in on social media and “fake news” was not just silly, but potentially counterproductive. Indeed, within weeks, authoritarians around the world started adopting the term “fake news” as a convenient excuse for censoring the media. And, obviously, it became a key part of Donald Trump’s stump speech as well.

It wasn’t long until “fake news” was used against any content someone in power didn’t like, and it became a key tool to push for censorship of those who were actually exposing malfeasance.

Bernstein’s article highlights how this same sort of thinking is happening with the term “disinformation.” Of course, disinformation doesn’t have a clear definition, and often it’s in the eye of the beholder (like “fake news” before it). But, the media (and many politicians) have become so obsessed with “disinformation” that, once again, we’ve turned it into a kind of moral panic — and a convenient one for censors around the globe. As Bernstein notes, the hue and cry over “disinformation” has made many Americans think it’s one of the biggest threats around:

Everyone scrounges this wasteland for tainted morsels of content, and it?s impossible to know exactly what anyone else has found, in what condition, and in what order. Nevertheless, our American is sure that what her fellow citizens are reading and watching is bad. According to a 2019 Pew survey, half of Americans think that ?made-up news/info? is ?a very big problem in the country today,? about on par with the ?U.S. political system,? the ?gap between rich and poor,? and ?violent crime.? But she is most worried about disinformation, because it seems so new, and because so new, so isolable, and because so isolable, so fixable. It has something to do, she knows, with the algorithm.

An important (and often overlooked!) point that Bernstein makes in the piece is that the big internet companies were pretty quick to embrace this idea that “disinformation” is a problem. It is true that Mark Zuckerberg initially pushed back on the idea, but after basically everyone attacked him for that, he quickly began his apology tour. And, why not? Even as it makes the company look bad in the narrative, at its core, the idea that disinformation on Facebook impacted the American election can be spun to be positive for Facebook. After all, if Facebook is so powerful, shouldn’t you be advertising on it, Mr. Toilet Paper maker? If Facebook can help get Trump elected because of some memes posted by idiots, just think how much beer and nachos it can sell as well? Literally, Facebook has a vested interest in having people believe that disinformation works on its platform, because that’s literally something the company can profit from.

Denial was always untenable, for Zuckerberg in particular. The so-called techlash, a season of belatedly brutal media coverage and political pressure in the aftermath of Brexit and Trump?s win, made it difficult. But Facebook?s basic business pitch made denial impossible. Zuckerberg?s company profits by convincing advertisers that it can standardize its audience for commercial persuasion. How could it simultaneously claim that people aren?t persuaded by its content? Ironically, it turned out that the big social-media platforms shared a foundational premise with their strongest critics in the disinformation field: that platforms have a unique power to influence users, in profound and measurable ways. Over the past five years, these critics helped shatter Silicon Valley?s myth of civic benevolence, while burnishing its image as the ultra-rational overseer of a consumerist future.

Behold, the platforms and their most prominent critics both proclaim: hundreds of millions of Americans in an endless grid, ready for manipulation, ready for activation. Want to change an output?say, an insurrection, or a culture of vaccine skepticism? Change your input. Want to solve the ?crisis of faith in key institutions? and the ?loss of faith in evidence-based reality?? Adopt a better content-moderation policy. The fix, you see, has something to do with the algorithm.

But… is it actually true? Are most Americans actually gullible suckers who will fall for any piece of made up nonsense that gives them a dopamine hit? As Bernstein’s piece explores, the American advertising industry has spent decades pushing this very notion — even putting forth various industry-supported studies to “prove” it. Whether or not those studies are accurate portrayals of reality is another question, but if they’re not, well, wouldn’t that create quite a paradox? The disinformation from questionable studies about how it’s possible to manipulate people into believing things worked — so even if the studies are inaccurate, there’s at least one example of how their own “disinformation” had an impact. And, as Bernstein does highlight, much of that early research on how easy it is to manipulate people with ads was… highly questionable.

The profitable relationship between the ad industry and the soft sciences took on a dark cast in 1957, when the journalist Vance Packard published The Hidden Persuaders, his expos? of ?motivation research??then the bleeding edge of collaboration between Madison Avenue and research psychology. The alarming public image Packard?s bestseller created?ad men wielding some unholy concoction of Pavlov and Freud to manipulate the American public into buying toothpaste?is still with us today. And the idea of the manipulability of the public is, as Arendt noted, an indispensable part of the product. Advertising is targeted at consumers, but sold to businesses.

Packard?s reporting was based on what motivation researchers told him. Among their own motivations, hardly hidden, was a desire to appear clairvoyant. In a late chapter, Packard admits as much:

Some of the researchers were sometimes prone to oversell themselves?or in a sense to exploit the exploiters. John Dollard, [a] Yale psychologist doing consulting work for industry, chided some of his colleagues by saying that those who promise advertisers ?a mild form of omnipotence are well received.?

So, the public (and the media and politicians) have been primed — with disinformation — to believe that disinformation works. And the big internet companies have their own vested interest in continuing that belief, even if it’s not quite true.

The media narrative of sinister digital mind control has obscured a body of research that is skeptical about the effects of political advertising and disinformation. A 2019 examination of thousands of Facebook users by political scientists at Princeton and NYU found that ?sharing articles from fake news domains was a rare activity??more than 90 percent of users had never shared any. A 2017 Stanford and NYU study concluded that

if one fake news article were about as persuasive as one TV campaign ad, the fake news in our database would have changed vote shares by an amount on the order of hundredths of a percentage point. This is much smaller than Trump?s margin of victory in the pivotal states on which the outcome depended.

But, of course, part of the problem is that — like “fake news” before it — there is no easy definition of “disinformation,” and thus the research on it isn’t always talking about the same thing.

The most comprehensive survey of the field to date, a 2018 scientific literature review titled ?Social Media, Political Polarization, and Political Disinformation,? reveals some gobsmacking deficits. The authors fault disinformation research for failing to explain why opinions change; lacking solid data on the prevalence and reach of disinformation; and declining to establish common definitions for the most important terms in the field, including disinformation, misinformation, online propaganda, hyperpartisan news, fake news, clickbait, rumors, and conspiracy theories. The sense prevails that no two people who research disinformation are talking about quite the same thing.

This will ring true to anyone who follows the current media discussion around online propaganda. ?Misinformation? and ?disinformation? are used casually and interchangeably to refer to an enormous range of content, ranging from well-worn scams to viral news aggregation; from foreign-intelligence operations to trolling; from opposition research to harassment. In their crudest use, the terms are simply jargon for ?things I disagree with.? Attempts to define ?disinformation? broadly enough as to rinse it of political perspective or ideology leave us in territory so abstract as to be absurd.

Another key point that Bernstein highlights is why “disinformation” is such a convenient term for the traditional media. After all, for decades they’ve pushed the idea that they’re the ones pushing “objective truth” and taking a view from nowhere. So they’re the ones who are supposed to save us from disinformation. When you’ve positioned yourself as the gatekeeper for truth, it helps to play up the constant threat of untruths. As Bernstein notes:

A quick scan of the institutions that publish most frequently and influentially about disinformation: Harvard University, the New York Times, Stanford University, MIT, NBC, the Atlantic Council, the Council on Foreign Relations, etc. That the most prestigious liberal institutions of the pre-digital age are the most invested in fighting disinformation reveals a lot about what they stand to lose, or hope to regain. Whatever the brilliance of the individual disinformation researchers and reporters, the nature of the project inevitably places them in a regrettably defensive position in the contemporary debate about media representation, objectivity, image-making, and public knowledge. However well-intentioned these professionals are, they don?t have special access to the fabric of reality.

And they sure do seem eager to put in place official arbiters of truth:

Still, Big Disinfo can barely contain its desire to hand the power of disseminating knowledge back to a set of ?objective? gatekeepers. In February, the tech news website Recode reported on a planned $65 million nonpartisan news initiative called the Project for Good Information. Its creator, Tara McGowan, is a veteran Democratic operative and the CEO of Acronym, a center-left digital-advertising and voter-mobilization nonprofit whose PAC is funded by, among others, Steven Spielberg, the LinkedIn co-founder Reid Hoffman, and the venture capitalist Michael Moritz. The former Obama campaign manager David Plouffe, currently a strategist at the Chan Zuckerberg Initiative, is an official Acronym adviser. Meanwhile, a February New York Times article humbly suggested the appointment of a ?reality czar? who could ?become the tip of the spear for the federal government?s response to the reality crisis.?

If you step back, you can absolutely see the thinking here, though it’s difficult to see any path by which it is actually effective. It only does make sense if it really is true that false info on Facebook really fried many people’s brains — rather than a much larger, and much more complex series of variables that have all contributed to large segments of the society being open to crazy ideas and conspiracy theories. What’s notable in all of this is how ready many people are to believe that a meme on Facebook magically convinced their aunt to believe in the dumbest conspiracy theories, but somehow they, themselves, are miraculously immune to such things.

None of this is to argue that conspiracy theories and those who push them are not a problem. Clearly it’s a part of the problem, but we’re so focused on the symptoms that we see, rather than the underlying causes, that we get wrapped up in bad ideas that sound good from a narrative perspective, but will have no real impact on the solution.

For years I’ve tried to highlight that what we see — what Facebook and other social media has exposed — is often the consequences of huge societal failings. There are problems with education, with social safety nets, with healthcare (especially mental healthcare). There are problems with income inequality and corruption. There are tons of problems out there, and many of these manifest themselves through false information that people share online. But saying that the “disinformation” is the problem — rather than a way in which the underlying problem shows itself — misses the point entirely.

And Bernstein’s article really does a great job calling this out. Is disinformation real? Sure, though what it actually is remains amorphous. But the focus on treating disinformation as the problem, rather than simply an exposed symptom of a much deeper, much more complex, much more troubling underlying societal ill, is missing the point.

There’s a lot more in Bernstein’s piece, and I really recommend reading the whole thing. He doesn’t necessarily come to the same conclusion I come to here, but it really includes a lot of useful thinking about how we’ve over-indexed on disinformation as the problem when it’s not at all clear that it really is the problem. Indeed, it’s easy to come away from the article and realize that there’s an awful lot of, er… disinformation about disinformation. The problem is not Facebook. The problem is that Facebook is shining a light on a whole bunch of other terrible shit.

Only certain types of people respond to certain types of propaganda in certain situations. The best reporting on QAnon, for example, has taken into account the conspiracy movement?s popularity among white evangelicals. The best reporting about vaccine and mask skepticism has taken into account the mosaic of experiences that form the American attitude toward the expertise of public-health authorities. There is nothing magically persuasive about social-media platforms; they are a new and important part of the picture, but far from the whole thing. Facebook, however much Mark Zuckerberg and Sheryl Sandberg might wish us to think so, is not the unmoved mover.

For anyone who has used Facebook recently, that should be obvious. Facebook is full of ugly memes and boring groups, ignorant arguments, sensational clickbait, products no one wants, and vestigial features no one cares about. And yet the people most alarmed about Facebook?s negative influence are those who complain the most about how bad a product Facebook is. The question is: Why do disinformation workers think they are the only ones who have noticed that Facebook stinks? Why should we suppose the rest of the world has been hypnotized by it? Why have we been so eager to accept Silicon Valley?s story about how easy we are to manipulate?

None of this means that manipulation never works. Or that people are not persuadable. Of course people are persuadable. But not in the way that many people think. It’s not because of a random meme that make people embrace crazy ideas about vaccines or 5G. There needs to be much more infrastructure and issues for people to be susceptible. And, as Bernstein notes, perhaps the most manipulated are those who feel they need to believe in this story of “disinformation” to explain away all of their other failures to build a better society:

Indeed, it?s possible that the Establishment needs the theater of social-media persuasion to build a political world that still makes sense, to explain Brexit and Trump and the loss of faith in the decaying institutions of the West. The ruptures that emerged across much of the democratic world five years ago called into question the basic assumptions of so many of the participants in this debate?the social-media executives, the scholars, the journalists, the think tankers, the pollsters. A common account of social media?s persuasive effects provides a convenient explanation for how so many people thought so wrongly at more or less the same time. More than that, it creates a world of persuasion that is legible and useful to capital?to advertisers, political consultants, media companies, and of course, to the tech platforms themselves. It is a model of cause and effect in which the information circulated by a few corporations has the total power to justify the beliefs and behaviors of the demos. In a way, this world is a kind of comfort. Easy to explain, easy to tweak, and easy to sell, it is a worthy successor to the unified vision of American life produced by twentieth-century television.

But I’d take that even a step further. Many of the “Establishment” that are pushing this are also the people whose previous policies failed. The reason we have so many societal problems are because their beliefs about their own policy powers and what they could accomplish in the past did not work the way they expected. The world did not progress the way they planned. And thus, pushing the “oh it’s social media and disinformation” button not only gives them a convenient story, it also absolves themselves of their own failures.

I recognize that this can be read to be saying that Facebook isn’t a problem. Or that “disinformation” and false info isn’t a problem. But that’s not what I am saying at all (and I’m pretty sure it’s not what Bernstein is saying either). It’s just that the world is a lot more complicated than that. What we’re seeing on Facebook and the flow of disinformation is a problem — but it’s not a problem you solve by sweeping it under the rug. It’s a mirror on the real underlying societal problems the world faces — which we should be talking about and trying to come up with better solutions for, rather than insisting that Facebook can make it all go away if only they had a better algorithm or better employees.

Filed Under: , , , , , , , , ,

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Most Information About Disinformation Is Misinformation”

Subscribe: RSS Leave a comment
29 Comments

This comment has been flagged by the community. Click here to show it.

Koby (profile) says:

It Can All Make Sense

Of course, disinformation doesn’t have a clear definition

Disinformation is any political opinion that you would prefer to censor rather than explain.

but it’s not a problem you solve by sweeping it under the rug. It’s a mirror on the real underlying societal problems the world faces — which we should be talking about and trying to come up with better solutions for, rather than insisting that Facebook can make it all go away

You’re definitely correct, and it should. But some folks don’t have good answers or solutions. Particularly if previously respected institutions have been given a chance, but lost credibility. So the fallback is to create gatekeepers and stamp out competing opinions.

This comment has been deemed insightful by the community.
This comment has been deemed funny by the community.
Anonymous Coward says:

Re: It Can All Make Sense

Disinformation is any political opinion that you would prefer to censor rather than explain.

Then please, for the love of god, tell us what are all these conservative political opinions that are being censored which are also the strongest opinions! Does that mean lower taxes? Smaller government? Just tell us what conservative opinions are being censored.

Also, tell us who is being banned form social media for espousing those opinions! You have unequivocally stated that conservatives are being banned for their conservative opinions, but have yet to provide ONE SINGLE EXAMPLE!

You love to talk the talk, but you never show us anything other than the most vague talk of ‘conservative opinions’.

Why do you refuse to provide any detailed examples of this censorship you are so convinced is happening? At least that would give us a starting point for a discussion, but nope, nothing but silence from you!

Makes one believe that you are just making stuff up to fit some narrative about how persecuted you conservatives have become. Always playing the victim.

Sara says:

Re: Re: It Can All Make Sense

Out the gate you make sure you let everyone know your political identity and a world view of victimhood with an intellectual basis of us vs them and the trust of politicians or political parties. You don’t see a world of diversity effected by millions of nuances and experience unique to individuals based on infinite environment and genetic factors. Your statement dehumanizes a large demographic of our country. And almost every partisan motivation from either side is the promise 50% of the country is oppressed by one of the 2 political regimes. It’s promised to the salivating constituents on both sides of political duopoly.

Are you capable of any independent thought?

CQ says:

Re: Re: It Can All Make Sense

…DISINFORMATION is synonymous with PROPAGANDA.

Propaganda is a very useful
l and widely understood term that’s been around for 3 centuries.
There was no need to invent a new replacement word for it.
This new trendy word DISINFORMATION obviously just confuses people.

Most people believe their viewpoints are factual information, not opinion. It’s human nature, people overestimate their intellect.
Politicians and media-types way overestimate their intellect and knowledge.

This comment has been deemed insightful by the community.
Anonymous Coward says:

Re: It Can All Make Sense

Disinformation is any political opinion that you would prefer to censor rather than explain.

Funny that it’s coming from you. Who, despite repeated requests for explaining which opinions are being censored, is not attempting to engage with the question, forcing ALL of us to infer from your past comments. You disingenious NeoNazi.

Particularly if previously respected institutions have been given a chance, but lost credibility. So the fallback is to create gatekeepers and stamp out competing opinions.

So who are these "respected institutions"? Rupert Murdoch’s media empire, inclusive of Harpercollins, Zondervan, and National Geographic in the US, Sky News and The Sun in the UK, and pretty much the entire Australian media? Koch Industries and their associated trusts, chairities and thinktanks? Turning Point USA? The Ku Klux Klan?

If you can’t tell us who, then don’t try to pretend to be reasonable. Stop engaging in bad faith.

Mark Gisleson (profile) says:

You can't sell ads anymore

If disinformation worked, newspapers would still be able to sell advertising. Newspapers still delivering eyeballs, but advertisers have finally figured out that they weren’t getting squat back on their investment.

2016 made it convenient for the neoliberals to agonize over fake news while at the same time spearheading insanely aggressive warmongering lies about Russia electing Trump.

The Durham Report is coming out. It’s been delayed because the establishment has stalled, refusing to be held accountable for their grotesque lies. If you bought into Russiagate, you don’t get to point fingers at others for getting suckered by the fast talking liars who dominate the news cycles these days.

Both sides lie. A lot. It’s time to let go of the things of your youth, and the duopoly should be at the top of that list.

Anonymous Coward says:

One of the biggest problems with the disinformation angle is the odd refusal to look anywhere outside the contemporary internet social media, and maybe occasionally traditional media. People have been making shit up since forever, and large swathes of other people choose to believe it wholeheartedly. (While still others may use a phenomenon cynically.) The US has historically been pretty damn good at (relatively) rapid-fire bullshit belief system creation.

No none mentions this stuff.

ECA (profile) says:

Everything.

First I look for a point of view, How, what, why, they would think that.
Then I look at the situation, and How they got this conclusion.
Then I look for Key word, They might have,, they could have, it was Probably, on and on.
Then I ask, Who told you that, where did you see that, where did you READ that.

If any of it sounds abit off, I try to look it up and find a source.
FB had a advert(?) and I followed it and it was very interesting, but it linked back to RT.COM
And I saw all these conspiracies and posts from other conspiracy sites. I then marked it TO FB, as fake news. Tons of it.

Opinion is nice, but TELL me its opinion. then we can debate the issues. If you just post something as FACT(?), I want abit of substantiation. Following a line of who told who what and passed it on? There is a game like that.

Pat Aufderheide (profile) says:

Disinformation

I thought Cory Doctorow did a good job of addressing some structural issues on this with How to Destroy Surveillance Capitalism. And I thought Bernstein’s piece really played up the contrarian "Disinformation isn’t what you think it is, and not as bad as you think it is," and while mentioning it doesn’t address the way in which social media affordances unprecedentedly fan the flame of the grievance-and-distrust mindset that has been put in motion for so many other reasons, and how those affordances have been actively leveraged by all the forces trying to capitalize on that mindset for their own purposes. I hope this piece doesn’t get repurposed to dismiss the poisonous pollution of discourse with tools never before available. And I hope people pay attention to key learnings from Benkler, Faris and Harris’ Network Propaganda. Fox News is an analog, old fashioned mass-media major actor here.

This comment has been flagged by the community. Click here to show it.

This comment has been flagged by the community. Click here to show it.

Fearless Zombie Hunter says:

Re: Re: Been looking for NINE YEAR GAP ZOMBIES! FOUND ONE!

The mysterious unexplained and indeed inexplicable gaps of the obvious astro-turfing with ancient "accounts" here seem to top out at 8 except for two with 12 years. But here’s the missing zombie! — Now if can just find ten- and eleven-yearers, my collection will be complete.

Pat Aufderheide: 13 (1), NINE YEAR GAP! Nov 9th, 2009 https://www.techdirt.com/user/paufder — 3 of its comments have same title and mostly same text!

Anonymous Coward says:

Re: Re:

The problem is that people can’t properly think in the first place.

As for distrust of official information, and rather selectively i might add, is that depends on who is in charge of an office, and whether official information is accurate, bad, or deliberate lies.

Unfortunately, this colors the perception for some of all official or expert information. And then they just substitute their own "experts".

Anonymous Coward says:

Re: Re: Re:

The problem is that people can’t properly think in the first place.

The problem is that people don’t have critical thinking skills.

When they see something posted on Facebook, no matter how conspiratorial, outlandish, beyond reason, or just plain fucking crazy talk, if it’s somebody they know and trust who posts it, then they are apt to believe it without question, disregarding what the experts or otherwise authoritative outlets state.

ECA (profile) says:

Re: Re: Re: Re:

NOt quite.
Its all the Stuff.
Its the 1st amendment without amendments to truth, facts or anything else that Could make sense.
How can a person judge when you Cant get facts, and then you get BS piled as high as it can go.
Its the strangeness of TV, sit in front and BELIEVE, that every crook is in jail. Dont look at dates and times things happened, it was Yesterday. Everything happens NOW.
Liek reading only from 1 newspaper all you get is what you get. Dont look at other critical papers or news. Dont compare and Think about something. Its SUPPOSED to be true if its published.

The problem ends up facts optional opinions. Ask someone if trump got a shot, They dont really understand that HE DID, after he went to the hospital. Listen to Fox, and dont ASK if they had their shots.
If we were as critical with the politics in this nation as we are with WHO we watch or read of news/info/entertainment. that little old lady they thinks Soap opera’s are REAL, MIGHT understand things abit better.

Anonymous Coward says:

Re: Re: Re: Re:

You’ve got that backwards. People don’t come to believe crazy things when someone they trust says it. Instead, they come to trust sources that say things they already believe or want to believe.

It’s easy to affirm people’s existing beliefs. Challenging them is much harder, and almost always elicits a negative reaction.

Anonymous Coward says:

It’s irritating how much good news has come to cost, while the low quality sites that churn out garbage quickly remain free. Just clicking random sources on Wikipedia, there are so many paywalls to go with the dead links. It’s like college. Want to learn? Bust out that credit card. I understand why this has happened and that the alternative is potentially even further consolidation of news. But being informed costs a fair bit of money or a lot of effort to freeload. It’s an odd moral debate over whether or not it’s right to take news for free if you truly can’t afford it. I’d say if I subbed to every site I use at least weekly I’d probably be over $300 a month. That’s just not reasonable. I pay for the local paper, but not national sources.

@b says:

I think we are talking about meta cognition, not information

I see many great points made under this article according to my current feelings on this topic.

Are they really great points? Who can say. Who to trust.

Plenty of information out there. Tomrw you will see a "meme about a vaccine". That info is merely the outcome of primate thinking. What we would love to know is if that thought was rational or madness. Probably in between. But any meme that suggests it’s up to us to "do the research" is madness. Instead, go ask somebody you trust.

This is why religious nonsense clusters in families. Primates trust their parents and neighbours and holymen.

Anonymous Coward says:

There is great exceptionalism in people claiming others are gullible morons unable to think for themselves, being controlled by magic words on the internet while they themselves are immune to this by virtue of their superior intellect or something.

But no, we’re all only human. If the internet could control people’s thoughts, it would’ve got you too.

Facebook wants to act like targeted advertizing can MAKE people buy whatever it is their clients are selling, because that’s what pays their bills. In reality, it’s merely finding (or trying to find) people who already want to buy the things.

nasch (profile) says:

Re: Re:

There is great exceptionalism in people claiming others are gullible morons unable to think for themselves, being controlled by magic words on the internet while they themselves are immune to this by virtue of their superior intellect or something. But no, we’re all only human.

It is also not reasonable to suppose that nobody is more or less gullible than anyone else. Or that everyone is equally prone to believe in conspiracy theories. And so on.

Anonymous Coward says:

Perhaps the question lays in the organization of the process. Religions developed a very interesting process to gather people, deliver content, and modify thoughts and behaviour of the masses. So have political parties, but less efficiently (except for some like the nazi, who in recent times convinced a lot of people that mass genocide is OK). My question is about the organized view of the process run by Facebook, and in particular about the measurable inputs/outputs describing people behaviour change in time. How can you dismiss the possibility that Facebook is running an organized behavioral experiment if you cannot see the data? The fact that they shut down access to researchers in this domain, whenever they try to use legal tools with the consent of the public to shed some light, isn’t this automatically indicative that something is going on – and the only questions are that we do not know the magnitude and the purpose?

Leave a Reply to ECA Cancel reply

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...