Techdirt's think tank, the Copia Institute, is working with the Trust & Safety Professional Association and its sister organization, the Trust & Safety Foundation, to produce an ongoing series of case studies about content moderation decisions. These case studies are presented in a neutral fashion, not aiming to criticize or applaud any particular decision, but to highlight the many different challenges that content moderators face and the tradeoffs they result in. Find more case studies here on Techdirt and on the TSF website.

Content Moderation Case Study: Can An Open Encyclopedia Handle Disinformation? (2005)

from the disinfo-on-an-open-platform dept

Summary: Wikipedia was founded in 2001 and the open encyclopedia that anyone could edit grew much faster than most people (including its founders) expected. In 2005, one of the first big controversies concerning disinformation on Wikipedia arose, when journalist and political figure John Seigenthaler wrote an article in USA Today calling out claims on Wikipedia that, among other things, he was involved in the assassinations of both John F. Kennedy and Robert F. Kennedy (for whom he worked for a time).

The entry in question read:

John Seigenthaler Sr. was the assistant to Attorney General Robert Kennedy in the early 1960s. For a short time, he was thought to have been directly involved in the Kennedy assassinations of both John, and his brother, Bobby. Nothing was ever proven.

John Seigenthaler moved to the Soviet Union in 1972, and returned to the United States in 1984.

He started one of the country’s largest public relations firms shortly thereafter.

Two months before publishing the USA Today article, Seigenthaler reached out to Wikipedia cofounder Jimmy Wales and requested that his entry be deleted. It was, but other Wikipedia users soon brought it back, leading Seigenthaler to publish the story.

Seigenthaler referred to this entire incident as ?character assassination? and it suddenly put the fact checking of Wikipedia into tremendous focus, leading some to question whether or not the site could even survive.

Seigenthaler?s article raised many questions about Wikipedia, how it was edited, and how reliable it was as a research tool. For years, Seigenthaler insisted he wouldn?t edit his own entry because it would lend legitimacy to the service that he wanted no part of.

The controversy received widespread media coverage, and raised many questions about how Wikipedia can and should deal with misinformation on its platform while staying true to its principles and concept as an encyclopedia built on anyone?s contributions.

Decisions to be made by Wikipedia:

  • Should it step in and edit Seigenthaler?s entry?
  • Should it block attempts to restore the false information?
  • Should it block further edits and lock down Seigenthaler?s page?
  • Should it change its overall policies regarding open editing?

Questions and policy implications to consider:

  • Are there ways to limit ?vandalism? on an open system?
  • Were certain pages at greater risk of false information that others? If so, how should that difference be handled?
  • Will any changes decrease the trust people have in Wikipedia?
  • Will any changes increase the trust people have in Wikipedia?

Resolution: In some ways the controversy itself solved the ?problem? in that the widespread attention resulted in many Wikipedia editors/users making sure that Seigenthaler?s entry was accurate. However, it also prompted Wikipedia to implement some wider changes, including restricting the ability to create new pages to registered users, and creating a new editing policy regarding ?biographies of living persons.? A few years later, it adjusted the policy some more to push for a more thorough review of edits to pages about living individuals.

Just weeks after the incident and controversy, Nature published a study claiming that Wikipedia was equally as reliable as the esteemed Encyclopedia Britannica.

Seigenthaler continued to question the entire Wikipedia concept for years. Separately, while in the USA Today piece, Seigenthaler detailed his failed efforts to track down who had made the edits (including explaining how he considered possible legal action), the person who created the page, Brian Chase came forward a few weeks after the story went viral, saying it was just a joke that he regretted.

Filed Under: , , ,
Companies: wikipedia

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Content Moderation Case Study: Can An Open Encyclopedia Handle Disinformation? (2005)”

Subscribe: RSS Leave a comment
22 Comments
This comment has been deemed insightful by the community.
Upstream (profile) says:

Timely topic

This problem has surfaced again recently with the back-and-forth editing of the Kamala Harris Wikipedia page for clearly political reasons, as rumors spread that she may become Biden’s running mate. Apparently, what you see on that page may depend on which side of the net the ping pong ball is on when you happen view the page.

There have apparently been several other instances of this problem over the years. I seem to remember Hillary Clinton’s Wikipedia page being the target of some politically motivated edits at some point.

And, as was pointed out in the article, it may be something of a self-fixing problem, maybe due some some corollary of the Streisand Effect.

In general, though, Wikipedia seems to be one of the big success stories, to at least some small extent fulfilling the early visions of the open Internet becoming a grand repository of all human knowledge.

Gregory Kohs (profile) says:

Re: Re: Re:

One example of how even with references and citations, information is still filtered through a "house" lens:

When CNN and the Atlanta Journal-Constitution both documented Ahmaud Arbery’s past criminal history, some editors tried to recognize that documentation in Wikipedia. I know this is a sensitive subject to even address. But the editors were quickly reverted by administrators (without any community discussion), and if they persisted in trying to add back in that documentation, they were threatened with being indefinitely blocked from editing, and I believe some were in fact blocked.

It was only permissible to present Arbery’s past in as favorable a light as possible. I understand the "BLP – Biographies of Living Persons" policy on Wikipedia is intended to respect the human dignity of living people, the recently deceased, and the families thereof. But in a criminal case, shouldn’t there at least be some allowance for discussing what the reliable news media have said in terms of background of both the alleged perpetrators and the victim — especially if the two had a potential previous history with one another?

All this while the accused killer(s) of Arbery were described on Wikipedia’s talk pages as "rednecks", "murderers", "hillbillies", and the like. They were not protected by the BLP policy, it seems.

Again, I am not saying this is an easy situation to resolve. It simply seems clear that the Wikipedia administrative authority had one (and only one) way of seeing that it was resolved according to their house point of view.

Anonymous Coward says:

Yup, remember hearing of that controversy and it’s among one of the more prominent anti-Wikipedia examples that exists today. Wikipedia is largely accurate and you can count on it most of the time to get reliable information. More often not the issue is that the data is outdated, not necessarily inaccurate or worse, intentionally false or defamatory. I remember someone editing Damon Dash’s Wikipedia page to say that he was “broke”, apparently this the claim was so hurtful to Mr. Dash’s feelings he contacted the legal department of the Wikipedia Foundation and they had the article locked down from any non-administrator edits for a solid two years. So much for community moderation.

Moderation of users’ conduct on Wikipedia is a whole other story. If anyone’s familiar with the English Wikipedia community, they may know of the administrator Fram and how his blunt enforcement of Wikipedia rules ruffled a few feathers from the Foundation which owns Wikipedia. From what I remember he lost his administrator status and was banned, but successfully appealed after the community protesting the Foundation’s actions as an overreach. Fram himself apparently was a bit of a stern administator, he edited according to the rules, but wasn’t nice and that’s what got him banned.

My account is banned indefinitely for a similar reason. One insult to some frilly editor and edits I made YEARS ago get taken out of context and put on display to brand me an anti-semite. You get a limited amount of time to make your case as to not get banned, and if you miss that opportunity, adios. Sucks that I have to evade the shit, but I still rise above the community and appreciate Wikipedia for what it’s best known for – its articles.

This comment has been deemed insightful by the community.
MathFox says:

More interesting questions

Does Wikipedia handle disinformation better than other media?

How does Wikipedia compare with respect to objectivity to a paper encyclopedia?
Is Wikipedia better in containing and correcting misinformation than Twitter, Facebook, YouTube?
Is Wikipedia better in mentioning all sides to a political story than Fox, CNN, Washington Post or The Times?

Anonymous Coward says:

Does Wikipedia handle disinformation better than other media?

I think most would say yes. Their moderator / contributor / editor … etc system, while clearly not perfect, seems to generally do fairly well.

How does Wikipedia compare with respect to objectivity to a paper encyclopedia?

I believe most sources would say they are comparable, with Wikipedia being much quicker to weed out biases. Hardcopy encyclopedias, when they were a thing, could take decades to make changes.

Is Wikipedia better in containing and correcting misinformation than Twitter, Facebook, YouTube?

Yes.

Is Wikipedia better in mentioning all sides to a political story than Fox, CNN, Washington Post or The Times?

Yes.

MathFox says:

Re: Re:

May I add [Citation Required] to your answers 😉

I know that Wikipedia aims for a high standard of objectivity and reliability; I observe that on these aspects they match highly respected professional publications.
I don’t think you can demand perfection from any organization, you better look what their track record is and decide with how big a grain of salt you want to consume their publication.

Gregory Kohs (profile) says:

Re: Re: Re:

I just clicked "Random article", and I got the page about James Lionel Michael. Some sentences from that short article:

  • After visiting Europe, Michael was articled to his father and began to mix in artistic and literary society.
    — I understand that "articled" as a verb here means to bind by an article of covenant or stipulation, as with an apprenticeship; but will most Wikipedia readers have any clue what that sentence means?
  • Sheridan Moore states that Michael became friendly with Millais and Ruskin…
    — Who is Sheridan Moore, and why do we care what he stated?
  • He became friendly with Joseph Sheridan Moore who introduced him to Henry Kendall, whom he afterwards took into his office and "treated as an affectionate elder brother would a younger one".
    — Completely unsourced information, but we’ll trust whoever wrote (or plagiarized) the Wikipedia article, I suppose.
  • Michael married in 1854…
    — This sentence comes after the sentence telling us how Michael died, some 14 years after his marriage. It’s information, but confuses the reader by being out of chronological order.
  • His long poem, John Cumberland, contains some good passages, however has many patches of prose.
    — That seems like an opinion. Whose, we cannot know, because the claim is not cited to any source.

It’s not terrible, by any stretch… but I would hardly say that this "matches" the highly-respected professional publications off which the Wikipedia re-write is based. Agreed?

Anonymous Coward says:

Re: Re: Re: Re:

Wikipedia is not trying to be a professional journal, but rather an encyclopedia for everybody. As an encyclopedia it gives an introduction, and usually references to more detailed and advanced resources. Wikipedia does well as a source of general knowledge, and is not aiming to be a specialist level source like professional journals are.

Gregory Kohs (profile) says:

One experiment in Wikipedia misinformation

When I set out a few years ago to test how resilient Wikipedia would be to subtle misinformation (and by subtle, I mean deliberately incorrect information, formatted suitably — not things like "MIKE HAZZ THREE HEADZ! ! !" which most people associate with being handled relatively well by Wikipedia), I was rather surprised that fully 60% of my deliberately misinformed pieces of content were not remedied within not just days, but weeks or months.

My research was given a mention in the Washington Post, and my own write-up of the experiment can be found here: https://wikipediocracy.com/2015/04/13/experiment-concludes-most-misinformation-inserted-into-wikipedia-may-persist/

Gregory Kohs (profile) says:

Re: Re: One experiment in Wikipedia misinformation

It’s been few years since I’ve wielded an axe in my perspective toward Wikipedia, and the link you provided is no more a "business" for me than a lemonade stand that you might see a child running on a hot summer day. It costs me more to run that website (out of respect to the contributors who published their content there) than I earn from leads generated by it.

But, fair enough — it’s worth pointing out to strangers that I’m not just some "random" participant on Wikipedia.

Anonymous Coward says:

Wikipedia is garbage

Several years ago I pointed to Wikipedia’s entry for Margret Sangor. It covered her racist views and belief in eugenics. Now when you look at her entry it paints her as a saint. And sadly leftists will not look further like to her book and her own writings to see the truth. So Wikipedia is worthless when it comes to anything that is even remotely controversial.

Gee Kay (profile) says:

Re: Re: Wikipedia is garbage

Looking at Encyclopedia Britannica, I can see that on July 21, 2020 (only three days ago), EB’s senior editor of biomedical sciences, Kara Rogers, added the following paragraph about Margaret Sanger (note, not spelled "Margret Sangor"):

Sanger’s legacy has been complicated by her support of eugenics, the idea that selective breeding for desired heritable characteristics could improve future generations of humans—an idea that was popular in the early 20th century (though it was later debunked). At the time Sanger began her work with birth control, eugenics was championed by well-known and respected scientists. It is unclear how extensively Sanger was involved in the eugenics movement, though she did believe that birth control could be used to prevent the breeding of unfit individuals. In addition, through the “Negro Project,” working closely with NAACP leader W.E.B. Du Bois, Sanger brought birth control to African American communities. As a consequence of these actions, critics have described Sanger as racist. However, as with her work in white communities, Sanger emphasized the importance of giving African Americans choices about parenthood and the number of children they wished to have. It is generally accepted that Sanger’s notions were no more racist than those found in society in general at the time.

I don’t know if this is "better" than Wikipedia in terms of content, but I do know that I am more at ease knowing the identity and credentials of the person who added the content. That is almost always missing from Wikipedia’s content — you don’t know the real name or credentials of (probably about) 90% of the content contributors; and even if you do have that information about the User, it’s still a pain in the neck to look up and investigate who added which specific piece of text.

Cynthia M. Moore says:

Yes, It is very true that Wikipedia is the most popular nowadays. Thank for providing very useful information your articles always have best content. As the time passes away the google and chrome became most popular site. I always read these type of article but I do not have so much time because I got too many assignments then my friend suggest me edu bridie and now I do my assignments by this and spending my mostly time reading new articles. I always ready to gain knowledge as much as I can.

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...
Older Stuff
15:43 Content Moderation Case Study: Facebook Struggles To Correctly Moderate The Word 'Hoe' (2021) (21)
15:32 Content Moderation Case Study: Linkedin Blocks Access To Journalist Profiles In China (2021) (1)
16:12 Content Moderation Case Studies: Snapchat Disables GIPHY Integration After Racist 'Sticker' Is Discovered (2018) (11)
15:30 Content Moderation Case Study: Tumblr's Approach To Adult Content (2013) (5)
15:41 Content Moderation Case Study: Twitter's Self-Deleting Tweets Feature Creates New Moderation Problems (2)
15:47 Content Moderation Case Studies: Coca Cola Realizes Custom Bottle Labels Involve Moderation Issues (2021) (14)
15:28 Content Moderation Case Study: Bing Search Results Erases Images Of 'Tank Man' On Anniversary Of Tiananmen Square Crackdown (2021) (33)
15:32 Content Moderation Case Study: Twitter Removes 'Verified' Badge In Response To Policy Violations (2017) (8)
15:36 Content Moderation Case Study: Spam "Hacks" in Among Us (2020) (4)
15:37 Content Moderation Case Study: YouTube Deals With Disturbing Content Disguised As Videos For Kids (2017) (11)
15:48 Content Moderation Case Study: Twitter Temporarily Locks Account Of Indian Technology Minister For Copyright Violations (2021) (8)
15:45 Content Moderation Case Study: Spotify Comes Under Fire For Hosting Joe Rogan's Podcast (2020) (64)
15:48 Content Moderation Case Study: Twitter Experiences Problems Moderating Audio Tweets (2020) (6)
15:48 Content Moderation Case Study: Dealing With 'Cheap Fake' Modified Political Videos (2020) (9)
15:35 Content Moderation Case Study: Facebook Removes Image Of Two Men Kissing (2011) (13)
15:23 Content Moderation Case Study: Instagram Takes Down Instagram Account Of Book About Instagram (2020) (90)
15:49 Content Moderation Case Study: YouTube Relocates Video Accused Of Inflated Views (2014) (2)
15:34 Content Moderation Case Study: Pretty Much Every Platform Overreacts To Content Removal Stimuli (2015) (23)
16:03 Content Moderation Case Study: Roblox Tries To Deal With Adult Content On A Platform Used By Many Kids (2020) (0)
15:43 Content Moderation Case Study: Twitter Suspends Users Who Tweet The Word 'Memphis' (2021) (10)
15:35 Content Moderation Case Study: Time Warner Cable Doesn't Want Anyone To See Critical Parody (2013) (14)
15:38 Content Moderation Case Studies: Twitter Clarifies Hacked Material Policy After Hunter Biden Controversy (2020) (9)
15:42 Content Moderation Case Study: Kik Tries To Get Abuse Under Control (2017) (1)
15:31 Content Moderation Case Study: Newsletter Platform Substack Lets Users Make Most Of The Moderation Calls (2020) (8)
15:40 Content Moderation Case Study: Knitting Community Ravelry Bans All Talk Supporting President Trump (2019) (29)
15:50 Content Moderation Case Study: YouTube's New Policy On Nazi Content Results In Removal Of Historical And Education Videos (2019) (5)
15:36 Content Moderation Case Study: Google Removes Popular App That Removed Chinese Apps From Users' Phones (2020) (28)
15:42 Content Moderation Case Studies: How To Moderate World Leaders Justifying Violence (2020) (5)
15:47 Content Moderation Case Study: Apple Blocks WordPress Updates In Dispute Over Non-Existent In-app Purchase (2020) (18)
15:47 Content Moderation Case Study: Google Refuses To Honor Questionable Requests For Removal Of 'Defamatory' Content (2019) (25)
More arrow