Techdirt's think tank, the Copia Institute, is working with the Trust & Safety Professional Association and its sister organization, the Trust & Safety Foundation, to produce an ongoing series of case studies about content moderation decisions. These case studies are presented in a neutral fashion, not aiming to criticize or applaud any particular decision, but to highlight the many different challenges that content moderators face and the tradeoffs they result in. Find more case studies here on Techdirt and on the TSF website.

Content Moderation Case Study: Can An Open Encyclopedia Handle Disinformation? (2005)

from the disinfo-on-an-open-platform dept

Summary: Wikipedia was founded in 2001 and the open encyclopedia that anyone could edit grew much faster than most people (including its founders) expected. In 2005, one of the first big controversies concerning disinformation on Wikipedia arose, when journalist and political figure John Seigenthaler wrote an article in USA Today calling out claims on Wikipedia that, among other things, he was involved in the assassinations of both John F. Kennedy and Robert F. Kennedy (for whom he worked for a time).

The entry in question read:

John Seigenthaler Sr. was the assistant to Attorney General Robert Kennedy in the early 1960s. For a short time, he was thought to have been directly involved in the Kennedy assassinations of both John, and his brother, Bobby. Nothing was ever proven.

John Seigenthaler moved to the Soviet Union in 1972, and returned to the United States in 1984.

He started one of the country's largest public relations firms shortly thereafter.

Two months before publishing the USA Today article, Seigenthaler reached out to Wikipedia cofounder Jimmy Wales and requested that his entry be deleted. It was, but other Wikipedia users soon brought it back, leading Seigenthaler to publish the story.

Seigenthaler referred to this entire incident as “character assassination” and it suddenly put the fact checking of Wikipedia into tremendous focus, leading some to question whether or not the site could even survive.

Seigenthaler’s article raised many questions about Wikipedia, how it was edited, and how reliable it was as a research tool. For years, Seigenthaler insisted he wouldn’t edit his own entry because it would lend legitimacy to the service that he wanted no part of.

The controversy received widespread media coverage, and raised many questions about how Wikipedia can and should deal with misinformation on its platform while staying true to its principles and concept as an encyclopedia built on anyone’s contributions.

Decisions to be made by Wikipedia:

  • Should it step in and edit Seigenthaler’s entry?
  • Should it block attempts to restore the false information?
  • Should it block further edits and lock down Seigenthaler’s page?
  • Should it change its overall policies regarding open editing?

Questions and policy implications to consider:

  • Are there ways to limit “vandalism” on an open system?
  • Were certain pages at greater risk of false information that others? If so, how should that difference be handled?
  • Will any changes decrease the trust people have in Wikipedia?
  • Will any changes increase the trust people have in Wikipedia?
Resolution: In some ways the controversy itself solved the “problem” in that the widespread attention resulted in many Wikipedia editors/users making sure that Seigenthaler’s entry was accurate. However, it also prompted Wikipedia to implement some wider changes, including restricting the ability to create new pages to registered users, and creating a new editing policy regarding “biographies of living persons.” A few years later, it adjusted the policy some more to push for a more thorough review of edits to pages about living individuals.

Just weeks after the incident and controversy, Nature published a study claiming that Wikipedia was equally as reliable as the esteemed Encyclopedia Britannica.

Seigenthaler continued to question the entire Wikipedia concept for years. Separately, while in the USA Today piece, Seigenthaler detailed his failed efforts to track down who had made the edits (including explaining how he considered possible legal action), the person who created the page, Brian Chase came forward a few weeks after the story went viral, saying it was just a joke that he regretted.

Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: case study, content moderation, disinformation, john segenthaler
Companies: wikipedia


Reader Comments

Subscribe: RSS

View by: Time | Thread


  1. identicon
    Anonymous Coward, 22 Jul 2020 @ 6:50pm

    Re:

    Not sure of the specific rules for that stuff, but even when you have a bunch of truth, with reliable citations, what gets included versus what is not included can make a huge difference, particularly if you are trying to influence people's opinion of someone.


Add Your Comment

Have a Techdirt Account? Sign in now. Want one? Register here



Subscribe to the Techdirt Daily newsletter




Comment Options:

  • Use markdown. Use plain text.
  • Make this the First Word or Last Word. No thanks. (get credits or sign in to see balance)    
  • Remember name/email/url (set a cookie)

Follow Techdirt
Essential Reading
Techdirt Insider Chat
Recent Stories

Close

Email This

This feature is only available to registered users. Register or sign in to use it.