Wikipedia To Experiment With Color-Coded Warnings On Quality

from the good-ideas dept

It always seems misguided when people complain about quality problems in Wikipedia while ignoring identical quality problems in other media — and the fact that it’s easier and faster to make corrections in Wikipedia when those errors are discovered. One thing that defenders of Wikipedia often point out, is that it’s easy to check the history page of any Wikipedia entry to get a sense of whether or not a particular tidbit of info has survived the test of time or was just recently dumped on the page (or if there’s been any controversy over it). However, the truth is not too many people actually bother to check the history page (even among those who bring it up as a defense of Wikipedia). It appears that Wikipedia may start experimenting with a creative idea to help deal with this: color coding sections of Wikipedia entries. If a change was made by a new or untrustworthy user, Wikipedia could color code it as red so any readers would know to be even more skeptical than usual about that information. As the information survives the test of time, then it could fade to black (so to speak). At the same time, users who have a long history of making trustworthy edits would have their edits more quickly “trusted” within the color coded system. It’s a creative idea that seems to make a lot of sense for improving the overall quality of Wikipedia. It’s almost a shame we can’t do the same thing with other forms of media as well. The plan is apparently to test this system on the smaller Wikia community before rolling it out on Wikipedia, but it seems like an experiment worth following.

Filed Under:
Companies: wikia, wikipedia

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Wikipedia To Experiment With Color-Coded Warnings On Quality”

Subscribe: RSS Leave a comment
MrBobDobalina says:

A far simpler alternative

A less ambitious but still useful idea would be to indicate the volatility of an article (or section) based on the number of edits in the past x days in proportion to the number of views. This wouldn’t provide a measure of accuracy, but would instead be an indication of how widely accepted the article’s contents are. More edits in proportion to views indicates that the contents are either in dispute or still emerging (new articles).

I suggested this to wikipedia and was informed that they don’t keep records of how many users visit a page. Wouldn’t tracking page views (and calculating the reliability) be far easier than analyzing every edit ever made?

MrBobDobalina says:

Re: Re: A far simpler alternative

Because wikipedia is edited by people and the information is never verified it can only be considered a representation of what people believe, not what is true. Therefore, even by analyzing all the edits as proposed they’ll only have a measure of how a user’s edits contrast with the wider user base. “Facts” that don’t get replaced aren’t necessarily true, they’re just not widely refuted.

In response to your overly sarcastic comment, a wiki article stating that ‘toyotas have the best quality’ would be edited by numerous people, possibly sparking an edit war. Therefore, the number of edits in proportion to the visitors would be higher than normal and would signal that the information wasn’t widely accepted. So it would work in that instance. As for your iPod example, that’s an opinion and therefore not capable of being fact.

Your comment wasn’t productive and doesn’t foster problem solving. Try contributing something worthwhile. Instead of bashing someones idea, suggest how it could be improved.

Clueless citizen says:

Re: Wait a minute

The two statements in your comment you’re using to make a point are both false. There are over 350 external sites that don’t have “nofollow” on Wikipedia… Wikia is only one of many… probably because Wikia is open source and host wikis… like many of the others on the list. Also, Amazon did not invest in Wikipedia (they only take donations, so check again) and Amazon is not one of the 350.

gettit (user link) says:

Re; Wait a minute

Why is that? Because those are businesses they are directly affiliated with. Wow.

I think a cool thing would be to actually change the darkness of the text as it remains unchanged over time. Probably unusable and annoying, but it seems like I could get a decent sense of the stability of a sentence that way. Between the number of edits and the similarity after changes, it would be pretty easy to do.

Beefcake says:

Consistent Voice

My caution with wikipedia isn’t that it’s more or less reliable than “other media” overall, but that the information on any given topic is from inconsistent sources and you never know what you’re going to get. Anyone can post anything on any topic on Wikipedia, and stand nothing to personally lose if they are inaccurate. Maybe it was fact-checked by the site, maybe not. Conversely, only those who meet the standards of (example) the BBC are allowed to provide information via that channel, and if they screw up they have their jobs, personal credibility, and/or careers at stake.

Of course, even the BBC occasionally puts a cab-driver on the air, which proves that even reliable, consistent sources can be fallible. But that fact actually provides a strong telling point — if the trained media with something to lose and who are committed to upholding standards can fail, then a website with anonymous posters and anonymous fact-checkers for all topics is a whole wide universe of failure.

Wikipedia is a tool that provides a good entry point for basic facts on basic topics. If I want to know Justin Timberlake’s birthday, I’ll use wikipedia. If I need to know which pitcher gave up Hank Aaron’s home run, yeah, Wikipedia is probably find. But if I need a summation of the history of Algeria; I’ll find another source I’ve come to trust more (such as the Country Profiles on the BBC).

If a site needs color-coding to identify worthless or unsubstantiated content, it’s probably not worth using very much.

kweeket says:

Re: Consistent Voice

As an occasional editor of Wikipedia, I wanted to point out that one of the goals of the project is to provide a citation for any stated fact. That way, a reader can trace that statement back to a (presumably) reputable source, such as the New York Times or the BBC, who does employ fact-checkers.

Wikipedia isn’t verified – how could it be, when it is constantly changing? – but it’s supposed to be VERIFIABLE. It’s up to the reader to check the citations and decide if the source is trust-worthy or not.

Calix says:

The veracity of information

I recently deeply annoyed a good friend of mine by asserting that the Internet in general (and Wikipedia in particular) is a much more reliable source of factual information than the New York Times. (I picked on the NYT simply because he was a reporter there for six years, which also explains his deep annoyance!)

If I create a web site, for example, stating that “the Empire State Building is constructed primarily of cheese,” it is my experience that some structural engineer will soon put up a competing web site explaining why this couldn’t be true because of the laws of physics, and an architect will put up yet another web site referencing historical documents (carefully scanned and posted) showing how it was actually built, etc.

If I posted to Wikipedia that “the Empire State Building is constructed primarily of cheese,” then those same experts will quickly correct the information.

If the NYT, however, states in a front-page article that “the Empire State Building is constructed primarily of cheese”, then at best a week later I might get a correction on page B-17 that no one reads. Of course, we all know the NYT would NEVER print something that wasn’t factually correct, either by mistake, bias or intent. My friend tried to assure me that such a thing could never happen at the NYT, because of tradition and fact-checking and the like, but there simply isn’t a mechanism for outside review.

The Internet is nothing BUT outside review! The fact that any whacko can post any old thing on the Internet (such as I’m doing right now) means that there is no “filter” to keep any and all information from being accessable – or from someone from correcting a whacko post such as I’m currently typing (like the post you’re about to type in response to my whacko comments herein.)

Sure, trust the NYT (or the BBC) – but even they get it wrong sometimes, and it’s fun to see how ticked off they get when the blogosphere catches them on it!

And that is the beauty – and the power – of the Internet, as exemplified by Wikipedia.

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...