Good To See: Wikipedia Moves Forward With Color Coding Less Trustworthy Text

from the teaching-people-to-be-skeptical dept

More than two years ago, we talked about a great idea to deal with the (somewhat misleading) question of the trustworthiness of Wikipedia: color code new edits from untrustworthy editors. Not only would this alert people to at least double-check that particular info, it would remind people that Wikipedia is a constantly changing site. To be honest, I was a bit disappointed that I hadn’t heard much about this idea since that summer of 2007. However, apparently, it’s been gaining in popularity, and now Wikipedia is set to start using it across the site. Here’s how it works:

Based on an person’s past contributions, WikiTrust computes a reputation score between zero and nine. When someone makes an edit, the background behind the new text gets shaded orange depending on their reputation: the brighter the orange, the less “trust” the text has. Then when another author edits the page, they essentially vote on the new text. If they like the edit, they’ll keep it, and if not, they’ll revert it. Text that persists will become less orange over time, as more editors give their votes of approval.

While there are some concerns about how well this will work (and how much processing power it will take), it seems like a worthwhile experiment.

Filed Under: , ,

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Good To See: Wikipedia Moves Forward With Color Coding Less Trustworthy Text”

Subscribe: RSS Leave a comment
22 Comments
NullOp says:

Color Coding

Color coding is the first thing people jump to when they want to ‘flag’ something or bring notice to it. Its a bad idea. Color coding can confuse and frustrate millions of people that are ‘color blind’, like me, for instance. It would be much better to use a pattern, symbol or number to denote trust in a document.

Christopher (profile) says:

Re: Color Coding

It’s a great idea for most; it’s just a bad idea *for you*. It could confuse and frustrate that unfortunate subset of color-blind individual — so don’t use it. It’s probably going to be a toggled setting.

Even if it’s not, don’t be the kind of person to remove a useful function from the majority, code up a greasemonkey script to see the colors and turn them into a pattern.

Marcus Carab (profile) says:

Re: Color Coding

That’s why I love this site:
http://colorschemedesigner.com/

Standard colour-scheme-builder sort of thing, but it has a drop-down menu that lets you choose from 8 different vision disorders (and shows you the percentage of the population with each) so you can see what the colours would look like to each group.

Nick Coghlan says:

Still a restricted experiment at this stage

Reading the Wired article, it looks they’re just implementing it for logged in users initially, and even for them it will only be displayed if they click on a specific “Trust info” tab at the top of the screen.

Still, at least it gives them a chance to see how it goes with the full data set and real time updates as entries change.

I wouldn’t be surprised if this then became a new editorial option in the future – instead of locking a page, editors may be given the ability to flag a page as one which should display trust info to all users, which would also enable an info box at the top of the page explaining what all the orange colouring was about.

Anonymous Coward says:

So what happens when one, not-so-liked editor updates a page with something verifiable and accurate but other editors who don’t care too much for the first, coat his update in orange or revert to something they prefer instead.

Hopefully there’s some sort of system to avoid this – besides being extra special nice.

Free Capitalist (profile) says:

Re: Re:

So what happens when one, not-so-liked editor updates a page with something verifiable and accurate but other editors who don’t care too much for the first, coat his update in orange or revert to something they prefer instead

You forgot that after they color the unpopular (but verifiable) opinion orange, they may/will ultimately remove it.

I look at wikipedia in the same light as I look at ‘scientific’ documentaries from TV. In order to reach a broader audience, TV docs will leave out much of the debate around a subject (such as the big boom theory) and espouse a single theory as being ‘as good as fact’.

The producers of a TV show get the last say on which theory will be supported in a documentary. Same goes for Wikipedia in the end, despite the ideal.

I’m not completely trashing Wiki here, I will still use it for what it is often good for (a jumping off point).

Marcus Carab (profile) says:

Re: Re: Re:

The nice thing about Wikipedia, though, is that you can go to the discussion page on a topic with more than one side to it and usually see every view presented, dissected and refuted/accepted, and then decide for yourself if you agree.

The fact is, no process is going to produce perfect results, which is why transparency in the process itself is so important.

Still, I agree it’s not perfect. But as a jumping off point (indeed its best role) it is a *godsend*

MRK says:

@11
I was thinking the exact same thing. Wikipedia (and really most wiki sites) have a problem where a few select editors (who make a lot of edits) consider themselves the most trusted editors. Those editors frequently revert changes by new users even if those edits are easily verifiable fact. I suspect this new system will be a be used as a tool for to exert control over the site.

I hope I am wrong, but human nature is quite predictable in this regard.

edgebilliards (profile) says:

i hate when i can't find sources...

there have been two studies that came out recently that have shown that the nature of wikipedia has changed drastically since its inception. (there’s also a long, labored discussion on…slashdot, i think?) long story short, these “trusted editors” just sit on the recent changes page and revert entries by “non-trusted editors” to get their edit count higher (and thus move up the ranks.)

wikipedia doesn’t need color-coded pages, it needs to do away with the rank system entirely. it was useful in getting people interested and contributing when wikipedia needed contributors. but reverting honest changes doesn’t add any value to the project and their voices shouldn’t take priority over the democratic whole.

[as long as i’m on the subject, and you can ignore this off-topic rant, i don’t see why any entries should be rejected for being too obscure. honestly, how much server space is needed for short entries on, for instance, phds working on important research projects or minor charaters from the star wars universe.]

long story short, wikipedia is mounted on a high horse, and that is exactly how dysfunctional oligarchies get started.

nasch (profile) says:

Re: i hate when i can't find sources...

[as long as i’m on the subject, and you can ignore this off-topic rant, i don’t see why any entries should be rejected for being too obscure. honestly, how much server space is needed for short entries on, for instance, phds working on important research projects or minor charaters from the star wars universe.]

I agree! The only criterion for notability should be if someone is interested enough to write a Wikipedia article about it. If it meets all the other criteria (NPOV, references, etc) leave it up, for crap’s sake!

random (profile) says:

I don’t think this is going to work at all.

Based on an person’s past contributions, WikiTrust computes a reputation score between zero and nine.

This new system creates a public flag based on a ranking produced by the wikitrust system. I spent hours updating and redoing a page only to have someone revert the page in its entirety. Rather then edit my new content the lazy editor who may see this page as his personal hangout has done the best thing to make sure the layman sees only his information.

What does color coding do to prevent this?

I think color coding will only make it harder to become trusted because now a new editor will be consistently flagged while an older editor receives no such penalty regardless of his personal stake in a page or accuracy of his information.

This new feature is nothing more the some foolish attempt to make wikipedia seem more trustworthy to the general public. It does nothing to address the deeper issues plaguing wikipedia as a whole.

irv (user link) says:

depends on how you look at it

The theory of having some kind of automated trust measure is very good. True, in practice it can be subverted by an irresponsible editor. But everything about Wikipedia can be (and to some extent has been) subverted by editors.

The question is, is there some way of automating ratings for editors? Someone (as mentioned in a post above) who routinely reverts edits instead of judging them carefully might be suspect – unless the page is one prone to vandalism.

It’s a difficult question.

Leave a Reply to irv Cancel reply

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...