from the teaching-people-to-be-skeptical dept
More than two years ago, we talked about a great idea to deal with the (somewhat misleading) question of the trustworthiness of Wikipedia: color code new edits from untrustworthy editors. Not only would this alert people to at least double-check that particular info, it would remind people that Wikipedia is a constantly changing site. To be honest, I was a bit disappointed that I hadn't heard much about this idea since that summer of 2007. However, apparently, it's been gaining in popularity, and now Wikipedia is set to start using it across the site. Here's how it works:
Based on an person's past contributions, WikiTrust computes a reputation score between zero and nine. When someone makes an edit, the background behind the new text gets shaded orange depending on their reputation: the brighter the orange, the less "trust" the text has. Then when another author edits the page, they essentially vote on the new text. If they like the edit, they'll keep it, and if not, they'll revert it. Text that persists will become less orange over time, as more editors give their votes of approval.While there are some concerns about how well this will work (and how much processing power it will take), it seems like a worthwhile experiment.