Skip to content
Technology & Innovation

Truth or Truthiness? How Wikipedia Decides.

Sign up for the Smarter Faster newsletter
A weekly newsletter featuring the biggest ideas from the smartest people

What’s the Big Idea?


What Wikipedia is not, according to Wikipedia: a paper encylopedia, a dictionary, a publisher of original thought. A soapbox. A means of self promotion. A link dump, social network, blog, textbook, manual, guidebook, newspaper, or crystal ball. It is also not a bureacracy, a democracy (which may come as a shock to techno-optimists who insist the web is intrinsically democratic), or a battleground. It is a community.

And like any community – nation-state or neighborhood – it has its own set of rules called the Five Pillars, which define standards of conduct on the site. True to form, these pillars are less a binding code and more a set of principles which convey the ethos of the enterprise. Chief of all virtues is accessibility, followed by respect and civility. More interesting, though, is the non-profit’s definition of neutrality: 

We strive for articles that document and explain the major points of view in a balanced and impartial manner. We avoid advocacy and we characterize information and issues rather than debate them. In some areas there may be just one well-recognized point of view; in other areas we describe multiple points of view, presenting each accurately and in context, and not presenting any point of view as “the truth” or “the best view.” 

There’s a subtle distinction here, which is the subject of a recent article by Timothy Messer-Kruse, “The Undue Weight of Truth on Wikipedia.” Messer-Kruse is a historian who has spent the last ten years of his life studying the Haymarket riot and trial of 1886. One day, he decided to correct what he thought of as a “misleading assertion” in a Wikipedia article on the subject: 

The description of the trial stated, “The prosecution, led by Julius Grinnell, did not offer evidence connecting any of the defendants with the bombing. … ” Coincidentally, that is the claim that initially hooked me on the topic. In 2001 I was teaching a labor-history course, and our textbook contained nearly the same wording that appeared on Wikipedia. One of my students raised her hand: “If the trial went on for six weeks and no evidence was presented, what did they talk about all those days?” I’ve been working to answer her question ever since.

After an exhaustive study of primary source material, Messer-Kruse was able to definitively confirm that the Wikipedia account was inaccurate; the prosecution had in fact offered plenty of evidence. Somewhere along the line, a mistake had been made and catalogued as part of the story of the trial, passed down from historian to historian.

The problem was that 80% of historians still accepted this mistake as fact. Messer-Kruse tried several times to edit the erroneous information on the page. Each time, his changes were deleted. When he contacted a Wikipedia gatekeeper, he was reminded of the site’s “undue weight policy,” which holds that the viewpoint supported by the majority of sources should never be deleted and replaced with a minority viewpoint. 

Messer-Kruse waited two years until his book on the subject was published to make another edit to the page, banking on the fact that he’d then have more credibility to argue his case. Again, he was contacted and asked to stop making changes until he’d familiarized himself with Wikipedia’s policies. 

Read the full article here.

What’s the Significance?

Messer-Kruse’s story raises interesting questions about community and governance in the digital age, which just so happen to stem all the way back to Plato’s Republic.

Slate columnist Chris Wilson has pointed out that, while sites like Wikipedia and Digg are celebrated “as shining examples of Web democracy places built by millions of Web users who all act as writers, editors, and voters,” they actually run on “the wisdom of chaperones.” In the case of Wikipedia, 1% of users are responsible for about 50% of the edits made to the site. Wikipedia also “deploys bots [to]… standardize format, prevent vandalism, and root out folks who flood the site with obscenities.” 

Of course, accuracy and accessibility do not necessarily have to be in opposition, if readers remember that Wikipedia is a starting point, not a panacea — and that Web 2.0 or no, the truth is something we will always have to work out for ourselves.

Sign up for the Smarter Faster newsletter
A weekly newsletter featuring the biggest ideas from the smartest people

Related

Up Next