Opinion is more compelling than fact. That's tearing society apart.
- Basic facts are up for debate, especially in the realm of science and politics. So which facts can you trust? Start by looking at trusted sources like Wikipedia, Snopes, and factcheck.org.
- "If people with money don't start supporting fact-checking systems then fact-checking systems will become increasingly rarer," says Dreger.
- Digital audiences are in the habit of sharing and reposting op-eds that agree with their existing opinions, rather than seeking out factual reporting. Opinion journalism makes money. Factual reporting makes less. That's a problem.
Can algorithms use collective knowledge to make us all internet explorers?
- Google has come under scrutiny lately for its dominance over the flow of information on the internet.
- TagTheWeb is researching a method to allow the "wisdom of the crowd" to categorize the internet more effectively.
- With or without Google, the internet looks to change significantly in the future, in ways we may not be ready for.
Crowdsourcing as an idea isn't anything new, says historian and sex researcher Alice Dreger. She tells us about the history of public gathering of information from the medieval era to today.
Crowdsourcing as an idea isn't anything new, says historian and sex researcher Alice Dreger. She tells us about the history of public gathering of information from the medieval era to today. The enlightenment period was a big boon to the arts and sciences, but also an even bigger help to how knowledge is organized and distributed. Is Wikipedia, with its checks and balances and appeal to honesty, more like the founding fathers' idea of America than the overtly libertarian wild west that the rest of the internet has turned into? It might seem like a leap, but Alice's position here is full of interesting suggestions like that. Alice's new book is Galileo's Middle Finger: Heretics, Activists, and One Scholar's Search for Justice.
What information can we trust? Truth isn't black and white, so here are three requirements every fact should meet.
The chances are good that you've used Wikipedia to define or discover something in the last week, if not 24 hours. It's currently the 5th most-visited website in the world. The English-language Wikipedia averages 800 new articles per day — but 1,000 articles are deleted per day, the site's own statistics page reports. That fluctuation is probably partly the result of mischievous users, but it is also an important demonstration of Wikipedia's quest for knowledge in motion. "As the world's consensus changes about what is reliable, verifiable information, the information for us will change too," says Katherine Maher, executive director of the Wikimedia Foundation. Maher is careful to delineate between truth and knowledge. Wikipedia isn't a jury for truth, it's a repository for information that must be three things: neutral, verifiable, and determined with consensus. So how do we know what information to trust, in an age that is flooded with access, data, and breaking news? Through explaining how Wikipedia editors work and the painstaking detail and debate that goes into building an article, Maher offers a guide to separating fiction from fact, which can be applied more broadly to help us assess the quality of information in other forums.
Users don't need better media literacy to beat fake news. We need social media to be frank about its commercial interests.
Wikipedia has come a long, long way. Back when teachers and education institutions were banning it as an information source for students, did anyone think that by 2017 "the encyclopedia that anyone can edit" would gain global trust? Wikipedia had a rough start and some very public embarrassments, explains Katherine Maher, the executive director at the Wikimedia Foundation, but it has been a process of constant self-improvement. Maher attributes its success to the Wikimedia community who are doggedly committed to accuracy, and are genuinely thankful to find errors — both factual and systemic ones — so they can evolve away from them. So what has Wikimedia gotten right that social platforms like Facebook haven't yet? "The whole conversation around fake news is a bit perplexing to Wikimedians, because bad information has always existed," says Maher. The current public discourse focuses on the age-old problem of fake news, rather than the root cause: the commercial interests that create a space where misinformation doesn't just thrive — it's rewarded. Why doesn't Facebook provide transparency and context for its algorithms? An explanation for 'why am I seeing this news?' could allow users to make good decisions based on where that information comes from, and what its motive is. "We [at Wikimedia] hold ourselves up to scrutiny because we think that scrutiny and transparency creates accountability and that accountability creates trust," says Maher.