The truth is a messy business, but an information revolution is coming. Danny Hillis and Peter Hopkins discuss knowledge, fake news and disruption at NeueHouse in Manhattan.
- In 2005, Danny Hillis co-founded Freebase, an open-source knowledge database that was acquired by Google in 2010. Freebase formed the foundation of Google's famous Knowledge Graph, which enhances its search engine results and powers Google Assistant and Google Home.
- Hillis is now building The Underlay, a new knowledge database and future search engine app that is meant to serve the common good rather than private enterprise. He calls it his "penance for having sold the other one to Google."
- Powerful collections of machine-readable knowledge are becoming exceedingly important, but most are privatized and serve commercial goals.
- Decentralizing knowledge and making information provenance transparent will be a revolution in the so-called "post-truth age". The Underlay is being developed at MIT by Danny Hillis, SJ Klein, Travis Rich.
Pope Francis’ 2018 World Communications Day message explains the dangers of fake news and what journalists and the public must do to combat it.
In his message for 2018 World Communications Day, Pope Francis rallied to the defense of journalism—if not the news media—comparing fake news to the snake in the Garden of Eden for its misleading, destructive power. This is a valuable, clear-eyed message for anyone, regardless of belief system. And to be clear, the pope’s not talking about fake news as defined in White House press briefings or presidential tweets. He has something very different—nearly the opposite—in mind:
Here's why your brain’s biases are a win for fake news, and a pay day for Facebook.
Even if you think of yourself as a human lie detector, there are some untruths that will sneak under the hood. For that, you can thank your brain, and it's absolute adoration for all things familiar, says Derek Thompson, senior editor at The Atlantic. One of the oldest findings in psychology history is the 'mere exposure effect', in which merely being exposed to something makes you biased toward it—parents influence their children by playing certain music around the house that they will love their whole lives, or they instill a political preference in them from an early age. You are drawn to what you know, and that bias really matters when it comes to digital media and the fake news phenomenon. Once something becomes memorable, we tend to conflate familiarity with fact. "This is one of the big reasons why it’s difficult to myth-bust on television or myth-bust in journalism, because sometimes the mere repetition of that myth biases audiences toward thinking that it’s true..." says Thompson. "The mere exposure of news to us biases us toward thinking that that news item is true." Facebook has an enormous ethical responsibility in this, he says, because it is the world's largest and most influential news outlet—whether it intended to be or not. Thompson believes there is no algorithmic fix for fake news that spreads via Facebook, only a human one: "The answer to a problem of a lack of human ethics in information markets is the introduction of more humans and more ethics," he says. Derek Thompson's latest book is Hit Makers: The Science of Popularity in an Age of Distraction.
Middle America is tired of those latte-sipping liberals and their "elite media" hanging out in New York City, but Ariel Levy makes the case that Americans aren't as different from one another as they'd like to think.
Middle America is tired of those latte-sipping liberals and their "elite media" hanging out in New York City, but author and New Yorker staff writer Ariel Levy makes the case that Americans aren't as different from one another as they'd like to think—and in fact they are all bound by one thing: truth. "No little falsehood is okay, ever, and we take that very seriously," says Levy, speaking of the allegiance to truth and extreme fact-checking that happens at The New Yorker. Journalists are human, and therein lies inevitable errors, but to claim that fake news is coming from the liberal media or that climate science is liberal propaganda is very much off base, she says. Here she delves into what the journalist's mandate is, and why there's no point making up facts: reality gets you in the end. Ariel Levy's memoir The Rules Do Not Apply, is out now.
The Internet is all shadows and mirrors—but what if it were the central source of truth? Thanks to Blockchain technology, it's a future that's possible.
Imagine a world where facts rule the Internet, and lies and rumors are stripped of their disguises before they can do damage. That's actually possible, explains tech expert Brian Behlendorf, the executive director of the Hyperledger Project (who was also a primary developer of the Apache HTTP Server, the most widely used web server in the world). Although a completely truthful Internet might be dull, and a little totalitarian, it would be sweet relief for all digital citizens if someone could end fake news. Distributed ledger technology like Blockchain could do that, says Behlendorf, by changing the way organizations collect and store data. If data were decentralized or transparent on an unmodifiable Blockchain, it would be almost impossible to attack the source or integrity of someone's data on that open ledger. "I view distributed ledger technology as the closest thing we have in the technology field to being able to say something is a fact," says Behlendorf. A distributed ledger system could also be used to help us check our confirmation biases in response to fake news. Currently, central providers like Facebook and Google can alert you to news sources that may be less than factual, but imagine a decentralized version, like a Yelp for news media, with experts who score platforms on their integrity, as well as crowd-contributed ratings. In the future, what if the Internet helped resolve controversy instead of cranking the rumor mill?
SMARTER FASTER trademarks owned by The Big Think, Inc. All rights reserved.