As we all look nervously toward Thanksgiving tomorrow, never have the two sides of the dinner table -- left and right -- seemed so far apart as they have in the days since the 2016 election. This is in no small part due to the fact that each side is increasingly relying on its own, separate sets of facts.
Producers and consumers of media--especially digital media--have witnessed this descent into separate realities as traditional media outlets have downsized and new outlets, operating under very different sensibilities, have risen to compete along side them. In a world where what’s “new” is valued above all else, and getting you to click is the ultimate goal, there is ever greater incentive for each new outlet to outdo the last in terms of shock and sensationalism. In this system, there is increasingly little reward for telling the truth.
Much has been made of social media, and Facebook in particular, in allowing fake stories to proliferate and go largely go unchecked, fanning the flames of our political discontent. Indeed, a recent Buzzfeed analysis found that fake election stories on Facebook greatly outperformed stories from mainstream outlets like The New York Times and The Washington Post over the last three months of the campaign.
Despite long-running internal debate within Facebook about its role in swaying the election, Mark Zuckerberg has, until recently, that the idea fake news promulgated on the platform influenced the outcome, one way or the other, is “highly unlikely”. Even with Zuckerberg’s newfound commitment to eradicating fake news, many observers note that it may not even be within Facebook’s capabilities to ever address the problem fully.
When and how chooses to address the problem of fake and misleading news is beside the point. All algorithmically driven content platforms, from Facebook to Google, will invariably give you more of what you want. If you demand biased or even fake news--as measured in your clicks, likes and comments--that’s what you’ll get.
Despite the instant, perverse gratification of having your worst societal fears and neuroses constantly validated in a nightmarish echo-chamber, few reasonable people would willingly choose to place themselves into such a ‘filter bubble’. Fortunately, informed media consumers can choose to remove themselves from their filter bubbles by rethinking their relationship with the platforms that inform them and assuming some of the roles once ceded to professional journalists in the era before social media.
Above all, we should proceed with the assumption that there is no such thing as an unbiased information source, period. Much as corporate america and certain law enforcement agencies have begun to embrace the idea of training for “implicit bias”--the understanding that we all, in one way or another, exhibit unconscious biases--the media consumer must operate under a similar premise. Whether the bias is explicit or implicit, or occurs through choices of what’s published or what is not, it exists everywhere. Identifying that bias in the outlets you follow and like is the first step in counter-balancing them.
Second, we must achieve ideological balance across sources, not rely on inherent balance within them. This means curating your feeds through addition rather than subtraction. Don’t angrily unfollow Aunt Susie because she posts things you disagree with. Swallow your intellectual pride and follow at least a few of her favorite sources, or your least favorite ones, using your level of disgust as an indicator of where you might benefit most from some new perspective. The idea isn’t that you’ll agree with these perspectives--rather, simply knowing these ideas exist, and are genuinely held by your fellow humans, will serve as a counter-weight to throwing yourself headlong into outrage.
You must now be your own fact checker--and Aunt Susie’s, too--at least superficially. With just a few seconds of due diligence, you’ll quickly discover all may not be as it seems. For instance, despite a deliberately similar appearance, ABCNEWS.com.co turns out to be a very different source than ABCNEWS.com with likely very different. And, even for President-elect Trump’s most ardent supporters, DONALDTRUMPNEWS.co may not meet the standard for reliable sources. What’s more, the next time Aunt Susie shares one of those sources, leave a comment calling her on it (as a good fact-checker should do) and skip the family-wrending digital diatribes.
Lastly, you should actively seek and follow outlets that explicitly, or implicitly, reject ‘new’s value’--the newness of information--as the basis of their editorial point of view. At Big Think, we think of ourselves as a “knowledge forum” rather than a news site. Instead of newness, we try to measure how “big” ideas are: what is their overall significance (impact over time), their relevance (how many people they directly touch or relate to) and their actionability (how immediately they can be translated from thought into action). Some of the most valuable information on the web can be found far outside the newsphere--in TED talks, long-form explanatory essays on Wait But Why and incisive commentary on sites like Brainpickings, to name a few.
Social media, and algorithmic platforms generally, are driven by the bottom-up inputs of the human beings participating in them. So, the solutions to systemic problems like fake news should work best in the same direction--from the bottom up. In other words, this is as much your problem, and my problem, as it is Mark Zuckerberg’s. So, this Thanksgiving, take the long view in how you use social media, bite the bullet, and re-follow Aunt Susie. In the end, you might just “like” the outcome.
Peter Hopkins, President & Co-founder, Big Think
This was originally delivered, earlier this month, as an address on the role media plays in impeding collaboration at the Unlikely Collaborators conference hosted by Elizabeth Koch.