Can We Think Critically Anymore?
In A Field Guide to Lies, neuroscientist Daniel Levitin explains how to wade through an endless sea of data and statistics to hone our critical thinking skills.
In a May 2015 New Yorker article, satirist Andy Borowitz warned of a “powerful new strain of fact-resistant humans who are threatening the ability of Earth to sustain life.” Although humans are endowed with an ability to “receive and process information,” he writes, these faculties have been rendered “totally inactive.”
Readers enjoy Borowitz because his writing is uncomfortably close to reality. While most articles are close enough to the ballpark you can hear the game, this particular piece hardly seems satirical. The medium of the Internet, where most people get their information and news on a daily basis, is not designed for nuanced, critical thinking; it incites our brain’s reptilian response system: scan it, believe it, rage against it (or proudly repost it without having read the content).
Cognitive psychologist and neuroscientist Daniel Levitin would agree. In fact, he’s written an entire book on the subject. The author of insightful previous works, This Is Your Brain on Music and The Organized Mind, in A Field Guide to Lies: Critical Thinking in the Information Age he takes to task our seemingly growing inability to weigh multiple ideas in making informed decisions, relying instead on emotional reactivity clouded by invented statistics and murky evidence.
Misinformation has been a fixture of human life for thousands of years, and was documented in biblical times and classical Greece. The unique problem we face today is that misinformation has proliferated; it is devilishly entwined on the Internet with real information, making the two difficult to separate.
Instead of merely pointing out problems, Levitin offers solutions, stepping into a professorial role with three evaluations: numbers, words, and the world. Through these sections he explores the ways that researchers and companies manipulate statistics, teaching us how to properly read studies without the intended bias.
For example, consider this headline: In the U.S., 150,000 girls and young women die of anorexia each year. This headline would quickly garner tens of thousands of shares, with few of those trigger-happy social media experts thinking through such a stat. So Levitin does it for us. Each year roughly 85,000 women between fifteen and twenty-four die; increase the age to forty-four and you still only have 55,000. The above statistic is impossible, regardless of how sharable.
Throughout this section Levitin returned me to Intro to Logic at Rutgers in the early nineties. He discusses how corporations manipulate graphs to suit their needs, such as one used by Apple CEO Tim Cook. Instead of reporting on Apple’s sluggish iPhone sales in 2013, he instead showed a cumulative graph beginning with 2008. The line, which if reflecting for a poor quarter would include a lethargic ascent, instead focuses the eye on the Himalayan climb of the previous two years. You barely notice the leveling off since your eye returns to his figure standing below it.
Another example is C-Span, which advertises that its network is available in 100 million homes. Of course, there might only be ten people watching, but that wouldn’t sit well. Likewise polling results, some of the most widely skewed numbers currently in the media. He writes,
A sample is representative if every person or thing in the group you’re studying has an equally likely chance of being chosen. If not, your sample is biased.
Since most circulated polls are conducted on landlines, and the demographic that still uses these phones is older, no such poll would represent new voters, who probably have no clue what that curly cord at the end of the receiver is for.
Then there’s simple bias, a neurological habit fully on display this week regarding presidential health. Forget numbers, we’re a visual species. Hillary Clinton’s slip has been defined as everything from a minor tumble to an avalanche of skin, depending on the viewer’s political inclinations. Levitin explains the bigger picture:
We also have a tendency to apply critical thinking only to things we disagree with.
The Internet might very well have been designed for confirmation bias. If you have a theory, you’ll find some site purporting it to be true. (I’m constantly amazed at how many people post Natural News stories on my feed, as if anything on the site is valid.) Levitin notes that MartinLutherKing.org is run by a white supremacist group. Even experts get fooled: Reporter Jonathan Capehart published a Washington Post article “based on a tweet by a nonexistent congressman in a nonexistent district.”
In The Organized Mind, Levitin writes that the human brain can only process 120 bits of information per second—not exactly Intel. Besides, our brain does not just process data, but is constantly scanning our environment for potential threats. Since we don’t have tigers to run from, and since we generally don’t commune in person (compared to time spent online), our emotional reactivity is directed at apparitions.
Add to this the fact that our attention is pulled in thousands of directions every day from advertisers purposefully falsifying information, eschewing traditional marketing under cover of ‘brand ambassadors’ and invented data. Taking the time to contemplate and comprehend what Nicholas Carr calls ‘deep knowledge’ is a forgotten art. Two thousand years ago people memorized the 100,00 shloka (couplets) of the Mahabharata. Today we forget what we tweeted five minutes ago.
Just as memorization and critical thinking occur when we train our brain like a muscle, it is exceptionally easy to forgo effort when emotionally-charged information is presented right before our eyes. As Levitin writes,
The brain is a giant pattern detector, and it seeks to extract order and structure from what often appear to be random configurations. We see Orion the Hunter in the night sky not because the stars were organized that way but because our brains can project patterns onto randomness.
Sadly, we’re victims of our patterns. Carr wrote The Shallows because, ironically, he could no longer finish reading an entire book. He wanted to know what technology was doing to his brain. Levitin made his own case for this in The Organized Mind. A Field Guide to Lies is an exceptional follow-up, not only describing the mechanisms for how we read and understand, but giving practical and essential advice on what to do about it.
Derek Beres is working on his new book, Whole Motion: Training Your Brain and Body For Optimal Health (Carrel/Skyhorse, Spring 2017). He is based in Los Angeles. Stay in touch on Facebook and Twitter.
Political activism may get people invested in politics, and affect urgently needed change, but it comes at the expense of tolerance and healthy democratic norms.