One day in 1995, a large, heavy middle-aged man robbed two Pittsburgh banks in broad daylight. He didn't wear a mask or any sort of disguise. And he smiled at surveillance cameras before walking out of each bank. Later that night, police arrested a surprised McArthur Wheeler. When they showed him the surveillance tapes, Wheeler stared in disbelief. 'But I wore the juice,' he mumbled. Apparently, Wheeler thought that rubbing lemon juice on his skin would render him invisible to videotape cameras. After all, lemon juice is used as invisible ink so, as long as he didn't come near a heat source, he should have been completely invisible.
Police concluded that Wheeler was not crazy or on drugs – just incredibly mistaken.
The saga caught the eye of the psychologist David Dunning at Cornell University, who enlisted his graduate student, Justin Kruger, to see what was going on. They reasoned that, while almost everyone holds favourable views of their abilities in various social and intellectual domains, some people mistakenly assess their abilities as being much higher than they actually are. This 'illusion of confidence' is now called the 'Dunning-Kruger effect', and describes the cognitive bias to inflate self-assessment.
To investigate this phenomenon in the lab, Dunning and Kruger designed some clever experiments. In one study, they asked undergraduate students a series of questions about grammar, logic and jokes, and then asked each student to estimate his or her score overall, as well as their relative rank compared to the other students. Interestingly, students who scored the lowest in these cognitive tasks always overestimated how well they did – by a lot. Students who scored in the bottom quartile estimated that they had performed better than two-thirds of the other students!
This 'illusion of confidence' extends beyond the classroom and permeates everyday life. In a follow-up study, Dunning and Kruger left the lab and went to a gun range, where they quizzed gun hobbyists about gun safety. Similar to their previous findings, those who answered the fewest questions correctly wildly overestimated their knowledge about firearms. Outside of factual knowledge, though, the Dunning-Kruger effect can also be observed in people's self-assessment of a myriad of other personal abilities. If you watch any talent show on television today, you will see the shock on the faces of contestants who don't make it past auditions and are rejected by the judges. While it is almost comical to us, these people are genuinely unaware of how much they have been misled by their illusory superiority.
Sure, it's typical for people to overestimate their abilities. One study found that 80 per cent of drivers rate themselves as above average – a statistical impossibility. And similar trends have been found when people rate their relative popularity and cognitive abilities. The problem is that when people are incompetent, not only do they reach wrong conclusions and make unfortunate choices but, also, they are robbed of the ability to realise their mistakes. In a semester-long study of college students, good students could better predict their performance on future exams given feedback about their scores and relative percentile. However, the poorest performers showed no recognition, despite clear and repeated feedback that they were doing badly. Instead of being confused, perplexed or thoughtful about their erroneous ways, incompetent people insist that their ways are correct. As Charles Darwin wrote in The Descent of Man (1871): 'Ignorance more frequently begets confidence than does knowledge.'
Interestingly, really smart people also fail to accurately self-assess their abilities. As much as D- and F-grade students overestimate their abilities, A-grade students underestimate theirs. In their classic study, Dunning and Kruger found that high-performing students, whose cognitive scores were in the top quartile, underestimated their relative competence. These students presumed that if these cognitive tasks were easy for them, then they must be just as easy or even easier for everyone else. This so-called 'imposter syndrome' can be likened to the inverse of the Dunning-Kruger effect, whereby high achievers fail to recognise their talents and think that others are equally competent. The difference is that competent people can and do adjust their self-assessment given appropriate feedback, while incompetent individuals cannot.
And therein lies the key to not ending up like the witless bank robber. Sometimes we try things that lead to favourable outcomes, but other times – like the lemon juice idea – our approaches are imperfect, irrational, inept or just plain stupid. The trick is to not be fooled by illusions of superiority and to learn to accurately reevaluate our competence. After all, as Confucius reportedly said, real knowledge is knowing the extent of one's ignorance.
This article was originally published at Aeon and has been republished under Creative Commons.
When it comes to climate change, gun control, and vaccinations, facts don’t change people’s minds—but there is one technique that might.
If you want someone to see an issue rationally, you just show them the facts, right? No one can refute a fact. Well, brain imaging and psychological studies are showing that, society wide, we may be on the wrong path by holding evidence up as an Ace card. Neuroscientist Tali Sharot and her colleagues have proven that reading the same set of facts polarizes groups of people even further, because of our in-built confirmation biases—something we all fall prey to, equally. In fact, Sharot cites research from Yale University that disproves the idea that the social divisions we are experiencing right now—over climate change, gun control, or vaccines—are somehow the result of an intelligence gap: smart people are just as illogical, and what's more, they are even more skilled at skewing data to align with their beliefs. So if facts aren't the way forward, what is? There is one thing that may help us swap the moral high ground for actual progress: finding common motives. Here, Sharot explains why identifying a shared goal is better than winning a fight. Tali Sharot's newest book is out now: The Influential Mind: What the Brain Reveals about Our Power to Change Others.
The Washington Post created a Twitter account that automatically retweets all the tweets from the people whom President Donald Trump follows.
Here’s a claim that seems rock solid in the “post-truth” era: People are conceptualizing reality in wildly different ways, especially since the election of President Donald Trump.
Part of the blame can be placed on “filter bubbles,” which form when website algorithms analyze a person's searches and personal information to deliver them news stories on which they’re likely to click—stories that affirm their existing beliefs. The result is a personalized media landscape that's ideologically homogenous.
“(Technologies such as social media) lets you go off with like-minded people, so you're not mixing and sharing and understanding other points of view,” Bill Gates said to Quartz in 2017. “It's super important. It's turned out to be more of a problem than I, or many others, would have expected.”
Facebook CEO Mark Zuckerberg echoed this sentiment, calling filter bubbles one of the company’s two “most discussed concerns” of 2016, in addition to inaccurate news stories.
(Source: Gisela Giardano/Flickr)
As it becomes clearer how filter bubbles are coloring our interpretations of people, events and ideas, an obvious question arises for America's first “Twitter President”: What does Trump’s filter bubble look like?
@trumps_feed, a Twitter account created by the Washington Post, provides a glimpse. The account retweets all the tweets sent out by people whom Trump follows, effectively replicating what he sees when he uses the app. See for yourself below:
Your brain stops at the most comforting thought. The truth is somewhere beyond that. Using scientific skepticism as a guide, astrophysicist Lawrence Krauss outlines the questions that critical thinkers ask themselves.
Strange answers aren’t inherently wrong, and satisfying answers aren’t inherently right, says Lawrence Krauss in this critical thinking crash course. The astrophysicist explains how principles of scientific skepticism can be applied beyond the laboratory; it can be a filter for the nonsense and misinformation we encounter each and every day. Here, he establishes a handful of core questions that critical thinkers ask themselves, which can be used to challenge your misconceptions and sense of comfort, question inconsistency, and think past your brain's evolved biases. Piece by piece, you can systematically remove nonsense from your life. Lawrence Krauss' most recent book is The Greatest Story Ever Told -- So Far: Why Are We Here?
Is creativity a wild and free state of mind, or is it actually a pattern that others just can't recognize?
To ensure your survival, your brain evolved to avoid one thing: uncertainty. As neuroscientist Beau Lotto points out, if your ancestors wondered for too long whether that noise was a predator or not, you wouldn't be here right now. Our brains are geared to make fast assumptions, and questioning them in many cases quite literally equates to death. No wonder we're so hardwired for confirmation bias. No wonder we'd rather stick to the status quo than risk the uncertainty of a better political model, a fairer financial system, or a healthier relationship pattern. But here's the catch: as our brains evolved toward certainty, we simultaneously evolved away from creativity—that's no coincidence; creativity starts with a question, with uncertainty, not with a cut and dried answer. To be creative, we have to unlearn millions of years of evolution. Creativity asks us to do that which is hardest: to question our assumptions, to doubt what we believe to be true. That is the only way to see differently. And if you think creativity is a chaotic and wild force, think again, says Beau Lotto. It just looks that way from the outside. The brain cannot make great leaps, it can only move linearly through mental possibilities. When a creative person forges a connection between two things that are, to your mind, so far apart, that's a case of high-level logic. They have moved through steps that are invisible to you, perhaps because they are more open-minded and well-practiced in questioning their assumptions. Creativity, it seems, is another (highly sophisticated) form of logic. Beau Lotto is the author of Deviate: The Science of Seeing Differently.