Skip to content

Learning From the Illusion of Understanding

Sign up for the Smarter Faster newsletter
A weekly newsletter featuring the biggest ideas from the smartest people

The feeling of certainty might be our default setting. We spend most of our mental life confirming our opinions, even when those opinions involve complex issues. We believe we understand the world with detail and coherence, even though our folk theories are usually incomplete. The sad reason rationality exists might not be to seek truth, but to argue and persuade. We’re lawyers, not judges, and the jury is always on our side.  


There are many psychological studies that illustrate this, but let me tell you about a particularly clever one. Imagine that an experimenter hands you a list of everyday devices – a piano key, a sewing machine, a zipper – and asks you to indicate how much you understand how each item works. Next you are tasked with writing a detailed step-by-step causal description of four items from the list. After that you have to re-rate how well you understand how each item works. Here’s the question. Do you think having to write a detailed description of each item will influence that rating?

Years ago two psychologists, Leonid Rozenblit and Frank Keil, conducted this study to find out. In a series of 12 experiments they discovered that when participants tried to explain how an everyday device worked they realized that they had no idea what they were talking about. Although it seems like we know how something like a zipper works – we use it everyday, after all – when we stop to think about the details we realize our ignorance. (Keil terms this the illusion of explanatory depth.)

This bias is not as bad as it sounds. It suggests that despite our overconfidence, we adjust our beliefs after discovering an error. What’s concerning is what happens when you ask people to explain their opinions. Decades of psychological research demonstrate that when this occurs, certainty and confidence skyrocket. It’s easy to admit that we don’t know how a zipper works – admitting that our opinions are flawed is a different story.

This brings me to a recently published study by Professor of Marketing Philip Fernbach, Todd Rogers, Craig R. Fox and Steven A. Sloman. The team of psychologists was interested in what happens when you ask people to elaborate on political policies. Most voters maintain strong opinions about complex policies, even though they are relatively uninformed about the details of those policies. The question is if we adjust our beliefs when we realize that we don’t understand a policy.  

To find out, the team conducted three experiments. In the first, 198 participants (all U.S. residents) rated their positions on six policies (e.g., “How strongly do you favor sanctions against Iran?”) and quantified their level of understanding for each policy (e.g., “How well do you understand the impact of imposing unilateral sanctions on Iran for its nuclear program?”). The participants also generated a “mechanistic explanation” for two randomly assigned political policies. The purpose of this was to test how well they understood each policy. Finally, participants rerated their positions after providing an explanation.

The first finding was obvious enough: participants indicated that they understood the details of each policy better than they actually did (this is consistent with the research on the illusion of explanatory depth). Here’s the interesting part. When they became aware of their unawareness, they adapted more moderate positions and reported feeling less certain about how much they understood each policy. In fact, in the third experiment participants were less likely to donate to an advocacy group after realizing that they did not have a firm understanding of the policy associated with that group. The one pessimistic finding was that participants did not reconsider their position on a policy when they simply generated reasons for it.  

Here’s Fernbach on the results:

Across three studies, we found that people have unjustified confidence in their understanding of policies. Attempting to generate a mechanistic explanation undermines this illusion of understanding and leads people to endorse more moderate positions. Mechanistic-explanation generation also influenced political behavior, making people less likely to donate to relevant advocacy groups. These moderation effects on judgment and decision making do not occur when people are asked to enumerate reasons for their position… 

One conclusion from this research is that we need to read up on the issues more. That goes without saying, yet it’s easy to forget that you are not an expert living in a world of idiots. Exposure to ignorance is an effective way to become more mindful of unjustified confidence. 

There’s something more interesting though. It appears that different psychological mechanisms are active when we explain how we think something works versus when we explain an opinion. In the former case we think more scientifically. When we struggle to understand how the world works, we tend to seek out relevant information and change our beliefs appropriately. In the latter case we are more like lawyers. When we detect an incongruity in our mental life, we strive to make it consistent with what we believe instead of gathering more information and confessing our biases. 

Fernbach’s research suggests that when it comes to explaining an opinion we should pretend like we are explaining the mechanics of something in the world – an everyday device or political policy. That way, instead of wasting mental effort on preserving an ideology, we develop more accurate beliefs. 

Two more ideas worth considering. First, the participants in this study stated their positions and provided their evaluations and judgments online, presumably alone. It would be interesting to see if moderation effects hold when people are put in the spotlight. I imagine that in an ecological setting we are more reluctant to admit our ignorance – no one wants to be perceived as unintelligent. (For example, watch this YouTube of Chris Matthews pressing Kevin James to explain what Neville Chamberlain did to appease Hitler during the 1930s.)

Second, it would be interesting to test how different objects or opinions influence moderation effects. In general, we are more reluctant to budge from a position when it involves a sensitive subject. Over an enlightening phone conversation, Fernbach distinguished between value-based judgments and consequentialist-based judgments. The former involves polarizing topics with unclear solutions (abortion) while the latter involves topics with debate solutions (what policy spurs economic growth?) but agreeable goals (that economic growth is good). It would be interesting to study the relationship between these different judgments and moderation effects. (To be sure, the distinction between each judgment is not absolute but on a spectrum.)

Either way, unjustified overconfidence is probably an innate feature of the mind but fixable if we think more scientifically. The optimistic news from this study is that participants not only recognized their overconfidence, but they made appropriate adjustments. Let’s learn from them. 

Image via Shuttershock/R-O-M-A

Sign up for the Smarter Faster newsletter
A weekly newsletter featuring the biggest ideas from the smartest people

Related

Up Next