Skip to content
Thinking

Three cognitive biases that allow bad ideas to scale

It took a series of ingenious experiments in the 20th century to uncover some of our biggest cognitive biases.
Credit: Cristina Conti / Adobe Stock
Key Takeaways
  • In The Voltage Effect: How to Make Good Ideas Great and Great Ideas Scale, economist and author John A. List explores the common characteristics of scalable ideas, and why some seemingly great ideas fail to take off.
  • In this excerpt of the book, List covers a few cognitive biases that play a role in allowing bad ideas to scale.
  • Cognitive biases can lead us to ignore or misinterpret accurate data, putting us at risk of subscribing to bad ideas.
Sign up for the Smarter Faster newsletter
A weekly newsletter featuring the biggest ideas from the smartest people

Excerpted from THE VOLTAGE EFFECT: How to Make Good Ideas Great and Great Ideas Scale by John A. List. Copyright © 2022 by John A. List. Published by Currency, an imprint of Random House, a division of Penguin Random House. All rights reserved

In 1974, the psychologists Daniel Kahneman and Amos Tversky published an academic paper titled “Judgment Under Uncertainty: Heuristics and Biases.” If you ever need a counterexample to the argument that good ideas will fail to catch on without good branding, this is it. In spite of the article’s unsexy title—par for the course in academia—Kahneman and Tversky essentially launched a new field with its publication: the study of cognitive biases. With a series of ingenious experiments, they uncovered a constellation of hidden weaknesses in human judgment that steer us away from rational decision-making. 

Cognitive biases are distinct from computational errors and other errors that result from misinformation. Mistakes that people make due to misinformation can be corrected by simply providing more accurate information, but cognitive biases are “hardwired” in the brain, which makes them difficult to change and impervious to correction, since the mind’s faulty interpretation of accurate information is precisely the problem. Kahneman and Tversky’s landmark collaboration has since been chronicled in multiple books—for instance, Kahneman’s Thinking, Fast and Slow, Dan Ariely’s Predictably Irrational, and Michael Lewis’s The Undoing Project—and some of the cognitive biases they studied have even found their way into the cultural lexicon. One of these is called confirmation bias, and it helps to explain why innocent yet avoidable false positives frequently occur.

In the most basic sense, confirmation bias prevents us from seeing possibilities that might challenge our assumptions, and it leads us to gather, interpret, and recall information in a way that conforms to our previously existing beliefs. The reason we have this trapdoor in our thinking is that when individuals are presented with information, their brains are already filled with vast quantities of previously obtained information, social context, and history that project meaning onto the new information. Because we have limited brainpower to process all of this, we use mental shortcuts to make quick, often gutlevel decisions. One such mental shortcut is to essentially filter out or ignore the information that is inconsistent with our expectations or assumptions. This is because science has taught us that reconciling new, contradictory information requires more mental energy than processing new information that is consistent with what’s already in our heads, and our brains prefer the easier route.

This tendency might appear counter to our own interests, but in the context of our species’s long-ago Darwinian history, confirmation bias makes perfect sense. Our brain evolved to reduce uncertainty and streamline our responses. For our ancestors, a shadow might have meant a predator, so if they assumed it was one and started running, this assumption could save their lives. If they stopped to gather more information and really think about it, they might have ended up as dinner.

While confirmation bias was useful for our species in the distant past and continues to be helpful in certain scenarios, for endeavors that require deep analysis and slow deliberation—like testing an innovative idea we hope to scale—it can be troublesome. It can hamper creativity and critical thinking, which are the pillars of both innovation and high-quality work. It can cause doctors to make lazy diagnoses and pursue the wrong treatment. It can drive policymakers, business leaders, administrators, and investors to pour massive amounts of resources into the wrong initiative or venture. And when it comes to interpreting information, whether in business or science, it can produce false positives.

The British psychologist Peter Wason’s classic Rule Discovery Test from the 1960s illustrates confirmation bias in action. He gave subjects three numbers and asked them to come up with the rule that applied to the selection of those numbers. Given the sequence 2, 4, 6, for example, they typically formed a hypothesis that the rule was even numbers. Then the participants would present other sequences of even numbers, and researchers would tell them whether or not those numbers conformed to the rule. Through this process, participants were tasked with determining whether their hypothesis was correct. After several correct tries the participants believed that they had discovered the rule. But in fact they hadn’t, because the rule was much simpler: increasing numbers.

The most interesting aspect of this study (and the many others like it) is that almost all subjects only tested number sequences that conformed to their personal hypothesis, and very few tried a number sequence that might disprove their hypothesis. Wason’s experiment demonstrated that most people, regardless of intelligence, fail to examine their hypotheses critically. Instead, they try only to confirm them by “fast thinking,” using quick heuristics, or mental shortcuts.

Another mental shortcut that has a knack for producing false positives is bandwagon bias. Also known as “herding” or “cascades,” the bandwagon effect arises from social influences on our mental processes. Like confirmation biases, bandwagon bias interferes with our ability to accurately recall and evaluate information. But in this case, we are under the unconscious sway of the views and behaviors of others—the social side of decision-making. In 1951, the pioneering social psychologist Solomon Asch developed a now-famous laboratory experiment that can help us understand this kind of groupthink. He recruited students to participate in what they thought was a vision experiment. These students joined several other supposed participants—who were in fact experimental confederates, or scientists masquerading as participants—in a classroom.

Everyone in the room was shown a picture with three lines of varying lengths, in which one line was very obviously longer than others. Each person in the room was asked to state out loud which was the longest line. After the early confederates all identified the wrong line, more than a third of the participants, on average, went along with the clearly incorrect answer; over the course of twelve trials a whopping 75 percent went along with the obviously wrong answer in at least one instance. In contrast, when no confederates were present to tempt them to hop on their bandwagon, virtually all subjects chose the correct answer—demonstrating just how easily our independent judgment can be subsumed by our desire to “fit in” or be “one of the pack.” Not only is this a disturbing blow to one’s self-image as a freethinking individual, it also has unsettling implications for the science of scaling.

If you look at the bandwagon effect from the perspective of marketers, whose mandate is to create demand for products at scale, this quirk of the human mind is a godsend: the desire to conform that drives so many of our thoughts and actions can be converted into dollar signs. Indeed, there are mountains of research showing how the bandwagon effect shapes consumer choices, such as the clothing we buy (ever wonder why different colors and styles come into fashion every year?), the toys children ask their parents for (remember Tickle Me Elmo? For your sake I hope you don’t), and the sports teams we root for and whose apparel we purchase (the top-selling basketball jerseys in the United States historically correspond to the star players of teams that make it to the NBA finals in any given year). The bandwagon effect—or social contagion, as it is sometimes called— can even influence our political leanings, and thus electoral results. While this is all well and good for marketers and strategists hired to nudge people toward certain choices over others, for those creating and launching innovations to benefit society, it can create false positives and lead to the scaling of bad ideas.

Sign up for the Smarter Faster newsletter
A weekly newsletter featuring the biggest ideas from the smartest people

Related

Up Next