Skip to content

Being Rational About Irrationality

The human brain tends to jump to conclusions based on limited information.
Sign up for the Smarter Faster newsletter
A weekly newsletter featuring the biggest ideas from the smartest people

Portions of the following were taken from an article I wrote for ScientificAmerican.com in April.


Reason appears to have fallen on hard times. Since the 1970s, psychologists have accumulated a long list of cognitive biases that illustrate all of the ways we screw up. More recently, behavioral economists started studying how we systematically deviate from standard economic models. Social scientists such as Jonathan Haidt are pointing out that humans are led by their passions and our reasons are puny post-hoc rationalizations. Thanks to Malcolm Gladwell and other popular science writers, going with your gut is in; thinking it through is out. And since we’re predictably irrational, as Dan Ariely points out, we may as well give up when it comes to curbing our built-in biases.

My fellow bloggers have given their two-cents. Steven Mazie brought to life several classic studies originally conducted by Daniel Kahneman and Amos Tversky to explore the relationship between rationality, logic and probability theory. David Berreby rightly reminded readers that, “research into human irrationality… has the potential to cure some of our most important institutions of the habitual harms they inflict on us.” These comments stem from a post by Tauriq Moosa who rifted on a recent Jonah Lehrer post about the relationship between intelligence and cognitive biases. All of the posts do a nice job of covering the major discussion points surrounding rationality. However, by bringing to life the predominant research of judgment and decision-making they inherently do more harm than good in terms of understanding and avoiding cognitive biases.

One of the reoccurring points of Daniel Kahneman’s book Thinking, Fast and Slow is that it only takes a small amount of information to confidently form new world views that are seemingly objective and accurate but almost entirely subjective and inaccurate. That is, the human brain tends to jump to conclusions based on limited information.

The problem with blog posts on rationality and intuition is that readers seem to glance them over uncritically and reduce human cognition into a monism (i.e., “go with your gut,” or “think it through”). As a result, they ironically fall prey to the very biases they should be on the lookout for: jumping to conclusions based on limited information.

This cognitive tendency is a good thing most of the time. As cognitive scientists such as Gerd Gigerenzer points out, human rationality evolved to help us understand and organize the world by making it appear as simple as possible. Knowledge of logic and probability wasn’t important for our hunter-gatherer ancestors. But when it comes to writing about rationality and intuition we must remember that readers are going to jump to conclusions about how people jump to conclusions.

The popular literature on cognitive biases is enlightening, but let’s not be irrational about irrationality; exposure to X is not knowledge and control of X. Reading about cognitive biases, after all, does not free anybody from their nasty epistemological pitfalls.

lyao/shuttershock

Sign up for the Smarter Faster newsletter
A weekly newsletter featuring the biggest ideas from the smartest people

Related

Up Next