Dealing With the Danger of Cognitive Hubris
Reason itself is fallible, and this fallibility must find a place in our logic.
Nicola Abbagnano (1901-1990) Italian existential philosopher
The human mind can achieve fantastic things. One of them is “…our almost unlimited ability to ignore our ignorance” as Daniel Kahneman notes in Thinking, Fast and Slow. Our cognitive hubris allows us to think that we’re smarter than we actually are, to stubbornly deny the overwhelming evidence that human cognition is in fact a messy subjective mix of facts and feelings, intellect and instinct, reason andgut reaction. Pure, objective, analytical ‘just-the-facts’ Cartesian reason is a wonderful goal – “God’s crowning gift to man” as Sophocles put it – but it’s an unachievable myth. And believing in it is dangerous.
Such misplaced pride in human intellect leads to what in my book, How Risky Is It, Really?, I have labeled The Risk Perception Gap…when we are more afraid of some threats than the evidence warrants, or less afraid of some perils than the evidence warns. This gap, worrying too much or not enough, is risky all by itself. Smug confidence in human reason, and the belief that once fully educated and informed people will then make the objectively ‘right’ decision about risk, only widens the gap and increases the danger.
So in the name of our health and safety, as we head into another year, it is profoundly important that we heed all we have learned in the last couple decades about the limits of human reason, and apply that knowledge to the challenge of thinking more carefully about the risks we face. Fortunately there are signs that this more realistic acceptance of the limits to reason may be taking hold.
Leaders in the study of human cognition who have taught us so much about the limits of reason have inherently recognized the threat of the Risk Perception Gap for decades. Mary Douglas, who helped develop the Cultural Theory of Risk , wrote in 1992 “We are said to be risk-aversive, but alas so inefficient at handing information that we are unintentional risk-takers; basically we are fools. The charge of irrationality has come home to roost.”
Nearly 20 years earlier, psychologist Robert Zajonc talked about the challenge of the subjective emotional nature of human cognition for rational government policy making. In a famous paper in 1980, “Feeling and Thinking”, Zajonc wrote “…it is for this very reason that law, science, sports, education, and other institutions of society keep devising ever new means of making judgments “objective.” We wish some decisions to be more independent of these virtually inescapable reactions.”
And as cognitive science has produced more and more evidence about the limits of reason and the subjective nature of risk perception, many leading thinkers have called for what cognitive scientist Gary Marcus calls “Cognitive humility“, suggesting that “knowing the limits of our minds can help us to us to make better reasoners. Which is pretty much what Abbagnano said when we began, that “Reason itself is fallible, and this fallibility must find a place in our logic.
But it’s one thing to understand the limits of human reason, and quite another to put our understanding of those limits to use. That’s the threshold at which we find ourselves as we face the threats of accelerating climate change, and a global population of 7 billion going on 9-10 billion within a generation (dynamic world population clock), and a long list of perils from the personal to the globally existential. In the name of our own health and safety it is now time to apply our understanding of the psychology behind The Risk Perception Gap to the design of laws and regulations that account for the dangerous things people do when they are too afraid or not afraid enough, behaviors that endanger not only those individuals but the greater community as well. To maximize public and environmental health we should apply our understanding of the psychology of risk perception to the establishment of economic incentives and disincentives, and the design of physical and operational systems, that encourage (rather than mandate) healthier choices and behaviors (see Richard Thaler and Cass Sunstein’s Nudge).
We have learned a great deal in the past several years that not only teaches us that our reason is fallible, but that explains why reason fails, and how. It is time to give our understanding of this fallibility a much more prominent place in the logic of how we make decisions to keep ourselves healthy and safe, both as individuals and as a society. But it must begin with a new more humble post-Enlightenment attitude about the limits of what the human brain can accomplish. As Descartes himself said; “If you would be a real seeker after truth, it is necessary that at least once in your life you doubt, as far as possible, all things.” Including how smart you think you are, and how objectively rational you think you and people in general can ever be.