Skip to content

Dealing With the Danger of Cognitive Hubris

Smug confidence in human reason, and the belief that once fully educated and informed people will then make the objectively ‘right’ decision about risk, only widens the gap and increases the danger.

Reason itself is fallible, and this fallibility must find a place in our logic.

Nicola Abbagnano  (1901-1990) Italian existential philosopher


            The human mind can achieve fantastic things. One of them is “…our almost unlimited ability to ignore our ignorance” as Daniel Kahneman notes in Thinking, Fast and SlowOur cognitive hubris allows us to think that we’re smarter than we actually are, to stubbornly deny the overwhelming evidence that human cognition is in fact a messy subjective mix of facts and feelings, intellect and instinct, reason andgut reaction. Pure, objective, analytical ‘just-the-facts’ Cartesian reason is a wonderful goal – “God’s crowning gift to man” as Sophocles put it – but it’s an unachievable myth. And believing in it is dangerous.

     Such misplaced pride in human intellect leads to what in my book, How Risky Is It, Really?, I have labeled The Risk Perception Gap…when we are more afraid of some threats than the evidence warrants, or less afraid of some perils than the evidence warns. This gap, worrying too much or not enough, is risky all by itself. Smug confidence in human reason, and the belief that once fully educated and informed people will then make the objectively ‘right’ decision about risk, only widens the gap and increases the danger.

     So in the name of our health and safety, as we head into another year, it is profoundly important that we heed all we have learned in the last couple decades about the limits of human reason, and apply that knowledge to the challenge of thinking more carefully about the risks we face. Fortunately there are signs that this more realistic acceptance of the limits to reason may be taking hold.

  • At the “Doomsday Clock Symposium”, an annual session that helps the Bulletin of Atomic Scientists judge just how close to midnight/doom we are (it’s set at 11:55 p.m. at the moment), instead of just talking about nuclear war or climate change or pandemics, this year they considered “…the idea of risk, its meanings, rather than just…the physical aspect of the existential risks we face.” The keynote speaker was Paul Slovic, a pioneer in the psychological research of risk perception. He didn’t discuss the facts about nuclear weapons, or climate change, or biosecurity. Slovic talked about how risk is more than just a matter of the facts. Risk also arises because of how we feel about those facts, and the choices and behaviors those feelings produce. Managing risk must take those emotions and behaviors, and the Risk Perception Gap, into account.
  • Governments have begun using behavioral science insights that explain why people make what seem like irrational choices, including potentially dangerous choices about risk, to encourage healthier behaviors about smoking, childhood obesity, fuel economy, and energy efficiency. Harvard Law Professor Cass Sunstein, who helped pioneer such efforts during three years in the Office and Management and Budget of the Obama Administration, said recently “We will uncover a lot more such opportunities in the future. Let’s take advantage of them.” The British Government has established a Behavioural Insights Team to inform government policy makers on specific issues.
  • Recognizing that the psychology of risk perception leads to a Risk Perception Gap in some parents who stubbornly deny the evidence about the safety of vaccines, California, Washington, and Vermont have passed laws making it harder for parents to opt out of vaccinating their kids, and other states are considering similar steps.  (Forgive my immodesty, but I suggested just this idea in a July 2011 OpEd in the Los Angeles Times, Not Vaccinated? Not Acceptable.)
  •      Leaders in the study of human cognition who have taught us so much about the limits of reason have inherently recognized the threat of the Risk Perception Gap for decades. Mary Douglas, who helped develop the Cultural Theory of Risk , wrote in 1992 “We are said to be risk-aversive, but alas so inefficient at handing information that we are unintentional risk-takers; basically we are fools. The charge of irrationality has come home to roost.”

         Nearly 20 years earlier, psychologist Robert Zajonc talked about the challenge of the subjective emotional nature of human cognition for rational government policy making. In a famous paper in 1980, “Feeling and Thinking”, Zajonc wrote “…it is for this very reason that law, science, sports, education, and other institutions of society keep devising ever new means of making judgments “objective.” We wish some decisions to be more independent of these virtually inescapable reactions.”

          And as cognitive science has produced more and more evidence about the limits of reason and the subjective nature of risk perception, many leading thinkers have called for what cognitive scientist Gary Marcus calls “Cognitive humility“, suggesting that “knowing the limits of our minds can help us to us to make better reasoners. Which is pretty much what Abbagnano said when we began, that “Reason itself is fallible, and this fallibility must find a place in our logic.

         But it’s one thing to understand the limits of human reason, and quite another to put our understanding of those limits to use. That’s the threshold at which we find ourselves as we face the threats of accelerating climate change, and a global population of 7 billion going on 9-10 billion within a generation (dynamic world population clock), and a long list of perils from the personal to the globally existential. In the name of our own health and safety it is now time to apply our understanding of the psychology behind The Risk Perception Gap to the design of laws and regulations that account for the dangerous things people do when they are too afraid or not afraid enough, behaviors that endanger not only those individuals but the greater community as well. To maximize public and environmental health we should apply our understanding of the psychology of risk perception to the establishment of economic incentives and disincentives, and the design of physical and operational systems, that encourage (rather than mandate) healthier choices and behaviors (see Richard Thaler and Cass Sunstein’s Nudge).

         We have learned a great deal in the past several years that not only teaches us that our reason is fallible, but that explains why reason fails, and how. It is time to give our understanding of this fallibility a much more prominent place in the logic of how we make decisions to keep ourselves healthy and safe, both as individuals and as a society. But it must begin with a new more humble post-Enlightenment attitude about the limits of what the human brain can accomplish. As Descartes himself said; “If you would be a real seeker after truth, it is necessary that at least once in your life you doubt, as far as possible, all things.” Including how smart you think you are, and how objectively rational you think you and people in general can ever be.


    Related

    Up Next