That Herman Cain allegedly had a long term extramarital relationship, and deluded himself into believing he could keep that secret while running for President, raises once again that ever-puzzling question; how can seemingly bright people like Cain, or Tiger Woods, or Bill Clinton, or Elliot Spitzer, or Gary Hart, or countless other powerful people whose lives are subject to such scrutiny, make such stupid decisions about risk? And before you default to the simplistic assumption that testosterone was to blameand that the brains below their belts were doing most of the thinking for these guys…there but for the grace of a few circumstances go you and I. This isn’t just about extramarital sex, or male ego. This is about how we all do risk perception. We all make mistakes about risk, sometimes in ways that raise new risks in and of themselves. There is a lesson here, and a warning, for all of us.
Set aside the morality and gender issues involved in the Cain affair affair and consider this; anyone who has a feeling of control, and who perceives a benefit from some behavior or choice that might also involve a risk, is likely to make that choice or engage in that behavior, because the benefit is perceived as outweighing the risk and their sense of control allows them to think they won’t be a victim. People texting on their smart phones while driving are doing the same thing. So are people who drive after a few beers or glasses of wine. So are overweight people who continue to enjoy eating too much. Those choices may not involve issues of sex and ego, but they reflect the same process, a process of risk perception that is hardly a careful, conscious, coldly analytical, objective process of fact-based reason.
Seemingly smart people can make patently dumb choices because the brain is only the organ with which we think we think. Most of our risk perception decision making happens subconsciously, and relies on instinctive psychological cues and subconscious mental shortcuts we have evolved to help us quickly turn the few facts we have into a judgment about what feels safe and what feels dangerous. This instinctive system works wonderfully in many cases. It has, after all, gotten us this far through evolution’s challenging gantlet. But it can also get us into trouble. Huh, Herman.
A pioneer in the study of risk perception, Paul Slovic, has written that when we face a choice that may involve risk we instinctively use a tool he and colleagues have labeled The Affect Heuristic, which essentially describes an instinctive subconscious mental process that allows us to quickly combine the facts we may have with how those facts feel. Slovic and others have even teased out many of the specific psychological characteristics that make some circumstances feel scarier than others…like how much control we feel we have (more control = less fear), and the way the risk compares with the benefits (the greater the benefit the smaller the risk), and whether the risk is natural or human-made (natural risks scare us less), and more than a dozen other ‘fear factors’. (These risk perception factors are described in full in Chapter Three of “How Risky Is It, Really? Why Our Fears Don’t Always Match The Facts”, which is available free here. Enjoy.)
The Cain affair, and countless other examples of risk decision-making that so patently seem to fly in the face of common sense, offer us all the warning that our instinctive risk perception system, as reliable as it often is, can sometimes be a threat all by itself. The system we rely on to keep us safe sometimes causes us to worry more than the evidence says we need to, or worry less than the evidence says we should. As common as this phenomenon is, and as dangerous as it can sometimes be, it merits a name. I call it The Perception Gap, the occasional gap between our fears and the facts caused by the instinctive way we perceive and respond to risk, which sometimes poses dangers all by itself.
Sometimes The Perception Gap just poses a danger to us as individuals, in the choices we make that might feel right but might be harmful in and of themselves….extramarital affairs, weighing too much, smoking. Sometimes our judgments about risk put others at risk, like drunk drivers or people who don’t vaccinate their kids and allow nearly eradicated diseases to re-emerge. Sometimes The Perception Gap is a societal threat. When we are more afraid than the evidence warrants or less worried than the evidence warns we push the government to protect us from what we fear more than what threatens us, producing policies that may feel right but which might not be doing us the most good. Fear of nuclear power, for example, contributed to energy policy that led to more coal burning, the particulate pollution from which kills tens of thousands of people per year.
Recognizing that faulty risk decision making can be a risk all by itself – The Perception Gap so clearly demonstrated by candidate Cain – is the first step toward reducing the danger here. Understanding in detail the way risk perception psychology gives rise to The Gap is the next step. Cain reminds us that we should add The Perception Gap to the more familiar risks we already work so hard to manage, and apply what Slovic and others have taught us about how our own risk perception system works to the hard job of making smarter choices about how to keep ourselves safe.