The Smarter We Are, the Dumber About the Facts We Can Be
I'm an Instructor at Harvard, a consultant in risk perception and risk communication, author of How Risky Is it, Really? Why Our Fears Don't Always Match the Facts, and principal co-author of RISK, A Practical Guide for Deciding What's Really Safe and What's Really Dangerous in the World Around You. I run a program called Improving Media Coverage of Risk. I was the Director of Risk Communication at the Harvard Center for Risk Analysis, part of the Harvard School of Public Health, for 4 years, prior to which I was a TV reporter, specializing in environmental issues, for a local station in Boston for 22 years.
What a golden age these past few decades have been for learning about how human cognition works. And what a humbling age, as we discover the truth that satirist Ambrose Bierce perceived more than 100 years ago in The Devil’s Dictionary, that the brain is only the organ with which we think we think. With yet another shrewd experiment, Dan Kahan and colleagues have added more evidence to the now overwhelming case that humans are indeed quite smart and rational, but that ‘smart’ and ‘rational’ have less to do with making objective judgments and decisions that accurately align with the facts, and more to do with our ability to shape the facts into judgments and decisions that feel right and help us feel safe, even if those judgments and decisions fly in the face of the facts and they are bad for us…even if they are dangerous.
For this new study, Kahan, Ellen Peters, Erica Cantrell Dawson, and pioneer-in-the-study-of-risk-perception Paul Slovic, asked a group of people about the efficacy of a skin cream for dealing with a rash, based on this numerical display. Which is better, using the cream, or not using it?
Rash got better
Rash got worse
Patients who did use the cream
Patients who did not use the cream
(They also showed a second group of subjects a similar chart, just with the columns reversed; ’Rash got worse’ first, Rash got better’ second.)
Seems like using the cream is better, right? But Kahan et.al. also asked the study subjects some baseline questions to identify how ‘numerate’ they were to begin with…a measure of both how good they were with numbers and how much they were willing to go beyond the first, easy, seemingly obvious answer and make the extra mental effort to get number questions right. (At this point you may want to go back to the chart and do the same thing…checking your math and your thinking. Go right ahead. I only got it right after I read the study. Numbers, and mental effort about numbers, make my head hurt. (In fact, this post has been amended to correct numbers, below, that I got wrong in the original version, mistakes brought respectfully to my attention by a few alert readers clearly more numerate than I am!)
That mental effort part of numeracy is important, because the answer to the skin cream question is not as obvious as it seems. The easy answer, that using the cream is better, is wrong. If you do the math, you see that the rash got worse in 25% of those who used the cream (75 of a total of 298) but only 16% of those who didn’t (21 of a total of 128). Not using the cream was the numerically correct choice.
Predictably, those with higher numeracy scores - better with numbers and more willing to do the heavier mental lifting required to figure out the not-so-obvious right answer - did better on the skin cream quiz. No surprise there. But then a third group of subjects saw similar charts asking a different hypothetical question, one freighted with ideological valence; Which is better for reducing urban crime, banning the carrying of concealed weapons, or not banning them?
Decrease in Crime
Increase in Crime
Cities that DID ban carrying concealed weapons in public
Cities that did NOT ban carrying concealed weapons in public
(As with the skin cream charts, a separate group of subjects was presented with the chart with the columns reversed: Increase in crime first, Decrease in crime second.)
If our judgments and decisions were just about the numbers and the evidence, the results should have been the same as with the quiz about the skin cream. The less numerate subjects should have jumped to the obvious easier answer…banning concealed weapons is better for reducing urban crime…and the more numerate subjects should have done the extra mental work of calculating the ratios and seeing that the ban led to a 25% increase in crime, while not banning the carrying of concealed weapons only caused crime to increase 16%, so not banning them made things safer.
But gun control is ideologically loaded (pun intended), and this time, the political views of the subjects overwhelmed their numeracy. The more strongly the subjects identified themselves as ‘Conservative/Republican’, the more likely they were to answer that NOT banning guns is better. The more strongly the subjects self-identified as ‘Liberal/Democratic’, the more they favored the gun ban. And here’s the scary part; this effect got WORSE the higher their numeracy. The same highly numerate subjects who got things right when it was about a skin cream, got things MORE wrong about gun control than the less numerate subjects. The smarter they were, the better they were at seeing the facts the way they wanted to see them! (Remember, these were the same numbers that they did just fine on when the question was about skin cream!)
Here’s what the effect looks like in charts that identify the groups by their ideological/political affiliations. Notice how in the top charts (about skin cream) the lines mostly match, but in the bottom charts (about the ideologically charged question of gun control), not only do the blue and red lines not match, but the gaps in the chart on the right get wider the smarter (the more numerate) the subjects supposedly are.
The study put it this way;
“Individuals high in science comprehension have a special resource to engage evidence in a manner calculated to generate ideological congenial conclusions.”
“More numerate individuals will use that ability opportunistically in a manner geared to promoting their interest in forming and persisting in identity-protective beliefs.”
As Kahan and others have found repeatedly, we align our views to those in the groups with which we most closely identify, because being a member in good standing of our group is protective. This makes adaptive sense. As social animals, we rely on our group, our tribe, for our safety. Not only does agreeing with our tribe-mates mean they are more likely to support and protect us, but when everyone in our tribe agrees, that unity increases our tribe’s success in the ‘combat’ (i.e. politics) over society’s rules when battling it out with tribes whose members have a wider range of views.
So it makes sense to see the facts in ways that make us feel safe. And it makes sense that people who have more numerical ability, and the mental willpower to think things through more carefully, would be better at this. Seen this way, such seemingly irrational interpretation of irrefutable factual evidence is entirely rational.
The only problem is, many of the issues that take on ideological baggage (climate change, guns, genetically modified food) have to do with risks, and the best solution…the one that the evidence says will keep the greatest number of us safest…offends one side or the other. So we go to war (usually political, sometimes physical) over having our ideological way, the evidence be damned, and the solution that would do the most of us the most good loses out to less beneficial options imposed by the loudest/richest/most unified tribe. And that leaves us all, including the winning tribe, at greater risk. There could not be a more ominous example of this than climate change.
Such thinking doesn’t sound all that smart, or rational, does it? Except as the study of human cognition is teaching us, ‘smart’ and ‘rational’ don’t necessarily mean what we think they do. And the implications of that...about how human cognition actually works for solving the complex and immense risks modern society faces…is REALLY scary.
New research links urban planning and political polarization.
- Canadian researchers find that excessive reliance on cars changes political views.
- Decades of car-centric urban planning normalized unsustainable lifestyles.
- People who prefer personal comfort elect politicians who represent such views.
Progressive America would be half as big, but twice as populated as its conservative twin.
- Why can't we have a human-sized cat tree?
- What would happen if you got a spoonful of a neutron star?
- Why do we insist on dividing our wonderfully complex selves into boring little boxes
SMARTER FASTER trademarks owned by The Big Think, Inc. All rights reserved.