Thomas Nagel says that "devaluation of conscious reasoning" is a form of "moral and intellectual laziness," and that David Brooks is guilty of same in his new book. Nagel's review made me wonder if others among us partisans of post-rational research should feel tarred by this brush too. Maybe in my enthusiasm for research into the non-conscious, inconsistent and inconstant drivers of behavior, I'm making a case against reason which does more harm than good.
What Nagel meant by "conscious reasoning" is, I think, a set of tools, invented over centuries, for reducing ignorance and avoiding errors. "Conscious reasoning" includes mathematics, logic and the sciences—enterprises in which statements are subject to explicit tests for accuracy and consistency, and the tests themselves are also constantly tested for adherence to consistent principles. These rigorous methods are supposed to be unaffected by emotions, social relationships or other aspects of our largely unconscious psychology.
There are people who claim that these tools are over-rated, misguided, ungodly or inhuman. I don't think the researchers who turn up in this blog are guilty of this. They respect the tools of reason (they use them to do their research, after all). They just keep in mind, though, that a tool is not a model. The human mind invented reason, but that doesn't mean the mind is shaped like its invention. Pointing out the differences between my mind and my computer doesn't imply disdain for the computer's admirable powers, nor does it commit me to throwing the computer away.
What worries me about some post-rationalist ideas is not the imaginary danger of a Cult of Unreason but the real danger of anti-democratic elitism. Maybe it's an occupational hazard: Contemplating irrationality in human affairs leads easily to the notion that "they," the unenlightened, are lost in their self-made darkness while "we," who have seen the light, know better. With that attitude, behavioral research looks less like an open-ended quest for understanding and more like a cookbook full of recipes for manipulation.
To the supposedly enlightened, this is easy to miss. Of course people should donate their organs, recycle and save for retirement! What could be wrong with wanting to correct the mind's natural "mistakes"? As the United Kingdom's government coalition declared last year, "Our Government will be a much smarter one, shunning the bureaucratic levers of the past and finding intelligent ways to encourage, support and enable people to make better choices for themselves."
British newspapers (The Guardian here and The Independent here) reported this winter that this declaration led to the creation of a "Behavioral Insight Team" whose goal is to tweak and trim government regulations to help people do the right thing. That is the strategy recommended by Cass Sunstein and Richard Thaler in Nudge. Yet one person's bagful of "nudge" is another's set of "psychological tricks to alter our behavior," as The Independent put it last January.
Behavioral economists often use homey metaphors of self-control (you know you shouldn't eat that brownie, you know you should buy a car that fits your dad-needs and not your cool-guy fantasies, let's help you). But the metaphor breaks down when controller and controlled are literally two different people. Refusing to acknowledge this makes a policy discussion of irrationality into a conversation about pension options and checkboxes on drivers' licenses. Which is tantamount to saying there is only one right way to behave and we policy-makers know what it is. Dissent is irrational, but don't worry, we'll fix it.
Nagel has noticed that this is a values problem, and that you can't marry post-rational research to policy without confronting it. Of Brooks' book, he observes:
Still, even if empirical methods enable us to understand subrational processes better, the crucial question is, How are we to use this kind of self-understanding? Brooks emphasizes the ways in which it can improve our prediction and control of what people will do, but I am asking something different. When we discover an unacknowledged influence on our conduct, what should be our critical response?
One possible response is that we all want, or should want, the same things, consistently, so let's bring on the nudges. That is, I think, what a lot of recent behavioral economics books are claiming, and what Thaler believes. Last year he called worries about elite paternalism "nudgephobia" and likened them to "the fear of being given helpful directions when lost" or "the fear of obtaining reliable medical advice when sick."
I think Nagel's response to Brooks might prove wiser. Post-rational research isn't a cookbook of administrative tweaks; it's a challenge to our notions of what people are, and how they should live with their human nature. Only that broader conversation offers a chance to transform society for the better.
Illustration: Excerpted from Goya, The Dream of Reason Bringeth Forth Monsters