from the world's big
Human Irrationality is a Fact, not a Fad
Once upon a time, we were taught that people are basically rational—at least when they have to be, at the stock market, the voting booth, the courtroom, the hospital, the school, the employment office and other important places. Economists, in particular, depended on their version (rationality=everyone is out for himself all the time) but the reliable rule of reason was important for politics, law, medicine and other fields as well. Then some economists changed their minds and put the word behavioral in front of their discipline. They reported abundant evidence that people don't know when they're being rational and can't decide to use reason at will. As these researchers were pushing against the old assumption, they (and their popularizers even more) banged the We're-Irrational drum pretty hard. So now there's a backlash: People pointing out that, foolish as it may be to say that people always think straight, it is equally dumb to claim they never think straight.
As a complaint about what I call post-rational thinking, this is largely a straw man. I've never run across a neuroscientist, psychologist or economist who claimed that people are utterly incapable of reason (what would be the point of showing them evidence that they can't make sense of evidence?). Maybe Jon Haidt didn't acknowledge his dependence on reason enough in his book. Maybe David Brooks sentimentally devalued conscious reasoning in his, as Thomas Nagel pointed out. But they didn't deny the rational mind's abilities. And then, in Thinking Fast and Slow, Daniel Kahneman, the world's most famous and most honored behavioral economist, continually reminds readers that people can and do reason well all the time. So I don't agree with my new fellow-blogger, Steven Mazie, that we're awash in "a faddish denial that human beings can think straight."
Indeed, the problems posed by post-rational research (and the reasons I'm interested in it) stem from the way human beings can "think straight." It's just that they can't tell when they are doing so, can't will themselves to do so, and often think they're being rational when they are not. As a result, there is a gap between the way our important institutions officially work and the way they really work, and this gap causes a great deal of harm.
Two examples: Officially markets are efficient sorters of information that help all participants find the true prices of goods and services. In reality, markets aren't meetingplaces of rational beings, and hence are susceptible to runs, panics, bubbles and fraud. Officially, judges are trained experts who objectively apply the law. In reality, judges who have had to make a lot of decisions without a break are more severe than they are when they've just had a break. And judges who have rolled a dice and gotten a high number will choose a longer sentence than will judges who rolled a low number, for the same criminal. We need to fix markets that rest on the false assumption of perfect rationality. We need to protect the justice system against the assumption that judges are consistent from 10 to 6. To ignore the evidence, by, say, dismissing it as a fad, is to let our institutions run badly in order to preserve the fiction of rationality on which they rest. That wouldn't be terribly rational of us, would it?
In between the absurd extremes of Perfect Rationality and and Perfect Irrationality (which nobody believes in anyway) are important questions, all of them still open, and none simple: When is Reason actually engaged? How can we tell? What rules do we follow instead of logic? And, most importantly, what do we mean by "thinking straight"?
This last question is seldom addressed in books and articles on human irrationality. Instead, as Mazie notes, a lot of this material uses a thin and impoverished notion of human thought. (Deirdre McCloskey makes a similar point about the allied field of "happiness economics" here.) Often, their model of what thought is, and what it is for, is as bad as the old Rational Economic Man models. In fact, often, it is the old REM model: After all, to say that people make systematic errors when they make choices is to say that we know what is correct, and that "correct" is doing what old-school economists would do. But perhaps those economists were wrong.
For example, people who have to choose between three options (call them A, B and C) will value them differently if they previously had to choose between A and B. Economists say this is faulty thinking, because the value of A and the value of B are not altered by the presence of C. But many creatures, including slime molds, are subject to this "error." So we should at least admit the possibility that it could be more appropriate for a living being to make the "error" than to act like a 20th-century economist. More broadly, we should realize that the end goal of a better understanding of human behavior isn't a few tweaks and nudges, but a better definition of what it means to think, and be, and be well.
Research into human irrationality, then, has the potential to cure some of our most important institutions of the habitual harms they inflict on us. And longer-term, it can contribute to a better understanding of what Reason is, and what we (sometimes) rational animals are. For both those reasons, I don't think this is just a passing fad.
Follow me on Twitter: @davidberreby
Join multiple Tony and Emmy Award-winning actress Judith Light live on Big Think at 2 pm ET on Monday.
From "if-by-whiskey" to the McNamara fallacy, being able to spot logical missteps is an invaluable skill.
- A fallacy is the use of invalid or faulty reasoning in an argument.
- There are two broad types of logical fallacies: formal and informal.
- A formal fallacy describes a flaw in the construction of a deductive argument, while an informal fallacy describes an error in reasoning.
Appeal to privacy<p>When someone behaves in a way that negatively affects (or could affect) others, but then gets upset when others criticize their behavior, they're likely engaging in the appeal to privacy — or "mind your own business" — fallacy. Examples:<br></p><ul><li>Someone who speeds excessively on the highway, considering his driving to be his own business.</li><li>Someone who doesn't see a reason to bathe or wear deodorant, but then boards a packed 10-hour flight.</li></ul><p>Language to watch out for: "You're not the boss of me." "Worry about yourself."</p>
Sunk cost fallacy<p>When someone argues for continuing a course of action despite evidence showing it's a mistake, it's often a sunk cost fallacy. The flawed logic here is something like: "We've already invested so much in this plan, we can't give up now." Examples:<br></p><ul><li>Someone who intentionally overeats at an all-you-can-eat buffet just to get their "money's worth"</li><li>A scientist who won't admit his theory is incorrect because it would be too painful or costly</li></ul><p>Language to watch out for: "We must stay the course." "I've already invested so much...." "We've always done it this way, so we'll keep doing it this way."</p>
If-by-whiskey<p>This fallacy is named after a speech given in 1952 by <a href="https://en.wikipedia.org/wiki/Noah_S._Sweat" target="_blank">Noah S. "Soggy" Sweat, Jr.</a>, a state representative for <a href="https://en.wikipedia.org/wiki/Mississippi" target="_blank">Mississippi</a>, on the subject of whether the state should legalize alcohol. Sweat's argument on prohibition was (to paraphrase):<br></p><p><em>If, by whiskey, you mean the devil's brew that causes so many problems in society, then I'm against it. But if whiskey means the oil of conversation, the philosopher's wine, "</em><em>the stimulating drink that puts the spring in the old gentleman's step on a frosty, crispy morning;" then I am certainly for it.</em></p>
Slippery slope<p>This fallacy involves arguing against a position because you think choosing it would start a chain reaction of bad things, even though there's little evidence to support your claim. Example:<br></p><ul><li>"We can't allow abortion because then society will lose its general respect for life, and it'll become harder to punish people for committing violent acts like murder."</li><li>"We can't legalize gay marriage. If we do, what's next? Allowing people to marry cats and dogs?" (Some people actually made this <a href="https://www.daytondailynews.com/news/national/cats-marrying-dogs-and-five-other-things-same-sex-marriage-won-mean/dLV9jKqkJOWUFZrSBETWkK/" target="_blank">argument</a> before same-sex marriage was legalized in the U.S.)</li></ul><p>Of course, sometimes decisions <em>do </em>start a chain reaction, which could be bad. The slippery slope device only becomes a fallacy when there's no evidence to suggest that chain reaction would actually occur.</p><p>Language to watch out for: "If we do that, then what's next?"</p>
"There is no alternative"<p><span style="background-color: initial;">A modification of the </span><a href="https://en.wikipedia.org/wiki/False_dilemma" target="_blank" style="background-color: initial;">false dilemma</a><span style="background-color: initial;">, this fallacy (often abbreviated to TINA) argues for a specific position because there are no realistic alternatives. Former British Prime Minister Margaret Thatcher used this exact line as a slogan to defend capitalism, and it's still used today to that same end: Sure, capitalism has its problems, but we've seen the horrors that occur when we try anything else, so there is no alternative.</span><br></p><p>Language to watch out for: "If I had a magic wand…" "What <em>else</em> are we going to do?!"</p>
Ad hoc arguments<p>An ad hoc argument isn't really a logical fallacy, but it is a fallacious rhetorical strategy that's common and often hard to spot. It occurs when someone's claim is threatened with counterevidence, so they come up with a rationale to dismiss the counterevidence, hoping to protect their original claim. Ad hoc claims aren't designed to be generalizable. Instead, they're typically invented in the moment. <a href="https://rationalwiki.org/wiki/Ad_hoc" target="_blank">RationalWiki</a> provides an example:<br></p><p style="margin-left: 20px;">Alice: "It is clearly said in the Bible that the Ark was 450 feet long, 75 feet wide and 45 feet high."</p><p style="margin-left: 20px;">Bob: "A purely wooden vessel of that size could not be constructed; the largest real wooden vessels were Chinese treasure ships which required iron hoops to build their keels. Even the <em>Wyoming</em> which was built in 1909 and had iron braces had problems with her hull flexing and opening up and needed constant mechanical pumping to stop her hold flooding."</p><p style="margin-left: 20px;">Alice: "It's possible that God intervened and allowed the Ark to float, and since we don't know what gopher wood is, it is possible that it is a much stronger form of wood than any that comes from a modern tree."</p>
Snow job<p><span style="background-color: initial;">This fallacy occurs when someone doesn't really have a strong argument, so they just throw a bunch of irrelevant facts, numbers, anecdotes and other information at the audience to confuse the issue, making it harder to refute the original claim. Example:</span><br></p><ul><li>A tobacco company spokesperson who is confronted about the health risks of smoking, but then proceeds to show graph after graph depicting many of the other ways people develop cancer, and how cancer metastasizes in the body, etc.</li></ul><p>Watch out for long-winded, data-heavy arguments that seem confusing by design.</p>
McNamara fallacy<p>Named after <a href="https://en.wikipedia.org/wiki/Robert_McNamara" target="_blank">Robert McNamara</a>, the <a href="https://en.wikipedia.org/wiki/United_States_Secretary_of_Defense" target="_blank">U.S. secretary of defense</a> from 1961 to 1968, this fallacy occurs when decisions are made based solely on <em>quantitative metrics or observations,</em> ignoring other factors. It stems from the Vietnam War, in which McNamara sought to develop a formula to measure progress in the war. He decided on bodycount. But this "objective" formula didn't account for other important factors, such as the possibility that the Vietnamese people would never surrender.<br></p><p>You could also imagine this fallacy playing out in a medical situation. Imagine a terminal cancer patient has a tumor, and a certain procedure helps to reduce the size of the tumor, but also causes a lot of pain. Ignoring quality of life would be an example of the McNamara fallacy.</p><p>Language to watch out for: "You can't measure that, so it's not important."</p>
A new study looks at what would happen to human language on a long journey to other star systems.
- A new study proposes that language could change dramatically on long space voyages.
- Spacefaring people might lose the ability to understand the people of Earth.
- This scenario is of particular concern for potential "generation ships".
Generation Ships<span style="display:block;position:relative;padding-top:56.25%;" class="rm-shortcode" data-rm-shortcode-id="a1e6445c7168d293a6da3f9600f534a2"><iframe type="lazy-iframe" data-runner-src="https://www.youtube.com/embed/H2f0Wd3zNj0?rel=0" width="100%" height="auto" frameborder="0" scrolling="no" style="position:absolute;top:0;left:0;width:100%;height:100%;"></iframe></span>
Many of the most popular apps are about self-improvement.
Emotions are the newest hot commodity, and we can't get enough.