from the world's big
Why You'll Always Think Your Big Changes Are Behind You
Why can we face up to our inconsistencies in the past but not expect more in the future?
Think about your favorite singer or band. What would you pay me today for a ticket to one of their concerts ten years from now? OK, now think about a different group—the one that was your favorite ten years ago. (If your answer to both questions is the same act, just take your Dylan bootlegs and try another quiz, OK?) What would you pay to go hear this year's favorites, this week?
If you're like the respondents in this study in the current issue of Science, you'll be willing to pay more to see your current favorite in ten years than you are to see your old love right now. To be precise, 170 people were willing to pay an average of $129 to see today's favorite in 10 years, but only $80 to see their once-favorite band from a decade ago. That 61 percent difference is, in essence, a strange bet—knowing that they value the 10-year-old band less, they still think they will value the current band more, when it is ten years in the past. This hunch—"I've changed a lot to get here, but now I'm done"—is, as the rest of the paper describes, a pretty strong and pervasive prejudice among the 19,000 people the researchers tested. The authors—Jordi Quoidbach, Daniel T. Gilbert and Timothy D. Wilson—have come up with a lovely name for it: "The end-of-history effect."
The End of History, Francis Fukuyama's 1992 book, argued that Western-style liberal democracy might well represent the end-point of humanity's forward march, as it could prove the final, lasting form of government on which all nations would settle. (In the early 1990s, with the Soviet Union recently collapsed, a recession that seems like a mild headcold compared to now and Islamic terrorism faint on the radar, this kind of thing seemed plausible).
Twenty years on it looks foolish indeed to assume that social change would not continue into the future, as it always has in the past. But, as the Science paper nicely documents, "history is over" does seem to be the default setting for understanding one's personal life.
Via interactive Web surveys, Quoidbach et al. asked people (the vast majority women—do men not take Internet surveys?) to play through different versions of the 10-years-ago versus 10-years-hence procedure. They asked some respondents to estimate how their personalities had changed and would change; they asked others about basic values (pleasure, success, security); they asked others about preferences for music, food, vacation and friends. Consistently, they report, they saw the same effect. People reported significant change in the past, but expected little to no significant change in the future.
You can think of a lot of reasons why most people would like to preserve the illusion that they're coherent beings. Psychologically, it's hard to feel sure of yourself if you know your self could be a different person in a few year's time. Then, too, there are the practical and psychic pressures of a society whose institutions depend on consistency over time. My mortgage binds me to do, think and feel in 20 years just as I did when I signed it. So does a marriage vow, or any solemn promise. When voters keenly scrutinize political candidates' biographies for clues about how they will behave in office, it's with the presumption that people stay roughly the same throughout their lives.
Why, then, can we face up to our inconsistencies in the past but not expect more in the future? I suspect part of the answer lies with the human mind's great talent for spinning stories, especially about itself. No matter how much your past is full of zigs and zags as you changed in personality, values and tastes, you can always package it as a coherent narrative. I can describe what happened to me as a natural evolution (my principles drove me to rethink my stance on marriage equality) or I can describe the change as a dramatic break (my experience in the war really changed me). It doesn't matter which; when I tell the tale, it will have the reassuring coherence of story. In other words, all autobiographies are coherent—not because people are, but because stories are. The future, being unknown, cannot be shaped in this way. Quite the opposite: Sitting on a story of ourselves that makes sense of the past, we can't be open to the idea of future change. It would mess up the narrative. (Of course, once the change has occurred, we—artful storytellers that we are—will find a way to work it in, or to leave it out of the tale.)
According to Quoidbach et al., one possible explanation for their results, is self-flattery: "most people believe that their personalities are attractive, their values admirable, and their preferences wise," they write, "and having reached that exalted state, they may be reluctant to entertain the possibility of change." But I think there's more to their second suggestion: "People also like to believe that they know themselves well, and the possibility of future change may threaten that belief."
POSTSCRIPT 1/8/13: Jordan Ellenberg, a mathematician, is skeptical about most of these experiments. (Hat tip Gary Marcus for this.)
Ellenberg notes that the experimenters asked people to make specific predictions about various traits in the future, and then combined these into an estimate of overall future change. That's a mathematical mistake, he says. Knowing that some general change will occur doesn't mean you know where and how it will. This is because even if people expect to change in some way they can't be sure exactly in which way it will happen. So imagine a person who is asked "do you expect to change a lot in the next ten years?" and then is asked about food, music, hobbies, friends and vacations. If that person says "yes!" to Question 1 but then says "no" to all five of the specifics, that person is doing their math correctly, and displaying no cognitive bias at all, Ellenberg argues. From their inability to predict five specific changes, you cannot infer a general refusal to predict change of some kind.
Only one of the studies in the paper, asked people to compare apples to apples: The one in which volunteers were asked to compare past personality changes to their estimated future change. Here's hoping the authors (and other interested parties, since the data is freely available here) will weigh in.
Follow me on Twitter: @davidberreby
Quoidbach, J., Gilbert, D., & Wilson, T. (2013). The End of History Illusion Science, 339 (6115), 96-98 DOI: 10.1126/science.1229294
Join multiple Tony and Emmy Award-winning actress Judith Light live on Big Think at 2 pm ET on Monday.
From "if-by-whiskey" to the McNamara fallacy, being able to spot logical missteps is an invaluable skill.
- A fallacy is the use of invalid or faulty reasoning in an argument.
- There are two broad types of logical fallacies: formal and informal.
- A formal fallacy describes a flaw in the construction of a deductive argument, while an informal fallacy describes an error in reasoning.
Appeal to privacy<p>When someone behaves in a way that negatively affects (or could affect) others, but then gets upset when others criticize their behavior, they're likely engaging in the appeal to privacy — or "mind your own business" — fallacy. Examples:<br></p><ul><li>Someone who speeds excessively on the highway, considering his driving to be his own business.</li><li>Someone who doesn't see a reason to bathe or wear deodorant, but then boards a packed 10-hour flight.</li></ul><p>Language to watch out for: "You're not the boss of me." "Worry about yourself."</p>
Sunk cost fallacy<p>When someone argues for continuing a course of action despite evidence showing it's a mistake, it's often a sunk cost fallacy. The flawed logic here is something like: "We've already invested so much in this plan, we can't give up now." Examples:<br></p><ul><li>Someone who intentionally overeats at an all-you-can-eat buffet just to get their "money's worth"</li><li>A scientist who won't admit his theory is incorrect because it would be too painful or costly</li></ul><p>Language to watch out for: "We must stay the course." "I've already invested so much...." "We've always done it this way, so we'll keep doing it this way."</p>
If-by-whiskey<p>This fallacy is named after a speech given in 1952 by <a href="https://en.wikipedia.org/wiki/Noah_S._Sweat" target="_blank">Noah S. "Soggy" Sweat, Jr.</a>, a state representative for <a href="https://en.wikipedia.org/wiki/Mississippi" target="_blank">Mississippi</a>, on the subject of whether the state should legalize alcohol. Sweat's argument on prohibition was (to paraphrase):<br></p><p><em>If, by whiskey, you mean the devil's brew that causes so many problems in society, then I'm against it. But if whiskey means the oil of conversation, the philosopher's wine, "</em><em>the stimulating drink that puts the spring in the old gentleman's step on a frosty, crispy morning;" then I am certainly for it.</em></p>
Slippery slope<p>This fallacy involves arguing against a position because you think choosing it would start a chain reaction of bad things, even though there's little evidence to support your claim. Example:<br></p><ul><li>"We can't allow abortion because then society will lose its general respect for life, and it'll become harder to punish people for committing violent acts like murder."</li><li>"We can't legalize gay marriage. If we do, what's next? Allowing people to marry cats and dogs?" (Some people actually made this <a href="https://www.daytondailynews.com/news/national/cats-marrying-dogs-and-five-other-things-same-sex-marriage-won-mean/dLV9jKqkJOWUFZrSBETWkK/" target="_blank">argument</a> before same-sex marriage was legalized in the U.S.)</li></ul><p>Of course, sometimes decisions <em>do </em>start a chain reaction, which could be bad. The slippery slope device only becomes a fallacy when there's no evidence to suggest that chain reaction would actually occur.</p><p>Language to watch out for: "If we do that, then what's next?"</p>
"There is no alternative"<p><span style="background-color: initial;">A modification of the </span><a href="https://en.wikipedia.org/wiki/False_dilemma" target="_blank" style="background-color: initial;">false dilemma</a><span style="background-color: initial;">, this fallacy (often abbreviated to TINA) argues for a specific position because there are no realistic alternatives. Former British Prime Minister Margaret Thatcher used this exact line as a slogan to defend capitalism, and it's still used today to that same end: Sure, capitalism has its problems, but we've seen the horrors that occur when we try anything else, so there is no alternative.</span><br></p><p>Language to watch out for: "If I had a magic wand…" "What <em>else</em> are we going to do?!"</p>
Ad hoc arguments<p>An ad hoc argument isn't really a logical fallacy, but it is a fallacious rhetorical strategy that's common and often hard to spot. It occurs when someone's claim is threatened with counterevidence, so they come up with a rationale to dismiss the counterevidence, hoping to protect their original claim. Ad hoc claims aren't designed to be generalizable. Instead, they're typically invented in the moment. <a href="https://rationalwiki.org/wiki/Ad_hoc" target="_blank">RationalWiki</a> provides an example:<br></p><p style="margin-left: 20px;">Alice: "It is clearly said in the Bible that the Ark was 450 feet long, 75 feet wide and 45 feet high."</p><p style="margin-left: 20px;">Bob: "A purely wooden vessel of that size could not be constructed; the largest real wooden vessels were Chinese treasure ships which required iron hoops to build their keels. Even the <em>Wyoming</em> which was built in 1909 and had iron braces had problems with her hull flexing and opening up and needed constant mechanical pumping to stop her hold flooding."</p><p style="margin-left: 20px;">Alice: "It's possible that God intervened and allowed the Ark to float, and since we don't know what gopher wood is, it is possible that it is a much stronger form of wood than any that comes from a modern tree."</p>
Snow job<p><span style="background-color: initial;">This fallacy occurs when someone doesn't really have a strong argument, so they just throw a bunch of irrelevant facts, numbers, anecdotes and other information at the audience to confuse the issue, making it harder to refute the original claim. Example:</span><br></p><ul><li>A tobacco company spokesperson who is confronted about the health risks of smoking, but then proceeds to show graph after graph depicting many of the other ways people develop cancer, and how cancer metastasizes in the body, etc.</li></ul><p>Watch out for long-winded, data-heavy arguments that seem confusing by design.</p>
McNamara fallacy<p>Named after <a href="https://en.wikipedia.org/wiki/Robert_McNamara" target="_blank">Robert McNamara</a>, the <a href="https://en.wikipedia.org/wiki/United_States_Secretary_of_Defense" target="_blank">U.S. secretary of defense</a> from 1961 to 1968, this fallacy occurs when decisions are made based solely on <em>quantitative metrics or observations,</em> ignoring other factors. It stems from the Vietnam War, in which McNamara sought to develop a formula to measure progress in the war. He decided on bodycount. But this "objective" formula didn't account for other important factors, such as the possibility that the Vietnamese people would never surrender.<br></p><p>You could also imagine this fallacy playing out in a medical situation. Imagine a terminal cancer patient has a tumor, and a certain procedure helps to reduce the size of the tumor, but also causes a lot of pain. Ignoring quality of life would be an example of the McNamara fallacy.</p><p>Language to watch out for: "You can't measure that, so it's not important."</p>
A new study looks at what would happen to human language on a long journey to other star systems.
- A new study proposes that language could change dramatically on long space voyages.
- Spacefaring people might lose the ability to understand the people of Earth.
- This scenario is of particular concern for potential "generation ships".
Generation Ships<span style="display:block;position:relative;padding-top:56.25%;" class="rm-shortcode" data-rm-shortcode-id="a1e6445c7168d293a6da3f9600f534a2"><iframe type="lazy-iframe" data-runner-src="https://www.youtube.com/embed/H2f0Wd3zNj0?rel=0" width="100%" height="auto" frameborder="0" scrolling="no" style="position:absolute;top:0;left:0;width:100%;height:100%;"></iframe></span>
Many of the most popular apps are about self-improvement.
Emotions are the newest hot commodity, and we can't get enough.