Question 1: At a dinner party this weekend, a friend introduces you to a woman named Genevieve. He tells you that Genevieve recently graduated from Bryn Mawr College with a B.A. in philosophy, where she was an active volunteer in an advocacy group for women's health and edited a literary magazine. You’re interested in talking to Genevieve about [Georg] Hegel, the subject of her senior thesis, but your friend jumps in and asks you to rank the following statements about Genevieve in order of their probability:
(1) Genevieve is a feminist.
(2) Genevieve is looking for a job as a sanitation worker.
(3) Genevieve is a feminist who is looking for a job as a sanitation worker.
Given what you know about Genevieve, rank the statements from most likely to least likely.
Question 2: Later that evening, your friend presents you with a deck of cards with a number on one side and a letter on the other. He deals you four cards from the deck. Here is what you see laid out before you on the four cards:
9 J U 2
Your friend then asks you which cards you will need to turn over in order to determine whether the following rule holds for the deck (assuming these four cards represent the rest of the deck):
If a vowel is printed on one side of the card, then an even number is printed on the other side
Which cards do you turn over in order to test this rule?
Question 3: Genevieve offers you a bet. “Flip this quarter,” she says. “If it’s heads, I’ll give you $200. If it’s tails, you pay me $100.”
Should you take the bet?
Question 1: This is known in the literature as the “Linda” problem, or the “conjunction fallacy.” It tests how well individuals reason using probability theory. In Kahneman and Amos Tversky’s 1983 study, 85 percent of subjects got it wrong. Your answer was incorrect, too, if you ranked statement (3) in the first or second position. Logic dictates that (3) is the least likely scenario: two conditions being true (Genevieve is an ardent feminist + Genevieve is looking for a job as a sanitation worker) is always less probable than only one of these being true. If you got this one right — it doesn’t matter whether you put (1) or (2) first, just that you ranked (3) last — congratulations. If not, you’re in good company: only 15 percent of Stanford business school students who had received training in probability theory got it right. (For more on Linda/Genevieve, including an examination of criticism of the question, see chapter 15 of Kahneman’s Thinking, Fast and Slow.)
Question 2: The card question, first asked by Peter Wason in 1966, challenges your deductive reasoning skills. In his 1977 book, Wason (with co-author Philip Johnson-Laird) reports that only 5 percent of subjects answered questions like this correctly. The most common mistake is to turn over the U and 2 cards — an error that flows from the rule’s specification of a relationship between vowels and even numbers. You do need to flip over the U card to check if an even number is on the other side (as the rule specifies). But you do not need to see what’s on the other side of the 2 card: The rule does not specify that even numbers are always paired with vowels, just that there must be an even number opposite a vowel. You do need to flip over the 9 card, however: If there is a vowel on the other side, you can disprove the rule. So the answer is: You must turn over exactly two cards: the U and the 9. (To try your hand at more examples of this selection task, with some interesting variations, try this link.)
Question 3: The bet question does not have a right or wrong answer, per se, but it highlights what Kahneman calls an irrational “loss aversion” everyone seems to suffer from, at least to some extent. Technically speaking, any bet where the payoff is greater than the loss, given an equal chance at either outcome, is a good one. And the prospect of earning $200 is a much better payoff that easily outweighs the $100 you’d have to pay Genevieve if you lose. Assuming the loss of $100 is tolerable — you know where your next meal is coming from, and you don’t need the money to pay the rent — you should, as a rational agent, accept the bet. The real-world problem with loss aversion isn’t that you’ll pass up great bets like these — Genevieve would have to be crazy to offer it, after all. The loss aversion ends up costing you dearly if you spend too much time protecting your precious assets when you should be just as assiduous about prospecting for new ones. I once spent about three hours, over several weeks, making calls to a merchant who had charged me shipping for an item I purchased online with a free shipping coupon. I finally got my $8 back. But if someone would have offered me a job calling up multiple customer service agents, waiting on hold, getting the runaround, etc., for a promise of $8 in compensation, there’s no way I’d accept it.
Interpreting the Results
So, how did you do? If you avoided the common errors of reasoning that led large majorities of subjects to do the irrational thing on repeated experiments, you may justly gloat a little. (But only a little: Smarter people may have a particularly hard time talking themselves out of other biases.)
If you answered one or more of these questions incorrectly — and chances are very high that you did — the question is what this says about you individually and about humanity writ large. Do experiments like these belie the faith of philosophers and social scientists in baseline human rationality? Do these results show that only a select slice of humanity (somewhere between 5 and 15 percent, depending on the study) qualifies for the title “rational”? One way out of this mess is to deny that any of these experiments are really measuring rationality. But if we seek to disentangle rationality from deductive logic and probability theory, our account of reason gets messy. Rationality may be about more than logic alone, but without logic at its base, isn’t it one confused puppy? In his 1993 book, The Nature of Rationality, Robert Nozick sketched a concept of “symbolic utility” in which rational irrationality becomes a potential reality rather than an oxymoron:
Producing evident bad consequences, these apparently irrational actions and symptoms have a symbolic significance that is not obvious; they symbolize something else [which] has some utility or value ... for the person. (p. 26)
So refusing Genevieve’s bet may symbolize your lack of greed, your conservative nature, or your pride in protecting assets you have worked hard to earn. And you may benefit in various ways from having one or more of these self-conceptions. Nozick’s idea raises a host of questions and intellectual tangles, but at least it points a path around the faddish denial that human beings can think straight. As delicious as that idea seems to be.
Note to Praxis readers: a while back I challenged readers with a three-question quiz similar to the one you'll find below. If you're new to the quiz, have at it. If you took it back when I first published it, consider giving it another try. Perhaps the most disturbing message of Daniel Kahneman's now-classic Thinking Fast and Slow (from which two of these questions are adapted) is that making people aware of their systematic irrational biases is highly ineffective as a cure for irrational thinking. Kahneman himself found that he would commit the same errors over and over again even after conducting studies in which he researched how people commit particular mistakes of logical reasoning. So readers who took the test a couple of years ago might see if their first experience has made any difference to their thinking today. (Sorry, this is low-tech — no interactive buttons here. Please get out a piece of paper and a pen to record your answers.)
Image credit: Shutterstock
Follow Steven Mazie on Twitter: @stevenmazie