For some time now, cognitive scientists have been sure that the mind is not made for logical reasoning. That ability is just a lucky side effect of the work brains actually evolved to do. For example, people can learn, with some difficulty, to solve a logic problem with abstract symbols. But it's much easier when turned into a question about who the bartender has to card tonight. Similarly, a purely logical mind would look at a collection of human beings—say, the citizens of Iran—and see an agglomeration of individuals, very few of whom have any say over whether their government develops nuclear weapons. But the human mind is tuned for signs of people operating together as a team. So we say things like "Iran is trying to obtain nuclear weapons" or "Iran deserves to be annihilated." In fact, according to this paper, the more we see people as part of a coherent group, the more harshly we judge their actions.
This is not exactly a secret to propagandists and other political types, who know that bombing Iran is a respectable subject for high-minded policy types, while possibly bombing Asghar Farhadi and his family just sounds like a Mafia hit. Yet, as people who actually have to drop bombs have said, those are two different descriptions of the same decision, so psychologists should be able to explain why our reaction depends on our description of the choice. That's what the new paper, which went online last month before print publication in a future issue of The Journal of Experimental Social Psychology, is about.
Its roots are in a powerful idea worked out in the middle of the last century by the philosopher and social scientist Donald T. Campbell. He proposed that the mind, when faced with any collection of people, can end up treating that group as if it were a coherent, single object that takes up a particular space and persists through time—in other words, as later work found, as if it were a single being, with thoughts, feelings, and intentions. What determined this perception, Campbell thought, were certain specific cues: For instance, if people are all dressed alike (as soldiers are), if they're close together (think soldiers again), or if it's obvious that they share the same fate (think sailors, all in the same boat), then the mind is more inclined to perceive them as a single entity than as a collection of different Bobs and Daves and Barbs and Donnas. (The actual traits of the group aren't particularly relevant, because any assemblage of people—China, Tea Partiers, Catholics, Red Sox fans—can be made to look either like a monotone forest or like a varied stand of individual trees. It's all about how the people are perceived, which is what this paper examines.)
Ever since Campbell people have been studying the effects of group thinginess, refining and elaborating his theory. (Yes, "thinginess" is a weird, off-putting word for the idea, but I think Campbell's—"entitativity"—is even worse.)
In this study, Anna-Kaisa Newheiser and her co-authors recruited 83 people to learn about the Greels (don't Google it, they're a made-up kind of person). Some of the volunteers learned that Greels are all the same color and do their Greel stuff all together, like a Boy Scout troop. Others were told that Greels came in all different colors and went about doing various separate things as individuals, like a feeble high-school garage band. The volunteers were then asked about various "behaviors unique to Greels" and asked to rate how morally acceptable each practice was. Some of these were designed to strike typical Americans as ethically dodgy (one, for instance, was that Greels force left-handed children to switch to their right hands). People who had been given clues that Greels behaved like a coherent self-contained object were more severe in judging the same conduct than were people who had been encouraged to see Greels as a collection of individuals.
Now, researchers have found that people's judgments of individuals are less severe when there are mitigating circumstances ("he stole her purse" makes people suggest a more severe punishment than "he stole her purse to pay for his sick child's medicine"). But in a second experiment Newheiser et al. found that this slack is much easier to give to individuals. Groups don't get the same break.
The researchers presented 319 people with the tale of four business executives who had embezzled hundreds of thousands of dollars from their company. In one version, they needed the money to pay off their loan sharks. In another, they stole to give money to underpaid workers in the company's Third World factory. Across this good-guy-versus-bad-guy difference the story also had a team-versus-individual plot point: In some versions, the four execs were "a tightly knit team,"; in others, they were "loosely connected individuals who barely knew each other."
Among other questions, volunteers were asked to recommend punishments for the perps. As you might expect, the average sentence suggested for the loan-shark clients was more severe than it was for the stories that made them out to be altruistic Robin Hoods. This contrast appeared in the data whether people perceived the foursome as a social unit or as loosely-tied individuals. However, when the motive was described as helping the poor, people gave altruistic individuals more of a break than they gave the altruistic group: Those who were told the accountants were loosely connected individual recommended an average sentence of six months to a year in prison. For the same crime, same people, and same motive but committed by a group that looked like a single entity, the recommendation average between 1 and 3 years in prison.
In other words, Newheiser et al. argue, they have teased out a tendency to harsher judgment and more aggressive punishment that has nothing to do with the nature a group's actions. It's prompted instead by the way we perceive the group.
Now, it's not hard to see that we tend to see nations—especially far-off, unfamiliar nations—as unitary creatures, with feelings, thoughts and plans. It's embedded in our language about states, which unthinkingly uses phrases like "France wants to get out of Afghanistan" or "China fears dissent," assuming this is just a kind of metonymy (like saying "the White House reacted to the charges" to save time). This study suggests this mental habit isn't just a bit of poetic license, but rather a dangerous penchant of the mind.
So as the war drums beat around Iran, it might be worth trying to correct for your built-in bias to be harsh toward entities made of people. The next time someone explains why the West might need to attack, try substituting "Farhadi and his family" for "Iran" and see how that feels. It might be a good exercise to check a hidden bias.
Newheiser, A., Sawaoka, T., & Dovidio, J. (2012). Why Do We Punish Groups? High Entitativity Promotes Moral Suspicion Journal of Experimental Social Psychology DOI: 10.1016/j.jesp.2012.02.013