This Simple Trick Will Help You Read People's Emotions More Accurately
It seems intuitive that the best way to interpret how others are feeling would be to both see and hear how they’re behaving. However, a new study suggests that’s dead wrong.
Want to really understand how other people are feeling? Close your eyes and listen.
That’s the takeaway of a new study published in American Psychologist that explored the empathic accuracy of various forms of communication. The results are some of the first to demonstrate that the primary way we convey emotions may be through the voice – not facial expressions or body language, as previously thought.
“Humans are actually remarkably good at using many of their senses for conveying emotions, but emotion research historically is focused almost exclusively on the facial expressions,” said Michael Kraus, a social psychologist at Yale University and author of the study, to The Guardian.
The paper detailed several experiments. In the first, researchers asked online participants to view videos showing a group of friends teasing each other about a nickname. Participants were presented the scene in one of three ways – audio only, audio and video, and video only – and were then asked to interpret what the friends were feeling by rating emotions like amusement, embarrassment, or happiness on a scale of 0 to 8. Surprisingly, those who only heard the interaction – but didn’t watch the video – were best able to interpret the emotions of the scene.
Another study involved undergraduate students gathering together in a room to discuss their favorite TV shows, movies, food and beverages. One group had the conversation in a lighted room, the other in a darkened room. Similar to the first experiment, people who were visually impaired in the darkened room more accurately interpreted the emotions of others.
Finally, the researchers took audio from the first experiment in which friends were teasing each other and had participants listen to one of two versions: the actual dialog from the friends, or a computerized voice reading the exact same words. Although you might expect to glean a similar amount of emotional information from the words alone, participants who interpreted the scene by listening to the digital voice fared far worse at interpreting emotions.
“The difference between emotional information in voice-only communication by a computer versus a human voice was the largest across all studies,” Kraus said to Yale Insights. “It’s really how you speak—not just what you say—that matters for conveying emotion.”
It seems intuitive that more information – both audio and visual – would better equip you to read the minds of other people, but the opposite seems true.
One explanation has to do with the limits of our cognitive power. When we’re taking in complex audio and visual input, it takes our brains more effort to process information. It’s similar to how a computer slows down when you have a bunch of different programs running simultaneously. Visual information is particularly costly to process, as Art Markman notes for Psychology Today:
Quite a bit of the brain is taken up with understanding what is going on in our sensory world. For example, if you clasp your hands behind your head, most of the area taken up by your hands reflects the amount of the brain that is devoted to making sense of the information coming in through your eyes.
These same brain regions are also responsible for recalling visual memory. And that could explain why people tend to shut their eyes when trying to recall details, or solve complex tasks in general. A 2011 paper published in the journal Memory & Cognition illustrates this idea quite nicely.
For the study, participants were instructed to watch a bit of a TV show, and later were asked to recall details about what occurred in the episode. The researchers separated participants into four groups, asking each to recall the show while they either: stared at a blank computer screen, closed their eyes, watched a computer screen as it randomly displayed nonsense images, or stared at a blank computer screen while they heard spoken words in a strange language.
Like the recent study, the groups that had received the least visual information – that is, they closed their eyes or stared at a blank computer screen – performed best. Interestingly, the group that stared at the screen displaying weird images fared worst at recalling visual details, while the group that heard random bits of a strange language did worst at recalling audio details from the show.
The other possible explanation has darker implications. People have a natural tendency to disguise their emotions, whether they’re doing something as benign as forcing a smile when you’re feeling down at work or something as malicious trying to manipulate someone into a shady business deal. Because our voices seem to be the primary way we communicate our emotions, the addition of visual cues like body language and facial expressions adds a whole toolset people can use to disguise their true emotions – a deliberately thoughtful tilt of the head, a raise of the eyebrows, or any of those body language hacks written up in countless articles ever since that one TED talk.
Either way, the researchers suggest people pay more attention to what others are saying and how they’re saying it.
“There’s an opportunity here to boost your listening skills to work more effectively across cultures and demographic characteristics,” Kraus said. “Understanding other people’s intentions is foundational to success in the global and diverse business environment that characterizes both the present and the future.”
Step inside the unlikely friendship of a former ACLU president and an ultra-conservative Supreme Court Justice.
- Former president of the ACLU Nadine Strossen and Supreme Court Justice Antonin Scalia were unlikely friends. They debated each other at events all over the world, and because of that developed a deep and rewarding friendship – despite their immense differences.
- Scalia, a famous conservative, was invited to circles that were not his "home territory", such as the ACLU, to debate his views. Here, Strossen expresses her gratitude and respect for his commitment to the exchange of ideas.
- "It's really sad that people seem to think that if you disagree with somebody on some issues you can't be mutually respectful, you can't enjoy each other's company, you can't learn from each other and grow in yourself," says Strossen.
- The opinions expressed in this video do not necessarily reflect the views of the Charles Koch Foundation, which encourages the expression of diverse viewpoints within a culture of civil discourse and mutual respect.
Learn how to redesign your job for maximum reward.
- Broaching the question "What is my purpose?" is daunting – it's a grandiose idea, but research can make it a little more approachable if work is where you find your meaning. It turns out you can redesign your job to have maximum purpose.
- There are 3 ways people find meaning at work, what Aaron Hurst calls the three elevations of impact. About a third of the population finds meaning at an individual level, from seeing the direct impact of their work on other people. Another third of people find their purpose at an organizational level. And the last third of people find meaning at a social level.
- "What's interesting about these three elevations of impact is they enable us to find meaning in any job if we approach it the right way. And it shows how accessible purpose can be when we take responsibility for it in our work," says Hurst.
Erik Verlinde has been compared to Einstein for completely rethinking the nature of gravity.
- The Dutch physicist Erik Verlinde's hypothesis describes gravity as an "emergent" force not fundamental.
- The scientist thinks his ideas describe the universe better than existing models, without resorting to "dark matter".
- While some question his previous papers, Verlinde is reworking his ideas as a full-fledged theory.
TuSimple, an autonomous trucking company, has also engaged in test programs with the United States Postal Service and Amazon.
PAUL RATJE / Contributor
- This week, UPS announced that it's working with autonomous trucking startup TuSimple on a pilot project to deliver cargo in Arizona using self-driving trucks.
- UPS has also acquired a minority stake in TuSimple.
- TuSimple hopes its trucks will be fully autonomous — without a human driver — by late 2020, though regulatory questions remain.