A new study from Ohio State University details implicit bias.
- New research from Ohio State claims we cannot separate how someone looks and sounds.
- Volunteers were asked to look at photos and listen to audio, and were told to ignore their face or voice.
- "They were unable to entirely eliminate the irrelevant information," said associate professor Kathryn Campbell-Kibler.
Postmates is a way of life in Los Angeles. So when a young Black driver recently crossed paths with a woman outside of her building while delivering food to another apartment, you might initially be shocked at her response. While the woman claims her reaction is not racist—not only does she refuse him entry, but after he calls the apartment and talks to the man on other line, she even denies that he lives in the building—her use of the term "boy" says it all.
Would she have reacted similarly if the driver was white? While no definitive answer can be given, a new study from Ohio State University finds that his race is not only an issue, the woman would have not been able to ignore it even if she wanted to.
The distance between implicit and explicit bias has been studied for years. In this research, published in Journal of Sociolinguistics, Associate Professor Kathryn Campbell-Kibler, in the Department of Linguistics at OSU, asked 1,034 volunteers to look at photos and listen to audio of people speaking to determine if they immediately judged someone by their looks or accents.
Almost across the board, they did.
8 powerful speakers that might make you think differently about racism | Big Think
In some cases, volunteers were told to evaluate how "good-looking" the people in the photos were; in others, they were asked to judge their accents. One cohort was not given guidance; they looked at a photo and listened to a voice. Others were told to ignore the face while listening, and vice-versa. Some were even told that the voice was not from the same person they were looking at.
It didn't matter. In most cases, volunteers expressed critical judgment of either their face or voice. As Campbell-Kibler says,
"Even though we told them to ignore the voice, they couldn't do it completely. Some of the information from the voice seeped into their evaluation of the face."
Detaching face from voice is a difficult endeavor. The first time I heard Welsh actor Matthew Rhys' true accent was while watching "The Wine Show," which he filmed shortly after wrapping up work on "The Americans." It took me a few minutes to rationalize what I was seeing. Now I can't get his actual speaking voice out of my head while watching the drunken private investigator transform into the lawyer we knew Perry Mason would become.
Jonathan Gartrelle (L), participating in a protest against police brutality, confronts a demonstrator taking part in a counter demonstration advertised as a Law and Order Rally that was also supporting President Donald Trump on June 14, 2020 in Miami, Florida.
Photo by Joe Raedle/Getty Images
Rhys is paid to speak English with an American accent. The stakes are low for me as a viewer. Out in the real world, where racism is as prevalent as ever, the situation is different. Implicit bias affects everyone, which means racism and xenophobia are conditions we have to work at correcting in ourselves. It won't come natural. Campbell-Kibler continues,
"We found that people could exercise some control over what information to favor, the voice or the face, depending on what we told them to do. But in most cases, they were unable to entirely eliminate the irrelevant information."
She notes that even though most participants were white, they were careful to not racially stereotype. Volunteers told to ignore faces while listening to accents performed best for this reason, though some admitted they had to make a conscious effort to do so.
Volunteers took no issue with judging the photos good-looking, believing looks to be subjective. Campbell-Kibler wants to follow up this research using videos instead of photographs to observe the impact of watching others on the screen.
The takeaway: we are influenced by all of the information available to us at all times. Our biases will make themselves apparent. Course-correcting is not natural, but thankfully, it is possible.
Do you know the implicit biases you have? Here are some ways to find them out.
- A study finds that even becoming aware of your own implicit bias can help you overcome it.
- We all have biases. Some of them are helpful — others not so much.
When we talk about a bias, what we're talking about, as Harvard University social psychologist Mahzarin Banaji puts it, is a shortcut our brain has created so that we don't have spend time and energy thinking about how we feel each time we encounter something — we have an opinion already formed and ready to use.
Many of these shortcuts are useful: A bias against hangovers, for example, has one refusing alcohol without having to think about it. The problem is the brain does a lot of this shortcutting, silently. What's more, it creates shortcuts for people different than ourselves, sometimes based on actual personal experience, but often based on incorrect information we've unknowingly absorbed: other peoples' opinions, media depictions, cultural attitudes, for instance.
Worst of all, this kind of bias may be created and deployed without our even being aware of it — it's implicit in our actions in spite of ourselves and our conscious intentions.
Our brains don't always get things right. We make errors in judgement all of the time. An accurate bias is a great time-saver. An inaccurate bias is a serious problem, especially if it causes us to unknowingly discriminate against others. For instance, the systemic assumptions about women that keep them from advancing in scientific fields.
How we can curb the effects of implicit biases
Image source: Radachynskyi Serhii / Shutterstock / Big Think
New research, published in Nature Human Behavior on August 26, suggests the gender bias, which continues to prevent women from advancing in science, has a lot to do with its hidden underbelly — human blindspots. During the study, French researchers discovered that more women were promoted after the scientists in charge of awarding research positions became consciously aware of the impact of their implicit bias.
When it was no longer being highlighted, their biases discriminatory effect re-asserted itself, with award grants regressing to their traditional, pro-male pattern. Other research suggests that diversity training doesn't really help and may even exacerbate the problem it seeks to address.
We can glean a new approach, though — one that could result in better outcomes — from the new research.
About the study
Image source: Tartila/Shutterstock/Big Think
What the new study encouragingly reveals is that a conscious awareness of one's own hidden bias can mitigate its effect. The mechanism, it would appear, is that awareness may not delete the bias so much as make it less implicit, or unconscious.
The study looked at the awards handed out during annual nationwide competitions for elite French research positions. There were 414 people on the committees altogether, assessing candidates' worthiness across a spectrum of research specialties — "from particle physics to political sciences." The study analyzed committee-level data without digging too deeply into whether a committee was internally gender-balanced. The assumption was that the consensus decision reached by group represented the outcome of its internal makeup, whatever that may be.
The study took place over two years. In the first year, committee members were given Harvard's implicit association test (IAT), which established there was a significant implicit gender biases among them. Nonetheless, that year, the influence of such biases appeared to be significantly suppressed in the awards the committees handed out.
To the researchers, this outcome suggested that simply being aware of one's own implicit biases may take away their invisibility — the callout could make the bias more apparent and, therefore, something that can be more readily over-ridden.
The second year of the study, from the subjects' point of view at least, was quite silent. The researchers were still watching, but the issue of implicit bias wasn't called out. What ended up happening? The committee members returned to awarding more positions to men than women. A regression, it seemed.
It should be said, there are some possible flaws in the study: Perhaps the committee members were simply on their good behavior the first time around — until they thought that they were no longer being observed. Additionally, the study notes that there were more male submissions to the committees than female, which could skew the test. Further studies will need to be done to get a more accurate picture.
Nonetheless, the study's authors do conclude that becoming aware of one's own implicit biases may be the first step — maybe the most essential step — needed to overcome them.
How do I know if implicit bias is affecting my judgement?
Image source: AlexandreNunes / Shutterstock / Big Think
While the study looked at gender bias, of course, it's not the only variety to be concerned about, others pervade our culture: race bias, ethnicity bias, anti-LGBTQ bias, age bias, anti-Muslim bias, and so on. There are a couple of online methods available for sussing out our own. Note that if the researchers are correct, then just making yourself aware of your implicit biases can help you combat them.
The IAT mentioned above is one widely used way to identify your own bias issues. Project Implicit — from psychologists at Harvard, the University of Virginia, and the University of Washington — offers a self-test you can take. Be aware, though, that the IAT requires multiple tests to produce a meaningful result.
If you're willing to invest a little time, there's also the "bias cleanse" offered by MTV in partnership with the Kirwan Institute for the Study of Race and Ethnicity. It's a seven-day program aimed at helping you sort out implicit gender, race, or anti-LGBTQ biases you may be harboring. Each day you receive three eye-opening email thought exercises, one for each type of bias.
Side note: Did you know that more people die in female-named hurricanes because they're typically perceived as less threatening? We didn't.
It's a well-worn bromide that simply acknowledging you have a problem is the first step to solving it, but the new study provides supporting evidence that this is especially true when dealing with implicit biases — a pernicious, stubborn problem in our society. Our brains are clever beasties, silently putting together shortcuts that reduce our cognitive load. We just need to be smarter about seeing and consciously assessing them if we can ever hope to be the people that we hope to be. That may mean, on occasion, being humble enough to receive feedback in the form of callouts.
What would it be like to live in the body of someone else? With VR, now you can actually find out.
What would it be like to live in the body of someone else? Since the dawn of mankind, people have imagined what it would be like to inhabit another body, just for a day or even for a few minutes. Thanks to the magic of VR, we can now do that. Jeremy Bailenson, the creator of the Virtual Human Interaction Lab, has designed a VR experience called 1000 Cut Journey that may change the way people see race: by experiencing it firsthand. Jeremy explains to us, "You start out as an elementary school child and you’re in a classroom. You then become a teenager and you’re interacting with police officers. You then become an adult who’s going on a job interview, and what you experience while wearing the body of a black male is implicit bias that happens repeatedly and over time." Jeremy is brought to you today by Amway. Amway believes that diversity and inclusion are essential to the growth and prosperity of today’s companies. When woven into every aspect of the talent life cycle, companies committed to diversity and inclusion are the best equipped to innovate, and improve brand image and drive performance.
Businesses have been adopting more diversity programs since the 1990s, but do they actually work?
Diversity programs have become commonplace in the professional world, but do they actually work?
Not really, according to an award-winning report in the Harvard Business Review written by Frank Dobbin, a professor of sociology at Harvard, and Alexandra Kalev, an associate professor at Tel Aviv University.
“It shouldn’t be surprising that most diversity programs aren’t increasing diversity,” wrote Dobbin and Kalev. “Despite a few new bells and whistles, courtesy of big data, companies are basically doubling down on the same approaches they’ve used since the 1960s—which often make things worse, not better.”
The Equal Employment Opportunity Commission reported that American companies with more than 100 employees barely increased their hiring of women and minorities between 1985 and 2014. According to the data, black men in managerial roles rose from 3 to 3.3 percent since 1985, while white women in management increased from 22 to 29 percent, a figure that's remained stagnant since 2000.
Diversity programs might also be creating a worse working environment for white men. In a study published in the Journal of Experimental Psychology, researchers compared the job interview performance of white men at companies with and without stated diversity programs.
“Compared to white men interviewing at the company that did not mention diversity, white men interviewing for the pro-diversity company expected more unfair treatment and discrimination against whites. They also performed more poorly in the job interview, as judged by independent raters. And their cardiovascular responses during the interview revealed that they were more stressed.”
(Photo: Chris Ryan – Getty Images)
Analyzing three decades’ worth of data and interviewing hundreds of managers and executives, Dobbin and Kalev identified some key aspects of diversity programs that make them ineffective, or worse, counterproductive. And perhaps more importantly, their research sheds light on diversity approaches that actually seem to work.
Why Diversity Programs Fail
Diversity training is used in about half of mid-sized companies and nearly all of the Fortune 500. But these programs fail for a number of reasons:
Many programs are mandatory. According to Dobbin and Kalev's research, companies that used mandatory diversity training ultimately employed less employees of color over a 5-year analysis.
Three quarters of companies use negative language in their programs, sending the message of “Discriminate, and the company will pay the price.”
“...threats, or “negative incentives,” don’t win converts.”
Employees might learn to answer a diversity program's questionnaire correctly, but they tend to forget the information after a few days.
“The positive effects of diversity training rarely last beyond a day or two, and a number of studies suggest that it can activate bias or spark a backlash.”
Some companies make it clear that diversity programs are remedial, particularly after harassment cases or complaints against managers.
“...singling them out implies that they’re the worst culprits. Managers tend to resent that implication and resist the message.”
Effective Ways to Promote Diversity
Dobbin and Kalev propose three principles for encouraging diversity in the workplace:
According to Dobbin and Kalev's findings, managers are happy to engage when programs are voluntary, and when they're asked to help in a positive way.
“When managers actively help boost diversity in their companies, something similar happens: They begin to think of themselves as diversity champions.”
Mentoring programs are a good way for managers to get involved, particularly when white male manager are assigned protégés to mentor – the research suggests these managers are hesitant to informally reach out to young women and minority men.
“Mentoring programs make companies’ managerial echelons significantly more diverse: On average they boost the representation of black, Hispanic, and Asian-American women, and Hispanic and Asian-American men, by 9% to 24%.”
College recruitment programs also seem to be effective.
“Five years after a company implements a college recruitment program targeting female employees, the share of white women, black women, Hispanic women, and Asian-American women in its management rises by about 10%, on average.”
The article cites a study that examined how integrated forces in WWII showed improved relationships over time. Dobbin explained the study's main idea in an interview with global management consulting firm McKinsey & Company:
“People’s stereotypes go away as they get to know people from other groups, especially if they work side by side with them. If you are white and have not been exposed to African-Americans very much, we know from that natural experiment during World War II and subsequent studies that intense exposure through working side by side helps you to individualize people from a group that you are not familiar with and stop stereotyping them.
So if you want to change stereotyping at work, the best way to do it is not to try to train it away, but to expose people to people from other groups in their work lives. In effect, you have to start by integrating the workplace. That’s what’s going to diminish stereotypes.”
Holding managers socially accountable for how they treat employees is another way to promote diversity. In one study, a firm was shown to give smaller raises to black employees, even when they held identical positions and performance ratings to their white coworkers. Then the firm started publicly posting employee's performance ratings and pay raises.
“Once managers realized that employees, peers, and superiors would know which parts of the company favored whites, the gap in raises all but disappeared,” wrote Dobbin and Kalev.
You can check out more about Dobbin and Kalev's research in the video below:
Racism is the acting out of biases learned as early as preschool, research shows. If racism starts at three years old, so should science-backed strategies to reduce it.
There's no getting around it: we're all a little bit biased. But when do harmful implicit biases, like racial judgements, form? Developmental psychologist Lori Markson and her colleagues have identified racial bias in preschool children aged three to six years old. Despite learning that kids this age—both black and white—prefer white teachers, or that white kids trust black adults less, Markson is not pessimistic about the future of race relations—in fact she's the opposite. The more data we can collect on racial bias, the more information we have to develop strategies to close social divides. Based on the research she presents here, Markson outlines three strategies—diversity exposure, bias intervention, and cross-race friendships—that can help to end racist behavior in the next generation, and hopefully in the current one. This video was filmed at the Los Angeles Hope Festival, a collaboration between Big Think and Hope & Optimism.