Alton Sterling and Philando Castile were shot to death by police officers this week. Both were calm. Both were compliant. Both were African American. Underneath the ugliness and tragedy of their deaths is the fact that the officers who killed them made the decision to do so in a split-second -- based entirely on implicit biases.
We’ve touched on this before, but implicit bias -- an unintentional action rooted in prejudicial cognitive bias -- causes more trouble than it sounds like it should. It’s the root part of your brain that assesses everything you absorb from the world around you -- smells, tastes, people, feelings -- and categorizes them into experiences -- good, bad, scary, happy -- for easy recall. For example, if you smell something yummy, see a chocolate chip cookie, eat the cookie, taste it, and realize it’s delicious, your brain shortcuts remembering all of those individual stimuli by saving the whole experience as “cookie = yummy.” That ingrained memory becomes a preference, and that preference helps you make future decisions when faced with another cookie much more quickly than by going through the whole sensory information gathering process. Implicit bias is kind of like your brain’s autopilot for decision making.
That sort of shortcut in decision making was a strong survival tactic for early humans. We learned which plants were safe to eat and which animals would try to kill us, developing preferences for the ones that benefitted us. Those preferences enabled us to make similar decisions more and more quickly, saving precious seconds for escaping predators or chasing food. In today’s world that kind of preference is too primitive, and often causes faulty associations based on cultural and sociological influences. Chris Mooney at Mother Jones explains it this way: “racially biased messages from the culture around you have shaped the very wiring of your brain. It is not that we are either prejudiced or unprejudiced, period. Rather, we are more and less prejudiced, based on our upbringings and experiences but also on a variety of temporary or situational prompts.” Psychologist Heidi Grant Halvorson explains how this happens:
One way to measure the effect of those upbringings and experiences is through the Implicit Association Test (IAT). Harvard’s Project Implicit defines the IAT as a test that: “measures the strength of associations between concepts (e.g., black people, gay people) and evaluations (e.g., good, bad) or stereotypes (e.g., athletic, clumsy).” Participants are given about 50 milliseconds to make each choice before the test loudly buzzes an error message -- just enough time to respond instinctually.
Credit: Project Implicit
IAT scores are the difference between the first and final rounds of associations -- or, the last time you categorized concepts and evaluations versus your first impression of them. On average, most people are either slightly or moderately biased, meaning no matter how much you try to make bias-free decisions your brain will default to the preferences it learned when you were initially classifying the world around you. You can take the test yourself on Project Implicit and see for yourself (or take it again if you took it when we showed it to you last year). The results will surprise you:
Police officers are trained to build these kinds of preferences in more situations than civilians, particularly in dangerous, life-threatening ones. However, they are also making decisions based on those racially biased messages. In a 2014 empirical review for Social and Personality Psychology Compass, scientists at the University of Colorado Boulder and California State University Northridge looked at 10 years’ worth of data and found “police officers use greater force (both lethal and non-lethal) when the suspect is black rather than white” and those officers “were faster to shoot armed targets when they were black (rather than white), and they were faster to choose a don’t-shoot response if an unarmed target was white (rather than black).” They were displaying an implicit bias -- because their surroundings reinforced a primitive learned preference:
When confronted with a black target, officers may activate racial stereotypes related to threat. In line with this possibility, response time bias was greatest among officers whose districts were characterized by a large (urban) population, a high rate of violent crime, and a greater concentration of Blacks and other minorities – environments likely to reinforce racial stereotypes.
Another 2014 study, this time in the Journal of Personality and Social Psychology, 176 mostly white, male police officers were tested for “an unconscious dehumanization bias” against black people. Using a modified IAT, researchers asked the officers to match photos of people with photos of big cats or apes. Officers “commonly dehumanized black people,” the study reports, “and those who did were most likely to be the ones who had a record of using force on black children in custody.”
Thankfully, these kind of biased responses can be trained away. Taking the IAT for a lengthy period of time (160 attempts, according to the SPCC review) negates the initial bias. Training officers to focus on body language cues instead of race negates the initial bias. Even interacting with the community in normal day-to-day situations negates the initial bias. As long as the brain is given a new positive association for something it previously attached a bias to, it can learn a new response. Or, as Mooney puts it, “our tribal instincts can actually be co-opted to decrease prejudice, if we are made to see those of other races as part of our team.”
This applies to all of us, not just police officers. But hopefully, building a better sense of team unity within African American communities will end the ill effects of implicit bias in policing.
Feature Image Credit: Mother Jones