Doctors may be missing fatal illnesses because medical textbooks are biased toward white skin.
- A medical student in the UK recently created a handbook to help trainee doctors recognize life-threatening conditions on black and brown skin.
- "Mind the Gap" includes images that display how certain illnesses appear on both darker and lighter skin tones.
- The COVID-19 pandemic has exacerbated problems with suspected coronavirus patients being asked if they are "pale" or if their lips "turned blue".
Look in a medical textbook for the symptoms of a rash and you'll probably find "red bumps." Look up oxygen deprivation and you'll get "blue lips" as a common sign. Melanin typically alters those colors, so those diagnostics don't always apply to non-white skin — i.e., most of the world's population. And it can have fatal consequences.After being taught clinical signs and symptoms on white skin, a Black medical student at St. George's University of London, Malone Mukwende, recently created a handbook to help trainee doctors recognize life-threatening conditions on black and brown skin.
Mind the Gap
Photo Credit: St George's University of London
Mukwende has been working with Senior Lecturer in Diversity and Medical Education, Margot Turner, and Clinical Lecturer in Clinical Skills, Peter Tamony on the handbook "Mind the Gap" as part of a student-staff partnership project looking at clinical teaching on black and brown skins.
According to the British Medical Journal, Mind the Gap aims "to teach medical students and other health professionals about the importance of recognizing how some conditions can present differently in darker skins."
The book includes images that display how certain illnesses appear on both darker and lighter skin tones. Additionally, it includes suggestions for appropriate phrases and vernacular for doctors to use with their patients.
"It is important that we as future healthcare professionals are aware of these differences so that we don't compromise our care for certain groups," said Mukwende in a St George's University press release, noting that medical textbooks contain a 'white skin bias,' which has put the health of those groups at risk.
Though Mind the Gap is not currently published or available for distribution, discussions with potential publishers are ongoing according to St George University's statement.
Coronavirus and need for change
Mukwende explained that the COVID-19 pandemic has exacerbated problems with suspected coronavirus patients being asked if they are "pale" or if their lips "turned blue".
"These are not useful descriptors for a black patient and, as a result, their care is compromised from the first point of contact," said Mukwende. "It is essential we begin to educate others so they are aware of such differences and the power of the clinical language we currently use. We will be hosting a training session for clinical skills peer tutors which will take place in July 2020."
He pointed out that conversations currently taking place regarding health disparities in the United Kingdom are pressuring universities to take real action to address those concerns. For example, at St George's there was a petition calling for teaching clinical skills on black and brown skin.
"The petition, Covid-19 pandemic and the Black Lives Matter movement all illustrate there is an urgent need for change," said Mukwende.
Monuments are under attack in America. How far should we go in re-examining our history?
- Historical American monuments and sculptures are under attack by activists.
- The monuments are accused of celebrating racist history.
- Toppling monuments is a process that often happens in countries but there's a danger of bias.
History is not only the stuffing of Wikipedia articles but a live process involving you right now. As is evidenced bluntly by 2020, history is an undeniable force, here to change our societies and force us to re-examine everything we think and know before you can say "news cycle." So far we've had one of the worst pandemics of the modern era, with thousands dead and economic livelihoods uprooted around the world. We've had the resurgence of the Black Lives Matter movement, spurred on by the murders of African-Americans by the police, unleashing pent-up frustrations at systemic injustice. We also find ourselves in an amazingly divisive election year, probably one of the worst periods of rancor in the life of the country. American "heroes" are getting re-examined left and right and statues are getting torn down.
All the upheaval places focus on the role of history in our society. How much of it do we want to own up to? How much are current American citizens responsible for the sins of their ancestors? Which men (and yes, mostly these are men) are allowed to stay up as bronze reminders of some heroic past, and which ones need to finally go to the far reaches of our collective unconscious? Do Confederate monuments and statues deserve to stay as part of the legacy of the South, or does it make any sense that a period of history that lasted about 5 years and produced attitudes that were actually defeated in a bloody Civil War is allowed to percolate in the minds of the population? It is as if a tacit agreement was kept up all these years where the victors allowed some such traditions to remain in order to foster a spirit of reconciliation.
Mob pulling down the statue of George III at Bowling Green, New York City, 9 July 1776.
Painting by William Walcutt. 1854.
There is a big danger, on the other hand, that as the conversation turns to exorcising ghosts of currently unpopular attitudes, we are doing it through the lens of presentism. It's a bias of judging the behavior of historical people through the standards of today. Oxford helpfully defines it as "uncritical adherence to present-day attitudes, especially the tendency to interpret past events in terms of modern values and concepts." We tend to view our present time as the best, most advanced socially and intellectually. And as such judge all others as inferior. While that may be true (certainly debatable), it's unfair to view how people reacted to situations around them within the constraints and prejudices of the society of their day. It's probably how people of a couple of hundred years from now will judge us, who still eat meat, as some kind of utter barbarians, a lesser humans.
Our present knowledge comes at the heels of wisdom gathered by generations before us. It is accumulated over time and by that standard should be richer, informed by greater experience and examination. Yet, is it fair to say that a person living 150 years ago should not have had the attitude shared by most people of his or her time, who only knew what they could know by that point in history? The intelligence of societies grows not only intellectually, reshaping their governments, but emotionally. It has taken the world a while to grow in that regard, to become mature in empathy and it's obviously nowhere near where it should be in such evolution.
As it is biased to judge a person from a different era for not having the moral foresight to stand up to his or her peers and end the tyranny of injustice, that is also no excuse to celebrate attitudes and statements that go squarely against what we believe in now. A bad idea is like a virus. It can take hold and come back quickly, infecting millions. We have all seen that happen too often recently. If you think it's no big deal to have statues of abolitionists and slave owners around, imagine if you were a Jewish person and had to pass by a statue of Heinrich Himmler, the Nazi architect of the Holocaust, every day on the way to work. It would just casually be there, kept up by people who believed that despite the times he subjected people to inhuman treatment and death, Himmler did a lot of other "good things" and represented the heritage of the people of the town. As that scenario would be unacceptable, so are a lot of statues kept up around the United States as vestiges of a past we do not need to remember with veneration. And for those who are wondering – no, Germany does not have any Nazi statues or memorials around.
People in Rome tear down the statues of Mussolini. July 25, 1943.
Photo by Fototeca Gilardi/Getty Images.
If you think this debate concerns only Southern "heroes," it took until this year to start taking down statues of Juan de Oñate, the Spanish conquistador who rampaged through what is now New Mexico in the late 1500s. In 1599, he ordered a massacre at the Acoma Pueblo, killing about 500 of its men and 300 women and children. Many were thrown off a tall mesa to their deaths while survivors had their legs cut off. And yet, you could find statues of this man all over New Mexico. Only this year, they started to come down.
Ultimately, there's a big lesson here for statue-makers and those prone to worshipping idols. Even the Bible spoke about this. Respecting and learning from historical figures is extremely useful and necessary, but putting anyone up on a pedestal is generally a losing proposition. Eastern Europe saw a whole century of statues being torn down every few years in the 1900s – from monarchic rulers to Communist heroes who would fall in and out of favor, then a whole period of pulling down Lenin and Stalin figures in the 90's. Western Europe had its own idol carousel. Many other countries across the world, who've had tumultuous histories and had to undergo historical reckonings did the same. It's a process that happens in societies that experience change.
Of course, the big question is – how far should this go? How far back do we have to extent the soul-searching to expunge all the wrongs in our country's past? Do the iconic Presidents get a pass? Are they coming for Mount Rushmore, put up on Native land without permission, and the Washington Monument? Besides being one of the country's main Founding Fathers, Washington was a lifelong slave-owner who changed his mind about the practice and freed all his slaves by the end of his life – the only Founding Fathers with slaves to do so.
Statue of Lenin in Berlin, Germany On November 13, 1991.
Photo by Patrick PIEL/Gamma-Rapho via Getty Images
Removing or altering some of the country's main symbols is probably a nonstarter at this point. But there can be correct acknowledgements and payments made, according to our current laws. It's legitimate to have concerns about dangerous ideas from the past but there have to be boundaries and the right pace. When you you start retooling your whole foundational mythology quickly, you get violence and revolutions. People are not going to give up on what they have been taught for generations easily and what is part of their culture. Still, this doesn't make such conversations not worth having, since what constitutes tribute to Southern heritage and values to some is a continual slap in the face to others, celebrating people who enslaved them and brutalized them for centuries. Status quo is not acceptable in such an equation.
But what about the once-taboo topic of asking national forgiveness with money – paying reparations to people brought to the country as slaves and to Native Americans who were largely exterminated and whose land was taken? In his seminal essay "The Case for Reparations" for The Atlantic, Ta-Nehisi Coates argues that the early American economy was built on slave labor and the wealth that was accumulated on the backs of the enslaved created a tremendous wealth gap that persists today in dramatic fashion, in a ratio of 20 to 1 (of white to black wealth).
"Perhaps no statistic better illustrates the enduring legacy of our country's shameful history of treating black people as sub-citizens, sub-Americans, and sub-humans than the wealth gap," he writes. "Reparations would seek to close this chasm. But as surely as the creation of the wealth gap required the cooperation of every aspect of the society, bridging it will require the same."
Could we achieve such cooperation now? At the moment, only about 20 percent of the American population would support paying reparations to descendants of slaves.
Statue of Lenin taken down in Ulan Bator, Mongolia. October 2012.
Photo by Paula Bronstein/Getty Images
As Ta-Nehisi Coates calls it, the notion of reparations is "frightening" to many because it might incur major economic costs and maybe most importantly, it "threatens something much deeper—America's heritage, history, and standing in the world."
While the issue of paying for past sins might further drive a wedge into an already-divided society, Coates believes putting a number on the historical events that led to American prosperity being "ill-gotten and selective in its distribution."
He further proposes that paying reparations would be more than just a payoff but would lead to "a healing of the American psyche and the banishment of white guilt," adding "What I'm talking about is a national reckoning that would lead to spiritual renewal."
The cultural reexamination unleashed by the recent protests linked to widespread police brutality taps into the undercurrents of the American psyche. What's surprising is that so many Confederate statues can still be found around the U.S. – approximately 1,800 such monuments to be exact. Think about that – almost 2,000 signals of past attitudes that were defeated in a war and have been legislated against since. And yet, there they are, like guardians of not-so-secret inclinations America is unwilling to let go.
As many rightfully fear, however, efforts to tear things down based on the emotion of the moment can lead to mob rule and often less-than-nuanced opinions on history. A country deserves its past and whitewashing it doesn't change the facts. But all people living in the country today, which is much more diverse and getting more and more so, according to clear census data, have a right to be part of a society that values their sensibilities and respects them equally. Hanging onto to imperfect idols is understandable on the part of a population that is becoming less and less able to wield its will over minorities but ultimately futile, as the statues will come down as they always tend to do. The question is – will they come down as part of an elevated national consciousness or amidst violence? As coronavirus continues to expand its grip on the country, a more measured approach would serve us well.
Repeating lies makes people believe they are true, show studies.
- Two recent studies looked at the illusory truth effect.
- The effect describes our propensity to start believing untrue statements if they are repeated.
- The phenomenon is a universal bias linked to cognitive fluency but can be counterbalanced.
In an age already beset by rampant misinformation and personality-driven realities imposed upon large segments of the global population, come new studies that show why finding the truth can be so hard. Both are concerned with the so-called illusory truth effect, which has been well-exploited by politicians of all stripes as well as advertisers.
The illusory truth effect is a well-studied and replicated psychological phenomenon that describes the fact that if a falsehood is repeated often enough, people will start to believe it. This has to do with familiarity – it's easier to process information you've comes across previously. This fact of how we are wired can create a feeling of fluency, explains Matthew Warren in the BPS Research Digest. Unfortunately, we may come to treat the recognition of a piece of content as a message that it's true.
Two recent studies delved further into this effect, first described in 1977, and came up with some sobering takeaways but also possible ways to use this bias to our advantage. Maybe you think your particular intelligence makes you immune to this play of the mind, but experiments carried out by Jonas De keersmaecker at Ghent University and an international team of psychologists showed that variations in cognition had no bearing on how strongly the illusory truth effect hit.
The researchers looked at how it worked across differences in cognitive ability or intelligence, the need for cognitive closure, or cognitive styles in six experiments with the number of subjects ranging from 199 to 336. The participants were made to read a mix of true and false trivia statements. Another study utilized fake and real headlines from politics.
What the psychologists found is that in all the studies, the illusory truth effect prevailed. The more participants saw false statements, the more likely they were to rate them as true or real. Any differences in how people thought did not impact the strength of the effect, highlighting that most of us are likely to start believing oft-repeated information.
In their conclusion, the researchers pointed to the effect not necessarily being as bad as it sounds. Rather, it is perhaps a useful universal bias, like a shortcut to pick out the truth that often works.
Another study, published in Cognition, looked at how we can try to stand up to this pervasive feature of our cognition. A team led by Nadia Brashier at Harvard University found that fact-checking bad claims using our own knowledge can help inoculate us from believing them later.
Their two-part study first involved asking 103 subjects on the veracity of 60 facts. Some of these were true like "The Italian city known for its canals is Venice", and some were false like "The planet closest to the sun is Venus". One group of participants had to rate whether the statements were true while the other rated on truthfulness. For the second part of the study, the researchers added another 60 statements of mixed truth to the 60 the subjects already saw.
The scientists found that the group which focused on how interesting the sentences were was more prone to the illusory truth effect than the one that focused on their accuracy. What's also important, discovered the researchers, is that education needs to be combined with a focus on accuracy, writing "Education only offers part of the solution to the misinformation crisis; we must also prompt people to carefully compare incoming claims to what they already know."
How to stop fake news
The development of implicit biases starts at a young age and then they get reinforced over time.
- Awareness of your implicit biases can lessen their effect.
- In the classic "Draw-A-Scientist Test" young students overwhelmingly drew similar representations of a scientist.
- Teaching young people to become aware of the idea of their "implicit biases" could help them better understand their peers.
Implicit bias refers to the unconscious ideas and stereotypes we hold through repeated cultural conditioning, which affects our thoughts, actions, and decision-making processes. These are often automatic or hidden associations we involuntarily hold for other groups of people. People who identify as a member of a stereotyped group can even show unconscious biases to their own group.
Researchers have found that implicit bias starts at a young age. We are often prone to stereotype because it's easier than having to sift through a more complex and nuanced view of the people and world around us. This said, experts believe that implicit bias impacts how we engage with others, even if we consciously renounce the idea of prejudice or stereotypes.
It's much easier to unconsciously create a stock image of a group or class of people when we're presented with so much information throughout our lives. Yet, this unconscious and often biased picture we hold of others can, and often does, lead to skewed perceptions of others.
When it comes to dealing with perceptions of gender, race, and even what profession someone takes, our limited and implicit bias can take over. This often leaves us with faulty, diminished decision-making faculties, uncreative outlooks (aka sticking to "the status quo"), and unfair practices. For example, recently French researchers discovered that more women were promoted after scientists in charge of rewarding research positions were made aware of how their implicit bias impacted the process.
If implicit bias is something that can be overcome, even among adults (see: "You can teach an old brain new tricks"), might it be wise to give students the tools to recognize it when their young?
Conscious awareness of bias
The French study leaves us with the encouraging idea that conscious awareness of our hidden biases can lessen their effect. While we might not be able to overcome our biases completely, we can get rid of the "hidden" factor.
The study reviewed the awards handed out during competitions for elite scientific research positions. They reviewed the 414 members of the committees, who were responsible for reviewing and picking the candidates. The researchers' assumption was that the decision made by the group would be representative of their internal makeup — that is, the members' decisions would reflect their own group bias.
Committee members were given Harvard's Implicit Association Test (IAT), which determined that there existed implicit gender biases among the committee. After the members were made aware of these biases, awards given out were less likely to be influenced by prejudiced notions and more women, subsequently, were promoted.
It's important to note that there were some lapses in the study. For instance, the committee members may have been prompted to promote — or be more proactive to promote— women because they felt like they were being called out. The study was also by no means comprehensive, however, the researchers' conclusion do give us a baseline to consider the role implicit bias plays in judging others, even in as real-world situations as job promotions.
Learning about implicit biases
Caroline Simard, Research Director at Stanford's Clayman Institute for Gender Research spoke with UC Berkeley Lab about implicit bias in science and how it affects us throughout our lives and what we can do about it.
Our brains look for shortcuts to make quicker snap decisions, stereotypes are one of the ways we shortcut the decision-making process.
Simard gives a classic example from a famous study called the Draw-A-Scientist Test.
"If you ask kindergarteners to draw a scientist you're going to have about half of them draw a male scientist, half of them draw a female scientist. By third-grade 75 percent draw a male scientist and if you look at the drawings they all start looking eerily similar. They're starting to look like the stereotype: it's a middle-aged man with the white lab coat, the little pencils in the pocket, glasses, and very interesting hair patterns. This stereotype essentially evokes Einstein."
David Wade Chambers, who first published this research in the 1980s, was able to show that children quickly began to develop stereotypical biased views of scientists at a very young age, while finding that the bias was progressively reinforced as they got older. This translates over to a great deal of other biases as well. Simard adds:
"Media images have a big role to play and, unfortunately, what research shows is that children's shows and children's cartoons are especially good at reinforcing stereotypes. But the opposite could also be true, you could reinforce the other stereotype by including more diversity in the media images."
Indeed, Simard believes there's a direct way to counterbalance implicit bias: While it's being reinforced through the media, educational awareness may dim it's influence, as suggested by the results of the French research committee study.
Some universities have started offering courses to mitigate this. The Berkeley Lab's UC Managing Implicit Bias Series, for instance, is an online course that's designed to increase the awareness of implicit bias in hopes of reducing its impact at the university community. "And just about everyone is prone to biases, whether you're male or female, white or non-white, scientist or not," says Simard.
A case can be made for implementing courses like this for younger students in grade school. Most courses today are directed at instructors, college students, and professionals in the workplace. Most courses utilize the Harvard IAT as a cornerstone of their instruction and teaching.
It has a number of thorough tests regarding race, religion, gender, disability, weight, skin tone, age, and many more that are all meant to give you a read on your own implicit biases.
A subset of this test made for younger students or at least the test given to their instructors could help propel a way for students to learn about the phenomenon of biases being given agency, often without any leash — awareness. Instruction could be given in the form of a creative prompt that propel students to develop different ways of envisioning people — for example, in the representation of a scientist. Supplementary materials could include a diverse range of books, documentaries, and imagery of diverse scientists — again, just one example — throughout the centuries.
Contemporary students need new, flexible ways of viewing the world around them. Take for example, the push for young students to learn architecture to develop critical thinking skills. It was found that teaching them the skills of an architect, even though they may never pick up the profession, gave them greater problem-solving skills.
Perhaps an early understanding of implicit bias could, likewise, pave the way for students to expand their critical thinking skills — all the while developing a thoughtfulness when it comes to viewing people who seem different from themselves.
Do you know the implicit biases you have? Here are some ways to find them out.
- A study finds that even becoming aware of your own implicit bias can help you overcome it.
- We all have biases. Some of them are helpful — others not so much.
When we talk about a bias, what we're talking about, as Harvard University social psychologist Mahzarin Banaji puts it, is a shortcut our brain has created so that we don't have spend time and energy thinking about how we feel each time we encounter something — we have an opinion already formed and ready to use.
Many of these shortcuts are useful: A bias against hangovers, for example, has one refusing alcohol without having to think about it. The problem is the brain does a lot of this shortcutting, silently. What's more, it creates shortcuts for people different than ourselves, sometimes based on actual personal experience, but often based on incorrect information we've unknowingly absorbed: other peoples' opinions, media depictions, cultural attitudes, for instance.
Worst of all, this kind of bias may be created and deployed without our even being aware of it — it's implicit in our actions in spite of ourselves and our conscious intentions.
Our brains don't always get things right. We make errors in judgement all of the time. An accurate bias is a great time-saver. An inaccurate bias is a serious problem, especially if it causes us to unknowingly discriminate against others. For instance, the systemic assumptions about women that keep them from advancing in scientific fields.
How we can curb the effects of implicit biases
Image source: Radachynskyi Serhii / Shutterstock / Big Think
New research, published in Nature Human Behavior on August 26, suggests the gender bias, which continues to prevent women from advancing in science, has a lot to do with its hidden underbelly — human blindspots. During the study, French researchers discovered that more women were promoted after the scientists in charge of awarding research positions became consciously aware of the impact of their implicit bias.
When it was no longer being highlighted, their biases discriminatory effect re-asserted itself, with award grants regressing to their traditional, pro-male pattern. Other research suggests that diversity training doesn't really help and may even exacerbate the problem it seeks to address.
We can glean a new approach, though — one that could result in better outcomes — from the new research.
About the study
Image source: Tartila/Shutterstock/Big Think
What the new study encouragingly reveals is that a conscious awareness of one's own hidden bias can mitigate its effect. The mechanism, it would appear, is that awareness may not delete the bias so much as make it less implicit, or unconscious.
The study looked at the awards handed out during annual nationwide competitions for elite French research positions. There were 414 people on the committees altogether, assessing candidates' worthiness across a spectrum of research specialties — "from particle physics to political sciences." The study analyzed committee-level data without digging too deeply into whether a committee was internally gender-balanced. The assumption was that the consensus decision reached by group represented the outcome of its internal makeup, whatever that may be.
The study took place over two years. In the first year, committee members were given Harvard's implicit association test (IAT), which established there was a significant implicit gender biases among them. Nonetheless, that year, the influence of such biases appeared to be significantly suppressed in the awards the committees handed out.
To the researchers, this outcome suggested that simply being aware of one's own implicit biases may take away their invisibility — the callout could make the bias more apparent and, therefore, something that can be more readily over-ridden.
The second year of the study, from the subjects' point of view at least, was quite silent. The researchers were still watching, but the issue of implicit bias wasn't called out. What ended up happening? The committee members returned to awarding more positions to men than women. A regression, it seemed.
It should be said, there are some possible flaws in the study: Perhaps the committee members were simply on their good behavior the first time around — until they thought that they were no longer being observed. Additionally, the study notes that there were more male submissions to the committees than female, which could skew the test. Further studies will need to be done to get a more accurate picture.
Nonetheless, the study's authors do conclude that becoming aware of one's own implicit biases may be the first step — maybe the most essential step — needed to overcome them.
How do I know if implicit bias is affecting my judgement?
Image source: AlexandreNunes / Shutterstock / Big Think
While the study looked at gender bias, of course, it's not the only variety to be concerned about, others pervade our culture: race bias, ethnicity bias, anti-LGBTQ bias, age bias, anti-Muslim bias, and so on. There are a couple of online methods available for sussing out our own. Note that if the researchers are correct, then just making yourself aware of your implicit biases can help you combat them.
The IAT mentioned above is one widely used way to identify your own bias issues. Project Implicit — from psychologists at Harvard, the University of Virginia, and the University of Washington — offers a self-test you can take. Be aware, though, that the IAT requires multiple tests to produce a meaningful result.
If you're willing to invest a little time, there's also the "bias cleanse" offered by MTV in partnership with the Kirwan Institute for the Study of Race and Ethnicity. It's a seven-day program aimed at helping you sort out implicit gender, race, or anti-LGBTQ biases you may be harboring. Each day you receive three eye-opening email thought exercises, one for each type of bias.
Side note: Did you know that more people die in female-named hurricanes because they're typically perceived as less threatening? We didn't.
It's a well-worn bromide that simply acknowledging you have a problem is the first step to solving it, but the new study provides supporting evidence that this is especially true when dealing with implicit biases — a pernicious, stubborn problem in our society. Our brains are clever beasties, silently putting together shortcuts that reduce our cognitive load. We just need to be smarter about seeing and consciously assessing them if we can ever hope to be the people that we hope to be. That may mean, on occasion, being humble enough to receive feedback in the form of callouts.