Big ideas.
Once a week.
Subscribe to our weekly newsletter.
People who see God as a white man tend to prefer white men for leadership positions
Participants were also more likely to see God as old than young, and male rather than female.

When you picture God, who do you see: a young black woman, or an old white man? Chances are it's the latter — and a new study in the Journal of Personality and Social Psychology suggests that that image has its consequences.
Across a series of seven studies, at team led by Steven O Roberts at Stanford University found that the way that we perceive God — and in particular our beliefs about God's race — may influence our decisions about who should be in positions of leadership more generally.
First, the team examined how 444 American Christians — a mixture of men and women, some black and some white — pictured God. In their "indirect" measure, the researchers asked participants to view 12 pairs of faces that differed either in age (young vs old), race (white vs black), or gender (man vs woman), and pick the photo of each pair they thought looked more like God. Participants were also asked to explicitly rate God on each of these characteristics (e.g. whether they thought God was more likely white or black).
On both measures, participants were more likely to see God as old than young, and male rather than female. But participants' view of God's race depended on their own race: white participants tended to see God as white, while black participants tended to see God as black.
So people clearly conceptualise God in a specific way — but how does this relate to decisions they make in their everyday lives? For Christians, God is the ultimate leader, so perhaps they look for the characteristics they ascribe to God in other leaders too. So in a second study, the team asked more than 1,000 participants to complete the same direct and indirect measures as before, as well as a new task in which they imagined working for a company that was looking for a new supervisor. They saw 32 faces that varied in gender and race and had to rate how well each person would fit the position.
The team found that when participants saw God as white, they tended to give white candidates a higher rating compared to black candidates. The reverse was true too: participants who saw God as black tended to rate black candidates as more suited than white ones. People who saw God as male also rated males higher than females. A subsequent study found that even children aged 4 to 12 generally perceived God as male and white, and those who conceptualised God as white also viewed white people as more boss-like than black people.
The results suggest that the extent to which people see God as white and male predicts how much they will prefer white men for leadership roles. Interestingly, these effects held even after controlling for measures of participants' racial prejudice, sexism and political attitudes, suggesting that the effects couldn't simply be explained by these kinds of biases. Of course, when people saw God as black, these effects were reversed, with participants preferring black candidates. But the fact is that in America, the idea that God is white is a "deeply rooted intuition", the authors write, and so this conceptualisation could potentially reinforce existing hierarchies that disadvantage black people.
However, there's a big limitation here: these studies were all based on correlations between beliefs about God and beliefs about who should be leaders. That is, it wasn't clear whether perceptions of God's race actually cause people to prefer certain leaders or whether there's something else going on that could explain the link between the two.
To address this question of causality, the team turned to made-up scenarios, in which participants had to judge who made good rulers based on the characteristics of a deity. Participants read about a planet inhabited by different kinds of aliens — "Hibbles" or "Glerks" — who all worshipped a Creator. People tended to infer whether Hibbles or Glerks should rule over the planet depending on whether the Creator itself was Hibble or Glerk.
These final studies provide some evidence that the ways in which people picture God, or at least an abstract God-like being, do indeed filter down to actively influence beliefs and decisions in other areas of their lives. The authors suggest that future work should look at how to prevent people making these kinds of inferences.
Of course, the research focuses on a specific group: all participants lived in America, and in most studies they were Christian (some of the later studies also included atheists). It remains to be seen whether similar patterns exist amongst adherents of other religions, or in countries with different demographics. It would also be important to figure out whether perceptions of God influence decisions in the real world, and not just in the lab.
Still, as the authors write, the results "provide robust support for a profound conclusion: beliefs about who rules in heaven predict beliefs about who rules on earth."
Matthew Warren (@MattbWarren) is Editor of BPS Research Digest
Reprinted with permission of The British Psychological Society. Read the original article.
- Were the ancient Egyptians black or white? Scientists now know ... ›
- China Hiring White Men as Fake Corporate Executives - Big Think ›
Did early humans hibernate?
New anthropological research suggests our ancestors enjoyed long slumbers.
- Neanderthal bone fragments discovered in northern Spain mimic hibernating animals like cave bears.
- Thousands of bone fragments, dating back 400,000 years, were discovered in this "pit of bones" 30 years ago.
- The researchers speculate that this physiological function, if true, could prepare us for extended space travel.
Humans have a terrible sense of time. We think in moments, not eons, which accounts for a number of people that still don't believe in evolutionary theory: we simply can't imagine ourselves any differently than we are today.
Thankfully, scientists and researchers have vast imaginations. Their findings often depend on creative problem-solving. Anthropologists are especially adept at this skill, as their job entails imagining a prehistoric world in which humans and our forebears were very different creatures.
A new paper, published in the journal L'Anthropologie, takes a hard look at ancient bone health and arrives at a surprising conclusion: Neanderthals (and possibly early humans) might have endured long, harsh winters by hibernating.
Adaptability is the key to survival. Certain endotherms evolved the ability to depress their metabolism for months at a time; their body temperature and metabolic rate lowered while their breathing and heart rate dropped to nearly imperceptible levels. This handy technique solved a serious resource management problem, as food supplies were notoriously scarce during the frozen months.
While today the wellness industry eschews fat, it has long had an essential evolutionary function: it keeps us alive during times of food scarcity. As autumn months pass, large mammals become hyperphagic (experiencing intense hunger followed by overeating) and store nutrients in fat deposits; smaller animals bury food nearby for when they need a snack. This strategy is critical as hibernating animals can lose over a quarter of their body weight during winter.
For this paper, Antonis Bartsiokas and Juan-Luis Arsuaga, both in the Department of History and Ethnology at Democritus University of Thrace, scoured through remains of a "pit of bones" in northern Spain. In 1976, archaeologists found a 50-foot shaft leading down into a cave in Atapuerca, where thousands of bone fragments have since been discovered. Dating back 400,000 years—some of the fragments may be as old as 600,000 years—researchers believe the bodies were intentionally buried in this cave.
Evidence of ancient human hibernation / human hibernation for space travel | Dr Antonis Bartsiokas
While the fragments have been well studied in the intervening decades, Arsuaga (who led an early excavation in Atapuerca) and Bartsiokas noticed something odd about the bones: they displayed signs of seasonal variations. These proto-humans appear to have experienced annual bone growth disruption, which is indicative of hibernating species.
In fact, the remains of cave bears were also found in this pit, increasing the likelihood that the burial site was reserved for species that shared common features. This could be the result of a dearth of food for bears and Neanderthals alike. The researchers write that modern northerners don't need to sleep for months at a time; an abundance of fish and reindeer didn't exist in Spain, as they do in the Arctic. They write,
"The aridification of Iberia then could not have provided enough fat-rich food for the people of Sima during the harsh winter—making them resort to cave hibernation."
The notion of hibernating humans is appealing, especially to those in cold climates, but some experts don't want to put the cart before the horse. Large mammals don't engage in textbook hibernation; their deep sleep is known as a "torpor." Even then, the demands of human-sized brains could have been too large for extended periods of slumber.
Still, as we continually discover our animalistic origins to better understand how we evolved, the researchers note the potential value of this research.
"The present work provides an innovative approach to the physiological mechanisms of metabolism in early humans that could help determine the life cycle and physiology of extinct human species."
Bartsiokas speculates that this ancient mechanism could be coopted for space travel in the future. If the notion of hibernating humans sounds far-fetched, the idea has been contemplated for years, as NASA began funding research on this topic in 2014. As the saying goes, everything old is new again.
--
Stay in touch with Derek on Twitter and Facebook. His new book is "Hero's Dose: The Case For Psychedelics in Ritual and Therapy."
Does science tell the truth?
It is impossible for science to arrive at ultimate truths, but functional truths are good enough.
- What is truth? This is a very tricky question, trickier than many would like to admit.
- Science does arrive at what we can call functional truth, that is, when it focuses on what something does as opposed to what something is. We know how gravity operates, but not what gravity is, a notion that has changed over time and will probably change again.
- The conclusion is that there are not absolute final truths, only functional truths that are agreed upon by consensus. The essential difference is that scientific truths are agreed upon by factual evidence, while most other truths are based on belief.
Does science tell the truth? The answer to this question is not as simple as it seems, and my 13.8 colleague Adam Frank took a look at it in his article about the complementarity of knowledge. There are many levels of complexity to what truth is or means to a person or a community. Why?
It's complicated
First, "truth" itself is hard to define or even to identify. How do you know for sure that someone is telling you the truth? Do you always tell the truth? In groups, what may be considered true to a culture with a given set of moral values may not be true in another. Examples are easy to come by: the death penalty, abortion rights, animal rights, environmentalism, the ethics of owning weapons, etc.
At the level of human relations, truth is very convoluted. Living in an age where fake news has taken center stage only corroborates this obvious fact. However, not knowing how to differentiate between what is true and what is not leads to fear, insecurity, and ultimately, to what could be called worldview servitude — the subservient adherence to a worldview proposed by someone in power. The results, as the history of the 20th century has shown extensively, can be catastrophic.
Proclamations of final or absolute truths, even in science, shouldn't be trusted.
The goal of science, at least on paper, is to arrive at the truth without recourse to any belief or moral system. Science aims to go beyond the human mess so as to be value-free. The premise here is that Nature doesn't have a moral dimension, and that the goal of science is to describe Nature the best possible way, to arrive at something we could call the "absolute truth." The approach is a typical heir to the Enlightenment notion that it is possible to take human complications out of the equation and have an absolute objective view of the world. However, this is a tall order.
It is tempting to believe that science is the best pathway to truth because, to a spectacular extent, science does triumph at many levels. You trust driving your car because the laws of mechanics and thermodynamics work. NASA scientists and engineers just managed to have the Ingenuity Mars Helicopter — the first man-made device to fly over another planet — hover above the Martian surface all by itself.
We can use the laws of physics to describe the results of countless experiments to amazing levels of accuracy, from the magnetic properties of materials to the position of your car in traffic using GPS locators. In this restricted sense, science does tell the truth. It may not be the absolute truth about Nature, but it's certainly a kind of pragmatic, functional truth at which the scientific community arrives by consensus based on the shared testing of hypotheses and results.
What is truth?
Credit: Sergey Nivens / 242235342
But at a deeper level of scrutiny, the meaning of truth becomes intangible, and we must agree with the pre-Socratic philosopher Democritus who declared, around 400 years BCE, that "truth is in the depths." (Incidentally, Democritus predicted the existence of the atom, something that certainly exists in the depths.)
A look at a dictionary reinforces this view. "Truth: the quality of being true." Now, that's a very circular definition. How do we know what is true? A second definition: "Truth: a fact or belief that is accepted as true." Acceptance is key here. A belief may be accepted to be true, as is the case with religious faith. There is no need for evidence to justify a belief. But note that a fact as well can be accepted as true, even if belief and facts are very different things. This illustrates how the scientific community arrives at a consensus of what is true by acceptance. Sufficient factual evidence supports that a statement is true. (Note that what defines sufficient factual evidence is also accepted by consensus.) At least until we learn more.
Take the example of gravity. We know that an object in free fall will hit the ground, and we can calculate when it does using Galileo's law of free fall (in the absence of friction). This is an example of "functional truth." If you drop one million rocks from the same height, the same law will apply every time, corroborating the factual acceptance of a functional truth, that all objects fall to the ground at the same rate irrespective of their mass (in the absence of friction).
But what if we ask, "What is gravity?" That's an ontological question about what gravity is and not what it does. And here things get trickier. To Galileo, it was an acceleration downward; to Newton a force between two or more massive bodies inversely proportional to the square of the distance between them; to Einstein the curvature of spacetime due to the presence of mass and/or energy. Does Einstein have the final word? Probably not.
Is there an ultimate scientific truth?
Final or absolute scientific truths assume that what we know of Nature can be final, that human knowledge can make absolute proclamations. But we know that this can't really work, for the very nature of scientific knowledge is that it is incomplete and contingent on the accuracy and depth with which we measure Nature with our instruments. The more accuracy and depth our measurements gain, the more they are able to expose the cracks in our current theories, as I illustrated last week with the muon magnetic moment experiments.
So, we must agree with Democritus, that truth is indeed in the depths and that proclamations of final or absolute truths, even in science, shouldn't be trusted. Fortunately, for all practical purposes — flying airplanes or spaceships, measuring the properties of a particle, the rates of chemical reactions, the efficacy of vaccines, or the blood flow in your brain — functional truths do well enough.
A canvas of nonsense: how Dada reflects a world gone mad through art
Using urinals, psychological collages, and animated furniture to shock us into reality.
A Dadaist artist is painted with the ashes of burned banknotes during the financial crisis.
- Dada is a provocative and surreal art movement born out of the madness of World War I.
- Tzara, a key Dada theorist, says Dada seeks "to confuse and upset, to shake and jolt" people from their comfort zones.
- Dada, as all avant-garde art, faces a key problem in how to stay true to its philosophy.
In a world gone mad, what can the few sane people left do? What can someone say when there are no words that seem up to the job? How can anyone hope to express ideas so terrible when doing so will only reduce those ideas?
These are some of the things that inspired the Dada movement, and in its absurd, surreal, and chaotic nonsense, we find the voice of the voiceless.
The origin of Dadaism
Dada was a response to the madness of World War I. Reasonable, intelligent, and sensitive people looked at the blood and mud graveyards of the trenches and wondered how any meaning or goodness could ever be found again. How can someone make sense of a world where millions of young, happy, hopeful men were scythed down in a spray of bullets? How could life go back to normal when returning soldiers, blinded and disfigured from gas, lay homeless in the streets? Out of this awful revulsion, there came one bitter voice, and it said: "Everything is nonsense."
Dada is the art of the nihilist. It smashes accepted wisdom, challenges norms and values, and offends, upsets, and provokes us to re-examine everything.
And so, the Dada movement expressed itself in absurdity. Tzara, the closest you get to a Dadaist philosopher, put it like this: "Like everything in life, Dada is useless. Dada is without pretension, as life should be." Dada rejects all systems, all philosophy, all definite answers, and all truth. It is the living embrace of contradictions and nonsense. It seeks "to confuse and upset people, to shake and jolt". It aims to shout down the "shamefaced sex of comfortable compromise and good manners," when actually "everything happens in a completely idiotic way."
In short, Dada is a response to the world when all the usual methods have broken down. It's the recognition that dinner party conversations, Hollywood blockbusters, and Silicon Valley are not how life actually is. This is a false reality and order, like some kind of veneer.
The Dada response to life is to embrace the personal and passionate madness of it all, where "the intensity of a personality is transposed directly, clearly into the work." It's to recognize the unique position of an artist, who can convey ideas and feelings in a way that goes beyond normal understanding. Art goes straight to the soul, but the intensity of it all can be hard to "enjoy" in the strictest sense.
Where is this Dada?
For instance, Dada is seen in the poems of Hugo Ball who wrote in meaningless foreign-sounding words. It's in Hausmann, who wrote works in disconnected phonemes. It's found in Duchamp's iconoclastic "Fountain" that sought to question what art or an artist really meant. It's in Hans Richter's short film "Ghost before Breakfast," which has an incoherent montage of images, loosely connected by the theme of inanimate objects in revolt. And, it's in Kurt Schwitters' "psychological collages" which present fragments of objects, juxtaposed together.
Kurt Schwitters, Merz-drawing 85, Zig-Zag Red, 1920, collageCredit: Kurt Schwitters / Public Domain via Wikipedia
Dada is intended to shock. It's an artistic jolt asking, or demanding, that the viewers reorient themselves in some way. It is designed to make us feel uncomfortable and does not make for easy appreciation. It's only when we're thrown so drastically outside of our comfort zone in this way that Dada asks us to question how things are. It shakes us out of a conformist stupor to look afresh at things.
The paradox of Dadaism
Of course, like all avant-garde art, Dada needs to address one major problem: how do you stay so provocative, so radical, and so anti-establishment when you also seek success? How can maverick rebels stay so as they get a mortgage and want a good school for their kids? The problem is that young, inventive, and idealistic artists are inevitably sucked into the world of profit and commodity.
As Grayson Perry, a British modern artist, wrote: "What starts as a creative revolt soon becomes co-opted as the latest way to make money," and what was once fresh and challenging "falls away to reveal a predatory capitalist robot." With Dada, how long can someone actually live in a world of nonsense and nihilistic absurdity?
But there will always be new blood to keep movements like Dada going. As the revolutionaries of yesterday become the rich mansion-owners of today, there will be hot, young things to come and take up the mantle. There will always be something to challenge and questions to be asked. So, art movements like Dada will always be in the vanguard.
Dada is the art of the nihilist. It smashes accepted wisdom, challenges norms and values, and offends, upsets, and provokes us to re-examine everything. It's an absurd art form that reflects the reality it perceives — that life is nothing more than a dissonant patchwork of egos floating in an abyss of nothing.
Jonny Thomson teaches philosophy in Oxford. He runs a popular Instagram account called Mini Philosophy (@philosophyminis). His first book is Mini Philosophy: A Small Book of Big Ideas.
Study: Tripping might not be required for psychedelic therapy
Two different studies provide further evidence of the efficacy of psychedelics in treating depression.
