Once a week.
Subscribe to our weekly newsletter.
Are religious people more moral?
Why do people distrust atheists? And are they right to do so?
Why do people distrust atheists?
A recent study we conducted, led by psychologist Will Gervais, found widespread and extreme moral prejudice against atheists around the world. Across all continents, people assumed that those who committed immoral acts, even extreme ones such as serial murder, were more likely to be atheists.
Although this was the first demonstration of such bias at a global scale, its existence is hardly surprising.
Survey data show that Americans are less trusting of atheists than of any other social group. For most politicians, going to church is often the best way to garner votes, and coming out as an unbeliever could well be political suicide. After all, there are no open atheists in the U.S. Congress. The only known religiously unaffiliated representative describes herself as “none,” but still denies being an atheist.
So, where does such extreme prejudice come from? And what is the actual evidence on the relationship between religion and morality?
How does religion relate to morality?
It is true that the world’s major religions are concerned with moral behavior. Many, therefore, might assume that religious commitment is a sign of virtue, or even that morality cannot exist without religion.
Both of these assumptions, however, are problematic.
For one thing, the ethical ideals of one religion might seem immoral to members of another. For instance, in the 19th century, Mormons considered polygamy a moral imperative, while Catholics saw it as a mortal sin.
Moreover, religious ideals of moral behavior are often limited to group members and might even be accompanied by outright hatred against other groups. In 1543, for example, Martin Luther, one of the fathers of Protestantism, published a treatise titled “On the Jews and their Lies,” echoing anti-Semitic sentiments that have been common among various religious groups for centuries.
These examples also reveal that religious morality can and does change with the ebb and flow of the surrounding culture. In recent years, several Anglican churches have revised their moral views to allow contraception, the ordination of women and the blessing of same-sex unions.
Discrepancy between beliefs and behavior
In any case, religiosity is only loosely related to theology. That is, the beliefs and behaviors of religious people are not always in accordance with official religious doctrines. Instead, popular religiosity tends to be much more practical and intuitive. This is what religious studies scholars call “theological incorrectness.”
Buddhism, for example, may officially be a religion without gods, but most Buddhists still treat Buddha as a deity. Similarly, the Catholic Church vehemently opposes birth control, but the vast majority of Catholics practice it anyway. In fact, theological incorrectness is the norm rather than the exception among believers.
This discrepancy among beliefs, attitudes and behaviors is a much broader phenomenon. After all, communism is an egalitarian ideology, but communists do not behave any less selfishly.
So, what is the actual evidence on the relationship between religion and morality?
Do people practice what they preach?
Social scientific research on the topic offers some intriguing results.
When researchers ask people to report on their own behaviors and attitudes, religious individuals claim to be more altruistic, compassionate, honest, civic and charitable than nonreligious ones. Even among twins, more religious siblings describe themselves are being more generous.
But when we look at actual behavior, these differences are nowhere to be found.
Researchers have now looked at multiple aspects of moral conduct, from charitable giving and cheating in exams to helping strangers in need and cooperating with anonymous others.
In a classical experiment known as the “Good Samaritan Study,” researchers monitored who would stop to help an injured person lying in an alley. They found that religiosity played no role in helping behavior, even when participants were on their way to deliver a talk on the parable of the good Samaritan.
This finding has now been confirmed in numerous laboratory and field studies. Overall, the results are clear: No matter how we define morality, religious people do not behave more morally than atheists, although they often say (and likely believe) that they do.
When and where religion has an impact
On the other hand, religious reminders do have a documented effect on moral behavior.
Studies conducted among American Christians, for example, have found that participants donated more money to charity and even watched less porn on Sundays. However, they compensated on both accounts during the rest of the week. As a result, there were no differences between religious and nonreligious participants on average.
Likewise, a study conducted in Morocco found that whenever the Islamic call to prayer was publicly audible, locals contributed more money to charity. However, these effects were short-lived: Donations increased only within a few minutes of each call, and then dropped again.
Interestingly, one’s degree of religiosity does not seem to have a major effect in these experiments. In other words, the positive effects of religion depend on the situation, not the disposition.
Religion and rule of law
Not all beliefs are created equal, though. A recent cross-cultural study showed that those who see their gods as moralizing and punishing are more impartial and cheat less in economic transactions. In other words, if people believe that their gods always know what they are up to and are willing to punish transgressors, they will tend to behave better, and expect that others will too.
Such a belief in an external source of justice, however, is not unique to religion. Trust in the rule of law, in the form of an efficient state, a fair judicial system or a reliable police force, is also a predictor of moral behavior.
The co-evolution of God and society
Scientific evidence suggests that humans – and even our primate cousins – have innate moral predispositions, which are often expressed in religious philosophies. That is, religion is a reflection rather than the cause of these predispositions.
But the reason religion has been so successful in the course of human history is precisely its ability to capitalize on those moral intuitions.
The historical record shows that supernatural beings have not always been associated with morality. Ancient Greek gods were not interested in people’s ethical conduct. Much like the various local deities worshiped among many modern hunter-gatherers, they cared about receiving rites and offerings but not about whether people lied to one another or cheated on their spouses.
According to psychologist Ara Norenzayan, belief in morally invested gods developed as a solution to the problem of large-scale cooperation.
Early societies were small enough that their members could rely on people’s reputations to decide whom to associate with. But once our ancestors turned to permanent settlements and group size increased, everyday interactions were increasingly taking place between strangers. How were people to know whom to trust?
Religion provided an answer by introducing beliefs about all-knowing, all-powerful gods who punish moral transgressions. As human societies grew larger, so did the occurrence of such beliefs. And in the absence of efficient secular institutions, the fear of God was crucial for establishing and maintaining social order.
In those societies, a sincere belief in a punishing supernatural watcher was the best guarantee of moral behavior, providing a public signal of compliance with social norms.
Today we have other ways of policing morality, but this evolutionary heritage is still with us. Although statistics show that atheists commit fewer crimes than average, the widespread prejudice against them, as highlighted by our study, reflects intuitions that have been forged through centuries and might be hard to overcome.
The COVID-19 pandemic is making health disparities in the United States crystal clear. It is a clarion call for health care systems to double their efforts in vulnerable communities.
- The COVID-19 pandemic has exacerbated America's health disparities, widening the divide between the haves and have nots.
- Studies show disparities in wealth, race, and online access have disproportionately harmed underserved U.S. communities during the pandemic.
- To begin curing this social aliment, health systems like Northwell Health are establishing relationships of trust in these communities so that the post-COVID world looks different than the pre-COVID one.
COVID-19 deepens U.S. health disparities<p>Communities on the pernicious side of America's health disparities have their unique histories, environments, and social structures. They are spread across the United States, but they all have one thing in common.</p><p>"There is one common divide in American communities, and that is poverty," said <a href="https://www.northwell.edu/about/leadership/debbie-salas-lopez" target="_blank">Debbie Salas-Lopez, MD, MPH</a>, senior vice president of community and population health at Northwell Health. "That is the undercurrent that manifests poor health, poor health outcomes, or poor health prognoses for future wellbeing."</p><p>Social determinants have far-reaching effects on health, and poor communities have unfavorable social determinants. To pick one of many examples, <a href="https://www.npr.org/2020/09/27/913612554/a-crisis-within-a-crisis-food-insecurity-and-covid-19" target="_blank" rel="noopener noreferrer">food insecurity</a> reduces access to quality food, leading to poor health and communal endemics of chronic medical conditions. The U.S. Centers for Disease Control and Prevention has identified some of these conditions, such as obesity and Type 2 diabetes, as increasing the risk of developing a severe case of coronavirus.</p><p>The pandemic didn't create poverty or food insecurity, but it exacerbated both, and the results have been catastrophic. A study published this summer in the <em><a href="https://link.springer.com/article/10.1007/s11606-020-05971-3" target="_blank">Journal of General Internal Medicine</a></em> suggested that "social factors such as income inequality may explain why some parts of the USA are hit harder by the COVID-19 pandemic than others."</p><p>That's not to say better-off families in the U.S. weren't harmed. A <a href="https://voxeu.org/article/poverty-inequality-and-covid-19-us" target="_blank" rel="noopener noreferrer">paper from the Centre for Economic Policy Research</a> noted that families in counties with a higher median income experienced adjustment costs associated with the pandemic—for example, lowering income-earning interactions to align with social distancing policies. However, the paper found that the costs of social distancing were much greater for poorer families, who cannot easily alter their living circumstances, which often include more individuals living in one home and a reliance on mass transit to reach work and grocery stores. They are also disproportionately represented in essential jobs, such as retail, transportation, and health care, where maintaining physical distance can be all but impossible.</p><p>The paper also cited a positive correlation between higher income inequality and higher rates of coronavirus infection. "Our interpretation is that poorer people are less able to protect themselves, which leads them to different choices—they face a steeper trade-off between their health and their economic welfare in the context of the threats posed by COVID-19," the authors wrote.</p><p>"There are so many pandemics that this pandemic has exacerbated," Dr. Salas-Lopez noted.</p><p>One example is the health-wealth gap. The mental stressors of maintaining a low socioeconomic status, especially in the face of extreme affluence, can have a physically degrading impact on health. <a href="https://www.scientificamerican.com/index.cfm/_api/render/file/?method=inline&fileID=123ECD96-EF81-46F6-983D2AE9A45FA354" target="_blank" rel="noopener noreferrer">Writing on this gap</a>, Robert Sapolsky, professor of biology and neurology at Stanford University, notes that socioeconomic stressors can increase blood pressure, reduce insulin response, increase chronic inflammation, and impair the prefrontal cortex and other brain functions through anxiety, depression, and cognitive load. </p><p>"Thus, from the macro level of entire body systems to the micro level of individual chromosomes, poverty finds a way to produce wear and tear," Sapolsky writes. "It is outrageous that if children are born into the wrong family, they will be predisposed toward poor health by the time they start to learn the alphabet."</p>Research on the economic and mental health fallout of COVID-19 is showing two things: That unemployment is hitting <a href="https://www.pewsocialtrends.org/2020/09/24/economic-fallout-from-covid-19-continues-to-hit-lower-income-americans-the-hardest/" target="_blank" rel="noopener noreferrer">low-income and young Americans</a> most during the pandemic, potentially widening the health-wealth gap further; and that the pandemic not only exacerbates mental health stressors, but is doing so at clinically relevant levels. As <a href="https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7413844/" target="_blank" rel="noopener noreferrer">the authors of one review</a> wrote, the pandemic's effects on mental health is itself an international public health priority.
Working to close the health gap<img type="lazy-image" data-runner-src="https://assets.rebelmouse.io/eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpbWFnZSI6Imh0dHBzOi8vYXNzZXRzLnJibC5tcy8yNDc5MDk1MS9vcmlnaW4uanBnIiwiZXhwaXJlc19hdCI6MTYxNTYyMzQzMn0.KSFpXH7yHYrfVPtfgcxZqAHHYzCnC2bFxwSrJqBbH4I/img.jpg?width=980" id="b40e2" class="rm-shortcode" data-rm-shortcode-id="1b9035370ab7b02a0dc00758e494412b" data-rm-shortcode-name="rebelmouse-image" />
Northwell Health coronavirus testing center at Greater Springfield Community Church.
Credit: Northwell Health<p>Novel coronavirus may spread and infect indiscriminately, but pre-existing conditions, environmental stressors, and a lack of access to care and resources increase the risk of infection. These social determinants make the pandemic more dangerous, and erode communities' and families' abilities to heal from health crises that pre-date the pandemic.</p><p>How do we eliminate these divides? Dr. Salas-Lopez says the first step is recognition. "We have to open our eyes to see the suffering around us," she said. "Northwell has not shied away from that."</p><p>"We are steadfast in improving health outcomes for our vulnerable and underrepresented communities that have suffered because of the prevalence of chronic disease, a problem that led to the disproportionately higher death rate among African-Americans and Latinos during the COVID-19 pandemic," said Michael Dowling, Northwell's president and CEO. "We are committed to using every tool at our disposal—as a provider of health care, employer, purchaser and investor—to combat disparities and ensure the <a href="https://www.northwell.edu/education-and-resources/community-engagement/center-for-equity-of-care" target="_blank" rel="noopener noreferrer">equity of care</a> that everyone deserves." </p><p>With the need recognized, Dr. Salas-Lopez calls for health care systems to travel upstream and be proactive in those hard-hit communities. This requires health care systems to play a strong role, but not a unilateral one. They must build <a href="https://www.northwell.edu/news/insights/faith-based-leaders-are-the-key-to-improving-community-health" target="_blank" rel="noopener noreferrer">partnerships with leaders in those communities</a> and utilize those to ensure relationships last beyond the current crisis. </p><p>"We must meet with community leaders and talk to them to get their perspective on what they believe the community needs are and should be for the future. Together, we can co-create a plan to measurably improve [community] health and also to be ready for whatever comes next," she said.</p><p>Northwell has built relationships with local faith-based and community organizations in underserved communities of color. Those partnerships enabled Northwell to test more than 65,000 people across the metro New York region. The health system also offered education on coronavirus and precautions to curb its spread.</p><p>These initiatives began the process of building trust—trust that Northwell has counted on to return to these communities to administer flu vaccines to prepare for what experts fear may be a difficult flu season.</p><p>While Northwell has begun building bridges across the divides of the New York area, much will still need to be done to cure U.S. health care overall. There is hope that the COVID pandemic will awaken us to the deep disparities in the US.</p><p>"COVID has changed our world. We have to seize this opportunity, this pandemic, this crisis to do better," Dr. Salas-Lopez said. "Provide better care. Provide better health. Be better partners. Be better community citizens. And treat each other with respect and dignity.</p><p>"We need to find ways to unify this country because we're all human beings. We're all created equal, and we believe that health is one of those important rights."</p>
With just a few strategical tweaks, the Nazis could have won one of World War II's most decisive battles.
- The Battle of Britain is widely recognized as one of the most significant battles that occurred during World War II. It marked the first major victory of the Allied forces and shifted the tide of the war.
- Historians, however, have long debated the deciding factor in the British victory and German defeat.
- A new mathematical model took into account numerous alternative tactics that the German's could have made and found that just two tweaks stood between them and victory over Britain.
Two strategic blunders<p>Now, historians and mathematicians from York St. John University have collaborated to produce <a href="http://www-users.york.ac.uk/~nm15/bootstrapBoB%20AAMS.docx" target="_blank">a statistical model (docx download)</a> capable of calculating what the likely outcomes of the Battle of Britain would have been had the circumstances been different. </p><p>Would the German war effort have fared better had they not bombed Britain at all? What if Hitler had begun his bombing campaign earlier, even by just a few weeks? What if they had focused their targets on RAF airfields for the entire course of the battle? Using a statistical technique called weighted bootstrapping, the researchers studied these and other alternatives.</p><p>"The weighted bootstrap technique allowed us to model alternative campaigns in which the Luftwaffe prolongs or contracts the different phases of the battle and varies its targets," said co-author Dr. Jaime Wood in a <a href="https://www.york.ac.uk/news-and-events/news/2020/research/mathematicians-battle-britain-what-if-scenarios/" target="_blank">statement</a>. Based on the different strategic decisions that the German forces could have made, the researchers' model enabled them to predict the likelihood that the events of a given day of fighting would or would not occur.</p><p>"The Luftwaffe would only have been able to make the necessary bases in France available to launch an air attack on Britain in June at the earliest, so our alternative campaign brings forward the air campaign by three weeks," continued Wood. "We tested the impact of this and the other counterfactuals by varying the probabilities with which we choose individual days."</p><p>Ultimately, two strategic tweaks shifted the odds significantly towards the Germans' favor. Had the German forces started their campaign earlier in the year and had they consistently targeted RAF airfields, an Allied victory would have been extremely unlikely.</p><p>Say the odds of a British victory in the real-world Battle of Britain stood at 50-50 (there's no real way of knowing what the actual odds are, so we'll just have to select an arbitrary figure). If this were the case, changing the start date of the campaign and focusing only on airfields would have reduced British chances at victory to just 10 percent. Even if a British victory stood at 98 percent, these changes would have cut them down to just 34 percent.</p>
A tool for understanding history<p>This technique, said co-author Niall Mackay, "demonstrates just how finely-balanced the outcomes of some of the biggest moments of history were. Even when we use the actual days' events of the battle, make a small change of timing or emphasis to the arrangement of those days and things might have turned out very differently."</p><p>The researchers also claimed that their technique could be applied to other uncertain historical events. "Weighted bootstrapping can provide a natural and intuitive tool for historians to investigate unrealized possibilities, informing historical controversies and debates," said Mackay.</p><p>Using this technique, researchers can evaluate other what-ifs and gain insight into how differently influential events could have turned out if only the slightest things had changed. For now, at least, we can all be thankful that Hitler underestimated Britain's grit.</p>
The next era in American history can look entirely different. It's up to us to choose.
- The timeline of America post-WWII can be divided into two eras, according to author and law professor Ganesh Sitaraman: the liberal era which ran through the 1970s, and the current neoliberal era which began in the early 1980s. The latter promised a "more free society," but what we got instead was more inequality, less opportunity, and greater market consolidation.
- "We've lived through a neoliberal era for the last 40 years, and that era is coming to an end," Sitaraman says, adding that the ideas and policies that defined the period are being challenged on various levels.
- What comes next depends on if we take a proactive and democratic approach to shaping the economy, or if we simply react to and "deal with" market outcomes.
A new MIT report proposes how humans should prepare for the age of automation and artificial intelligence.
- A new report by MIT experts proposes what humans should do to prepare for the age of automation.
- The rise of intelligent machines is coming but it's important to resolve human issues first.
- Improving economic inequality, skills training, and investment in innovation are necessary steps.