When and why do people become atheists? New study uncovers important predictors
The less that parents "walk the walk" about religious beliefs, the more likely their children are to walk away.
In 2009, Joseph Henrich, a professor in the Psychology and Economics departments at the University of British Columbia (now at Harvard), proposed the idea of Credibility Enhancing Displays (CREDs). He was looking for a term to signify people that “convey one mental representation but actually believe something else.” At the very least, he continues, they fudge their level of commitment.
Henrich coined this term to make sense of manipulability, especially in regards to religious belief. While his focus was on cultural learning through evolutionary history, extrapolating to apply CREDs to politics doesn’t tax our imagination. In fact, he argues that CREDs are an essential component of tribalism; they help you identify with a group and strengthen in-group bonds. Throughout history, this would have been a very useful device, yet evolutionary biology didn’t foresee the development of societies containing hundreds of millions of people. Our minds might move a million miles an hour but deeply ingrained habits do not.
To make his case, Henrich turned to ritualized theater, such as firewalking and animal sacrifice. Such costly displays, he writes, “transmit higher levels of belief commitment and thereby promote cooperation and success in intergroup or interinstitution competition.” The more audacious a display, the more likely we’ll buy into what’s being sold, even if the seller is focused more on your purchase than the item itself.
Though we can spend pages applying this to the American electoral cycle, we’ll instead turn to Joseph Langston, who recently applied CREDs to the onset of atheism. As with everything religion and politics, there is plenty of inherent crossovers.
The Age of Atheism
Langston, a Ph.D. student at New Zealand’s Victoria University and a researcher at the Atheist Research Collaborative, wanted to know when people become atheists. He realized CREDs provided a good means of measuring this, according to his new study, published in Religion, Brain & Behavior. It turns out that parents that talk about religion but fail to practice what they preach are more likely to produce little deniers.
CREDs are not relegated to beliefs in the supernatural. Knowing which mushrooms to eat and what berries to avoid fall within its purview. What Henrich understood, and what Langston reiterates, is that this socially beneficial tool is malleable enough to be co-opted by hucksters. Sure, beliefs are often sincere, yet when a ritualistic display is being exploited, grand theatrics are more likely to secure a potential believer’s emotional investment.
Langston points to previous research that position CREDs as being at least partly influential for intergenerational religious belief, which made him suspect that they would offer insight into what age a person becomes atheistic. He gives special attention to the distance between religious choice and religious conflict: in post-industrial societies, where existential security is common, parents are less likely to rely on supernatural authority for survival—though America is unique in its fundamentalist proclivities.
Religious choice, he writes, is likely to produce greater numbers of atheists in future generations. Yet authoritarian parenting also creates atheistic tendencies through “alienation, personal disappointment, and rebellion.” Not allowing for choice, it appears, increases the likelihood of atheism.
For the study, 5,153 atheists were questioned on two sets of criteria. First, Langston wanted to know if the relationship between CREDs and atheism were influenced by religious importance, religious choice, and religious conflict. Second, he broadened the scope of questioning to include the acquisition and transmission of religious beliefs by studying other familial and social variables. These included questions such as “While you were growing up, would you say your [Mother or Father] was (1) Easy to Talk With, (2) Strict, and (3) Warm and Loving.”
Langston discovered that religious importance predicted a delay in the age which people became atheists, while choice and conflict hastened the process. And, as he initially predicted, CREDs did indeed lead to an earlier onset of atheism. When children hear their parents talk but they don’t walk, it’s the children who end up walking away.
Some people find it difficult to think of belief as fluid, yet humans are generally open to being manipulated. Culture is created through interconnected layers of CREDs; if that weren't the case, consensus in societies would not exist. While there is a tendency to separate religious belief from other forms of social norms, there is nothing sacred or universal about any belief. They are all constructs, open to interpretation, and plastic.
In an interview, Langston admits to several limitations, namely the fact that believers were not included in this study.
If we were to design a study that was superior to ours, then for that study we would have collected a large sample of nonbelievers and believers. Then we would be able to do direct comparisons between those two groups.
Overall, Langston doesn’t see this as a problem, only cause for further research (which he’s already been conducting). In the future, he wants to know if nonreligious identifications are being deliberately transmitted by secular families, and if so, what kind of CREDs are being used; if believers experience different levels of religious choice and religious conflict than nonbelievers; and whether or not authoritarian-leaning religious parents unknowingly cause their child's pivot to atheism.
Shortly after receiving my degree in Religious Studies, I was walking over the Brooklyn Bridge with my father. By that point I had already turned atheist; I studied religion because I was fascinated by why people believe, not necessarily what they believe. I asked my father why I was raised with no religion at all.
His answer was immediate: “Because I was raised with too much of it.” He resented the fact that he had to attend the local Russian Orthodox Church every Sunday while his parents stayed home. By sixth grade, when I said I no longer wanted to go to CCD—a loose Catholicism from my mother's side—my parents were fine with it. The weekly class was more social activity than required training anyway.
I’m not sure what CREDs were being transmitted during my youth, but one thing is certain from Langston’s research: hypocrites rarely produce the results they desire. The theater might at first entrance, but the drugs do eventually wear off.
Dominique Crenn, the only female chef in America with three Michelin stars, joins Big Think Live this Thursday at 1pm ET.
Scientists discover the inner workings of an effect that will lead to a new generation of devices.
- Researchers discover a method of extracting previously unavailable information from superconductors.
- The study builds on a 19th-century discovery by physicist Edward Hall.
- The research promises to lead to a new generation of semiconductor materials and devices.
Credit: Gunawan/Nature magazine
Students who think the world is just cheat less, but they need to experience justice to feel that way.
- Students in German and Turkish universities who believed the world is just cheated less than their pessimistic peers.
- The tendency to think the world is just is related to the occurence of experiences of justice.
- The findings may prove useful in helping students adjust to college life.
The world is just? That’s news to a lot of people.<p>The study is the most recent addition to a long line of work focusing on the belief in justice, our behavior, and our reactions to evidence that might suggest injustice occasionally occurs. This study focuses on a personal belief in a just world, (PBJW) rather than a general belief in a just world (GBJW). The difference between them must be highlighted.</p><p>GBJW is the stance that justice prevails all over the world and that people tend to get what they deserve. PBJW is more focused on the individual's social environment and their belief that they tend to be treated justly. While several studies show PBJW correlates with a higher sense of well-being and a variety of other positive effects, a high GBJW is associated with less life satisfaction, negative behavior, and callousness towards the suffering of <a href="https://link.springer.com/book/10.1007%2F978-1-4939-3216-0" target="_blank">others</a>. This study controlled for GBJW, and focused on PBJW as much as possible. </p><p>To assure that culture was not a factor, the study included students at universities in both Germany and Turkey. </p><p>The researchers gave students at the four participating universities a series of questionnaires that asked if they ever cheated in class, if they perceived the world to be just, if they though that justice always prevailed everywhere, their tendencies towards socially appropriate behavior, their life satisfaction, and if they felt like they were treated justly by their teachers and fellow students. </p><p>The answers were statistically analyzed for relationships. While some of the connections seem trivially true, others were surprising. <strong></strong></p><p>PBJW turned out to only be an indirect predictor of if a student was likely to cheat. Both a belief in a just world and a lower likelihood of cheating were mediated by the justice experiences of the students, with more of these positive experiences lowering the rate of cheating and improving their belief in justice. This was also associated with higher levels of life satisfaction. </p><p>These effects existed across all demographics in both countries. </p>
What does this mean? Is a belief in justice a self-fulfilling prophecy?<iframe width="730" height="430" src="https://www.youtube.com/embed/6oMv-azHNCA" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe><p>In a way, it seems to be. People who have reason to think the world is just to them tend to interpret events in a way to sustain that belief and behave in a just manner. In a larger sense, the take away from this study is that experiences of justice, both from peers and instructors, is vital to student's wellbeing and understanding that the rules that exist about cheating are part of a larger, legitimate, system. </p><p>The researchers, citing previous studies on the perception of justice, note that "justice experiences (1) signal that university students are esteemed members of their social group, which in turn conveys feelings of belonging and social inclusion and (2) motivate them to accept and observe university rules and norms. These cognitive processes may thus strengthen their well-being and decrease the likelihood that they cheat."</p><p>The authors also suggest that if you want people (not only students) to act justly; consider treating them with "civility, respect, and dignity."</p><p>Sometimes, all it can take to help somebody act virtuously is to treat them well. Likewise, people treated harshly can rarely find reason to play by rules that don't protect them. The findings of this study will certainly add to the literature on how we perceive justice in the world around us, but might also help us remember that there are real consequences to our actions which can be much larger than we imagine. <strong></strong></p>
This could change how researchers approach vaccine development.
- The reason children suffer less from the novel coronavirus has remained mysterious.
- Researchers identified a cytokine, IL-17A, which appears to protect children from the ravages of COVID-19.
- This cytokine response could change how researchers approach vaccine development.
A member of staff wearing personal protective equipment (PPE) takes a child's temperature at the Harris Academy's Shortland's school on June 04, 2020 in London, England.
Photo by Dan Kitwood/Getty Images<p>Experts don't want to place kids at the back of the line, regardless of how strong their immune systems appear. At least one company, Moderna, <a href="https://www.businessinsider.com/coronavirus-vaccine-for-kids-moderna-plans-pediatric-trial-2020-9" target="_blank">hopes to begin testing</a> vaccines in pediatric volunteers by year's end.</p><p>Innate immune response is especially high during childhood (compared to adaptive immunity). This makes evolutionary sense: nature wants an animal to survive until its ready to procreate. Turns out the children in the study possessed high levels of cytokines that boost their immune response. The biggest impact is made by IL-17A, which appears to protect the youngest cohort from the ravages of the coronavirus. </p><p>While both age groups produced antibodies to fight off the infamous spike protein, adults that produce neutralizing antibodies actually suffer a <em>worse</em> fate. Herold says this "over-vigorous adaptive immune response" might promote inflammation, triggering acute respiratory distress syndrome (ARDS). </p><p>This matters for vaccine development. As Herold says, </p><p style="margin-left: 20px;">"Our adult COVID-19 patients who fared poorly had high levels of neutralizing antibodies, suggesting that convalescent plasma—which is rich in neutralizing antibodies—may not help adults who have already developed signs of ARDS. By contrast, therapies that boost innate immune responses early in the course of the disease may be especially beneficial."</p><p>Herold says current vaccine trials are focused on boosting neutralizing-antibody levels. With this new information, researchers may want to work on vaccines that boost the innate immune response instead. </p><p>With <a href="https://www.nytimes.com/interactive/2020/science/coronavirus-vaccine-tracker.html" target="_blank">at least 55 vaccine trials</a> underway, every piece of data matters. </p><p>--</p><p><em>Stay in touch with Derek on <a href="http://www.twitter.com/derekberes" target="_blank">Twitter</a>, <a href="https://www.facebook.com/DerekBeresdotcom" target="_blank">Facebook</a> and <a href="https://derekberes.substack.com/" target="_blank" rel="noopener noreferrer">Substack</a>. His next book is</em> "<em>Hero's Dose: The Case For Psychedelics in Ritual and Therapy."</em></p>
Researchers from the University of Toronto published a new map of cancer cells' genetic defenses against treatment.