When and why do people become atheists? New study uncovers important predictors

The less that parents "walk the walk" about religious beliefs, the more likely their children are to walk away.

In 2009, Joseph Henrich, a professor in the Psychology and Economics departments at the University of British Columbia (now at Harvard), proposed the idea of Credibility Enhancing Displays (CREDs). He was looking for a term to signify people that “convey one mental representation but actually believe something else.” At the very least, he continues, they fudge their level of commitment.


Henrich coined this term to make sense of manipulability, especially in regards to religious belief. While his focus was on cultural learning through evolutionary history, extrapolating to apply CREDs to politics doesn’t tax our imagination. In fact, he argues that CREDs are an essential component of tribalism; they help you identify with a group and strengthen in-group bonds. Throughout history, this would have been a very useful device, yet evolutionary biology didn’t foresee the development of societies containing hundreds of millions of people. Our minds might move a million miles an hour but deeply ingrained habits do not.

To make his case, Henrich turned to ritualized theater, such as firewalking and animal sacrifice. Such costly displays, he writes, “transmit higher levels of belief commitment and thereby promote cooperation and success in intergroup or interinstitution competition.” The more audacious a display, the more likely we’ll buy into what’s being sold, even if the seller is focused more on your purchase than the item itself.

Though we can spend pages applying this to the American electoral cycle, we’ll instead turn to Joseph Langston, who recently applied CREDs to the onset of atheism. As with everything religion and politics, there is plenty of inherent crossovers.

The Age of Atheism

Langston, a Ph.D. student at New Zealand’s Victoria University and a researcher at the Atheist Research Collaborative, wanted to know when people become atheists. He realized CREDs provided a good means of measuring this, according to his new study, published in Religion, Brain & Behavior. It turns out that parents that talk about religion but fail to practice what they preach are more likely to produce little deniers.

CREDs are not relegated to beliefs in the supernatural. Knowing which mushrooms to eat and what berries to avoid fall within its purview. What Henrich understood, and what Langston reiterates, is that this socially beneficial tool is malleable enough to be co-opted by hucksters. Sure, beliefs are often sincere, yet when a ritualistic display is being exploited, grand theatrics are more likely to secure a potential believer’s emotional investment.

Langston points to previous research that position CREDs as being at least partly influential for intergenerational religious belief, which made him suspect that they would offer insight into what age a person becomes atheistic. He gives special attention to the distance between religious choice and religious conflict: in post-industrial societies, where existential security is common, parents are less likely to rely on supernatural authority for survival—though America is unique in its fundamentalist proclivities.

Religious choice, he writes, is likely to produce greater numbers of atheists in future generations. Yet authoritarian parenting also creates atheistic tendencies through “alienation, personal disappointment, and rebellion.” Not allowing for choice, it appears, increases the likelihood of atheism.

The study

For the study, 5,153 atheists were questioned on two sets of criteria. First, Langston wanted to know if the relationship between CREDs and atheism were influenced by religious importance, religious choice, and religious conflict. Second, he broadened the scope of questioning to include the acquisition and transmission of religious beliefs by studying other familial and social variables. These included questions such as “While you were growing up, would you say your [Mother or Father] was (1) Easy to Talk With, (2) Strict, and (3) Warm and Loving.”

Langston discovered that religious importance predicted a delay in the age which people became atheists, while choice and conflict hastened the process. And, as he initially predicted, CREDs did indeed lead to an earlier onset of atheism. When children hear their parents talk but they don’t walk, it’s the children who end up walking away.

Some people find it difficult to think of belief as fluid, yet humans are generally open to being manipulated. Culture is created through interconnected layers of CREDs; if that weren't the case, consensus in societies would not exist. While there is a tendency to separate religious belief from other forms of social norms, there is nothing sacred or universal about any belief. They are all constructs, open to interpretation, and plastic.  

Limitations

In an interview, Langston admits to several limitations, namely the fact that believers were not included in this study.

If we were to design a study that was superior to ours, then for that study we would have collected a large sample of nonbelievers and believers. Then we would be able to do direct comparisons between those two groups.

Overall, Langston doesn’t see this as a problem, only cause for further research (which he’s already been conducting). In the future, he wants to know if nonreligious identifications are being deliberately transmitted by secular families, and if so, what kind of CREDs are being used; if believers experience different levels of religious choice and religious conflict than nonbelievers; and whether or not authoritarian-leaning religious parents unknowingly cause their child's pivot to atheism.

Shortly after receiving my degree in Religious Studies, I was walking over the Brooklyn Bridge with my father. By that point I had already turned atheist; I studied religion because I was fascinated by why people believe, not necessarily what they believe. I asked my father why I was raised with no religion at all.

His answer was immediate: “Because I was raised with too much of it.” He resented the fact that he had to attend the local Russian Orthodox Church every Sunday while his parents stayed home. By sixth grade, when I said I no longer wanted to go to CCD—a loose Catholicism from my mother's side—my parents were fine with it. The weekly class was more social activity than required training anyway.

I’m not sure what CREDs were being transmitted during my youth, but one thing is certain from Langston’s research: hypocrites rarely produce the results they desire. The theater might at first entrance, but the drugs do eventually wear off.

--

Stay in touch with Derek on Facebook and Twitter.

Cambridge scientists create a successful "vaccine" against fake news

A large new study uses an online game to inoculate people against fake news.

University of Cambridge
Politics & Current Affairs
  • Researchers from the University of Cambridge use an online game to inoculate people against fake news.
  • The study sample included 15,000 players.
  • The scientists hope to use such tactics to protect whole societies against disinformation.
Keep reading Show less

Yale scientists restore brain function to 32 clinically dead pigs

Researchers hope the technology will further our understanding of the brain, but lawmakers may not be ready for the ethical challenges.

Still from John Stephenson's 1999 rendition of Animal Farm.
Surprising Science
  • Researchers at the Yale School of Medicine successfully restored some functions to pig brains that had been dead for hours.
  • They hope the technology will advance our understanding of the brain, potentially developing new treatments for debilitating diseases and disorders.
  • The research raises many ethical questions and puts to the test our current understanding of death.

The image of an undead brain coming back to live again is the stuff of science fiction. Not just any science fiction, specifically B-grade sci fi. What instantly springs to mind is the black-and-white horrors of films like Fiend Without a Face. Bad acting. Plastic monstrosities. Visible strings. And a spinal cord that, for some reason, is also a tentacle?

But like any good science fiction, it's only a matter of time before some manner of it seeps into our reality. This week's Nature published the findings of researchers who managed to restore function to pigs' brains that were clinically dead. At least, what we once thought of as dead.

What's dead may never die, it seems

The researchers did not hail from House Greyjoy — "What is dead may never die" — but came largely from the Yale School of Medicine. They connected 32 pig brains to a system called BrainEx. BrainEx is an artificial perfusion system — that is, a system that takes over the functions normally regulated by the organ. The pigs had been killed four hours earlier at a U.S. Department of Agriculture slaughterhouse; their brains completely removed from the skulls.

BrainEx pumped an experiment solution into the brain that essentially mimic blood flow. It brought oxygen and nutrients to the tissues, giving brain cells the resources to begin many normal functions. The cells began consuming and metabolizing sugars. The brains' immune systems kicked in. Neuron samples could carry an electrical signal. Some brain cells even responded to drugs.

The researchers have managed to keep some brains alive for up to 36 hours, and currently do not know if BrainEx can have sustained the brains longer. "It is conceivable we are just preventing the inevitable, and the brain won't be able to recover," said Nenad Sestan, Yale neuroscientist and the lead researcher.

As a control, other brains received either a fake solution or no solution at all. None revived brain activity and deteriorated as normal.

The researchers hope the technology can enhance our ability to study the brain and its cellular functions. One of the main avenues of such studies would be brain disorders and diseases. This could point the way to developing new of treatments for the likes of brain injuries, Alzheimer's, Huntington's, and neurodegenerative conditions.

"This is an extraordinary and very promising breakthrough for neuroscience. It immediately offers a much better model for studying the human brain, which is extraordinarily important, given the vast amount of human suffering from diseases of the mind [and] brain," Nita Farahany, the bioethicists at the Duke University School of Law who wrote the study's commentary, told National Geographic.

An ethical gray matter

Before anyone gets an Island of Dr. Moreau vibe, it's worth noting that the brains did not approach neural activity anywhere near consciousness.

The BrainEx solution contained chemicals that prevented neurons from firing. To be extra cautious, the researchers also monitored the brains for any such activity and were prepared to administer an anesthetic should they have seen signs of consciousness.

Even so, the research signals a massive debate to come regarding medical ethics and our definition of death.

Most countries define death, clinically speaking, as the irreversible loss of brain or circulatory function. This definition was already at odds with some folk- and value-centric understandings, but where do we go if it becomes possible to reverse clinical death with artificial perfusion?

"This is wild," Jonathan Moreno, a bioethicist at the University of Pennsylvania, told the New York Times. "If ever there was an issue that merited big public deliberation on the ethics of science and medicine, this is one."

One possible consequence involves organ donations. Some European countries require emergency responders to use a process that preserves organs when they cannot resuscitate a person. They continue to pump blood throughout the body, but use a "thoracic aortic occlusion balloon" to prevent that blood from reaching the brain.

The system is already controversial because it raises concerns about what caused the patient's death. But what happens when brain death becomes readily reversible? Stuart Younger, a bioethicist at Case Western Reserve University, told Nature that if BrainEx were to become widely available, it could shrink the pool of eligible donors.

"There's a potential conflict here between the interests of potential donors — who might not even be donors — and people who are waiting for organs," he said.

It will be a while before such experiments go anywhere near human subjects. A more immediate ethical question relates to how such experiments harm animal subjects.

Ethical review boards evaluate research protocols and can reject any that causes undue pain, suffering, or distress. Since dead animals feel no pain, suffer no trauma, they are typically approved as subjects. But how do such boards make a judgement regarding the suffering of a "cellularly active" brain? The distress of a partially alive brain?

The dilemma is unprecedented.

Setting new boundaries

Another science fiction story that comes to mind when discussing this story is, of course, Frankenstein. As Farahany told National Geographic: "It is definitely has [sic] a good science-fiction element to it, and it is restoring cellular function where we previously thought impossible. But to have Frankenstein, you need some degree of consciousness, some 'there' there. [The researchers] did not recover any form of consciousness in this study, and it is still unclear if we ever could. But we are one step closer to that possibility."

She's right. The researchers undertook their research for the betterment of humanity, and we may one day reap some unimaginable medical benefits from it. The ethical questions, however, remain as unsettling as the stories they remind us of.

5 facts you should know about the world’s refugees

Many governments do not report, or misreport, the numbers of refugees who enter their country.

David McNew/Getty Images
Politics & Current Affairs

Conflict, violence, persecution and human rights violations led to a record high of 70.8 million people being displaced by the end of 2018.

Keep reading Show less