Is atheism on the rise, or is religion? At times we hear polls claiming both, but new research shows it's not that simple.
You've likely seen conflicting headlines like this: “Atheism on the rise." “Religion experiencing an increase." “Millennials Less Likely to Be Religious." “Churches Finding New Ways to Reach Young Audiences." And so on. The question remains: Are we becoming more or less religious?
In a 2017 article published in Journal for the Scientific Study of Religion, NYU Sociology professor Michael Hout discusses the phenomenon of liminalism. Limen is Latin for “threshold." Being liminal means you're on the fence about religion. You either have one or you don't, and that might change depending on when or how you're asked.
This sounds wishy-washy, in the way that some atheists believe agnostics need to decide (as do some faithful). But as Hout points out, this phenomenon partly explains why polls seem skewed year after year. And no small percentage of Americans are liminal:
About 20 percent of Americans were liminal in recent years, 10 percent were consistently nonreligious, and 70 percent were consistently religious.
As Hout points out, an answer often resides in how you phrase the question. The religious will be consistent, as will atheists. But when “something else" is offered things become less clear. If you aren't affiliated with Judaism or Protestantism, yet you don't want to check off “no religion," into the liminal category you go, which might be odd if you're pagan or Taoist.
One of the most popular responses I've come across is that someone believes in God, the afterlife, or heaven and hell, but does not have faith in organized religion. Likewise, the “spiritual not religious" category fulfills the role of religious yearning without fitting into the folds of any particular religion.
And, of course, humans change. I think of my mother in this circumstance, who was raised Catholic but didn't pay much attention to her religion until her own mother passed. Suddenly she began attending church again and making sure I believe in God (I don't) during our phone conversations. This trend lasted for a few years after my grandmother's passing but has tapered off recently. Nonetheless, mortality is a powerful indicator of religiosity for people who otherwise don't think much at all about it.
Our views generally become more conservative as we age, for a number of reasons: we move into like-minded enclaves when leaving city life; our trust in institutions falter the longer we live and the more experiences we have; our relationship with money changes as economic divides grow; our body starts to slow and break down, making us sense mortality in ways we previously did not. Aging is change in many regards, so it makes sense that a.) conservatism and religion are often linked and b.) religion is more associated with baby boomers than millennials.
Then there's the function of religious institutions. In his 2016 book, Sweat Equity, Bloomberg's New York bureau chief, Jason Kelly, writes that yoga and Crossfit studios are filling the role churches and synagogues once did. They provide room for a shared experience between individuals with similar goals. Likewise, the explosion of ayahuasca tourism in South America offers an opportunity for having spiritual experiences without the dogma of American religious rituals. These spaces provide for profound moments without prior religious beliefs, which could account for the uptick in numbers of those leaving religion behind.
And while liminalism does cause strange curvatures in studies, it does appear that fewer humans have faith in religion. Hout's article covers 2006 to 2014, and there is one trend he's confident expressing: people are becoming less religious. Or at least they're claiming as such. In 2006 he discovered that 14 percent of Americans preferred no religion. Fast forward to 2014 and that number rose to 21 percent. Each two-year interval showed an increase.
Hout believes the liminal population accounts for “the rapid decline of religious identification in the United States." Yet he does not feel that is a promise of eventual atheism. In fact, he says the data points in the opposite direction:
"As they stand on the threshold between religious and nonreligious, nothing in the logic of their position or the evidence at hand foreordains that they will eventually step in the direction of being nonreligious. Two key observations point the other direction, toward a religious identity. Liminals are more likely to name a religion than not. A minority of persons raised with no religion displayed a consistent nonreligious identity as adults; a third of them were liminal, and a quarter of them were consistently religious."
Religion is fluid, dependent upon culture and context. A 2017 Pew survey shows that splits in Protestantism, which has divided the church for centuries, is no longer as important as before. Muslim births are projected to outnumber Christian births by 2035, while the “nones" are not procreating nearly as much. Neuroscience and the social sciences are explaining many human behaviors once attributed to religion, though with climate change and economic inequality affecting the psyche of a planet, religious and nationalistic tribalism is also on the rise.
Hout's data is a snapshot of our current moment. A fifth of humans appear religiously dynamic. How that changes in the coming years is anybody's guess, but we can be certain that it won't be disassociated from external conditions. And right now it's pretty clear that we're better off working together than continuing to believe apart. We'll have to see what direction the curves shift next.
Derek is the author of Whole Motion: Training Your Brain and Body For Optimal Health. Based in Los Angeles, he is working on a new book about spiritual consumerism. Stay in touch on Facebook and Twitter.
Everyone encounters stereotypes. But what you do afterward says something about you
There is a lot of debate in the scientific community over what exactly intelligence is. We can talk about IQ. That’s one thing that’s absolutely measurable. But beyond that things get hazy. According to Harvard’s Howard Gardner there are multiple intelligences. In an elemental sense, one of the earliest and most comprehensive explanations is the ability to recognize patterns.
The human brain is actually the world’s most complex pattern recognition system. Previous research finds that those who are skillful in noticing patterns tend to earn more money, perform better at their jobs, and take better care of their health. In addition, advanced pattern detection may make one savvier in spotting opportunities and less likely to identify with authoritarian ideology.
“Pattern-matching” helps us to discern the feelings of others, make plans, learn a new language, and much more. The problem is, everything has a downside. Those who have excellent pattern recognition tend to use it to evaluate other humans, making this type prone to stereotyping.
Certain cognitive styles may be prone to social stereotypes. Flickr.
In a series of studies recently performed at New York University, researchers determined that those who were better at pattern-matching, were also more likely to recognize social stereotypes and apply them. There was a saving grace. These types were also more willing to change their attitude or position, in light of new information.
The lead author, David Lick, is a postdoctoral researcher in NYU’s Department of Psychology. Lick, along with Assistant Professors Jonathan Freeman and Adam Alter, joined forces to find out how pattern detectors operate when they come into contact with social stereotypes. The authors wrote, “Because pattern detection is a core component of human intelligence, people with superior cognitive abilities may be equipped to efficiently learn and use stereotypes about social groups.”
Researchers recruited 1,257 participants online through Amazon’s Mechanical Turk. This is where participants agree to become subjects in social science experiments, in return for some form of compensation. Participants were put through six experiments in all. In the first two, they saw pictures of either blue or yellow aliens with varying dimensional differences, such as a different face shapes, eye sizes, or ear sizes.
Certain types may be more likely to act on social stereotypes without being aware of it. Getty Images.
Recruits were told that blue aliens are “unfriendly.” They take part in rude behavior, such as spitting in another's face. Meanwhile, yellow aliens are “friendly.” They’d do things like buying a bouquet of flowers for another. In the third leg, respondents were made to take the Raven's Advanced Progressive Matrices, a pattern recognition assessment.
In the fourth segment, they underwent a memory test. Participants were told to match faces with behaviors. Among those the viewers encountered were some blue and yellow faces they’d never seen before. What the study showed was that pattern detectors were more likely to attribute blue faces to unfriendly behavior and yellow ones to the friendly kind. Researcher’s say, this constitutes a learned behavior.
In the next test, respondents encountered human faces. They were all male and had either a wide or narrow nose. For one set of participants, the wide-nosed faces were given unfriendly traits and the thin-nosed, friendly ones. In the second group, the roles were reversed. The example given of unfriendly behavior was laughing at a homeless person. While the positive example was bringing a bouquet of flowers to a sick friend.
We encounter social stereotypes all the time. How we internalize it is being uncovered. Getty Images.
Next, participants were told that they’d take a break from the study, which was misleading. They were asked if they’d like to play a game. One aspect was they’d have to lend out money to other participants. Players chose their avatar from a group of faces and played for 12 rounds. In each, they partnered up with a different looking avatar.
Participants didn’t know it, but they weren’t playing with real partners. Instead, researchers were selecting avatars to pair them up with, to see if they operated under any sort of bias. Respondents who did better with pattern recognition often gave less money to those avatars whose noses they had learned to stereotype. Yet, when they encountered information that bucked the bias, pattern-detectors altered the way they played the game.
In the last simulation, researchers looked at real-world stereotypes related to traditional male-oriented traits such as being authoritative and female-oriented ones such as being submissive. Pattern detectors who were shown repeated examples that women actually were more authoritative, showed a significant decrease in stereotyping behavior.
Lick, Freedman, and Alter say that specific advanced cognitive abilities may have a tendency to come with certain shortcomings. Besides this bias toward stereotyping, pattern-matching types are also more prone to OCD-like symptoms and behavior. Fortunately, the study also shows that this type may be the most amenable to bias.
Pattern detectors may be the most amenable to stereotyping. Getty Images.
David Lick responded to some questions I had about this study via email. He told me that he and colleagues can accurately predict how likely participants are to apply stereotypes if given the chance.
In fact, social psychologists have done quite a bit of work on the topic using implicit measures similar to the ones described in our paper. There's also been some work on methods to reduce stereotyping, though the literature is considerably smaller. Irene Blair (2002) and Kerry Kawakami (2005, 2007) have done some of the best work on counter-stereotype training procedures, and have shown some success in reducing explicit / implicit stereotyping. However, a number of questions still remain about the long-term effects of such training, and I think we need to do more research before making broad claims about the efficacy of these programs.
I asked if someday, we could use these findings to develop a sort of bias screening tool. But Lick said he wasn’t comfortable with that for a couple of reasons:
(1) These findings are restricted to fictional groups, “which could differ from real-world stereotypes in a number of important ways.”
(2) It's not clear that such a tool would even be useful. “Although there is a statistically reliable association between pattern detection and stereotyping, that doesn't mean there's a 1:1 mapping or that every good pattern detector will stereotype in every situation,” he said. Such a tool would only tell you if someone was likely to stereotype or not, which could lead to serious problems such as damaged interpersonal relationships or reputations by causing false accusations. “Even if the intentions were good, we'd need a lot more research with more diverse groups of people before beginning to think about a screening tool,” Lick said.
Still, these findings are paving the way for future research, allowing us to come to understand different cognitive styles in a deeper and more comprehensive way. From there, we could develop an anti-stereotyping program complete with different tracks, each tailored to reach a particular cognitive style.
To learn more about the nature of stereotyping and how we humans go about it, click here:
Researchers succeed in deleting key genes from ants, significantly modifying their behavior.
A staple of bad science fiction, mutant ants have been more of a figment of imagination rather than scientific reality. We’ve genetically altered mice and fruit flies, but growing mutant ants has eluded scientists due to the complex life cycle of the little critters. Now two teams announced that they managed to edit out certain genes from lab ants, altering their behavior.
The team from Rockefeller University published a paper outlining how they removed orco - a gene that plays a key part in an ant’s odor receptors. Deleting the gene by using the CRISPR-Cas9 technique resulted in the ants losing about 90% of their “olfaction”. This made them unable to socialize. The ants also changed in other ways, showing affected behavior. They laid very few eggs, wandered aimlessly, and showed poor parenting.
The other team, including scientists from NYU, Vanderbilt University, University of Pennsylvania, and Arizona State University, also used CRISPR to delete the orco protein in ants to affect their communication through pheromones, causing an "aberrant social behavior and defective neural development."
Researchers modified the ability of the ants to detect pheromones though porous hairs on their antennae. Credit: Rockefeller University.
This kind of interference with the social behavior of ants is considered a success because of the difficulty in altering the nature of insects with such a sophisticated social structure. NYU Professor Claude Desplan, who was involved in one of the studies called the modified ant they created “the first mutant in any social insect.”
“While ant behavior does not directly extend to humans, we believe that this work promises to advance our understanding of social communication, with the potential to shape the design of future research into disorders like schizophrenia, depression or autism that interfere with it,” said Desplan.
Why edit ant genes at all? Daniel Kronauer, author of the Rockefeller University study, says there are “interesting biologic questions” you can only study in ants.
“It was well known that ant language is produced through pheromones, but now we understand a lot more about how pheromones are perceived,” says Kronauer. “The way ants interact is fundamentally different from how solitary organisms interact, and with these findings we know a bit more about the genetic evolution that enabled ants to create structured societies.”
Check out this animation of how Kronauer and his colleagues tracked color-coded ants, while using an algorithm to analyze the resulting behavior.
A study finds an increasing number of Americans live with serious mental issues and their access to healthcare is getting worse.
A new study raises alarms about the rising amount of people suffering from mental illnesses in the U.S. and the many who are not getting the care they need. Researchers found that more than 8.3 million adult Americans (or 3.4% of the population) are battling serious psychological distress (or SPD).
The study, carried out by researchers from the NYU Langone Medical Center, analyzed a federal health database, looking at 200,000 participants of surveys from 2006 to 2014.
The CDC, which manages the surveys, defines SPD as a combination of emotional states like sadness, worthlessness and restlessness that severely affect a person, requiring treatment.
The number of people impacted by mental illness appears to be rising, with previous surveys putting the figure at 3% or less, and their access to healthcare is getting worse.
“Although our analysis does not give concrete reasons why mental health services are diminishing, it could be from shortages in professional help, increased costs of care not covered by insurance, the great recession, and other reasons worthy of further investigation,” said the study’s lead investigator Dr. Judith Weissman.
The study specifically points to how the poor with mental issues are often prevented from getting treatment. People with SPD are three times more likely to be unable to afford general health care. Similarly, Weissman found that 9.5% of people needing help did not have the kind of insurance that would give them access to a psychiatrist. Around 10% had delays getting help due to mental health coverage issues or could not afford to pay for necessary medications.
“Based on our data, we estimate that millions of Americans have a level of emotional functioning that leads to lower quality of life and life expectancy,” said Dr. Weissman. “Our study may also help explain why the U.S. suicide rate is up to 43,000 people each year.”
She pointed out that access to treatment for mental illnesses deteriorated despite implementation of the Mental Health Parity Act of 2008 and Obamacare’s provisions for increasing medical coverage in such cases. It also bears noting that in its current form, the American Health Care Act aka Trumpcare could result in loss of coverage of over a million people dealing with mental illnesses.
In Weissman’s opinion, the recession from 2007 to 2009 probably contributed to the worsening mental health picture. Many people who needed care got left behind and are still paying the price.
“There is this generation of middle aged adults that are really suffering right now and if policies change, if we increase access to mental health care and we increase coverage for mental health care, we can save the next generation,” she added.
Overall, there's an estimated 43.4 million adults (18+) in the U.S. who live with some form of mental illness. This is according to 2015 stats from the National Institute of Mental Health. That's about 17.9% of all American adults.
You can read the study here, in the journal Psychiatric Services.
The advent of portable technology has exploited our reptilian addiction switch like never before.
It's not your screen you're addicted to — it's just the conduit for your high. NYU professor Adam Alter explains that behavioral addiction is similar to substance addiction: it feels good in the short term, but over time can negatively impact your mental state, social life, financial stability, and physiological wellbeing. There's been a steep takeoff of digital addiction in recent years, with approximately half the developed world now exhibiting addictive tendencies when it comes to the internet. It comes down to portability. The more wireless our devices become, the more our addiction follows us around, and the more we turn to our phones as "adult pacifiers" — just a swipe of your screen is enough to feel relaxed again. Adam Alter is the author of Irresistible: The Rise of Addictive Technology and the Business of Keeping Us Hooked.