Is life worse or better than non-existence? And if it is, who is judging? Welcome to anti-natalism, a small but lively corner of philosophy.
Is being born worth it? If you weighed life’s pleasure against the suffering and sorrow, do you end up ahead? Gustave Flaubert claimed that he would have cursed himself if he became a father, as he desired to “transmit to no one the aggravations and the disgrace of existence.” Fyodor Dostoyevsky was even more bleak in The Brothers Karamazov, writing, “I'd have let them kill me in the womb, so as not to come out into the world at all.”
Arthur Schopenhauer was especially pessimistic on this topic:
If children were brought into the world by an act of pure reason alone, would the human race continue to exist? Would not a man rather have so much sympathy with the coming generation as to spare it the burden of existence, or at any rate not take it upon himself to impose that burden upon it in cold blood?
We can even find this view in the New International Version of the Bible:
And I declared that the dead, who had already died, are happier than the living, who are still alive. But better than both is the one who has never been born, who has not seen the evil that is done under the sun.
Welcome to anti-natalism, a small but lively corner of philosophy that, in our time of climate change, prospects of nuclear war, and divisive populist politics, has been growing of late. Though David Benatar, one of the chief modern architects of this philosophy, may or may not have coined the term “anti-natalism”—he’s done “intellectual archaeology” to figure it out, and his jury of one is still debating—his recent appearance on Sam Harris’s Waking Up podcast further solidified his stake in this long debated topic: Is life worth living? Benatar says no, at least for the unborn.
According to Benatar, head of the Department of Philosophy at the University of Cape Town and author of Better Never to Have Been, being born is “not always a harm, but always a very serious harm.” Summating his philosophy, he continues:
We ought not to bring new people into existence, but I think the view is broader, that we ought not to bring new sentient beings into existence. It’s not just the view that it’s harmful to come into existence, but a further view that it’s wrong to bring beings into existence.
Harris finds a correlation with Buddhism. According to a translation of Buddhist texts by Sir Hari Singh Gour, Buddha claimed that men are ignorant of the suffering they unleash; existence is the cause of old age and death. If man would realize this harm he would immediately stop procreating. That might offer insight into why Buddha named his own son Rāhula, which means “fetter” or “impediment.” Of course, Buddha had his son before embarking on his legendary quest, so selfishly the name implies Rāhula was getting in the way of his father’s search for enlightenment.
Morals are a critical component of Buddhism, as well as the founding principle of anti-natalism. Benatar believes there exists “an asymmetry of values between the good and bad things in life.” When we consider the uninhabited corners of the universe (which would be most of the universe), we don’t consider the absence of good that could be out there. But if we were to contemplate that suffering does not exist, for example, on Mars, we would think it positive that the beings that don’t exist have escaped suffering. Benatar focuses much energy on this perceived absence of pain.
Harris mentions that Benatar’s observation is directly in opposition to philosophers working in existential risk, the idea that a catastrophic event would drastically curtail or end human existence. Harris cites Oxford philosopher William MacAskill, who says the greatest possible wrong would be to do something (i.e. nuclear war) putting us at risk of self-annihilation, which is wrong because it closes the door to all the untold goods that exist after untold years of creative involvement with the cosmos. Harris believes those hypothetical losses are equally important to any suffering that might be wiped out.
Harris then speculates on what it would take to create a “life worth living,” which Benatar calls “an ambiguous sentiment.” Benatar distinguishes between a life worth starting and a life worth continuing. Missing that ambiguity makes his fundamental point impossible to understand, as he is not advocating for suicide. As to bringing others into existence, however, the bar for starting a life must be much higher than it currently is.
If you’re thinking about bringing someone into existence, you’re not just thinking about when they’re young, but also when they’re in their eighties. Parents don’t think about the cancer that will ravage their future child’s body decades after they themselves die.
Benatar draws an analogy to a play you were looking forward to seeing. You buy tickets and attend the show, which turns out subpar. If you had known in advance that it was not what you thought, you would not have wasted your time. Again, this is in alignment with Buddhism, only from that perspective it’s your perception that needs to change; you don’t necessarily need to wipe the slate clean.
Harris continues to search for benefits. There’s no telling how beautiful life could have been if you’re not willing to give it a shot in the first place. Turning the lights out in a universe with the potential for beauty is not as bad as bringing life into a world that is purely hell, but that’s not the situation we find ourselves in at this moment. We don’t know how good life can be, at least not in our current experience.
This, Harris believes, is an especially important question as we design artificial intelligence, as we might build minds that suffer to degrees we cannot even understand without being aware that we’ve done so. We have the potential to create hells inside of our computers in our ignorance.
Harris, of course, leans heavily on science, though Benatar says that the suffering of now isn’t worth it to the many generations that will continue to suffer for a potential benefit a thousand years from now. While Harris states there are many potentially better existences than non-existence, Benatar simply cannot imagine any possible existence that is better than never having existed.
The two-hour conversation is exhilarating and exhausting, as the same ground is covered through numerous analogies. But, like in Buddhist debating traditions, those details are necessary. Anti-natalism is not a philosophy that can be summated in an elevator pitch, especially given that it goes against our most base biological impulse. Tell almost any parent that their child should not have been born and a reasoned response is not forthcoming.
Happily the conversation never becomes heated, a feat in a topic this emotional. Harris is always a reasoned debater while Benatar has tread this territory for decades. When Harris mentions those who grow from their suffering—many people come out the other side of pain with unforeseen cognitive and emotional benefits—Benatar concedes that your perception of existence shifts your understanding of reality. If you perceive your life to have grown richer from an experience, it did.
In the end, though, the suffering still isn’t worth it. Benatar invokes rape victims. You can take that experience and help others through counseling and therapy, but would the rape be valuable enough given the suffering it caused? It is an analogy of a bigger question about existence the living will continue to grapple with, but if you ask Benatar, it is the unborn who benefit most.
Derek is the author of Whole Motion: Training Your Brain and Body For Optimal Health. Based in Los Angeles, he is working on a new book about spiritual consumerism. Stay in touch on Facebook and Twitter.
The US is arguably the most scientifically and technologically advanced society in the world, and yet at the same time the most religious of Western societies.
In 1966, just over 50 years ago, the distinguished Canadian-born anthropologist Anthony Wallace confidently predicted the global demise of religion at the hands of an advancing science: ‘belief in supernatural powers is doomed to die out, all over the world, as a result of the increasing adequacy and diffusion of scientific knowledge’. Wallace’s vision was not exceptional. On the contrary, the modern social sciences, which took shape in 19th-century western Europe, took their own recent historical experience of secularisation as a universal model. An assumption lay at the core of the social sciences, either presuming or sometimes predicting that all cultures would eventually converge on something roughly approximating secular, Western, liberal democracy. Then something closer to the opposite happened.
Not only has secularism failed to continue its steady global march but countries as varied as Iran, India, Israel, Algeria and Turkey have either had their secular governments replaced by religious ones, or have seen the rise of influential religious nationalist movements. Secularisation, as predicted by the social sciences, has failed.
To be sure, this failure is not unqualified. Many Western countries continue to witness decline in religious belief and practice. The most recent census data released in Australia, for example, shows that 30 per cent of the population identify as having ‘no religion’, and that this percentage is increasing. International surveys confirm comparatively low levels of religious commitment in western Europe and Australasia. Even the United States, a long-time source of embarrassment for the secularisation thesis, has seen a rise in unbelief. The percentage of atheists in the US now sits at an all-time high (if ‘high’ is the right word) of around 3 per cent. Yet, for all that, globally, the total number of people who consider themselves to be religious remains high, and demographic trends suggest that the overall pattern for the immediate future will be one of religious growth. But this isn’t the only failure of the secularisation thesis.
Scientists, intellectuals and social scientists expected that the spread of modern science would drive secularisation – that science would be a secularising force. But that simply hasn’t been the case. If we look at those societies where religion remains vibrant, their key common features are less to do with science, and more to do with feelings of existential security and protection from some of the basic uncertainties of life in the form of public goods. A social safety net might be correlated with scientific advances but only loosely, and again the case of the US is instructive. The US is arguably the most scientifically and technologically advanced society in the world, and yet at the same time the most religious of Western societies. As the British sociologist David Martin concluded in The Future of Christianity (2011): ‘There is no consistent relation between the degree of scientific advance and a reduced profile of religious influence, belief and practice.’
The story of science and secularisation becomes even more intriguing when we consider those societies that have witnessed significant reactions against secularist agendas. India’s first prime minister Jawaharlal Nehru championed secular and scientific ideals, and enlisted scientific education in the project of modernisation. Nehru was confident that Hindu visions of a Vedic past and Muslim dreams of an Islamic theocracy would both succumb to the inexorable historical march of secularisation. ‘There is only one-way traffic in Time,’ he declared. But as the subsequent rise of Hindu and Islamic fundamentalism adequately attests, Nehru was wrong. Moreover, the association of science with a secularising agenda has backfired, with science becoming a collateral casualty of resistance to secularism.
Turkey provides an even more revealing case. Like most pioneering nationalists, Mustafa Kemal Atatürk, the founder of the Turkish republic, was a committed secularist. Atatürk believed that science was destined to displace religion. In order to make sure that Turkey was on the right side of history, he gave science, in particular evolutionary biology, a central place in the state education system of the fledgling Turkish republic. As a result, evolution came to be associated with Atatürk’s entire political programme, including secularism. Islamist parties in Turkey, seeking to counter the secularist ideals of the nation’s founders, have also attacked the teaching of evolution. For them, evolution is associated with secular materialism. This sentiment culminated in the decision this June to remove the teaching of evolution from the high-school classroom. Again, science has become a victim of guilt by association.
The US represents a different cultural context, where it might seem that the key issue is a conflict between literal readings of Genesis and key features of evolutionary history. But in fact, much of the creationist discourse centres on moral values. In the US case too, we see anti-evolutionism motivated at least in part by the assumption that evolutionary theory is a stalking horse for secular materialism and its attendant moral commitments. As in India and Turkey, secularism is actually hurting science.
In brief, global secularisation is not inevitable and, when it does happen, it is not caused by science. Further, when the attempt is made to use science to advance secularism, the results can damage science. The thesis that ‘science causes secularisation’ simply fails the empirical test, and enlisting science as an instrument of secularisation turns out to be poor strategy. The science and secularism pairing is so awkward that it raises the question: why did anyone think otherwise?
Historically, two related sources advanced the idea that science would displace religion. First, 19th-century progressivist conceptions of history, particularly associated with the French philosopher Auguste Comte, held to a theory of history in which societies pass through three stages – religious, metaphysical and scientific (or ‘positive’). Comte coined the term ‘sociology’ and he wanted to diminish the social influence of religion and replace it with a new science of society. Comte’s influence extended to the ‘young Turks’ and Atatürk.
The 19th century also witnessed the inception of the ‘conflict model’ of science and religion. This was the view that history can be understood in terms of a ‘conflict between two epochs in the evolution of human thought – the theological and the scientific’. This description comes from Andrew Dickson White’s influential A History of the Warfare of Science with Theology in Christendom (1896), the title of which nicely encapsulates its author’s general theory. White’s work, as well as John William Draper’s earlier History of the Conflict Between Religion and Science (1874), firmly established the conflict thesis as the default way of thinking about the historical relations between science and religion. Both works were translated into multiple languages. Draper’s History went through more than 50 printings in the US alone, was translated into 20 languages and, notably, became a bestseller in the late Ottoman empire, where it informed Atatürk’s understanding that progress meant science superseding religion.
Today, people are less confident that history moves through a series of set stages toward a single destination. Nor, despite its popular persistence, do most historians of science support the idea of an enduring conflict between science and religion. Renowned collisions, such as the Galileo affair, turned on politics and personalities, not just science and religion. Darwin had significant religious supporters and scientific detractors, as well as vice versa. Many other alleged instances of science-religion conflict have now been exposed as pure inventions. In fact, contrary to conflict, the historical norm has more often been one of mutual support between science and religion. In its formative years in the 17th century, modern science relied on religious legitimation. During the 18th and 19th centuries, natural theology helped to popularise science.
The conflict model of science and religion offered a mistaken view of the past and, when combined with expectations of secularisation, led to a flawed vision of the future. Secularisation theory failed at both description and prediction. The real question is why we continue to encounter proponents of science-religion conflict. Many are prominent scientists. It would be superfluous to rehearse Richard Dawkins’s musings on this topic, but he is by no means a solitary voice. Stephen Hawking thinks that ‘science will win because it works’; Sam Harris has declared that ‘science must destroy religion’; Stephen Weinberg thinks that science has weakened religious certitude; Colin Blakemore predicts that science will eventually make religion unnecessary. Historical evidence simply does not support such contentions. Indeed, it suggests that they are misguided.
So why do they persist? The answers are political. Leaving aside any lingering fondness for quaint 19th-century understandings of history, we must look to the fear of Islamic fundamentalism, exasperation with creationism, an aversion to alliances between the religious Right and climate-change denial, and worries about the erosion of scientific authority. While we might be sympathetic to these concerns, there is no disguising the fact that they arise out of an unhelpful intrusion of normative commitments into the discussion. Wishful thinking – hoping that science will vanquish religion – is no substitute for a sober assessment of present realities. Continuing with this advocacy is likely to have an effect opposite to that intended.
Religion is not going away any time soon, and science will not destroy it. If anything, it is science that is subject to increasing threats to its authority and social legitimacy. Given this, science needs all the friends it can get. Its advocates would be well advised to stop fabricating an enemy out of religion, or insisting that the only path to a secure future lies in a marriage of science and secularism.
This article was originally published at Aeon and has been republished under Creative Commons.
There are many people who preach the supposed benefits of psychedelics, but none do it as well, nor as reliably, as these philosophers and scientists.
- The world is enjoying a bit of a psychedelic renaissance.
- The phenomenon of micro dosing, in which a fraction of a hit of LSD is taken to gain the supposed benefits without the hassle of hallucinations, is increasingly popular in Silicon Valley.
- Medical research into psychedelics of all kinds is also expanding and finding new beneficial uses for these drugs in the treatment of psychological disorders.
Scientific evidence for the benefit of drugs
With decades of prohibitions on research, the scientific evidence of the benefits of such drugs is limited. There are many people who preach the supposed benefits of the drugs, but few of them can be said to be philosophers or respected scientists. Here, we offer the experiences of a few real philosophers and scientists on the possible benefits of psychedelics.
Gerald Heard, a British author who wrote many books on science, history, and human consciousness, tried LSD earlier than most people, in the middle of the 1950s. His use and private praise of the possible application of the drug as a catalyst to create moments of near-religious insight caused many other intellectuals to give it a try, including his friend and our final entry on the list Aldous Huxley, and psychedelic research pioneer Timothy Leary. He described the drug like this: "There are the colors and the beauties, the designs, the beautiful way things appear... But that's only the beginning. Suddenly you notice that there aren't these separations. That we're not on a separate island shouting across to somebody else trying to hear what they are saying and misunderstanding. You know. You used the word yourself: empathy." This interview has also been sampled into the song 'Waking Bliss'.
Alan Watts, one pro-LSD philosopher
Alan Watts, the British philosopher best known for popularizing the ideas of Eastern philosophy to his Western audience, also experimented with LSD and other drugs. He saw them as being of use in offering "glimpses" to a greater spirituality, and in helping individuals understand their connection to the universe. He later concluded that, "If you get the message, hang up the phone. For psychedelic drugs are simply instruments, like microscopes, telescopes, and telephones. The biologist does not sit with eye permanently glued to the microscope, he goes away and works on what he has seen."
Sam Harris: Can psychedelics help you expand your mind?
Sam Harris, an American neuroscientist and so-called horseman of new atheism, experimented with MDMA for the mental effects rather than the physical ones. His MDMA trip resulted in a profound understanding that he was connected to every sentient being in existence. The trip was so powerful for him that it took him years to fully be able to integrate the ideas into his intellectual life.
He also mentions, despite being an advocate of secular meditation, that while meditation is useful it might not work for everybody. This is as opposed to psychoactive drugs, which will cause some effect if taken in a large enough dose. He does temper this notion, however, and states that anything you can do with psychedelics can be done without them. He does accept that he would never have supposed such an experience would be possible without the drugs, if he had not taken them initially.
Jason Silva: We're going through a psychedelic renaissance
British philosopher Aldous Huxley, best known as the author of Brave New World, experimented with psychedelic drugs in the late 1950s. His ideas on the subject are recorded in his books The Doors of Perception and Heaven and Hell. Huxley believed that drugs such as mescaline and LSD allowed us to view the world "as is" rather than as we normally experience it—in a way more fitting for survival. He called this manner of viewing the world the "mind at large", and argued that it was a wonderful perspective that many people would benefit from.
He also argued that every culture across time has sought some kind of chemical escape from daily life. In his opinion, psychedelics were a healthier alternative to tobacco and alcohol, achieving the goals of escape alongside psychological and mystical realizations.
However, Huxley also believed that LSD should not be popularly available, but used only by "the best and brightest". He mentions at the end of his book that drugs are not enlightenment, but merely helpful for the intellectual who might be attached to words and symbols. His occasional enjoyment of drugs lasted the rest of his life; his last words were a request to his wife to be injected with LSD before dying. She obliged him.
There are, of course, other philosophers and thinkers who tried the stuff and had things to say about it. George Carlin, Richard Feynman, and Steve Jobs for example. The less philosophically inclined who still got a great deal out of their trips and were open about it include Jimi Hendrix, Ken Kesey, Cary Grant, and George Harrison.
While all these icons of art and science disagree on the benefits of those drugs being generally available to the public, or even what those benefits are, they did converge on one thing: that the mind-bending effects are good for some people.
That's not to be interpreted as blind endorsement—Sam Harris is perhaps clearest on that when he says: "This is not to say that everyone should take psychedelics... these drugs pose certain dangers. Undoubtedly, some people cannot afford to give the anchor of sanity even the slightest tug." As the West continues to consider the pros and cons of differing chemical substances, the testimony of some intelligent and successful people must be included in any discussion.
Where are the four "horsewomen" of new atheism? Well, here are two of them, secular scholars Rebecca Goldstein and Susan Jacoby.
In 2006 Wired contributing editor Gary Wolf wrote a story on emerging trends in atheism. In his skeptical piece Wolf coined “new atheism,” a term later applied to the “four horsemen”: Richard Dawkins, Sam Harris, Daniel Dennett, and the late Christopher Hitchens.
These men had varying responses to the term. Harris, for one, pointed out that “atheist” never appears in the book that kicked off this movement, The End of Faith. Alas, the four horsemen are the usual go-to thinkers when considering atheism in the 21st century, which begs one important question: What about women?
In general there are more male than female atheists. One 2010 survey found that males outnumber females in confessed atheism. In the United States that equates to 6 percent of men compared to 1.2 percent of women. (The “not religious” category is closer, as it is in most nations.) In Russia the number was 6.1 to 3 percent, whereas Switzerland it was 9 to 7 percent.
Numbers become confusing with examples like this 2012 poll, which reports that while women make up 52 percent of the US population they count for only 36 percent of “atheists and agnostics.” The problem with this differentiation is that everyone is agnostic, in that no one “knows” whether a god exists. You’re either theistically or atheistically agnostic. Many choose to not think much about it. That’s qualitatively different than pronouncing your atheism.
On top of that these are self-reported polls, and there might be reasons women do not claim their atheism. In a 2015 discussion, secular scholars Susan Jacoby and Rebecca Goldstein explore the question of why more women don’t profess critical skepticism of faith. They point first to social reasons: children of women who admit their atheism are more likely to be bullied at school, for example.
Personal beliefs are one thing, but social circles tend to be tight-knit. If your circle is comprised of devout followers, expressing atheism might ostracize you from this network, which could lead to larger problems for the entire household. Jacoby believes this is a driving factor of why some women stay “in the closet” regarding atheism.
Jacoby also points to an education gap. She says there is an “enormous deficit in math and science education between women and men.” The more educated one is in the sciences, she says, the more likely you are to be skeptical regarding divinity. While medical schools are seeing roughly equivalent numbers in terms of men and women, Jacoby reminds listeners there are very few female surgeons. Her preference appears to be for the more rigorous degrees.
There are other reasons. Humans are generally more reactive than proactive, and stringent religious dictates—President Trump announcing transgender people will not be allowed to serve in the military appeals to specific Christian sensibilities, for example—turn people off of religion and its questionable metaphysics. Sociology professor Phil Zuckerman believes this is turning many young people, specifically women, away from religion, as Kyle Fitzpatrick reports:
Zuckerman believes this has to do with traditional organized religions' male-centrism: teaching women that they're second class, must remain virginal, and must stay out of leadership positions. Pair this with the amount of women in the workplace rivaling men, and the group doesn’t need to turn to a church for social or financial support that churches typically offer.
This is an important about-face for women willing to declare their unbelief. In the Los Angeles Review of Books Zuckerman writes about Elmina Drake Slenker, the mid-19th century ex-Quaker atheist who scandalized the nation when she publicly declared her atheism in 1856. She was prosecuted shortly thereafter. Zuckerman points out her actual “crime,” which led to months in prison because she refused to swear heavenly allegiance on a bible:
Writing leaflets and personal letters to various people about human sexuality, marital relations, birth control, and bestiality. She was put on trial, and it only took the jury 10 minutes to find her guilty.
How things have changed. Instead of submitting to public pressure and governmental interference women have, thankfully, fought back, especially when they’ve been personally affected by religious mandates. Ayaan Hirsi Ali still remains a contentious figure in Islam, where she’s constantly harassed by dogmatic followers, but her secular foundation, dedicated to combating the ravages of archaic religious displays of power, such as female genital mutilation and honor violence, is flourishing.
Technology has helped aid such movements. Jacoby believes many female freethinkers existed in the past, but their voices were never heard since publishing was a male game. Women who broke through often had to assume male monikers just to do so. With easy access to social media this has changed dramatically.
Jacoby believes the next step in inviting more women into the fold requires educating people that morals are not dependent on religion. She expresses disdain for those who feel that moral decisions depend on religion or what she finds to be an innocuous term, spirituality.
The statement “I’m spiritual but not religious” makes me want to throw up. What this sentence means is I’m not religious, I don’t go to church, but I am a good person. And this word spiritual comes to stand for being a good person, just as people were talking about religion as a transcendent experience, as if it’s different from what people experience when they listen to great music.
She admits women appear to be more religious than men thanks to biology and a penchant for spirituality. During their talk Goldstein points to social psychologist Jonathan Haidt’s work on purity as one possible motivation for religion: women tend to associate more with the concept of being “pure” in part due to its long history of patriarchic power structures. Both women agree that a link between spirituality and sexuality also align more women than men with religion.
And both women agree that intellectual equality and freedom will even the gender playing field regarding atheism. Jacoby states that comforting people in the face of tragedy—she cites Newtown as an example—is possible without an allegiance to a metaphysical figure or a prophet. Reason, she says, is more likely to foster relationships based on equality and sharing, as the pretensions of right and wrong promoted by religious ideology dissolve. What you are left with is our human nature, fallible and beautiful, imperfect though empathetic, no deity required.
Derek's is the author of Whole Motion: Training Your Brain and Body For Optimal Health. Based in Los Angeles he is working on a new book about spiritual consumerism. Stay in touch on Facebook and Twitter.
New research claims religious terrorism is on the rise, and it appears that it's going to get worse before we see a decline in such horrendous acts.
In the latest episode of Common Sense, Dan Carlin considers what’s being lost in the current age of terrorism. He posits the idea that America would look much different right now had 9/11 never occurred—terror as a main driver causing the rift between Left and Right. The word “terror” not even need be spoken to feel it lurking behind so much of our discontent.
That’s because, he continues, terrorists take advantage of our “human reaction,” which is, namely, security at any cost. The payment for this security, real or imagined, is huge:
To be so afraid turns off your natural filters. You no longer think what you’re sacrificing for this extra security is valuable at all.
What makes terrorism “so particularly abhorrent,” he continues, “is that it strikes against innocence.” In one of the best debates I’ve heard on this topic, Fareed Zakaria recently appeared on Sam Harris’s podcast. The two have long been at odds on Islamic terrorism. Zakaria reminds Harris that the predominant number of victims are Muslim, many, which Carlin also points out, are children.
A young boy views tributes left opposite the main entrance of Bataclan concert hall in Paris, France.
Carlin believes there are two ways out of this mess. The first is to normalize terrorism and adjust, which he considers the safer option. The worse road—one we’re witnessing evidence of regularly—is to allow terrorists to morph society in ways that forces citizens to act in ways we never would have had terrorism never existed. This means, for one, how we treat one another.
While the Zakaria-Harris conversation becomes heated (though never disrespectful), they agree that peaceful resolution is the goal—a lofty ambition, but ultimately the saving grace. Zakaria grew up Muslim in India, which he claims is predominantly secular. While he’s an atheist he advocates for not generalizing Muslims in one category nor denying their religiosity. Citing ancient scriptures as if all of Islam follows those codes today is his main criticism of Harris.
Yet new research claims religious terrorism is on the rise, and it appears that it is going to get worse before we see a decline in such horrendous acts. Whereas terrorism in the sixties was predominantly secular, writes University of California, Berkeley lecturer Bruce Newsome, the nineties gave birth to a new breed of religious terrorism, a wave that led to a more recent uptick which he dubs “newest terrorism.”
He makes the distinction in part between how terrorists today strike quickly instead of prolonging events with hijackings and hostages.
While the new terrorists prioritized spectacular lethality in long-planned hijackings or bombings of mass transit, offices or hotels, the “newest” terrorists encourage more frequent active violence, hostage-takings and kidnappings. They seek to kill in the most horrifying ways.
This means walking into a club and opening fire, driving a van through a crowded sidewalk, and taking a machete to passerby. More frequent small attacks are usurping large-scale 9/11-style tragedies, keeping us on constant alert, always on edge in fear of a suicide bomber or suspicious driver just around the corner.
And while Zakaria is certainly right that many Muslims do not promote such heinous acts, Newsome notes that religion is the catalyst. It is common to hear claims that these attacks “are not real religion,” a mistake that Harris, among others, frequently points out. Newsome writes that religious terrorists are more violent and deadly than secular counterparts, and that we should take them at their word.
For example, Newsome crunched 46 years of hostage data to find that secular terrorists released hostages 51 percent of the time compared to 31 percent by religious terrorists. Twice as many people die in the hands of the religious; the religious kill more people while deploying fewer terrorists per event. Since religious terrorists are more likely to be willing to die or commit suicide in the process, maximizing death and destruction is a stated and lauded goal.
And, of course, religious terrorists are social media experts, consummate marketers, both online and on the ground:
We found that newest terrorists choose more public targets, such as theaters and shopping malls, theoretically in pursuit of higher lethality and terror. Old terrorists choose more politically useful or symbolic targets, such as government buildings or military barracks.
Newsome predicts that terrorism is going to get worse in the coming years. Religious extremism combined with an ability to build and access weapons and ease of communication are the culprits. It might be some time before we witness another 9/11 (if ever), but the uptick in smaller attacks appears to be imminent. How we deal with that fact—normalizing it; committing atrocities in the name of security; spreading secular education broadly—will be our great challenge in the coming years.
Derek's next book, Whole Motion: Training Your Brain and Body For Optimal Health, will be published on 7/17 by Carrel/Skyhorse Publishing. He is based in Los Angeles. Stay in touch on Facebook and Twitter.