Once a week.
Subscribe to our weekly newsletter.
The Flaws in Defending Morality With Religion
When we think of those opposed to homosexuality – which still sounds weird to me, like opposing left-handed people* – or stem-cell research or euthanasia, we tend conclude they’re justifying themselves because of religion. But, as with almost anything underpinned by religion, the pendulum swings both ways: religious people also support these. And, perhaps without energy or recognisable ability to justify their moral decisions, many often fall back on their god’s say-so to provide a foundation for their otherwise empty assertions that certain things are right or wrong. Of course, one tends to forget that this is true even of those who support views one condones.
By definition, justifying moral views because god says so is inherently flawed. I have not seen an escape from the problem begun with Plato's Euthyphro’s Dilemma, two millennia ago. After looking at the Dilemma, I'll highlight what I consider the fundamental problem with religious-based ethics.
As Plato first portrayed it, we have to ask with James Rachels a two-part question: “(1) Is conduct right because the gods command it, or (2) do the gods command it because it is right?”
“Conduct is moral because god says so”
If (1) then conduct takes on the afterglow of being moral because of the gods’ wishes, rendering morality arbitrary. It is merely their blessing which “makes” it good, not the thing itself - which is not in-itself troubling, since, for example, utilitarianism operates the same way. Before something is good or bad, it is amoral: rape, torturing babies, hugging bunnies, and so on could be made good or bad.
The difference between (1) and other moral frameworks, like utilitarianism, is that what gives conduct moral currency is up to the gods’. This means the whims and wishes of beings who are not us, beyond us and our scrutiny, etc.: as Yahweh did in the Bible, this could render genocide, trophy-wives and so on, as moral just because a god says so (or because powerful men tell us god says so). I know few people who would follow through with what they believe their god says all the time, as Adam Lee, at Daylight Atheism, pointed out with his Abraham Test. Furthermore, this makes ethics a useless subject since we need only consult the gods. Further still, of course, even if we believe all this to be true, religious people of the same religion cannot even agree on moral matters: whether homosexuality is right or wrong, capital punishment, abortion in dire circumstances, etc. All this, too, is prefaced on the recognition that some kind of morally engaged deity exists.
“God commands it because it is right”
If (2) then we must ask, simply, “why is this conduct right?” Basically, we are repeating ourselves! If the gods are saying “helping others in need is good” because “helping others is in need is good” we’ve reached a tautology. “God commands a good action because it is a good action”. This doesn’t help us at all. We still want to know why it is good. And, remember, if we say to this “Because god says it is good”, we’re back to the problems pointed out in the previous section.
It might also be an opportunity to say that the gods are useless, since if the action is right, why do we need the gods to recognise it? We are already using another standard if we are proclaiming “helping others in need is good”: what do we mean by good? This places us on proper ethical platforms to discuss our meanings of good.
"God Would Never Do Evil"
One popular method to try and save face is to proclaim that my god would never do or will anything other than good. That is, there is in fact a third option. As popular religious ethicist Greg Koukl says: “The third option is that an objective standard exists. However, the standard is not external to God, but internal. Morality is grounded in the immutable character of God, who is perfectly good. His commands are not whims, but rooted in His holiness” (quoted from this blogpost). All that’s happened here is that god is already being defined as good. So the Christian god automatically is good. But one can immediately see the problem: what is meant by “good”? By what standard are we even saying god is good? We can’t simply be saying “god is good” before the conversation on what constitutes good has even begun: because then it would render the discussions circular. Equating God with good doesn't answer the question of what constitutes good, it just redefines God.
Again, we might merely restate the original dilemma: "Is god good because he says so, or is he good because he really is good?" If the former, then it’s arbitrary, unclear, uncertain and so on – whereas, if it’s the latter, we still haven’t answered the question of how we know what good is.
Why this Matters
The point is, as Paul Cliteur highlights in The Secular Outlook, any religious-based ethics therefore is fundamentally flawed. By definition, a moral decision based on religion will be a command, a handed down assertion, a view propped up by circularity rather than consistency. Whether god or the Bible, you are not making a proper moral decision if someone else is telling you what to do: it is not a decision, it is a command being obeyed. To be able to reason morally, you must be able to engage freely.
To be free, you must not be able to point to the whims of another individual as your moral justification. One may appeal to reasons made by smarter people, but then you are engaging in their reasoning which any other free agent can assess and dispute: not the Creator of Universe, who I think suffers from the minor problems of inconsistency and non-existence, who you cannot dispute because by definition he “is good” or “must be obeyed”. The circularity traps everyone, not just you, in a prison of moral myopia: where we mistake the bars for protective fences.
That is why when people like Alise Wright make the point that it’s wrong to accuse Christians like her, who support gay marriage for example, of not being “true” or proper or “really” Christians, she’s right. The problem, however, that she misses and which I would consider central to my criticism of people like her is that there is a fundamental problem for everyone who bases their ethics on god, regardless of whether those conclusions square with nonbelievers’. So by “people like her”, I don’t see a Christian who supports a moral view I endorse: I see someone who is basing her ethics on the Bible. That’s my problem and that should be a problem for everyone, including Christians, as I’ve highlighted: it fundamentally undermines ethical deliberation, which requires free-thinking beings, not those following orders. This doesn't mean Christians can't be free-thinking beings (of course they are), it just means anyone who appeals to religion, specifically theism, as their basis for morality makes a flawed argument, no matter how they dress it up.
EDIT: Rephrased and fixed some sentences. Apologies.
UPDATE: Friend and member of the loyal opposition, theologian Jordan Pickering has written a reply to me.
* Thanks to reader Birnam420 for this brilliant suggestion.
Image Credit: Platón Academia de Atenas/WikiPedia (source)
New anthropological research suggests our ancestors enjoyed long slumbers.
- Neanderthal bone fragments discovered in northern Spain mimic hibernating animals like cave bears.
- Thousands of bone fragments, dating back 400,000 years, were discovered in this "pit of bones" 30 years ago.
- The researchers speculate that this physiological function, if true, could prepare us for extended space travel.
Humans have a terrible sense of time. We think in moments, not eons, which accounts for a number of people that still don't believe in evolutionary theory: we simply can't imagine ourselves any differently than we are today.
Thankfully, scientists and researchers have vast imaginations. Their findings often depend on creative problem-solving. Anthropologists are especially adept at this skill, as their job entails imagining a prehistoric world in which humans and our forebears were very different creatures.
A new paper, published in the journal L'Anthropologie, takes a hard look at ancient bone health and arrives at a surprising conclusion: Neanderthals (and possibly early humans) might have endured long, harsh winters by hibernating.
Adaptability is the key to survival. Certain endotherms evolved the ability to depress their metabolism for months at a time; their body temperature and metabolic rate lowered while their breathing and heart rate dropped to nearly imperceptible levels. This handy technique solved a serious resource management problem, as food supplies were notoriously scarce during the frozen months.
While today the wellness industry eschews fat, it has long had an essential evolutionary function: it keeps us alive during times of food scarcity. As autumn months pass, large mammals become hyperphagic (experiencing intense hunger followed by overeating) and store nutrients in fat deposits; smaller animals bury food nearby for when they need a snack. This strategy is critical as hibernating animals can lose over a quarter of their body weight during winter.
For this paper, Antonis Bartsiokas and Juan-Luis Arsuaga, both in the Department of History and Ethnology at Democritus University of Thrace, scoured through remains of a "pit of bones" in northern Spain. In 1976, archaeologists found a 50-foot shaft leading down into a cave in Atapuerca, where thousands of bone fragments have since been discovered. Dating back 400,000 years—some of the fragments may be as old as 600,000 years—researchers believe the bodies were intentionally buried in this cave.
Evidence of ancient human hibernation / human hibernation for space travel | Dr Antonis Bartsiokas
While the fragments have been well studied in the intervening decades, Arsuaga (who led an early excavation in Atapuerca) and Bartsiokas noticed something odd about the bones: they displayed signs of seasonal variations. These proto-humans appear to have experienced annual bone growth disruption, which is indicative of hibernating species.
In fact, the remains of cave bears were also found in this pit, increasing the likelihood that the burial site was reserved for species that shared common features. This could be the result of a dearth of food for bears and Neanderthals alike. The researchers write that modern northerners don't need to sleep for months at a time; an abundance of fish and reindeer didn't exist in Spain, as they do in the Arctic. They write,
"The aridification of Iberia then could not have provided enough fat-rich food for the people of Sima during the harsh winter—making them resort to cave hibernation."
The notion of hibernating humans is appealing, especially to those in cold climates, but some experts don't want to put the cart before the horse. Large mammals don't engage in textbook hibernation; their deep sleep is known as a "torpor." Even then, the demands of human-sized brains could have been too large for extended periods of slumber.
Still, as we continually discover our animalistic origins to better understand how we evolved, the researchers note the potential value of this research.
"The present work provides an innovative approach to the physiological mechanisms of metabolism in early humans that could help determine the life cycle and physiology of extinct human species."
Bartsiokas speculates that this ancient mechanism could be coopted for space travel in the future. If the notion of hibernating humans sounds far-fetched, the idea has been contemplated for years, as NASA began funding research on this topic in 2014. As the saying goes, everything old is new again.
Stay in touch with Derek on Twitter and Facebook. His new book is "Hero's Dose: The Case For Psychedelics in Ritual and Therapy."
It is impossible for science to arrive at ultimate truths, but functional truths are good enough.
- What is truth? This is a very tricky question, trickier than many would like to admit.
- Science does arrive at what we can call functional truth, that is, when it focuses on what something does as opposed to what something is. We know how gravity operates, but not what gravity is, a notion that has changed over time and will probably change again.
- The conclusion is that there are not absolute final truths, only functional truths that are agreed upon by consensus. The essential difference is that scientific truths are agreed upon by factual evidence, while most other truths are based on belief.
Does science tell the truth? The answer to this question is not as simple as it seems, and my 13.8 colleague Adam Frank took a look at it in his article about the complementarity of knowledge. There are many levels of complexity to what truth is or means to a person or a community. Why?
First, "truth" itself is hard to define or even to identify. How do you know for sure that someone is telling you the truth? Do you always tell the truth? In groups, what may be considered true to a culture with a given set of moral values may not be true in another. Examples are easy to come by: the death penalty, abortion rights, animal rights, environmentalism, the ethics of owning weapons, etc.
At the level of human relations, truth is very convoluted. Living in an age where fake news has taken center stage only corroborates this obvious fact. However, not knowing how to differentiate between what is true and what is not leads to fear, insecurity, and ultimately, to what could be called worldview servitude — the subservient adherence to a worldview proposed by someone in power. The results, as the history of the 20th century has shown extensively, can be catastrophic.
Proclamations of final or absolute truths, even in science, shouldn't be trusted.
The goal of science, at least on paper, is to arrive at the truth without recourse to any belief or moral system. Science aims to go beyond the human mess so as to be value-free. The premise here is that Nature doesn't have a moral dimension, and that the goal of science is to describe Nature the best possible way, to arrive at something we could call the "absolute truth." The approach is a typical heir to the Enlightenment notion that it is possible to take human complications out of the equation and have an absolute objective view of the world. However, this is a tall order.
It is tempting to believe that science is the best pathway to truth because, to a spectacular extent, science does triumph at many levels. You trust driving your car because the laws of mechanics and thermodynamics work. NASA scientists and engineers just managed to have the Ingenuity Mars Helicopter — the first man-made device to fly over another planet — hover above the Martian surface all by itself.
We can use the laws of physics to describe the results of countless experiments to amazing levels of accuracy, from the magnetic properties of materials to the position of your car in traffic using GPS locators. In this restricted sense, science does tell the truth. It may not be the absolute truth about Nature, but it's certainly a kind of pragmatic, functional truth at which the scientific community arrives by consensus based on the shared testing of hypotheses and results.
What is truth?
Credit: Sergey Nivens / 242235342
But at a deeper level of scrutiny, the meaning of truth becomes intangible, and we must agree with the pre-Socratic philosopher Democritus who declared, around 400 years BCE, that "truth is in the depths." (Incidentally, Democritus predicted the existence of the atom, something that certainly exists in the depths.)
A look at a dictionary reinforces this view. "Truth: the quality of being true." Now, that's a very circular definition. How do we know what is true? A second definition: "Truth: a fact or belief that is accepted as true." Acceptance is key here. A belief may be accepted to be true, as is the case with religious faith. There is no need for evidence to justify a belief. But note that a fact as well can be accepted as true, even if belief and facts are very different things. This illustrates how the scientific community arrives at a consensus of what is true by acceptance. Sufficient factual evidence supports that a statement is true. (Note that what defines sufficient factual evidence is also accepted by consensus.) At least until we learn more.
Take the example of gravity. We know that an object in free fall will hit the ground, and we can calculate when it does using Galileo's law of free fall (in the absence of friction). This is an example of "functional truth." If you drop one million rocks from the same height, the same law will apply every time, corroborating the factual acceptance of a functional truth, that all objects fall to the ground at the same rate irrespective of their mass (in the absence of friction).
But what if we ask, "What is gravity?" That's an ontological question about what gravity is and not what it does. And here things get trickier. To Galileo, it was an acceleration downward; to Newton a force between two or more massive bodies inversely proportional to the square of the distance between them; to Einstein the curvature of spacetime due to the presence of mass and/or energy. Does Einstein have the final word? Probably not.
Is there an ultimate scientific truth?
Final or absolute scientific truths assume that what we know of Nature can be final, that human knowledge can make absolute proclamations. But we know that this can't really work, for the very nature of scientific knowledge is that it is incomplete and contingent on the accuracy and depth with which we measure Nature with our instruments. The more accuracy and depth our measurements gain, the more they are able to expose the cracks in our current theories, as I illustrated last week with the muon magnetic moment experiments.
So, we must agree with Democritus, that truth is indeed in the depths and that proclamations of final or absolute truths, even in science, shouldn't be trusted. Fortunately, for all practical purposes — flying airplanes or spaceships, measuring the properties of a particle, the rates of chemical reactions, the efficacy of vaccines, or the blood flow in your brain — functional truths do well enough.
Using urinals, psychological collages, and animated furniture to shock us into reality.
- Dada is a provocative and surreal art movement born out of the madness of World War I.
- Tzara, a key Dada theorist, says Dada seeks "to confuse and upset, to shake and jolt" people from their comfort zones.
- Dada, as all avant-garde art, faces a key problem in how to stay true to its philosophy.
In a world gone mad, what can the few sane people left do? What can someone say when there are no words that seem up to the job? How can anyone hope to express ideas so terrible when doing so will only reduce those ideas?
These are some of the things that inspired the Dada movement, and in its absurd, surreal, and chaotic nonsense, we find the voice of the voiceless.
The origin of Dadaism
Dada was a response to the madness of World War I. Reasonable, intelligent, and sensitive people looked at the blood and mud graveyards of the trenches and wondered how any meaning or goodness could ever be found again. How can someone make sense of a world where millions of young, happy, hopeful men were scythed down in a spray of bullets? How could life go back to normal when returning soldiers, blinded and disfigured from gas, lay homeless in the streets? Out of this awful revulsion, there came one bitter voice, and it said: "Everything is nonsense."
Dada is the art of the nihilist. It smashes accepted wisdom, challenges norms and values, and offends, upsets, and provokes us to re-examine everything.
And so, the Dada movement expressed itself in absurdity. Tzara, the closest you get to a Dadaist philosopher, put it like this: "Like everything in life, Dada is useless. Dada is without pretension, as life should be." Dada rejects all systems, all philosophy, all definite answers, and all truth. It is the living embrace of contradictions and nonsense. It seeks "to confuse and upset people, to shake and jolt". It aims to shout down the "shamefaced sex of comfortable compromise and good manners," when actually "everything happens in a completely idiotic way."
In short, Dada is a response to the world when all the usual methods have broken down. It's the recognition that dinner party conversations, Hollywood blockbusters, and Silicon Valley are not how life actually is. This is a false reality and order, like some kind of veneer.
The Dada response to life is to embrace the personal and passionate madness of it all, where "the intensity of a personality is transposed directly, clearly into the work." It's to recognize the unique position of an artist, who can convey ideas and feelings in a way that goes beyond normal understanding. Art goes straight to the soul, but the intensity of it all can be hard to "enjoy" in the strictest sense.
Where is this Dada?
For instance, Dada is seen in the poems of Hugo Ball who wrote in meaningless foreign-sounding words. It's in Hausmann, who wrote works in disconnected phonemes. It's found in Duchamp's iconoclastic "Fountain" that sought to question what art or an artist really meant. It's in Hans Richter's short film "Ghost before Breakfast," which has an incoherent montage of images, loosely connected by the theme of inanimate objects in revolt. And, it's in Kurt Schwitters' "psychological collages" which present fragments of objects, juxtaposed together.
Dada is intended to shock. It's an artistic jolt asking, or demanding, that the viewers reorient themselves in some way. It is designed to make us feel uncomfortable and does not make for easy appreciation. It's only when we're thrown so drastically outside of our comfort zone in this way that Dada asks us to question how things are. It shakes us out of a conformist stupor to look afresh at things.
The paradox of Dadaism
Of course, like all avant-garde art, Dada needs to address one major problem: how do you stay so provocative, so radical, and so anti-establishment when you also seek success? How can maverick rebels stay so as they get a mortgage and want a good school for their kids? The problem is that young, inventive, and idealistic artists are inevitably sucked into the world of profit and commodity.
As Grayson Perry, a British modern artist, wrote: "What starts as a creative revolt soon becomes co-opted as the latest way to make money," and what was once fresh and challenging "falls away to reveal a predatory capitalist robot." With Dada, how long can someone actually live in a world of nonsense and nihilistic absurdity?
But there will always be new blood to keep movements like Dada going. As the revolutionaries of yesterday become the rich mansion-owners of today, there will be hot, young things to come and take up the mantle. There will always be something to challenge and questions to be asked. So, art movements like Dada will always be in the vanguard.
Dada is the art of the nihilist. It smashes accepted wisdom, challenges norms and values, and offends, upsets, and provokes us to re-examine everything. It's an absurd art form that reflects the reality it perceives — that life is nothing more than a dissonant patchwork of egos floating in an abyss of nothing.