Once a week.
Subscribe to our weekly newsletter.
Does it matter if there was a historical Jesus?
The message matters more than the man.
- Early Christianity was a synthesis of Jewish and Greek ideas and rituals, though it's often presented as brand new.
- Jesus's teachings can predominantly be traced back to earlier apocalyptic Judaism.
- An important question persists: Is it the man or the message that really matters to modern Christians?
Although Jesus Christ is inextricably linked to Christianity, historically he was a major factor but not the totality of the faith. For the bulk of the last 2,000 years, Christians were more concerned with other facets of their religion. As religion professor Stephen Prothero writes, the obsessive focus on the Christ figure is a relatively recent innovation.
Within America, Christ has many meanings for different people. Prothero notes that American Christians have portrayed him as "black and white, male and female, straight and gay, a socialist and a capitalist, a pacifist and a warrior, a Klu Klux Klansman and a civil rights agitator." Of course, each denomination claims to worship the correct Jesus, which is more assumption than grounded truth.
Christ's malleability is historically constant. We can see that in action today. For supporters of Bernie Sanders, he's the wild-eyed prophet raging against moneymen in the temple; evangelicals pleased with the choice of Amy Coney Barrett are rejoicing over the potential overturning of Roe v Wade, a cultural wedge issue they claim Jesus demands. With this kind of imagined flexibility, does the actual historical Christ even matter?
The historical Jesus has always been elusive. The four gospels—the only widely circulated biographies of his existence—were penned somewhere between 65 and 100 years after his death. Gospel writers never even divulge his appearance. As religious scholar Lawrence Cunningham writes, New Testament authors "do not say what Jesus looked like." From Christianity's very origins, Christ is portrayed more as archetype than man.
Scripture and prophecy don't always match up: the Hebrew Bible claimed the coming prophet would be born in Bethlehem, a prediction "Jesus of Nazareth" doesn't fulfill. Sure, Luke says his parents were visiting for a census, while Matthew claims they were always in Bethlehem until fleeing to Egypt under duress (and then ended up in Nazareth). The earliest gospel, Mark, keeps it simple: Jesus of Nazareth is from Nazareth.
Remember, we aren't even discussing Q or the Gospel of Mary.
The Jesus of History versus the Christ of Faith
These aren't the only contradictions, though what can you expect when four writers tackle one subject on hearsay over the course of decades?
There's also the question of narrative lineage. As Brian Muraresku writes in "The Immortality Key," the Church has gone to great lengths to make it appear as if Christianity emerged whole-cloth amid a world of pagan worship. In fact, the ruling Romans considered early Christians to be atheists due to their belief in only one God. Christians certainly held distinct beliefs, but they were also heavily influenced by their environment.
Most biblical stories have precedent. As Muraresku notes, you don't get to Jesus without Dionysus; Dionysus without El; El without Osiris; Osiris without a rich oral history that predates written language. In each case, the mythological archetype mattered most. With Christianity, an emphasis was placed on a man, which in some ways is also traceable to the Greeks.
As Edith Hamilton writes in her epic survey on mythology, the Greek emphasis on human deities (in sculpture, painting, and story) broke from prior traditions, which dreamed up animal totems and animal-human hybrids. Christians merely took human worship to the next level, even though the roots of this practice are distinctly Greek.
As with all religions, Christianity was a cult for quite some time. Early Christian writers fused Jewish and Greek ideas during the Patristic Era to create their doctrines. Various Christologies were introduced to suit the temperament of each faithful tribe. After the Nicene Creed (325 CE) dubbed Jesus the "only-begotten Son of God," a host of competing Christian offshoots shook their head in agreement. The man finally usurped the myth, and the cult took form as a global religion.
Though Jesus is presented as revolutionary, his philosophical bent squares well with the apocalyptic prophets of Judaism, especially as presented in Second Isaiah. Jesus remixed a long-held Jewish belief in a heavenly kingdom on earth. Amos and First Isaiah feature plenty of discussion about speaking up for the poor and weak. The exploitation of the lower class had been a sin for at least seven centuries by the time Jesus took to the soapbox. If anything, Jesus was a synthesist, not a creationist, as was the writers that honored him.
Credit: pronoia / Adobe Stock
This doesn't denigrate Jesus's role in any capacity. Instead, it grounds his humanity. Every religion is a synthesis of previous religions. As Muraresku shows, the Greek influence on Christian symbolism is too often overlooked. Understanding historical circumstances help us recognize the forces such prophets were fighting and provides context for their messages. Better to evolve a tradition than pretend it emerged from a vacuum.
As religious scholar Karen Armstrong points out, Christians seem particularly interested in the origins of their religion, certainly much more so than Buddhists, making Muraresku's research even more revealing: Why wouldn't you want to know about the pharmacological connection between Dionysus and Jesus in the early Church? If we're talking about the supposed world savior, would an honest biography really dampen our enthusiasm for the Eucharist? How much do his flesh and blood matter when the goal is to live his values in our time?
Christ has long been weighed down by false assumptions. German philosopher Hermann Samuel Reimarus was the first modern thinker to question myths around the historical Jesus. For example, he writes that Jesus never claimed to atone for the sins of mankind. That feature was added by St. Paul, arguably the real founder of Christianity. Reimarus writes that Jesus isn't God, but a teacher of a "remarkable, simple, exalted and practical religion."
If we want to investigate Jesus's most pertinent messages—treat the poor and underserved with respect; question authority; refrain from hatred; love your neighbor as yourself—then the actual person is irrelevant. Many have espoused the same principles before and after Jesus. The man is second to the message, which, if you read his instructions closely, is how he'd likely want it.
If you think turning water into wine and walking on water is amazing, imagine the magic of universal basic income and healthcare for all. That's a practical and living religion we can all take part in.
Stay in touch with Derek on Twitter and Facebook. His new book is "Hero's Dose: The Case For Psychedelics in Ritual and Therapy."
- Misquoting Jesus: A Blessing in Disguise? - Big Think ›
- A Growing Number of Scholars Are Questioning the Historical ... ›
- Was Jesus real? Historical evidence of Jesus Christ - Big Think ›
So much for rest in peace.
- Australian scientists found that bodies kept moving for 17 months after being pronounced dead.
- Researchers used photography capture technology in 30-minute intervals every day to capture the movement.
- This study could help better identify time of death.
We're learning more new things about death everyday. Much has been said and theorized about the great divide between life and the Great Beyond. While everyone and every culture has their own philosophies and unique ideas on the subject, we're beginning to learn a lot of new scientific facts about the deceased corporeal form.
An Australian scientist has found that human bodies move for more than a year after being pronounced dead. These findings could have implications for fields as diverse as pathology to criminology.
Dead bodies keep moving
Researcher Alyson Wilson studied and photographed the movements of corpses over a 17 month timeframe. She recently told Agence France Presse about the shocking details of her discovery.
Reportedly, she and her team focused a camera for 17 months at the Australian Facility for Taphonomic Experimental Research (AFTER), taking images of a corpse every 30 minutes during the day. For the entire 17 month duration, the corpse continually moved.
"What we found was that the arms were significantly moving, so that arms that started off down beside the body ended up out to the side of the body," Wilson said.
The researchers mostly expected some kind of movement during the very early stages of decomposition, but Wilson further explained that their continual movement completely surprised the team:
"We think the movements relate to the process of decomposition, as the body mummifies and the ligaments dry out."
During one of the studies, arms that had been next to the body eventually ended up akimbo on their side.
The team's subject was one of the bodies stored at the "body farm," which sits on the outskirts of Sydney. (Wilson took a flight every month to check in on the cadaver.)Her findings were recently published in the journal, Forensic Science International: Synergy.
Implications of the study
The researchers believe that understanding these after death movements and decomposition rate could help better estimate the time of death. Police for example could benefit from this as they'd be able to give a timeframe to missing persons and link that up with an unidentified corpse. According to the team:
"Understanding decomposition rates for a human donor in the Australian environment is important for police, forensic anthropologists, and pathologists for the estimation of PMI to assist with the identification of unknown victims, as well as the investigation of criminal activity."
While scientists haven't found any evidence of necromancy. . . the discovery remains a curious new understanding about what happens with the body after we die.
Metal-like materials have been discovered in a very strange place.
- Bristle worms are odd-looking, spiky, segmented worms with super-strong jaws.
- Researchers have discovered that the jaws contain metal.
- It appears that biological processes could one day be used to manufacture metals.
The bristle worm, also known as polychaetes, has been around for an estimated 500 million years. Scientists believe that the super-resilient species has survived five mass extinctions, and there are some 10,000 species of them.
Be glad if you haven't encountered a bristle worm. Getting stung by one is an extremely itchy affair, as people who own saltwater aquariums can tell you after they've accidentally touched a bristle worm that hitchhiked into a tank aboard a live rock.
Bristle worms are typically one to six inches long when found in a tank, but capable of growing up to 24 inches long. All polychaetes have a segmented body, with each segment possessing a pair of legs, or parapodia, with tiny bristles. ("Polychaeate" is Greek for "much hair.") The parapodia and its bristles can shoot outward to snag prey, which is then transferred to a bristle worm's eversible mouth.
The jaws of one bristle worm — Platynereis dumerilii — are super-tough, virtually unbreakable. It turns out, according to a new study from researchers at the Technical University of Vienna, this strength is due to metal atoms.
Metals, not minerals
Fireworm, a type of bristle wormCredit: prilfish / Flickr
This is pretty unusual. The study's senior author Christian Hellmich explains: "The materials that vertebrates are made of are well researched. Bones, for example, are very hierarchically structured: There are organic and mineral parts, tiny structures are combined to form larger structures, which in turn form even larger structures."
The bristle worm jaw, by contrast, replaces the minerals from which other creatures' bones are built with atoms of magnesium and zinc arranged in a super-strong structure. It's this structure that is key. "On its own," he says, "the fact that there are metal atoms in the bristle worm jaw does not explain its excellent material properties."
Just deformable enough
Credit: by-studio / Adobe Stock
What makes conventional metal so strong is not just its atoms but the interactions between the atoms and the ways in which they slide against each other. The sliding allows for a small amount of elastoplastic deformation when pressure is applied, endowing metals with just enough malleability not to break, crack, or shatter.
Co-author Florian Raible of Max Perutz Labs surmises, "The construction principle that has made bristle worm jaws so successful apparently originated about 500 million years ago."
Raible explains, "The metal ions are incorporated directly into the protein chains and then ensure that different protein chains are held together." This leads to the creation of three-dimensional shapes the bristle worm can pack together into a structure that's just malleable enough to withstand a significant amount of force.
"It is precisely this combination," says the study's lead author Luis Zelaya-Lainez, "of high strength and deformability that is normally characteristic of metals.
So the bristle worm jaw is both metal-like and yet not. As Zelaya-Lainez puts it, "Here we are dealing with a completely different material, but interestingly, the metal atoms still provide strength and deformability there, just like in a piece of metal."
Observing the creation of a metal-like material from biological processes is a bit of a surprise and may suggest new approaches to materials development. "Biology could serve as inspiration here," says Hellmich, "for completely new kinds of materials. Perhaps it is even possible to produce high-performance materials in a biological way — much more efficiently and environmentally friendly than we manage today."
Dealing with rudeness can nudge you toward cognitive errors.
- Anchoring is a common bias that makes people fixate on one piece of data.
- A study showed that those who experienced rudeness were more likely to anchor themselves to bad data.
- In some simulations with medical students, this effect led to higher mortality rates.
Cognitive biases are funny little things. Everyone has them, nobody likes to admit it, and they can range from minor to severe depending on the situation. Biases can be influenced by factors as subtle as our mood or various personality traits.
A new study soon to be published in the Journal of Applied Psychology suggests that experiencing rudeness can be added to the list. More disturbingly, the study's findings suggest that it is a strong enough effect to impact how medical professionals diagnose patients.
Life hack: don't be rude to your doctor
The team of researchers behind the project tested to see if participants could be influenced by the common anchoring bias, defined by the researchers as "the tendency to rely too heavily or fixate on one piece of information when making judgments and decisions." Most people have experienced it. One of its more common forms involves being given a particular value, say in negotiations on price, which then becomes the center of reasoning even when reason would suggest that number should be ignored.
It can also pop up in medicine. As co-author Dr. Trevor Foulk explains, "If you go into the doctor and say 'I think I'm having a heart attack,' that can become an anchor and the doctor may get fixated on that diagnosis, even if you're just having indigestion. If doctors don't move off anchors enough, they'll start treating the wrong thing."
Lots of things can make somebody more or less likely to anchor themselves to an idea. The authors of the study, who have several papers on the effects of rudeness, decided to see if that could also cause people to stumble into cognitive errors. Past research suggested that exposure to rudeness can limit people's perspective — perhaps anchoring them.
In the first version of the study, medical students were given a hypothetical patient to treat and access to information on their condition alongside an (incorrect) suggestion on what the condition was. This served as the anchor. In some versions of the tests, the students overheard two doctors arguing rudely before diagnosing the patient. Later variations switched the diagnosis test for business negotiations or workplace tasks while maintaining the exposure to rudeness.
Across all iterations of the test, those exposed to rudeness were more likely to anchor themselves to the initial, incorrect suggestion despite the availability of evidence against it. This was less significant for study participants who scored higher on a test of how wide of a perspective they tended to have. The disposition of these participants, who answered in the affirmative to questions like, "Before criticizing somebody, I try to imagine how I would feel if I were in his/her place," was able to effectively negate the narrowing effects of rudeness.
What this means for you and your healthcare
The effects of anchoring when a medical diagnosis is on the line can be substantial. Dr. Foulk explains that, in some simulations, exposure to rudeness can raise the mortality rate as doctors fixate on the wrong problems.
The authors of the study suggest that managers take a keener interest in ensuring civility in workplaces and giving employees the tools they need to avoid judgment errors after dealing with rudeness. These steps could help prevent anchoring.
Also, you might consider being nicer to people.