Once a week.
Subscribe to our weekly newsletter.
Is "To Thine Own Self Be True" Actually Good Advice?
Antiquated phrasings don't make you any more profound than antiquated notions.
"To thine own self be true," says Polonius in Hamlet.
This phrase has become enormously popular, so much so that there are entire Tumblrs of photographs of people bearing "to thine own self be true" tattoos and other paraphernalia. People often appeal to this injunction when they feel defensive and want to say something smart and deep in their own favor. With the added benefit of being a quote due to Shakespeare, saying this faux profundity (fauxfundity?) is often too hard to resist.
Without getting into the details of how well-received men and supposed fools are actually treated in Shakespeare, I will just note that the intent of the author was likely not to represent Polonius as profound, but rather as a blowhard. So what does it mean, and what's the problem?
It's a way of saying that nothing at all matters more to how we should act than our own esteem.
It's a way of saying that nothing at all matters more to how we should act than our own esteem. It says that we should stick to our principles, not assimilate, and that we should do what we believe. It is certainly beautifully phrased, and invokes ideas with positive connotations: truth, self-ownership, individuality. But, are these virtues really hiding a fundamental vice?
They are. The phrase echoes something which I have heard subscribers to a particular brand of therapy repeat as a sort of mantra: "I just really need to focus on me right now." In fact, the phrase appeals to our complacency, not to our resilience. Its function is to swell our laziness, not to stoke our resolve. It's use is to excuse our disagreements with society, not to force us to reconcile them with fact.We are all victims, suffering in vain, alone in our wisdom, against an unfair society that condemns iconoclasts.
In fact, the phrase appeals to our complacency, not to our resilience. Its function is to swell our laziness, not to stoke our resolve.
"How do I square the circle of perceived condemnation? How do I ignore the majority opinion telling me I must do something, or be something which isn't expedient for me?" "It doesn't matter what anyone thinks, or what I know is good. This is who I am, and I'm just being true to myself."
It's a universal excuse, a get out of jail free card from the prison of having to consider and acknowledge your own failings and biases and whims. I don't have to conform to the world; it has to conform to me.
Of course, there always are some lone victims who are genuinely iconoclastic and genuinely oppressed, and it is they who progress our society forward. But it is not they who cling to "to thine own self be true". They don't need an excuse to do nothing, because they are too busy finding an excuse to do something.
There is no absolute self to be virtuously true to. Anyway, cognitive neuroscience has shown us that we are especially bad judges of our own character and desire.Really everything that needs to be said against this platitude was said by the great George Bernard Shaw, who remarked, "Life isn't about finding yourself; Life is about creating yourself."
Which Shakespeare character is Obama? Ben Brantley, chief theater critic at the New York Times, explains:
New anthropological research suggests our ancestors enjoyed long slumbers.
- Neanderthal bone fragments discovered in northern Spain mimic hibernating animals like cave bears.
- Thousands of bone fragments, dating back 400,000 years, were discovered in this "pit of bones" 30 years ago.
- The researchers speculate that this physiological function, if true, could prepare us for extended space travel.
Humans have a terrible sense of time. We think in moments, not eons, which accounts for a number of people that still don't believe in evolutionary theory: we simply can't imagine ourselves any differently than we are today.
Thankfully, scientists and researchers have vast imaginations. Their findings often depend on creative problem-solving. Anthropologists are especially adept at this skill, as their job entails imagining a prehistoric world in which humans and our forebears were very different creatures.
A new paper, published in the journal L'Anthropologie, takes a hard look at ancient bone health and arrives at a surprising conclusion: Neanderthals (and possibly early humans) might have endured long, harsh winters by hibernating.
Adaptability is the key to survival. Certain endotherms evolved the ability to depress their metabolism for months at a time; their body temperature and metabolic rate lowered while their breathing and heart rate dropped to nearly imperceptible levels. This handy technique solved a serious resource management problem, as food supplies were notoriously scarce during the frozen months.
While today the wellness industry eschews fat, it has long had an essential evolutionary function: it keeps us alive during times of food scarcity. As autumn months pass, large mammals become hyperphagic (experiencing intense hunger followed by overeating) and store nutrients in fat deposits; smaller animals bury food nearby for when they need a snack. This strategy is critical as hibernating animals can lose over a quarter of their body weight during winter.
For this paper, Antonis Bartsiokas and Juan-Luis Arsuaga, both in the Department of History and Ethnology at Democritus University of Thrace, scoured through remains of a "pit of bones" in northern Spain. In 1976, archaeologists found a 50-foot shaft leading down into a cave in Atapuerca, where thousands of bone fragments have since been discovered. Dating back 400,000 years—some of the fragments may be as old as 600,000 years—researchers believe the bodies were intentionally buried in this cave.
Evidence of ancient human hibernation / human hibernation for space travel | Dr Antonis Bartsiokas
While the fragments have been well studied in the intervening decades, Arsuaga (who led an early excavation in Atapuerca) and Bartsiokas noticed something odd about the bones: they displayed signs of seasonal variations. These proto-humans appear to have experienced annual bone growth disruption, which is indicative of hibernating species.
In fact, the remains of cave bears were also found in this pit, increasing the likelihood that the burial site was reserved for species that shared common features. This could be the result of a dearth of food for bears and Neanderthals alike. The researchers write that modern northerners don't need to sleep for months at a time; an abundance of fish and reindeer didn't exist in Spain, as they do in the Arctic. They write,
"The aridification of Iberia then could not have provided enough fat-rich food for the people of Sima during the harsh winter—making them resort to cave hibernation."
The notion of hibernating humans is appealing, especially to those in cold climates, but some experts don't want to put the cart before the horse. Large mammals don't engage in textbook hibernation; their deep sleep is known as a "torpor." Even then, the demands of human-sized brains could have been too large for extended periods of slumber.
Still, as we continually discover our animalistic origins to better understand how we evolved, the researchers note the potential value of this research.
"The present work provides an innovative approach to the physiological mechanisms of metabolism in early humans that could help determine the life cycle and physiology of extinct human species."
Bartsiokas speculates that this ancient mechanism could be coopted for space travel in the future. If the notion of hibernating humans sounds far-fetched, the idea has been contemplated for years, as NASA began funding research on this topic in 2014. As the saying goes, everything old is new again.
Stay in touch with Derek on Twitter and Facebook. His new book is "Hero's Dose: The Case For Psychedelics in Ritual and Therapy."
Two different studies provide further evidence of the efficacy of psychedelics in treating depression.
- A phase 2 clinical trial by Imperial College London found psilocybin to be as effective at treating depression as escitalopram, a commonly prescribed antidepressant.
- A different study by the University of Maryland showed that blocking the hallucinogenic effects of magic mushrooms in mice did not reduce the antidepressant effect.
- Combined, these studies could lead to new ways of applying psychedelics to patient populations that don't want to trip.
Due to stigma, their illegal status and difficulty in finding control groups, research with psychedelics has been a challenge. But research increasingly shows that this class of drug has legitimate medicinal uses, and they may be just as good or even better than more traditional therapies.
Now, the Centre for Psychedelic Research at Imperial College London reports in the New England Journal of Medicine that when pitted against escitalopram (brand name: Lexapro), psilocybin was as effective as the popular SSRI (selective serotonin reuptake inhibitor) in treating moderate to severe depression. Perhaps most significantly, these results were obtained when comparing 6 weeks of daily doses of escitalopram to just two administrations of psilocybin.
Robin Carhart-Harris, head of the center who has published over 100 papers on psychedelics, is confident this study represents another step forward in applying psychedelics to mental health treatment protocols while also reducing fears a lot of citizens have around these substances. In a press release, he said:
"One of the most important aspects of this work is that people can clearly see the promise of properly delivered psilocybin therapy by viewing it compared with a more familiar, established treatment in the same study. Psilocybin performed very favorably in this head-to-head."
Credit: Robin Carhart-Harris et al, NEJM, 2021.
As depicted above, the phase 2 clinical trial included 59 volunteers. The escitalopram (control) group received six weeks of daily escitalopram in addition to two tiny (1-mg) doses of psilocybin — a dose so low that it is unlikely to produce hallucinogenic effects. The psilocybin (experimental) group received two 25-mg doses of psilocybin three weeks apart with placebo given on all the other days.
At the end of the study, both groups saw a decrease in depressive symptoms, though the results were not statistically significant. (That isn't necessarily bad because if the two drugs have similar effects, then they would not produce statistically significant results. Still, a larger study is needed to confirm that psilocybin is "just as good as" escitalopram.)
Additionally, several other outcomes favored psilocybin over escitalopram. For instance, 57 percent in the psilocybin group saw a remission of symptoms compared to 28 percent in the escitalopram group. This result was significant.
Psychedelics without tripping
As psychedelics become decriminalized and potentially legalized for therapeutic use, however, a large population of people might desire the antidepressant effects without the hallucinations. For example, the psychedelic ibogaine may be useful for treating addiction, so the company Mindmed is developing an analog that works without producing the unwanted hallucinogenic side effects.
A new research article, published in the journal PNAS, investigated the antidepressant effects of psilocybin on a group of chronically stressed mice. (Under immense stress, mice develop something resembling human depression.) As with humans, depressed mice lose a sense of joy, which can be assessed by determining their preference for sugar water over tap water. Normal mice prefer sugar water, but depressed mice simply don't care.
Once the mice were no longer juicing up on the sweetened water, the team dosed them with psilocybin alongside a drug called ketanserin, a 5-HT2A serotonin receptor antagonist that eliminates psychedelic effects. Within 24 hours of receiving the dose, the mice were rushing back to the sugar water, indicating that tripping is not necessary for psilocybin to work as an antidepressant.
While the team is excited about these results, they realize it needs to be replicated in a different population.
"The possibility of combining psychedelic compounds and a 5-HT2AR antagonist offers a potential means to increase their acceptance and clinical utility and should be studied in human depression."
Photo: Cannabis_Pic / Adobe Stock
The future of psychedelic therapy
Psychedelics such as psilocybin and LSD have a long track record of efficacy in clinical trials and anecdotal experiences. Almost all volunteers of the famous Marsh Chapel experiment claimed their experience on Good Friday in 1962 was one of the most significant events of their lives — and this was a quarter-century after the fact. A more recent, controlled study found that a single dose of psilocybin showed antidepressant effects six months later.
Proponents of macrodosing and ritualistic experiences sometimes argue that the full-blown mystical trip is the therapy, though this is anecdotal, not clinical research. As the Maryland team noted, a number of people are contraindicated for psychedelics, whether through a family history of schizophrenia or current antidepressant treatments.
Senior author Scott Thompson is excited for future research on this topic. As he said of his team's findings:
"The psychedelic experience is incredibly powerful and can be life-changing, but that could be too much for some people or not appropriate… These findings show that activation of the receptor causing the psychedelic effect isn't absolutely required for the antidepressant benefits, at least in mice."
Hopefully, with more research occurring in psychedelics than even in the 1950s (when studies predominantly relied on anecdotal evidence and little government support), the longstanding stigmatization of psychedelics is beginning to recede. This could open up new possibilities for both clinical research and, for those curious about the ritual effects, a continuation of introspective experiences.
Stay in touch with Derek on Twitter and Facebook. His most recent book is "Hero's Dose: The Case For Psychedelics in Ritual and Therapy."
A Harvard professor's study discovers the worst year to be alive.
- Harvard professor Michael McCormick argues the worst year to be alive was 536 AD.
- The year was terrible due to cataclysmic eruptions that blocked out the sun and the spread of the plague.
- 536 ushered in the coldest decade in thousands of years and started a century of economic devastation.
The past year has been nothing but the worst in the lives of many people around the globe. A rampaging pandemic, dangerous political instability, weather catastrophes, and a profound change in lifestyle that most have never experienced or imagined.
But was it the worst year ever?
Nope. Not even close. In the eyes of the historian and archaeologist Michael McCormick, the absolute "worst year to be alive" was 536.
Why was 536 so bad? You could certainly argue that 1918, the last year of World War I when the Spanish Flu killed up to 100 million people around the world, was a terrible year by all accounts. 1349 could also be considered on this morbid list as the year when the Black Death wiped out half of Europe, with up to 20 million dead from the plague. Most of the years of World War II could probably lay claim to the "worst year" title as well. But 536 was in a category of its own, argues the historian.
It all began with an eruption...
According to McCormick, Professor of Medieval History at Harvard University, 536 was the precursor year to one of the worst periods of human history. It featured a volcanic eruption early in the year that took place in Iceland, as established by a study of a Swiss glacier carried out by McCormick and the glaciologist Paul Mayewski from the Climate Change Institute of The University of Maine (UM) in Orono.
The ash spewed out by the volcano likely led to a fog that brought an 18-month-long stretch of daytime darkness across Europe, the Middle East, and portions of Asia. As wrote the Byzantine historian Procopius, "For the sun gave forth its light without brightness, like the moon, during the whole year." He also recounted that it looked like the sun was always in eclipse.
Cassiodorus, a Roman politician of that time, wrote that the sun had a "bluish" color, the moon had no luster, and "seasons seem to be all jumbled up together." What's even creepier, he described, "We marvel to see no shadows of our bodies at noon."
...that led to famine...
The dark days also brought a period of coldness, with summer temperatures falling by 1.5° C. to 2.5° C. This started the coldest decade in the past 2300 years, reports Science, leading to the devastation of crops and worldwide hunger.
...and the fall of an empire
In 541, the bubonic plague added considerably to the world's misery. Spreading from the Roman port of Pelusium in Egypt, the so-called Plague of Justinian caused the deaths of up to one half of the population of the eastern Roman Empire. This, in turn, sped up its eventual collapse, writes McCormick.
Between the environmental cataclysms, with massive volcanic eruptions also in 540 and 547, and the devastation brought on by the plague, Europe was in for an economic downturn for nearly all of the next century, until 640 when silver mining gave it a boost.
Was that the worst time in history?
Of course, the absolute worst time in history depends on who you were and where you lived.
Native Americans can easily point to 1520, when smallpox, brought over by the Spanish, killed millions of indigenous people. By 1600, up to 90 percent of the population of the Americas (about 55 million people) was wiped out by various European pathogens.
Like all things, the grisly title of "worst year ever" comes down to historical perspective.
A newly discovered coronavirus — but not the one that causes COVID-19 — has made some dogs very sick.