Once a week.
Subscribe to our weekly newsletter.
Your Storytelling Brain
The brain is hardwired for storytelling. What stories give us, in the end, is reassurance. And as childish as it may seem, that sense of security – that coherent sense of self – is essential to our survival.
Cognitive Neuroscientist Michael Gazzaniga, a pioneer in the study of hemispheric (left vs. right brain) specialization describes "the Interpreter" - a left hemisphere function that organizes our memories into plausible stories. Less romantic, perhaps, than Gone With the Wind, the Interpreter may help to explain our species' profound relationship with storytelling.
'A Mind-Blowing Triumph!'
Mock these movie poster clichés if you will, but they speak to something we want from a story from about the age of two onward. Some of us might get a bit finicky in later years about which stories we allow to seduce us, and how many spoonfuls of critical reflection we want along with our dose of narrative intoxicant, but there's no getting around it: humans love stories. In fact, in some fundamental sense, we need them.
Cognitive science has long recognized narrative as a basic organizing principle of memory. From early childhood, we tell ourselves stories about our actions and experiences. Accuracy is not the main objective – coherence is. If necessary, our minds will invent things that never happened, people who don't exist, simply to hold the narrative together. How often have you had a fierce disagreement with a partner or sibling over who gave you that Three Tenors CD or which of you made the pathetic clay reindeer Christmas ornament? How can two eyewitnesses at a trial be absolutely convinced of two conflicting accounts of the same events?
This tendency to confabulate – to fill in the gaps of memory with plausible inventions that preserve narrative continuity – is most pronounced in patients with significant memory loss, or in laboratory tests with participants who have had the connection cut between the left and right hemispheres of their brain (a procedure that, surprisingly enough, rarely results in death or significant impairment of function). Michael Gazzaniga, a cognitive neuroscientist and the author of Who's in Charge?, has performed countless experiments with split-brain participants. They have revealed a function of the left hemisphere called 'the Interpreter,' which jumps in to make sense of memories, when it has no direct access to those memories or the context in which they were made.
What's the Significance?
The arts and sciences have had an uneasy relationship over the past couple centuries, as science has attempted to disentangle itself from its roots in superstition and magic and build a firm foundation on more empirical grounds. So lovers of film and literature may react with suspicion to any attempt at neurocognitive analysis of their passions. This is misguided, says Gazzaniga – understanding our hardwired need for narrative coherence doesn't diminish the aesthetic power of a great story – nor will it enable us anytime soon to program computers to write like William Blake. But it may help to explain what's going on when we are mesmerized or stunned by a novel or the latest Matt Damon flick.
Gazzaniga suspects that narrative coherence helps us to navigate the world – to know where we're coming from and where we're headed. It tells us where to place our trust and why. One reason we may love fiction, he says, is that it enables us to find our bearings in possible future realities, or to make better sense of our own past experiences. What stories give us, in the end, is reassurance. And as childish as it may seem, that sense of security – that coherent sense of self – is essential to our survival.
Follow Jason Gots (@jgots) on Twitter
Image credit: Shutterstock.com
New anthropological research suggests our ancestors enjoyed long slumbers.
- Neanderthal bone fragments discovered in northern Spain mimic hibernating animals like cave bears.
- Thousands of bone fragments, dating back 400,000 years, were discovered in this "pit of bones" 30 years ago.
- The researchers speculate that this physiological function, if true, could prepare us for extended space travel.
Humans have a terrible sense of time. We think in moments, not eons, which accounts for a number of people that still don't believe in evolutionary theory: we simply can't imagine ourselves any differently than we are today.
Thankfully, scientists and researchers have vast imaginations. Their findings often depend on creative problem-solving. Anthropologists are especially adept at this skill, as their job entails imagining a prehistoric world in which humans and our forebears were very different creatures.
A new paper, published in the journal L'Anthropologie, takes a hard look at ancient bone health and arrives at a surprising conclusion: Neanderthals (and possibly early humans) might have endured long, harsh winters by hibernating.
Adaptability is the key to survival. Certain endotherms evolved the ability to depress their metabolism for months at a time; their body temperature and metabolic rate lowered while their breathing and heart rate dropped to nearly imperceptible levels. This handy technique solved a serious resource management problem, as food supplies were notoriously scarce during the frozen months.
While today the wellness industry eschews fat, it has long had an essential evolutionary function: it keeps us alive during times of food scarcity. As autumn months pass, large mammals become hyperphagic (experiencing intense hunger followed by overeating) and store nutrients in fat deposits; smaller animals bury food nearby for when they need a snack. This strategy is critical as hibernating animals can lose over a quarter of their body weight during winter.
For this paper, Antonis Bartsiokas and Juan-Luis Arsuaga, both in the Department of History and Ethnology at Democritus University of Thrace, scoured through remains of a "pit of bones" in northern Spain. In 1976, archaeologists found a 50-foot shaft leading down into a cave in Atapuerca, where thousands of bone fragments have since been discovered. Dating back 400,000 years—some of the fragments may be as old as 600,000 years—researchers believe the bodies were intentionally buried in this cave.
Evidence of ancient human hibernation / human hibernation for space travel | Dr Antonis Bartsiokas
While the fragments have been well studied in the intervening decades, Arsuaga (who led an early excavation in Atapuerca) and Bartsiokas noticed something odd about the bones: they displayed signs of seasonal variations. These proto-humans appear to have experienced annual bone growth disruption, which is indicative of hibernating species.
In fact, the remains of cave bears were also found in this pit, increasing the likelihood that the burial site was reserved for species that shared common features. This could be the result of a dearth of food for bears and Neanderthals alike. The researchers write that modern northerners don't need to sleep for months at a time; an abundance of fish and reindeer didn't exist in Spain, as they do in the Arctic. They write,
"The aridification of Iberia then could not have provided enough fat-rich food for the people of Sima during the harsh winter—making them resort to cave hibernation."
The notion of hibernating humans is appealing, especially to those in cold climates, but some experts don't want to put the cart before the horse. Large mammals don't engage in textbook hibernation; their deep sleep is known as a "torpor." Even then, the demands of human-sized brains could have been too large for extended periods of slumber.
Still, as we continually discover our animalistic origins to better understand how we evolved, the researchers note the potential value of this research.
"The present work provides an innovative approach to the physiological mechanisms of metabolism in early humans that could help determine the life cycle and physiology of extinct human species."
Bartsiokas speculates that this ancient mechanism could be coopted for space travel in the future. If the notion of hibernating humans sounds far-fetched, the idea has been contemplated for years, as NASA began funding research on this topic in 2014. As the saying goes, everything old is new again.
Stay in touch with Derek on Twitter and Facebook. His new book is "Hero's Dose: The Case For Psychedelics in Ritual and Therapy."
A Harvard professor's study discovers the worst year to be alive.
- Harvard professor Michael McCormick argues the worst year to be alive was 536 AD.
- The year was terrible due to cataclysmic eruptions that blocked out the sun and the spread of the plague.
- 536 ushered in the coldest decade in thousands of years and started a century of economic devastation.
The past year has been nothing but the worst in the lives of many people around the globe. A rampaging pandemic, dangerous political instability, weather catastrophes, and a profound change in lifestyle that most have never experienced or imagined.
But was it the worst year ever?
Nope. Not even close. In the eyes of the historian and archaeologist Michael McCormick, the absolute "worst year to be alive" was 536.
Why was 536 so bad? You could certainly argue that 1918, the last year of World War I when the Spanish Flu killed up to 100 million people around the world, was a terrible year by all accounts. 1349 could also be considered on this morbid list as the year when the Black Death wiped out half of Europe, with up to 20 million dead from the plague. Most of the years of World War II could probably lay claim to the "worst year" title as well. But 536 was in a category of its own, argues the historian.
It all began with an eruption...
According to McCormick, Professor of Medieval History at Harvard University, 536 was the precursor year to one of the worst periods of human history. It featured a volcanic eruption early in the year that took place in Iceland, as established by a study of a Swiss glacier carried out by McCormick and the glaciologist Paul Mayewski from the Climate Change Institute of The University of Maine (UM) in Orono.
The ash spewed out by the volcano likely led to a fog that brought an 18-month-long stretch of daytime darkness across Europe, the Middle East, and portions of Asia. As wrote the Byzantine historian Procopius, "For the sun gave forth its light without brightness, like the moon, during the whole year." He also recounted that it looked like the sun was always in eclipse.
Cassiodorus, a Roman politician of that time, wrote that the sun had a "bluish" color, the moon had no luster, and "seasons seem to be all jumbled up together." What's even creepier, he described, "We marvel to see no shadows of our bodies at noon."
...that led to famine...
The dark days also brought a period of coldness, with summer temperatures falling by 1.5° C. to 2.5° C. This started the coldest decade in the past 2300 years, reports Science, leading to the devastation of crops and worldwide hunger.
...and the fall of an empire
In 541, the bubonic plague added considerably to the world's misery. Spreading from the Roman port of Pelusium in Egypt, the so-called Plague of Justinian caused the deaths of up to one half of the population of the eastern Roman Empire. This, in turn, sped up its eventual collapse, writes McCormick.
Between the environmental cataclysms, with massive volcanic eruptions also in 540 and 547, and the devastation brought on by the plague, Europe was in for an economic downturn for nearly all of the next century, until 640 when silver mining gave it a boost.
Was that the worst time in history?
Of course, the absolute worst time in history depends on who you were and where you lived.
Native Americans can easily point to 1520, when smallpox, brought over by the Spanish, killed millions of indigenous people. By 1600, up to 90 percent of the population of the Americas (about 55 million people) was wiped out by various European pathogens.
Like all things, the grisly title of "worst year ever" comes down to historical perspective.
A newly discovered coronavirus — but not the one that causes COVID-19 — has made some dogs very sick.
- A different coronavirus outbreak in late 2019 made many dogs in the UK very ill.
- The strangeness of the disease led veterinarians to send questionnaires to their peers and pet owners.
- The findings point toward the need for better systems to identify disease outbreaks in animals.
A recent study suggests that a mysterious disease plaguing dogs in the UK is caused by a novel coronavirus. This virus, which coincidentally appeared in late 2019 and began to concern veterinarians in early 2020, is not related to the virus which causes COVID-19, but can make your four-legged friend feel quite ill.
Novel coronavirus in dogs
The term "coronavirus" doesn't refer to a single disease, but a family of viruses (more formally, Coronaviridae) that share a shape similar to a crown (hence the name, "corona"). They infect many different kinds of animals and cause various diseases from COVID-19 and SARS to the common cold.
This new coronavirus, a variant of canine enteric coronavirus, was first noticed in January 2020 when a veterinarian in the United Kingdom treated "an unusually high number" of dogs with severe vomiting and other gastrointestinal issues at his office. Concerned about this spike in doggy indigestion, they reached out to other veterinarians to see if they were reporting a similar outbreak.
Online questionnaires were sent out to more than a thousand vets and pet owners to map the outbreak and collect information on which animals were being affected. Analysis of this data showed that nearly all of the cases involved vomiting and a loss of appetite, and half of them also involved diarrhea. Most of the cases took place in south and northwest England, though a large outbreak also occurred in and around the Scottish city of Edinburgh.
The data also suggested that male dogs in contact with other dogs were most likely to be infected, hinting at "either transmission between dogs or a common environmental source." The dogs recovered in more than 99 percent of cases.
Hoping to move beyond the questionnaire, the authors also turned to records to piece together what happened.
While public health data for animals is less frequently gathered than it is for humans, electronic records of pet admissions to veterinarian offices and pet insurance payouts do exist. The researchers accessed these records and found that the number of dogs recorded with stomach problems rose between December 2019 and March 2020, with nearly double the number of expected cases occurring during that time. There was also a concomitant rise in prescriptions for drugs to treat those conditions.
A later comparison of samples from dogs that were sick and healthy control dogs confirmed the presence of the novel coronavirus in the ill dogs. All of this was later compiled into a study that was recently published in Emerging Infectious Diseases, which is produced by the CDC.
Should I be concerned?
This coronavirus only affects dogs, and the researchers didn't find anything to suggest that humans could become infected.
However, the scale of the outbreak and the lack of tools immediately available to determine what was happening led the researchers to suggest that better organization is needed. Many of the authors are involved in creating a disease surveillance system for dogs, known as SAVSNet-Agile.
The authors also mention that "previous CeCoV [canine enteric coronavirus] seasonality suggests further outbreaks may occur." Thanks to this study, your local vet might be a little more prepared for it next time.