Once a week.
Subscribe to our weekly newsletter.
The Literary Darwinists: The Evolutionary Origins of Storytelling
In question is nothing less than the nature of literature from an evolutionary perspective.
The summer of 1975 was a rough time for beachgoers at Amity Island. Citizens of the Long Island getaway became terrorized when a shark devoured a young woman and two young boys in the span of a few days. Luckily, three brave men – Brody, Quint and Hooper – traveled out to sea in a small boat, killed the sea monster, returned safely and restored peace to the town.
Fortunately, the only real harm Spielberg’s Jaws inflicted was on future American moviegoers. Jaws is the prototypical thriller that opened the floodgates for big budget summer blockbusters such as Star Wars and Indiana Jones. It is also credited with inspiring Ridley Scott’s Alien and other man-eating animal thrillers. But once Hollywood realized the draw of über stimulating summertime films it was only a matter of time before Deep Impact and The Expendables 2 hit the big screen.
Jaws was a groundbreaking film in some respects, but the plot was nothing new. About 1,200 years earlier Saxons told an eerily similar story. Beowulf is set in a peaceful village that’s suddenly terrorized by the monster Grendel, who resides in a nearby lake. Luckily, the hero of the story – Beowulf – slays the monster (along with his evil mother) and restores peace to his village.
The British journalist Christopher Booker parallels these two tales in a recent book to argue that all stories are built off of seven basic plot templates. For Booker, Jaws and Beowulf fall into the “Overcoming the Monster” category along with Little Red Riding Hood and Dumas’ The Three Musketeers. From Homer’s Odyssey to the James Bond series, Booker’s widespread analysis reminds us that literature is a human universal; where there are people, there are stories.
Given the ubiquity and pleasure of stories, as well as the important role they play in human societies, it’s not surprising that literary scholars and evolutionary psychologists are beginning to critically examine literature with a Darwinian lens. In question is nothing less than the nature of literature from an evolutionary perspective.
What’s odd about literature is that a mind immersed in fantasy appears ripe for fixing by natural selection, since it always leads to erroneous beliefs and distracts from reality. Wouldn’t a daydreaming hunter-gatherer be easy pray for a predator on the African Savannah? It turns out that our ability to invent and tell stories might be a key adaptation that significantly distinguished our species. There are at least three reasons for this.
First, stories are time-savers. For instance, we don’t have to jump off a cliff to know what will happen. We can just simulate it in our minds and imagine the deadly result. Steven Pinker puts it this way:
Intelligent systems often best reason by experiment, real or simulated: they set up a situation whose outcome they cannot predict beforehand, let it unfold according to fixed causal laws, observe the results, and file away a generalization about how what becomes of such entities in such situations. Fiction, then, would be a kind of thought experiment, in which agents are allowed to play out plausible interactions in a more-or-less lawful virtual world.
Second, stories make communication easier by embedding important information about the world and people into coherent narratives. We have a hunger for social and factual information; stories help us organize all of it so we can efficiently exchange information and make more informed decisions. This is why stories are often reduced to maxims, i.e., the early bird gets the worm or he who hesitates gets lost. The brain prefers a concrete scenario to a laundry list of information. Stories, then, serve as mnemonic devices for important tips on how to navigate the world.
Finally, stories help us understand and organize our lives. Instead of seeing life as a collection of random events, our ability to look at sequences of events and weave explanations into them serves as a key existential aid. An October 2009 StrategyOne poll found that when Americans were asked what metaphor best describes their lives, 51 percent responded with “A Journey,” 11 percent with “A Battle” and 8 percent with “A Novel.” It’s nearly impossible to not think about life as a journey where a person is a traveler, a purpose is a destination, teachers and coaches are guides and birth denotes a starting point, death an ending point. In English, for instance, we describe ourselves as being “lost”, “found” or “at a crossroads”; we encounter “twists and turns” and manage to “find our way.”
Another line of reasoning argues that storytelling is simply a delightful byproduct of human evolution. This helps explain why we like Jaws; it’s a vicarious experience in which we get a taste of what it’s like to save the day. As Pinker says:
Fiction may be, at least in part, a pleasure technology, a co-opting of language and imagery as a virtual reality device which allows a reader to enjoy pleasant hallucinations like exploring interesting territories, conquering enemies, hobnobbing with powerful people, and winning attractive mates. Fiction, moreover, can tickle people’s fancies without even having to project them into a thrilling vicarious experience.
Stories might also be the byproduct of our instinctual craving for gossip. Because humans are inherently social animals, information about other humans is valuable. This is why companies do background checks and why profiles on matchmaking websites are so thorough: reputations are fairly good predictors of future behavior; information is power.
Stories might feast on our social instincts by cherry picking dramatic and outlandish scenarios. As the behavior scientist Daniel Nettle argues:
Conversations are only interesting to the extent that you know about the individuals involved and your social world is bound into theirs… Given that dramatic characters are mostly strangers to us, then, the conversation will have to be unusually interesting to hold our attention. That is, the drama has to be an intensified version of the concerns of ordinary conversation.
Nettle goes on to argue that we don’t watch films about people going shopping; we watch films about people going shopping who are having an affair with an ex-lover. This helps explain why gossip tabloids surround so many checkout counters and why so many people seem to prefer watching Friends instead of being with friends: we love drama. Likewise, few would read a book about a man who goes fishing. However, Hemmingway’s Old Man and the Sea reminds us that many would enjoy a book about a man who goes fishing, ends up enduring an epic battle with a giant marlin and returns veiled in tragedy.
Returning to the question of literature as an adaptation versus a byproduct, my hunch aligns with the literary scholar Jonathan Gottschall, who says in his recent book The Story Telling Animal that
… we probably gravitate to story for a number of different evolutionary reasons. There may be elements of storytelling that bear the impression of evolutionary design… there may be other elements that are evolutionary by-product… and there may be elements of story that are highly functional now but were not sculpted by nature for that purpose, such as hands moving over the keys of a piano or a computer.
The real mystery is not just the evolutionary origins of literature, but movements and attitudes such as modernism that insist on transcending the traditional plot lines that Booker diagnoses. If Booker is right and all stories fall into seven basic templates, then writers who strive for complete originality might be out of luck. The human mind, it appears, has its limits on literature. This is supported by several cross-cultural studies clearly demonstrating that all humans gravitate towards similar literary theme. As Hume said, “the general principles of taste are uniform in human nature… the same Homer who pleased at Athens and Rome two thousands years ago, is still admired at Paris and London.”
Of course, the fact that humans share certain literary hot buttons didn’t stop Joyce from throwing out plots altogether in Finnegan’s Wake. Nor did Virginia Woolf hesitate when crafting the free-flowing Mrs. Dalloway. For various reasons, writers in the 20th century were motivated to create stories that don’t appeal to the senses. Pinker explains that a “compelling story may simulate juicy gossip about desirable or powerful people, put us in an exciting time or place, tickle our language instincts with well-chosen words, and teach us something new about the entanglements of families, politics, or love.” Why, then, were so many authors in the 20th century obsessed with disjointed narration, bewildering characters and exhausting prose? And why did they (and do they) look down on the mainstream?
These examples test the limits of literary Darwinism. Science gives us some reasons, which I hoped to clarify above, for why we liked Jaws, why moviegoers paid to see Deep Impact and The Expendables 2, and why we have recycled, replayed and rehearsed the same plot lines for thousands of years. Why the metafiction of Vonnegut’s Slaughterhouse-Five and Stoppard’s The Real Thing? It’s unclear. But elite art is not off-limits to scientific investigation.
Why mega-eruptions like the ones that covered North America in ash are the least of your worries.
- The supervolcano under Yellowstone produced three massive eruptions over the past few million years.
- Each eruption covered much of what is now the western United States in an ash layer several feet deep.
- The last eruption was 640,000 years ago, but that doesn't mean the next eruption is overdue.
The end of the world as we know it
Panoramic view of Yellowstone National Park
Image: Heinrich Berann for the National Park Service – public domain
Of the many freak ways to shuffle off this mortal coil – lightning strikes, shark bites, falling pianos – here's one you can safely scratch off your worry list: an outbreak of the Yellowstone supervolcano.
As the map below shows, previous eruptions at Yellowstone were so massive that the ash fall covered most of what is now the western United States. A similar event today would not only claim countless lives directly, but also create enough subsidiary disruption to kill off global civilisation as we know it. A relatively recent eruption of the Toba supervolcano in Indonesia may have come close to killing off the human species (see further below).
However, just because a scenario is grim does not mean that it is likely (insert topical political joke here). In this case, the doom mongers claiming an eruption is 'overdue' are wrong. Yellowstone is not a library book or an oil change. Just because the previous mega-eruption happened long ago doesn't mean the next one is imminent.
Ash beds of North America
Ash beds deposited by major volcanic eruptions in North America.
Image: USGS – public domain
This map shows the location of the Yellowstone plateau and the ash beds deposited by its three most recent major outbreaks, plus two other eruptions – one similarly massive, the other the most recent one in North America.
The Huckleberry Ridge eruption occurred 2.1 million years ago. It ejected 2,450 km3 (588 cubic miles) of material, making it the largest known eruption in Yellowstone's history and in fact the largest eruption in North America in the past few million years.
This is the oldest of the three most recent caldera-forming eruptions of the Yellowstone hotspot. It created the Island Park Caldera, which lies partially in Yellowstone National Park, Wyoming and westward into Idaho. Ash from this eruption covered an area from southern California to North Dakota, and southern Idaho to northern Texas.
About 1.3 million years ago, the Mesa Falls eruption ejected 280 km3 (67 cubic miles) of material and created the Henry's Fork Caldera, located in Idaho, west of Yellowstone.
It was the smallest of the three major Yellowstone eruptions, both in terms of material ejected and area covered: 'only' most of present-day Wyoming, Colorado, Kansas and Nebraska, and about half of South Dakota.
The Lava Creek eruption was the most recent major eruption of Yellowstone: about 640,000 years ago. It was the second-largest eruption in North America in the past few million years, creating the Yellowstone Caldera.
It ejected only about 1,000 km3 (240 cubic miles) of material, i.e. less than half of the Huckleberry Ridge eruption. However, its debris is spread out over a significantly wider area: basically, Huckleberry Ridge plus larger slices of both Canada and Mexico, plus most of Texas, Louisiana, Arkansas, and Missouri.
This eruption occurred about 760,000 years ago. It was centered on southern California, where it created the Long Valley Caldera, and spewed out 580 km3 (139 cubic miles) of material. This makes it North America's third-largest eruption of the past few million years.
The material ejected by this eruption is known as the Bishop ash bed, and covers the central and western parts of the Lava Creek ash bed.
Mount St Helens
The eruption of Mount St Helens in 1980 was the deadliest and most destructive volcanic event in U.S. history: it created a mile-wide crater, killed 57 people and created economic damage in the neighborhood of $1 billion.
Yet by Yellowstone standards, it was tiny: Mount St Helens only ejected 0.25 km3 (0.06 cubic miles) of material, most of the ash settling in a relatively narrow band across Washington State and Idaho. By comparison, the Lava Creek eruption left a large swathe of North America in up to two metres of debris.
The difference between quakes and faults
The volume of dense rock equivalent (DRE) ejected by the Huckleberry Ridge event dwarfs all other North American eruptions. It is itself overshadowed by the DRE ejected at the most recent eruption at Toba (present-day Indonesia). This was one of the largest known eruptions ever and a relatively recent one: only 75,000 years ago. It is thought to have caused a global volcanic winter which lasted up to a decade and may be responsible for the bottleneck in human evolution: around that time, the total human population suddenly and drastically plummeted to between 1,000 and 10,000 breeding pairs.
Image: USGS – public domain
So, what are the chances of something that massive happening anytime soon? The aforementioned mongers of doom often claim that major eruptions occur at intervals of 600,000 years and point out that the last one was 640,000 years ago. Except that (a) the first interval was about 200,000 years longer, (b) two intervals is not a lot to base a prediction on, and (c) those intervals don't really mean anything anyway. Not in the case of volcanic eruptions, at least.
Earthquakes can be 'overdue' because the stress on fault lines is built up consistently over long periods, which means quakes can be predicted with a relative degree of accuracy. But this is not how volcanoes behave. They do not accumulate magma at constant rates. And the subterranean pressure that causes the magma to erupt does not follow a schedule.
What's more, previous super-eruptions do not necessarily imply future ones. Scientists are not convinced that there ever will be another big eruption at Yellowstone. Smaller eruptions, however, are much likelier. Since the Lava Creek eruption, there have been about 30 smaller outbreaks at Yellowstone, the last lava flow being about 70,000 years ago.
As for the immediate future (give or take a century): the magma chamber beneath Yellowstone is only 5 percent to 15 percent molten. Most scientists agree that is as un-alarming as it sounds. And that its statistically more relevant to worry about death by lightning, shark, or piano.
Strange Maps #1041
Got a strange map? Let me know at firstname.lastname@example.org.
Measuring a person's movements and poses, smart clothes could be used for athletic training, rehabilitation, or health-monitoring.
In recent years there have been exciting breakthroughs in wearable technologies, like smartwatches that can monitor your breathing and blood oxygen levels.
But what about a wearable that can detect how you move as you do a physical activity or play a sport, and could potentially even offer feedback on how to improve your technique?
And, as a major bonus, what if the wearable were something you'd actually already be wearing, like a shirt of a pair of socks?
That's the idea behind a new set of MIT-designed clothing that use special fibers to sense a person's movement via touch. Among other things, the researchers showed that their clothes can actually determine things like if someone is sitting, walking, or doing particular poses.
The group from MIT's Computer Science and Artificial Intelligence Lab (CSAIL) says that their clothes could be used for athletic training and rehabilitation. With patients' permission, they could even help passively monitor the health of residents in assisted-care facilities and determine if, for example, someone has fallen or is unconscious.
The researchers have developed a range of prototypes, from socks and gloves to a full vest. The team's "tactile electronics" use a mix of more typical textile fibers alongside a small amount of custom-made functional fibers that sense pressure from the person wearing the garment.
According to CSAIL graduate student Yiyue Luo, a key advantage of the team's design is that, unlike many existing wearable electronics, theirs can be incorporated into traditional large-scale clothing production. The machine-knitted tactile textiles are soft, stretchable, breathable, and can take a wide range of forms.
"Traditionally it's been hard to develop a mass-production wearable that provides high-accuracy data across a large number of sensors," says Luo, lead author on a new paper about the project that is appearing in this month's edition of Nature Electronics. "When you manufacture lots of sensor arrays, some of them will not work and some of them will work worse than others, so we developed a self-correcting mechanism that uses a self-supervised machine learning algorithm to recognize and adjust when certain sensors in the design are off-base."
The team's clothes have a range of capabilities. Their socks predict motion by looking at how different sequences of tactile footprints correlate to different poses as the user transitions from one pose to another. The full-sized vest can also detect the wearers' pose, activity, and the texture of the contacted surfaces.
The authors imagine a coach using the sensor to analyze people's postures and give suggestions on improvement. It could also be used by an experienced athlete to record their posture so that beginners can learn from them. In the long term, they even imagine that robots could be trained to learn how to do different activities using data from the wearables.
"Imagine robots that are no longer tactilely blind, and that have 'skins' that can provide tactile sensing just like we have as humans," says corresponding author Wan Shou, a postdoc at CSAIL. "Clothing with high-resolution tactile sensing opens up a lot of exciting new application areas for researchers to explore in the years to come."
The paper was co-written by MIT professors Antonio Torralba, Wojciech Matusik, and Tomás Palacios, alongside PhD students Yunzhu Li, Pratyusha Sharma, and Beichen Li; postdoc Kui Wu; and research engineer Michael Foshey.
The work was partially funded by Toyota Research Institute.
How imagining the worst case scenario can help calm anxiety.
- Stoicism is the philosophy that nothing about the world is good or bad in itself, and that we have control over both our judgments and our reactions to things.
- It is hardest to control our reactions to the things that come unexpectedly.
- By meditating every day on the "worst case scenario," we can take the sting out of the worst that life can throw our way.
Are you a worrier? Do you imagine nightmare scenarios and then get worked up and anxious about them? Does your mind get caught in a horrible spiral of catastrophizing over even the smallest of things? Worrying, particularly imagining the worst case scenario, seems to be a natural part of being human and comes easily to a lot of us. It's awful, perhaps even dangerous, when we do it.
But, there might just be an ancient wisdom that can help. It involves reframing this attitude for the better, and it comes from Stoicism. It's called "premeditation," and it could be the most useful trick we can learn.
Broadly speaking, Stoicism is the philosophy of choosing your judgments. Stoics believe that there is nothing about the universe that can be called good or bad, valuable or valueless, in itself. It's we who add these values to things. As Shakespeare's Hamlet says, "There is nothing either good or bad, but thinking makes it so." Our minds color the things we encounter as being "good" or "bad," and given that we control our minds, we therefore have control over all of our negative feelings.
Put another way, Stoicism maintains that there's a gap between our experience of an event and our judgment of it. For instance, if someone calls you a smelly goat, you have an opportunity, however small and hard it might be, to pause and ask yourself, "How will I judge this?" What's more, you can even ask, "How will I respond?" We have power over which thoughts we entertain and the final say on our actions. Today, Stoicism has influenced and finds modern expression in the hugely effective "cognitive behavioral therapy."
Helping you practice StoicismCredit: Robyn Beck via Getty Images
One of the principal fathers of ancient Stoicism was the Roman statesmen, Seneca, who argued that the unexpected and unforeseen blows of life are the hardest to take control over. The shock of a misfortune can strip away the power we have to choose our reaction. For instance, being burglarized feels so horrible because we had felt so safe at home. A stomach ache, out of the blue, is harder than a stitch thirty minutes into a run. A sudden bang makes us jump, but a firework makes us smile. Fell swoops hurt more than known hardships.
What could possibly go wrong?
So, how can we resolve this? Seneca suggests a Stoic technique called "premeditatio malorum" or "premeditation." At the start of every day, we ought to take time to indulge our anxious and catastrophizing mind. We should "rehearse in the mind: exile, torture, war, shipwreck." We should meditate on the worst things that could happen: your partner will leave you, your boss will fire you, your house will burn down. Maybe, even, you'll die.
This might sound depressing, but the important thing is that we do not stop there.
Stoicism has influenced and finds modern expression in the hugely effective "cognitive behavioral therapy."
The Stoic also rehearses how they will react to these things as they come up. For instance, another Stoic (and Roman Emperor) Marcus Aurelius asks us to imagine all the mean, rude, selfish, and boorish people we'll come across today. Then, in our heads, we script how we'll respond when we meet them. We can shrug off their meanness, smile at their rudeness, and refuse to be "implicated in what is degrading." Thus prepared, we take control again of our reactions and behavior.
The Stoics cast themselves into the darkest and most desperate of conditions but then realize that they can and will endure. With premeditation, the Stoic is prepared and has the mental vigor necessary to take the blow on the chin and say, "Yep, l can deal with this."
Catastrophizing as a method of mental inoculation
Seneca wrote: "In times of peace, the soldier carries out maneuvers." This is also true of premeditation, which acts as the war room or training ground. The agonizing cut of the unexpected is blunted by preparedness. We can prepare the mind for whatever trials may come, in just the same way we can prepare the body for some endurance activity. The world can throw nothing as bad as that which our minds have already imagined.
Stoicism teaches us to embrace our worrying mind but to embrace it as a kind of inoculation. With a frown over breakfast, try to spend five minutes of your day deliberately catastrophizing. Get your anti-anxiety battle plan ready and then face the world.
A study on charity finds that reminding people how nice it feels to give yields better results than appealing to altruism.