Once a week.
Subscribe to our weekly newsletter.
It’s hard to scare people without a visual imagination
Next time you listen to scary campfire stories, sit with a friend who has aphantasia.
A strong imagination is generally viewed as being a good thing, even if at times an over-active one can result in self-induced terror as you repeat to yourself, "Just because I can vividly picture something terrible happening doesn't mean it will."
A study from researchers at the University of New South Wales (UNSW) in Sydney, Australia suggests that a visual imagination may actually be a requirement for experiencing fear. It suggests some people are less likely to be frightened simply because they lack the imagination it requires. This also means visual stimuli have a special connection to fear and perhaps other emotional experiences.
The study is published in Proceedings of the Royal Society B.
Credit: Martin Villadsen/Adobe Stock/Big Think
It's known that some people have trouble picturing things in their minds. This is called "mind-blindness," or more clinically, "aphantasia." The UNSW Sydney researchers conducted experiments to see if people with aphantasia were harder to scare.
It's believed that aphantasia affects between two and five percent of people, and science is just beginning to understand it. Says the study's senior author Joel Peterson of UNSW Science's Future Minds Lab, "Aphantasia is neural diversity. It's an amazing example of how different our brain and minds can be."
Previous research on aphantasia at UNSW found that it's associated with a general widespread pattern of altered cognitive process, including memory, imagination, and dreams.
Pearson says, "Aphantasia comes in different shapes and sizes. Some people have no visual imagery, while other people have no imagery in one or all of their other senses. Some people dream while others don't."
The new research connects aphantasia for the first time to skin conductivity, a worthy finding all by itself. "This evidence further supports aphantasia as a unique, verifiable phenomenon," says co-author Rebecca Keogh. "This work may provide a potential new objective tool which could be used to help to confirm and diagnose aphantasia in the future."
The current study was prompted by comments made on an aphantasia message boards expressing a disinterest in fiction for people with the condition.
Imagining disturbing imagery when you read scary stories
Credit: pure julia/Unsplash/Big Think
The experiments involved 22 people with aphantasia and 24 people with normal visual imaginations. Individuals were seated alone in a darkened room with electrodes attached to their skin to measure electrical conductivity. Conductivity increases when a person experiences strong emotions. Subjects were shown a succession of 3- to 7-word phrases immediately following one another, with each displayed for two seconds as they developed a frightening narrative.
The stories started innocently enough: "You are at the beach, in the water" or "You're on a plane, by the window." Little by little, unsettling elements were introduced — a mention of a dark flash among distant waves, or people standing on the beach pointing, or the plane shaking as the cabin lights dim.
Pearson reports, "Skin conductivity levels quickly started to grow for people who were able to visualize the stories. The more the stories went on, the more their skin reacted."
Not so for the aphantasic participants, of whom he says: "the skin conductivity levels pretty much flatlined."
Reacting to scary imagery
Credit: Mark Kostich/Adobe Stock
The researchers confirmed that it was the aphantasia which accounted for the different reactions between the two groups by running the experiment again, but this time with pictures instead of words. Visual imagination wasn't necessary — all the disturbing imagery, which included a dead human body and a snake bearing its fangs in threat, were supplied.
This time, both groups of people became similarly unnerved. "The emotional fear response was present when participants actually saw the scary material play out in front of them," says Pearson.
"The findings suggest," Pearson says, "that imagery is an emotional thought amplifier. We can think all kind of things, but without imagery, the thoughts aren't going to have that emotional 'boom.'"
It also suggests a couple of things about telling scary stories. First, the importance of visual imagination suggests that providing lots of visual details will give a scary story more oomph. Second, people with aphantasia are probably lousy campfire audiences.
Next, the researchers plan to investigate the ways in which disorders such as PTSD might be different for people with aphantasia.
If you don't want to know anything about your death, consider this your spoiler warning.
- For centuries cultures have personified death to give this terrifying mystery a familiar face.
- Modern science has demystified death by divulging its biological processes, yet many questions remain.
- Studying death is not meant to be a morbid reminder of a cruel fate, but a way to improve the lives of the living.
Black cloak. Scythe. Skeletal grin. The Grim Reaper is the classic visage of death in Western society, but it's far from the only one. Ancient societies personified death in a myriad of ways. Greek mythology has the winged nipper Thanatos. Norse mythology the gloomy and reclusive Hel, while Hindu traditions sport the wildly ornate King Yama.
Modern science has de-personified death, pulling back its cloak to discover a complex pattern of biological and physical processes that separate the living from the dead. But with the advent of these discoveries, in some ways, death has become more alien.
1) You are conscious after death
Many of us imagine death will be like drifting to sleep. Your head gets heavy. Your eyes flutter and gently close. A final breath and then… lights out. It sounds perversely pleasant. Too bad it may not be that quick.
Dr. Sam Parnia, the director of critical care and resuscitation research at NYU Langone Medical Center, researches death and has proposed that our consciousness sticks around while we die. This is due to brainwaves firing in the cerebral cortex — the conscious, thinking part of the brain — for roughly 20 seconds after clinical death.
Studies on lab rats have shown their brains surge with activity in the moments after death, resulting in an aroused and hyper-alert state. If such states occur in humans, it may be evidence that the brain maintains a lucid consciousness during death's early stages. It may also explain how patients brought back from the brink can remember events that took place while they were technically dead.
But why study the experience of death if there's no coming back from it?
"In the same way that a group of researchers might be studying the qualitative nature of the human experience of 'love,' for instance, we're trying to understand the exact features that people experience when they go through death, because we understand that this is going to reflect the universal experience we're all going to have when we die," he told LiveScience.
2) Zombie brains are a thing (kind of)
There is life after death if you're a pig...sorta. Image source: Wikimedia Commons)
Recently at the Yale School of Medicine, researchers received 32 dead pig brains from a nearby slaughterhouse. No, it wasn't some Mafia-style intimidation tactic. They'd placed the order in the hopes of giving the brains a physiological resurrection.
The researchers connected the brains to an artificial perfusion system called BrainEx. It pumped a solution through them that mimicked blood flow, bringing oxygen and nutrients to the inert tissues.
This system revitalized the brains and kept some of their cells "alive" for as long as 36 hours postmortem. The cells consumed and metabolized sugars. The brains' immune systems even kicked back in. And some samples were even able to carry electrical signals.
Because the researchers weren't aiming for Animal Farm with Zombies, they included chemicals in the solution that prevented neural activity representative of consciousness from taking place.
Their actual goal was to design a technology that will help us study the brain and its cellular functions longer and more thoroughly. With it, we may be able to develop new treatments for brain injuries and neurodegenerative conditions.
3) Death is not the end for part of you
Researchers used zebrafish to gain insights into postmortem gene expression. Image source: ICHD / Flickr
There is life after death. No, science hasn't discovered proof of an afterlife or how much the soul weighs. But our genes keep going after our demise.
A study published in the Royal Society's Open Biology looked at gene expression in dead mice and zebrafish. The researchers were unsure if gene expression diminished gradually or stopped altogether. What they found surprised them. Over a thousand genes became more active after death. In some cases, these spiked expressions lasted for up to four days.
"We didn't anticipate that," Peter Noble, study author and microbiology professor at the University of Washington, told Newsweek. "Can you imagine, 24 hours after [time of death] you take a sample and the transcripts of the genes are actually increasing in abundance? That was a surprise."
Gene expression was shown for stress and immunity responses but also developmental genes. Noble and his co-authors suggest this shows that the body undergoes a "step-wise shutdown," meaning vertebrates die gradually and not all at once.
4) Your energy lives on
Even our genes will eventually fade, and all that we are will become clay. Do you find such oblivion disheartening? You're not alone, but you may take solace in the fact that part of you will continue on long after your death. Your energy.
According to the first law of thermodynamics, the energy that powers all life continues on and can never be destroyed. It is transformed. As comedian and physicist Aaron Freeman explains in his "Eulogy from a Physicist":
"You want the physicist to remind your sobbing mother about the first law of thermodynamics; that no energy gets created in the universe, and none is destroyed. You want your mother to know that all your energy, every vibration, every Btu of heat, every wave of every particle that was her beloved child remains with her in this world. You want the physicist to tell your weeping father that amid energies of the cosmos, you gave as good as you got."
5) Near-death experiences may be extreme dreams
Near-death experiences come in a variety of styles. Some people float above their bodies. Some go to a supernatural realm and meet passed-on relatives. Others enjoy the classic dark-tunnel-bright-light scenario. One thing they all have in common: We don't know what's going on.
A study published in Neurology suggests near-death experiences stem from a type of sleep-wake state. It compared survivors who had near-death experiences with those who did not. The researchers found that people with near-death experiences were more likely to also undergo REM intrusions, states in which sleep intrudes upon wakeful consciousness.
"People who have near-death experiences may have an arousal system that predisposes them to REM intrusion," Kevin Nelson, professor at the University of Kentucky and the study's lead author, told the BBC.
It's worth noting that the study does have its limitations. Only 55 participants were interviewed in each group, and the results relied on anecdotal evidence. These highlight key difficulties in studying near-death experiences. Such experiences are rare and cannot be induced in a controlled setting. (Such a proposal would be a huge red flag for any ethics board.)
The result is sparse data opened to a lot of interpretation, but it is unlikely that the soul enjoys a postmortem romp. One experiment installed pictures on high shelves in 1,000 hospital rooms. These images would only be visible to people whose souls departed the body and returned.
No cardiac arrest survivor reported seeing the images. Then again, if they did manage to sever their fleshy fetters, they may have had more pressing matters to attend to.
6) Animals may mourn the dead too
Elephants form strong familial bonds, and some eye witness accounts suggest they may mourn the dead, too. Image source: Cocoparisienne / Pixabay
We're still not sure, but eye witness accounts suggest the answer may be yes.
Field researchers have witnessed elephants staying with the dead — even if the deceased is not from the same family herd. This observation led the researchers to conclude the elephants had a "generalized response" to death. Dolphins too have been seen guarding deceased members of their species. And chimpanzees maintain social routines with the dead, such as grooming.
No other species has been observed performing human-like memorial rituals, which requires abstract thought, but these events suggest animals possess a unique understanding of and response to death.
As Jason Goldman writes for BBC, "[F]or every facet of life that is unique to our species, there are hundreds that are shared with other animals. As important as it is to avoid projecting our own feelings onto animals, we also need to remember that we are, in an inescapable way, animals ourselves."
7) Who first buried the dead?
Anthropologist Donald Brown has studied human cultures and discovered hundreds of features shared by each and every one. Among them, every culture has its own way to honor and mourn the dead.
But who was the first? Humans or another hominin in our ancestral lineage? That answer is difficult because it is shrouded in the fog of our prehistorical past. However, we do have a candidate: Homo naledi.
Several fossils of this extinct hominin were discovered in a cave chamber at the Rising Star Cave system, Cradle of Humankind, South Africa. To access the chamber required a vertical climb, a few tight fits, and much crawling.
This led researchers to believe it unlikely so many individuals ended up there by accident. They also ruled out geological traps like cave-ins. Given the seemingly deliberate placement, some have concluded the chamber served as a Homo naledi graveyard. Others aren't so sure, and more evidence is needed before we can definitively answer this question.
8) Walking corpse syndrome
The medieval Danse Macabre fresco at the Holy Trinity Church in Hrastovlje, Solvenia. (Photo: Marco Almbauer/Wikimedia Commons)
For most of us, the line between life and death is stark. We are alive; therefore, we are not dead. It's a notion many take for granted, and we should be thankful we can manage it so effortlessly.
People afflicted with Cotard's syndrome don't see the divide so cleanly. This rare condition was first described by Dr. Jules Cotard in 1882 and describes people who believe they are dead, missing body parts, or have lost their soul. This nihilistic delusion manifests in a prevailing sense of hopelessness, neglect of health, and difficulty dealing with external reality.
In one case, a 53-year-old Filipino woman with Cotard's syndrome believed herself to smell like rotting fish and wished to be brought to the morgue so she could be with her kind. Thankfully, a regimen of antipsychotics and antidepressants improved her condition. Others with this debilitating mental disorder have also been known to improve with proper treatment.
9) Do hair and fingernails grow after death?
Nope. This is a myth, but one that does have a biological origin.
The reason hair and fingernails don't grow after death is because new cells can't be produced. Glucose fuels cell division, and cells require oxygen to break down glucose into cellular energy. Death puts an end to the body's ability to intake either one.
It also ends the intaking of water, leading to dehydration. As a corpse's skin desiccates, it pulls away from the fingernails (making them look longer) and retracts around the face (giving a dead man's chin a five-o'clock shadow). Anyone unlucky enough to exhume a corpse could easily mistake these changes as signs of growth.
Interestingly, postmortem hair and fingernail growth provoked lore about vampires and other creatures of the night. When our ancestors dug up fresh corpses and found hair growth and blood spots around mouths (the result of natural blood pooling), their minds naturally wandered to undeath.
Not that becoming undead is anything we need to worry about today. (Unless, of course, you donate your brain to the Yale School of Medicine.)
10) Why we die?
People who live to be 110 years old, called super-centenarians, are a rare breed. Those who live to be 120 rarer still. The longest-living human on record was Jeanne Calment, a Frenchwoman who lived an astounding 122 years.
But why do we die in the first place? Setting spiritual and existential responses aside, the simple answer is that nature is done with us after a certain point.
Success in life, evolutionarily speaking, is passing on one's genes to offspring. As such, most species die soon after their fecund days end. Salmon die soon after making their upriver trek to fertilize their eggs. For them, reproduction is a one-way trip.
Humans are a bit different. We invest heavily in our young, so we require a longer lifespan to continue parental care. But human lives outpace their fecundity by many years. This extended lifespan allows us to invest time, care, and resources in grandchildren (who share our genes). This is known as the grandmother effect.
But if grandparents are so useful, why is cap set at 100-some-odd years? Because our evolution did not invest in longevity beyond that. Nerve cells do not replicate, brains shrink, hearts weaken, and we die. If evolution needed us to hang around longer, maybe these kill switches would have been weeded out, but evolution as we know it requires death to promote adaptive life.
At this age, however, it is likely that our children may be entering their grandparent years themselves, and our genes will continue to be cared for in subsequent generations.
The author of 'How We Read' Now explains.
During the pandemic, many college professors abandoned assignments from printed textbooks and turned instead to digital texts or multimedia coursework.
As a professor of linguistics, I have been studying how electronic communication compares to traditional print when it comes to learning. Is comprehension the same whether a person reads a text onscreen or on paper? And are listening and viewing content as effective as reading the written word when covering the same material?
The answers to both questions are often “no," as I discuss in my book “How We Read Now," released in March 2021. The reasons relate to a variety of factors, including diminished concentration, an entertainment mindset and a tendency to multitask while consuming digital content.
Print versus digital reading
The benefits of print particularly shine through when experimenters move from posing simple tasks – like identifying the main idea in a reading passage – to ones that require mental abstraction – such as drawing inferences from a text. Print reading also improves the likelihood of recalling details – like “What was the color of the actor's hair?" – and remembering where in a story events occurred – “Did the accident happen before or after the political coup?"
Studies show that both grade school students and college students assume they'll get higher scores on a comprehension test if they have done the reading digitally. And yet, they actually score higher when they have read the material in print before being tested.
Educators need to be aware that the method used for standardized testing can affect results. Studies of Norwegian tenth graders and U.S. third through eighth graders report higher scores when standardized tests were administered using paper. In the U.S. study, the negative effects of digital testing were strongest among students with low reading achievement scores, English language learners and special education students.
My own research and that of colleagues approached the question differently. Rather than having students read and take a test, we asked how they perceived their overall learning when they used print or digital reading materials. Both high school and college students overwhelmingly judged reading on paper as better for concentration, learning and remembering than reading digitally.
The discrepancies between print and digital results are partly related to paper's physical properties. With paper, there is a literal laying on of hands, along with the visual geography of distinct pages. People often link their memory of what they've read to how far into the book it was or where it was on the page.
But equally important is mental perspective, and what reading researchers call a “shallowing hypothesis." According to this theory, people approach digital texts with a mindset suited to casual social media, and devote less mental effort than when they are reading print.
Podcasts and online video
Given increased use of flipped classrooms – where students listen to or view lecture content before coming to class – along with more publicly available podcasts and online video content, many school assignments that previously entailed reading have been replaced with listening or viewing. These substitutions have accelerated during the pandemic and move to virtual learning.
Surveying U.S. and Norwegian university faculty in 2019, University of Stavanger Professor Anne Mangen and I found that 32% of U.S. faculty were now replacing texts with video materials, and 15% reported doing so with audio. The numbers were somewhat lower in Norway. But in both countries, 40% of respondents who had changed their course requirements over the past five to 10 years reported assigning less reading today.
A primary reason for the shift to audio and video is students refusing to do assigned reading. While the problem is hardly new, a 2015 study of more than 18,000 college seniors found only 21% usually completed all their assigned course reading.
Maximizing mental focus
Researchers found similar results with university students reading an article versus listening to a podcast of the text. A related study confirms that students do more mind-wandering when listening to audio than when reading.
Results with younger students are similar, but with a twist. A study in Cyprus concluded that the relationship between listening and reading skills flips as children become more fluent readers. While second graders had better comprehension with listening, eighth graders showed better comprehension when reading.
Research on learning from video versus text echoes what we see with audio. For example, researchers in Spain found that fourth through sixth graders who read texts showed far more mental integration of the material than those watching videos. The authors suspect that students “read" the videos more superficially because they associate video with entertainment, not learning.
The collective research shows that digital media have common features and user practices that can constrain learning. These include diminished concentration, an entertainment mindset, a propensity to multitask, lack of a fixed physical reference point, reduced use of annotation and less frequent reviewing of what has been read, heard or viewed.
Digital texts, audio and video all have educational roles, especially when providing resources not available in print. However, for maximizing learning where mental focus and reflection are called for, educators – and parents – shouldn't assume all media are the same, even when they contain identical words.
Humans may have evolved to be tribalistic. Is that a bad thing?
- From politics to every day life, humans have a tendency to form social groups that are defined in part by how they differ from other groups.
- Neuroendocrinologist Robert Sapolsky, author Dan Shapiro, and others explore the ways that tribalism functions in society, and discuss how—as social creatures—humans have evolved for bias.
- But bias is not inherently bad. The key to seeing things differently, according to Beau Lotto, is to "embody the fact" that everything is grounded in assumptions, to identify those assumptions, and then to question them.
Ancient corridors below the French capital have served as its ossuary, playground, brewery, and perhaps soon, air conditioning.