from the world's big
Trigger warnings don’t help people cope with distressing material
So why should we keep using them?
Imagine you're a lecturer teaching a celebrated novel that features violent scenes – say, F Scott Fitzgerald's The Great Gatsby (1925).
It transpires that one of your students has themselves been a victim of violence and now, thanks to your words, they are reliving their trauma. Could you, should you, have done more to protect this person?
Beginning in 2013, many students at universities in the United States began demanding that their lecturers do just that and provide 'trigger warnings' ahead of any potentially upsetting content. For instance, one student at Rutgers University in New Jersey highlighted the potential harm that The Great Gatsby might cause, with its 'variety of scenes that reference gory, abusive and misogynistic violence'.
As you might have noticed, the use of trigger warnings has since spread beyond US universities to educational institutions around the world, and further: into theatres, festivals and even news stories. The warnings have become another battlefield in the culture wars, with many seeing them as threatening free speech and the latest sign of 'political correctness' gone mad.
Ideology aside, one could make a basic ethical case for giving warnings in the sense that it's the considerate thing to do. If I invite a friend round to watch a movie that I know features disturbing scenes, it's simply courteous and thoughtful to alert my friend in advance, in case she'd rather watch something more anodyne – and one could make the same case for a lecturer about to discuss distressing topics.
But as the debate over trigger warnings has raged, advocates for them have made strong psychological claims. First, they've argued that trigger warnings give people with a history of trauma a welcome chance to avoid the upsetting content. The literature scholar Mason Stokes of Skidmore College in New York has said that his teachings of Jim Grimsley's novel Dream Boy (1995), which explores themes of child sexual abuse, caused one of his students – an incest survivor – to need in-patient psychiatric care. 'I've warned students about the emotions this novel might trigger every time I've taught it since,' he wrote in The Chronicle of Higher Education in 2014, the implication being that, in future, any of his students with a history of trauma will be able to avoid his upsetting lectures and therefore avoid needing acute psychiatric care.
Second, trigger-warning advocates say that such warnings give students and others the opportunity to brace themselves emotionally. In her New York Times op-ed 'Why I Use Trigger Warnings' (2015), the philosophy lecturer Kate Manne of Cornell University in New York argued that they 'allow those who are sensitive to [potentially upsetting] subjects to prepare themselves for reading about them, and better manage their reactions'.
Whereas the ideological arguments for and against trigger warnings are difficult to settle, the specific psychological claims can be tested against the evidence. On the first claim, that trigger warnings enable survivors of trauma to avoid re-experiencing the negative associated emotions, critics argue that the avoidance of potentially upsetting material is actually a counterproductive approach because it offers no chance to learn to manage one's emotional reactions. As a result, fears deepen and catastrophic thoughts go unchallenged.
Consider a meta-analysis of 39 studies in 2007 by Sam Houston State University in Texas that found a 'clear, consistent association' between using avoidance-based coping strategies (that is, staying away from upsetting stressors or avoiding thinking about them) and increased psychological distress. For a more concrete example, look at the findings from a study, published in 2011, of women who witnessed the Virginia Tech shooting of 2007 – those who tried to avoid thinking about what happened tended to experience more symptoms of depression and anxiety in the months that followed.
On the question of whether trigger warnings give people the chance to brace themselves emotionally, a spate of recent studies suggest that this simply isn't how the mind works. In 2018, an investigation by Harvard University asked hundreds of volunteers on Amazon's Mechanical Turk survey website to read graphic literary passages – such as the murder scene in Fyodor Dostoevsky's Crime and Punishment (1866) – that either were or weren't preceded by a trigger warning of distressing content ahead, and then rate their feelings. The warnings had little beneficial effect on the volunteers' emotional reactions.
In the spring of 2019, a paper by the University of Waikato in New Zealand had nearly 1,400 participants across six studies watch graphic video footage, either preceded or not with warnings. This time, the warnings reduced the upsetting impact of the videos, but the size of this effect was 'so small as to lack practical significance' – and this was true regardless of whether the participants had a history of trauma or not.
Around the same time, a group at Flinders University in Australia looked at the effect of trigger warnings on people's experience of ambiguous photos accompanied by different headlines – such as a picture of passengers boarding a plane either with an upsetting crash-related headline or an innocuous business-related headline. Trigger warnings increased participants' negative feelings prior to the photo presentation, presumably as they anticipated what was to come. But, once again, the warnings didn't make much difference to how volunteers responded emotionally to the photos.
It was a similar story in the summer of 2019 when researchers at McKendree University in Illinois gave volunteers warnings (or not) prior to watching educational videos about suicide or sexual assault. Again, the warnings had no meaningful effect on the emotional impact of the videos, including for volunteers who'd had their own personal experience of the topics. Post-video quizzes also showed the trigger warnings had no benefit for the participants' learning.
And just this autumn, another relevant paper was published online. It wasn't about trigger warnings per se, but investigated a cognitive principle central to the trigger-warnings debate. A team from the University of Würzburg in Germany wanted to see if advance warnings could allow people to better ignore distracting negative images while they were engaged in another task. Their consistent finding across three experiments was that people cannot use warnings to prepare or shield themselves from being distracted by an upsetting image.
All these new research findings don't undermine the ethical or ideological case for trigger warnings, but they do cast serious doubt on the psychological arguments mustered by trigger-warning advocates. At the same time, the results provide some support for other psychological claims made by trigger-warning critics – such as the attorney Greg Lukianoff and the social psychologist Jonathan Haidt, authors of the book The Coddling of the American Mind (2018) – namely, that these warnings encourage a belief in the vulnerability of people with a history of trauma and, in fact, in people's vulnerability in general.
For instance, the Harvard research found that the use of trigger warnings increased participants' belief in the vulnerability of people with post-traumatic stress disorder – an unwelcome effect that the researchers described as a form of 'soft stigma' (also, for the subgroup of participants who started out the study believing in the power of words to harm, the trigger warnings actually increased the negative impact of the passages). Similarly, the McKendree research found that the only meaningful effect of trigger warnings was to increase people's belief in the sensitivity of others to upsetting material and in the need for warnings.
It's important not to overstate the scientific case against trigger warnings. Research into their effects is still in its infancy and, most notable, none of the recent studies has focused on their use among people with mental-health diagnoses. Yet already the results are surprisingly consistent in undermining the specific claim that trigger warnings allow people to marshal some kind of mental defence mechanism. There is also a solid evidence base that avoidance is a harmful coping strategy for people recovering from trauma or dealing with anxiety. The clear message from psychology then is that trigger warnings should come with their own warning – they won't achieve much, except encourage maladaptive coping and the belief that folk are sensitive and need protecting.
- WARNING: This Post May Trigger an Emotional Reaction - Big Think ›
- Have Americans become too fragile for their own good? ›
Ready to see the future? Nanotronics CEO Matthew Putman talks innovation and the solutions that are right under our noses.
Innovation in manufacturing has crawled since the 1950s. That's about to speed up.
Evolution doesn't clean up after itself very well.
- An evolutionary biologist got people swapping ideas about our lingering vestigia.
- Basically, this is the stuff that served some evolutionary purpose at some point, but now is kind of, well, extra.
- Here are the six traits that inaugurated the fun.
The plica semilunaris<img type="lazy-image" data-runner-src="https://assets.rebelmouse.io/eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpbWFnZSI6Imh0dHBzOi8vYXNzZXRzLnJibC5tcy8xOTA5NjgwMS9vcmlnaW4ucG5nIiwiZXhwaXJlc19hdCI6MTYxMTgyMzg1NX0.ZY8qmhtoZfbRMAqrNnmbgyk7GLabglx_9lBq3PKcy7g/img.png?width=980" id="99882" class="rm-shortcode" data-rm-shortcode-id="68e8758894b0359c6ef61b2c158832b2" data-rm-shortcode-name="rebelmouse-image" />
The human eye in alarming detail. Image source: Henry Gray / Wikimedia commons<p>At the inner corner of our eyes, closest to the nasal ridge, is that little pink thing, which is probably what most of us call it, called the caruncula. Next to it is the plica semilunairs, and it's what's left of a third eyelid that used to — ready for this? — blink horizontally. It's supposed to have offered protection for our eyes, and some birds, reptiles, and fish have such a thing.</p>
Palmaris longus<img type="lazy-image" data-runner-src="https://assets.rebelmouse.io/eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpbWFnZSI6Imh0dHBzOi8vYXNzZXRzLnJibC5tcy8xOTA5NjgwNy9vcmlnaW4uanBnIiwiZXhwaXJlc19hdCI6MTYzMzQ1NjUwMn0.dVor41tO_NeLkGY9Tx46SwqhSVaA8HZQmQAp532xLxA/img.jpg?width=980" id="879be" class="rm-shortcode" data-rm-shortcode-id="970e9c15f3c3d846dde05e2b2c6ebf12" data-rm-shortcode-name="rebelmouse-image" />
Palmaris longus muscle. Image source: Wikimedia commons<p> We don't have much need these days, at least most of us, to navigate from tree branch to tree branch. Still, about 86 percent of us still have the wrist muscle that used to help us do it. To see if you have it, place the back of you hand on a flat surface and touch your thumb to your pinkie. If you have a muscle that becomes visible in your wrist, that's the palmaris longus. If you don't, consider yourself more evolved (just joking).</p>
Darwin's tubercle<img type="lazy-image" data-runner-src="https://assets.rebelmouse.io/eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpbWFnZSI6Imh0dHBzOi8vYXNzZXRzLnJibC5tcy8xOTA5NjgxMi9vcmlnaW4uanBnIiwiZXhwaXJlc19hdCI6MTY0ODUyNjA1MX0.8RuU-OSRf92wQpaPPJtvFreOVvicEwn39_jnbegiUOk/img.jpg?width=980" id="687a0" class="rm-shortcode" data-rm-shortcode-id="b38a957408940673ccc744f0f6828d18" data-rm-shortcode-name="rebelmouse-image" />
Darwin's tubercle. Image source: Wikimedia commons<p> Yes, maybe the shell of you ear does feel like a dried apricot. Maybe not. But there's a ridge in that swirly structure that's a muscle which allowed us, at one point, to move our ears in the direction of interesting sounds. These days, we just turn our heads, but there it is.</p>
Goosebumps<img type="lazy-image" data-runner-src="https://assets.rebelmouse.io/eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpbWFnZSI6Imh0dHBzOi8vYXNzZXRzLnJibC5tcy8xOTA5NzMxNC9vcmlnaW4uanBnIiwiZXhwaXJlc19hdCI6MTYyNzEyNTc2Nn0.aVMa5fsKgiabW5vkr7BOvm2pmNKbLJF_50bwvd4aRo4/img.jpg?width=980" id="d8420" class="rm-shortcode" data-rm-shortcode-id="f735418322b34382dcd882299c9ccc48" data-rm-shortcode-name="rebelmouse-image" />
Goosebumps. Photo credit: Tyler Olson via Shutterstock<p>It's not entirely clear what purpose made goosebumps worth retaining evolutionarily, but there are two circumstances in which they appear: fear and cold. For fear, they may have been a way of making body hair stand up so we'd appear larger to predators, much the way a cat's tail puffs up — numerous creatures exaggerate their size when threatened. In the cold, they may have trapped additional heat for warmth.</p>
Tailbone<img type="lazy-image" data-runner-src="https://assets.rebelmouse.io/eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpbWFnZSI6Imh0dHBzOi8vYXNzZXRzLnJibC5tcy8xOTA5NzMxNi9vcmlnaW4uanBnIiwiZXhwaXJlc19hdCI6MTYxMDMzMDc3N30.p9BEtkf3-PV3EtDSQMUGUeopsimiCHUagx97P4f8IBw/img.jpg?width=980" id="e8ab8" class="rm-shortcode" data-rm-shortcode-id="0063ce99bdd22fbebe1279244b87935c" data-rm-shortcode-name="rebelmouse-image" />
Coccyx. Image source: decade3d-anatomy online via Shutterstock<p>Way back, we had tails that probably helped us balance upright, and was useful moving through trees. We still have the stump of one when we're embryos, from 4–6 weeks, and then the body mostly dissolves it during Weeks 6–8. What's left is the coccyx.</p>
The palmar grasp reflex<img type="lazy-image" data-runner-src="https://assets.rebelmouse.io/eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpbWFnZSI6Imh0dHBzOi8vYXNzZXRzLnJibC5tcy8xOTA5NzMyMC9vcmlnaW4uanBnIiwiZXhwaXJlc19hdCI6MTYzNjY0MDY5NX0.OSwReKLmNZkbAS12-AvRaxgCM7zyukjQUaG4vmhxTtM/img.jpg?width=980" id="8804c" class="rm-shortcode" data-rm-shortcode-id="45469ca5ee5f43433a782f7d4ac0a440" data-rm-shortcode-name="rebelmouse-image" />
Palmar reflex activated! Photo credit: Raul Luna on Flickr<p> You've probably seen how non-human primate babies grab onto their parents' hands to be carried around. We used to do this, too. So still, if you touch your finger to a baby's palm, or if you touch the sole of their foot, the palmar grasp reflex will cause the hand or foot to try and close around your finger.</p>
Other people's suggestions<p>Amir's followers dove right in, offering both cool and questionable additions to her list. </p>
Fangs?<blockquote class="twitter-tweet" data-conversation="none" data-lang="en"><p lang="en" dir="ltr">Lower mouth plate behind your teeth. Some have protruding bone under the skin which is a throw back to large fangs. Almost like an upsidedown Sabre Tooth.</p>— neil crud (@neilcrud66) <a href="https://twitter.com/neilcrud66/status/1085606005000601600?ref_src=twsrc%5Etfw">January 16, 2019</a></blockquote> <script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script>
Hiccups<blockquote class="twitter-tweet" data-conversation="none" data-lang="en"><p lang="en" dir="ltr">Sure: <a href="https://t.co/DjMZB1XidG">https://t.co/DjMZB1XidG</a></p>— Stephen Roughley (@SteBobRoughley) <a href="https://twitter.com/SteBobRoughley/status/1085529239556968448?ref_src=twsrc%5Etfw">January 16, 2019</a></blockquote> <script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script>
Hypnic jerk as you fall asleep<blockquote class="twitter-tweet" data-conversation="none" data-lang="en"><p lang="en" dir="ltr">What about when you “jump” just as you’re drifting off to sleep, I heard that was a reflex to prevent falling from heights.</p>— Bann face (@thebanns) <a href="https://twitter.com/thebanns/status/1085554171879788545?ref_src=twsrc%5Etfw">January 16, 2019</a></blockquote> <script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script> <p> This thing, often called the "alpha jerk" as you drop into alpha sleep, is properly called the hypnic jerk,. It may actually be a carryover from our arboreal days. The <a href="https://www.livescience.com/39225-why-people-twitch-falling-asleep.html" target="_blank" data-vivaldi-spatnav-clickable="1">hypothesis</a> is that you suddenly jerk awake to avoid falling out of your tree.</p>
Nails screeching on a blackboard response?<blockquote class="twitter-tweet" data-conversation="none" data-lang="en"><p lang="en" dir="ltr">Everyone hate the sound of fingernails on a blackboard. It's _speculated_ that this is a vestigial wiring in our head, because the sound is similar to the shrill warning call of a chimp. <a href="https://t.co/ReyZBy6XNN">https://t.co/ReyZBy6XNN</a></p>— Pet Rock (@eclogiter) <a href="https://twitter.com/eclogiter/status/1085587006258888706?ref_src=twsrc%5Etfw">January 16, 2019</a></blockquote> <script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script>
Ear hair<blockquote class="twitter-tweet" data-conversation="none" data-lang="en"><p lang="en" dir="ltr">Ok what is Hair in the ears for? I think cuz as we get older it filters out the BS.</p>— Sarah21 (@mimix3) <a href="https://twitter.com/mimix3/status/1085684393593561088?ref_src=twsrc%5Etfw">January 16, 2019</a></blockquote> <script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script>
Nervous laughter<blockquote class="twitter-tweet" data-lang="en"><p lang="en" dir="ltr">You may be onto something. Tooth-bearing with the jaw clenched is generally recognized as a signal of submission or non-threatening in primates. Involuntary smiling or laughing in tense situations might have signaled that you weren’t a threat.</p>— Jager Tusk (@JagerTusk) <a href="https://twitter.com/JagerTusk/status/1085316201104912384?ref_src=twsrc%5Etfw">January 15, 2019</a></blockquote> <script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script>
Um, yipes.<blockquote class="twitter-tweet" data-conversation="none" data-lang="en"><p lang="en" dir="ltr">Sometimes it feels like my big toe should be on the side of my foot, was that ever a thing?</p>— B033? K@($ (@whimbrel17) <a href="https://twitter.com/whimbrel17/status/1085559016011563009?ref_src=twsrc%5Etfw">January 16, 2019</a></blockquote> <script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script>
So far, 30 student teams have entered the Indy Autonomous Challenge, scheduled for October 2021.
- The Indy Autonomous Challenge will task student teams with developing self-driving software for race cars.
- The competition requires cars to complete 20 laps within 25 minutes, meaning cars would need to average about 110 mph.
- The organizers say they hope to advance the field of driverless cars and "inspire the next generation of STEM talent."
Indy Autonomous Challenge<p>Completing the race in 25 minutes means the cars will need to average about 110 miles per hour. So, while the race may end up being a bit slower than a typical Indy 500 competition, in which winners average speeds of over 160 mph, it's still set to be the fastest autonomous race featuring full-size cars.</p><p style="margin-left: 20px;">"There is no human redundancy there," Matt Peak, managing director for Energy Systems Network, a nonprofit that develops technology for the automation and energy sectors, told the <a href="https://www.post-gazette.com/business/tech-news/2020/06/01/Indy-Autonomous-Challenge-Indy-500-Indianapolis-Motor-Speedway-Ansys-Aptiv-self-driving-cars/stories/202005280137" target="_blank">Pittsburgh Post-Gazette</a>. "Either your car makes this happen or smash into the wall you go."</p>
Illustration of the Indy Autonomous Challenge
Indy Autonomous Challenge<p>The Indy Autonomous Challenge <a href="https://www.indyautonomouschallenge.com/rules" target="_blank">describes</a> itself as a "past-the-post" competition, which "refers to a binary, objective, measurable performance rather than a subjective evaluation, judgement, or recognition."</p><p>This competition design was inspired by the 2004 DARPA Grand Challenge, which tasked teams with developing driverless cars and sending them along a 150-mile route in Southern California for a chance to win $1 million. But that prize went unclaimed, because within a few hours after starting, all the vehicles had suffered some kind of critical failure.</p>
Indianapolis Motor Speedway
Indy Autonomous Challenge<p>One factor that could prevent a similar outcome in the upcoming race is the ability to test-run cars on a virtual racetrack. The simulation software company Ansys Inc. has already developed a model of the Indianapolis Motor Speedway on which teams will test their algorithms as part of a series of qualifying rounds.</p><p style="margin-left: 20px;">"We can create, with physics, multiple real-life scenarios that are reflective of the real world," Ansys President Ajei Gopal told <a href="https://www.wsj.com/articles/autonomous-vehicles-to-race-at-indianapolis-motor-speedway-11595237401?mod=e2tw" target="_blank">The Wall Street Journal</a>. "We can use that to train the AI, so it starts to come up to speed."</p><p>Still, the race could reveal that self-driving cars aren't quite ready to race at speeds of over 110 mph. After all, regular self-driving cars already face enough logistical and technical roadblocks, including <a href="https://www.bbc.com/news/technology-53349313#:~:text=Tesla%20will%20be%20able%20to,no%20driver%20input%2C%20he%20said." target="_blank">crumbling infrastructure, communication issues</a> and the <a href="https://bigthink.com/paul-ratner/would-you-ride-in-a-car-thats-programmed-to-kill-you" target="_self">fateful moral decisions driverless cars will have to make in split seconds</a>.</p>But the Indy Autonomous Challenge <a href="https://static1.squarespace.com/static/5da73021d0636f4ec706fa0a/t/5dc0680c41954d4ef41ec2b2/1572890638793/Indy+Autonomous+Challenge+Ruleset+-+v5NOV2019+%282%29.pdf" target="_blank">says</a> its main goal is to advance the industry, by challenging "students around the world to imagine, invent, and prove a new generation of automated vehicle (AV) software and inspire the next generation of STEM talent."
A new Harvard study finds that the language you use affects patient outcome.
- A study at Harvard's McLean Hospital claims that using the language of chemical imbalances worsens patient outcomes.
- Though psychiatry has largely abandoned DSM categories, professor Joseph E Davis writes that the field continues to strive for a "brain-based diagnostic system."
- Chemical explanations of mental health appear to benefit pharmaceutical companies far more than patients.
Challenging the Chemical Imbalance Theory of Mental Disorders: Robert Whitaker, Journalist<span style="display:block;position:relative;padding-top:56.25%;" class="rm-shortcode" data-rm-shortcode-id="41699c8c2cb2aee9271a36646e0bee7d"><iframe type="lazy-iframe" data-runner-src="https://www.youtube.com/embed/-8BDC7i8Yyw?rel=0" width="100%" height="auto" frameborder="0" scrolling="no" style="position:absolute;top:0;left:0;width:100%;height:100%;"></iframe></span><p>This is a far cry from Howard Rusk's 1947 NY Times editorial calling for mental healt</p><p>h disorders to be treated similarly to physical disease (such as diabetes and cancer). This mindset—not attributable to Rusk alone; he was merely relaying the psychiatric currency of the time—has dominated the field for decades: mental anguish is a genetic and/or chemical-deficiency disorder that must be treated pharmacologically.</p><p>Even as psychiatry untethered from DSM categories, the field still used chemistry to validate its existence. Psychotherapy, arguably the most efficient means for managing much of our anxiety and depression, is time- and labor-intensive. Counseling requires an empathetic and wizened ear to guide the patient to do the work. Ingesting a pill to do that work for you is more seductive, and easier. As Davis writes, even though the industry abandoned the DSM, it continues to strive for a "brain-based diagnostic system." </p><p>That language has infiltrated public consciousness. The team at McLean surveyed 279 patients seeking acute treatment for depression. As they note, the causes of psychological distress have constantly shifted over the millennia: humoral imbalance in the ancient world; spiritual possession in medieval times; early childhood experiences around the time of Freud; maladaptive thought patterns dominant in the latter half of last century. While the team found that psychosocial explanations remain popular, biogenetic explanations (such as the chemical imbalance theory) are becoming more prominent. </p><p>Interestingly, the 80 people Davis interviewed for his book predominantly relied on biogenetic explanations. Instead of doctors diagnosing patients, as you might expect, they increasingly serve to confirm what patients come in suspecting. Patients arrive at medical offices confident in their self-diagnoses. They believe a pill is the best course of treatment, largely because they saw an advertisement or listened to a friend. Doctors too often oblige without further curiosity as to the reasons for their distress. </p>
Image: Illustration Forest / Shutterstock<p>While medicalizing mental health softens the stigma of depression—if a disorder is inheritable, it was never really your fault—it also disempowers the patient. The team at McLean writes,</p><p style="margin-left: 20px;">"More recent studies indicate that participants who are told that their depression is caused by a chemical imbalance or genetic abnormality expect to have depression for a longer period, report more depressive symptoms, and feel they have less control over their negative emotions."</p><p>Davis points out the language used by direct-to-consumer advertising prevalent in America. Doctors, media, and advertising agencies converge around common messages, such as everyday blues is a "real medical condition," everyone is susceptible to clinical depression, and drugs correct underlying somatic conditions that you never consciously control. He continues,</p><p style="margin-left: 20px;">"Your inner life and evaluative stance are of marginal, if any, relevance; counseling or psychotherapy aimed at self-insight would serve little purpose." </p><p>The McLean team discovered a similar phenomenon: patients expect little from psychotherapy and a lot from pills. When depression is treated as the result of an internal and immutable essence instead of environmental conditions, behavioral changes are not expected to make much difference. Chemistry rules the popular imagination.</p>