Once a week.
Subscribe to our weekly newsletter.
Sartre Fallacy Part IV: Philosophizing Without a Ph.D.
Consider one last autobiographical note before I answer the question: “How do we avoid the Sartre Fallacy?”
I conducted an independent study my senior year that focused on biases and heuristics. I lived in this literature for months, covering what seemed like every study demonstrating how we humans screw up; it was difficult not to conclude that we are hopelessly irrational. What occurred to me, however, was that the word “rational” was misleading. What’s rational is usually relative to the homo economicus perspective, in which good decisions are about surveying alternatives, estimating probabilities and optimizing self-interest. But what looks rational in one setting might be deeply irrational in another if you attribute deviations in judgments not to deficits of the mind but to the homo economicus perspective itself. That is, it might be better to think about how biases and heuristics relate to the environment rather than to logic and optimization.
Like every idea I’ve ever had, I discovered that I was not the first person to think it. Gerd Gigerenzer, one of the most important decision-making theorists of the last few decades, has made a career pursuing this line of reasoning. I remember buying his book Rationality for Mortals and finding a passage that crystalized my insight: “Violations of logical reasoning [are] interpreted as cognitive fallacies, yet what appears to be a fallacy can often also be seen as adaptive behavior, if one is willing to rethink the norm.” Consequently, I realized that if we judge behavior not relative to logic and optimization but to the physical and social environments we exist in, then I cannot understand judgment by only studying the literature on decision-making. (We’ve seen that doing so actually increases your bias.)
It’s fitting that Gigerenzer’s insight struck me during an especially bibulous weekend, not when I was binging on judgment literature but on top of a couch dancing to Katy Perry. I still remember the idea arriving, clearly, into my conscious mind, despite the inebriation. In that moment I pledged to step outside the body of research I had been living in. I needed to go out and see how we decide in reality. Doing so would be the only way I could think clearly about biases and heuristics.
Much later, after I graduated, I realized that an overlooked source of wisdom in college is the double life: one in the classroom and the other socializing. That is, the library and the party are not opposites but complements: you don’t know what you know until you’ve tested your knowledge in the real world. Therefore, the first step in avoiding the Sartre Fallacy is not to stop thinking but to start experiencing.
Nietzsche, Buridan & Elliot
The second step is balancing the two. Young Nietzsche outlined a strategy in The Birth of Tragedy. Recall his two central characters: The Dionysian is the chaotic, untamed, partygoer; the Apollonian is measured, rational, and self-controlled. Nietzsche argued that ancient Athens thrived by balancing each until the playwright Euripides, influenced by Socrates’ stubborn commitment to the truth and reason, focused the spotlight on the Apollonian thereby suffocating the city of its livelihood.
As an illustration, consider the thought experiment “Buridan’s Donkey,” named after medieval philosopher and priest Jean Buridan. An equally thirsty and hungry donkey stands exactly midway between a stack of hay and a pail of water. The donkey is rational, so he needs a reason to pick one over the other. But both are equally far away so he idles between the two until he dies. I’ll translate: a high dose of the Apollonian kills - a deadlocked mind requires a degree of chaos and randomness.*
If this sounds too philosophical, here’s an equivalent example from neuroscience. Perhaps you know Elliot, a centerpiece in Antonio Damasio’s Descartes’ Error. Elliot damaged his frontal lobes and subsequently lost his capacity to make trivial decisions. In a now famous example, Damasio
suggested two alternative dates, both in the coming month and just a few days apart from each other. [Elliot] pulled out his appointment book and began consulting the calendar. The behavior that ensued, which was witnessed by several investigators, was remarkable. For the better part of half hour, the patient enumerated reasons for and against each of the two dates: previous engagements, proximity to other engagements, possible meteorological conditions, virtually anything that one could reasonably think about concerning a simple date... He was now walking us through a tiresome cost-benefit analysis, and endless outlining and fruitless comparison of options and possible consequences. It took enormous discipline to listen to all of this without pounding on the table and telling him to stop.
Damasio concludes that rationality is impossible without emotion – we need the Dionysian to push us one way or another or else we will remain stuck analyzing the pros and cons, just like Buridan’s Donkey.
Research on creativity is rife with supporting examples. Insights occur not in stressful moments but during warm showers, long walks, the weekend, and even exercise. The unconscious mind needs time to incubate ideas before delivering them to the conscious mind. I’m not a sucker for counter-intuitive research suggesting that alcohol and doziness are the “keys” to creativity, but there’s no doubt that a clenched state of mind is a bane on creativity. Stepping away from a problem is often the secret to solving it. Without the Dionysian, creativity output will cease like Monsieur Buridan’s Donkey and Elliot’s capacity to decide.
For another example, consider two reappearing characters in Nassim Taleb’s books on uncertainty.
The first, Nero Tulip, is a risk-averse scholar who lives in books. His background is in statistics and probability and he has a particular interest in medical texts. He is a disciplined thinker – an erudite-autodidact-philosopher at heart. The second, Fat Tony, is a street smart Brooklyn-type (although he lives in New Jersey) who does not read and despises name-dropping intellectuals. He’s classy. He likes fine wine and good food; I imagine him to be like Tony Soprano without the violence. Fat Tony benefits from chaos and unpredictability. Nero does not but he is not fragile either. Both hate boredom and avoid being “the turkey.” Nassim’s characters, likely disguised reflections of his experiences, are, in many ways, extensions of Nietzsche’s Apollonian (Nero) and Dionysian (Fat Tony).
Nassim writes about the “Barbell Strategy.” As a financial strategy this means playing it safe most of the time (e.g., buying Treasury bills) and making “extremely speculative” bets the rest of the time. As a lifestyle it means hating middlebrow stuff and moderation. Consider stuffing yourself with fat and carbs and then fasting; taking long leisurely walks and then sprinting for short periods; reading gossip magazines on one hand and a Platonic dialog, in Latin, on the other; staying sober six days a week and drinking excessively one day a week; Wittgenstein, for example, switched between writing philosophical texts and teaching elementary school. Why combine extremes and avoid the middle?
The human body hates moderation and loves stressors (to a point). What happens when you lay in bed all day long? You become exhausted and lazy. The best way to get over a hangover? Go running. Vaccines harm the body a little bit so it becomes stronger in the long run. Bones become denser when subjected to episodes of stress; lifting weights strengthens your muscles. Think about Hydra, the mythological serpent that gained two heads every time it lost one, or the airline industry: every plane crash makes flying safer. The barbell strategy benefits from the Dionysian but, importantly, does not abandon the Apollonian. For Nassim it’s a means towards an “antifragile” lifestyle, in which you live part Fat Tony part Nero Tulip. For our purposes it means not spending too much time in the library.
Imagine a student at the University of Chicago. Let’s call him John. This enlightened individual deprives himself of the weekend, thereby forgoing late night dorm room bull sessions - an unaccredited source of wisdom. John has a promising but boring future as a philosophy professor. He agrees that all X’s are Y’s but not all Y’s are X’s, but cringes with fear when someone from the south side approaches him. He wants to learn Chinese – he figures it will look good on his resume – so he studies grammar books and buys a Rosetta Stone. Of course, if he really wanted to learn the language he would have moved to Beijing and found a Mandarin-speaking girlfriend. Unfortunately, his lack of social skills makes this difficult. In fact, John wonders why the unofficial slogan of his Alma Mater is “Where fun comes to die.” In the future, John will write a dissertation on Kant’s categorical imperative but he will not live more virtuously or ethically than anybody else. Despite a successful life, John will be boring.
How to avoid the Sartre Fallacy? Don’t be like John. But don’t be a college-drop out either. Balance the Apollonian with the Dionysian – Fat Tony with Nero. Learn about biases and then commit them in the real world. Understanding how the mind decides and makes judgments requires experience in physical and social environments in the same way bones require stress and Hydra requires volatility. This is the virtue of adapting the Barbell strategy; it forces you to step outside the library and test your theories in real life – in the midst of chaos. Doing so will help you think about your irrationality, rationally.
This is the final installment of the Sartre Fallacy Series.
Image via Wikipedia Creative Commons
* Fittingly, I learned about Buridan’s Donkey in one of the most boring class I ever took: modern philosophy. It didn’t help that it was in the morning and the professor, a Descartes specialist, was dreadfully dry.
Evolution doesn't clean up after itself very well.
- An evolutionary biologist got people swapping ideas about our lingering vestigia.
- Basically, this is the stuff that served some evolutionary purpose at some point, but now is kind of, well, extra.
- Here are the six traits that inaugurated the fun.
The plica semilunaris<img class="rm-lazyloadable-image rm-shortcode" type="lazy-image" data-runner-src="https://assets.rebelmouse.io/eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpbWFnZSI6Imh0dHBzOi8vYXNzZXRzLnJibC5tcy8xOTA5NjgwMS9vcmlnaW4ucG5nIiwiZXhwaXJlc19hdCI6MTY3NDg5NTg1NX0.kdBYMvaEzvCiJjcLEPgnjII_KVtT9RMEwJFuXB68D8Q/img.png?width=980" id="59914" width="429" height="350" data-rm-shortcode-id="b11e4be64c5e1f58bf4417d8548bedc7" data-rm-shortcode-name="rebelmouse-image" />
The human eye in alarming detail. Image source: Henry Gray / Wikimedia commons<p>At the inner corner of our eyes, closest to the nasal ridge, is that little pink thing, which is probably what most of us call it, called the caruncula. Next to it is the plica semilunairs, and it's what's left of a third eyelid that used to — ready for this? — blink horizontally. It's supposed to have offered protection for our eyes, and some birds, reptiles, and fish have such a thing.</p>
Palmaris longus<img class="rm-lazyloadable-image rm-shortcode" type="lazy-image" data-runner-src="https://assets.rebelmouse.io/eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpbWFnZSI6Imh0dHBzOi8vYXNzZXRzLnJibC5tcy8xOTA5NjgwNy9vcmlnaW4uanBnIiwiZXhwaXJlc19hdCI6MTYzMzQ1NjUwMn0.dVor41tO_NeLkGY9Tx46SwqhSVaA8HZQmQAp532xLxA/img.jpg?width=980" id="879be" width="1920" height="2560" data-rm-shortcode-id="4089a32ea9fbb1a0281db14332583ccd" data-rm-shortcode-name="rebelmouse-image" />
Palmaris longus muscle. Image source: Wikimedia commons<p> We don't have much need these days, at least most of us, to navigate from tree branch to tree branch. Still, about 86 percent of us still have the wrist muscle that used to help us do it. To see if you have it, place the back of you hand on a flat surface and touch your thumb to your pinkie. If you have a muscle that becomes visible in your wrist, that's the palmaris longus. If you don't, consider yourself more evolved (just joking).</p>
Darwin's tubercle<img class="rm-lazyloadable-image rm-shortcode" type="lazy-image" data-runner-src="https://assets.rebelmouse.io/eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpbWFnZSI6Imh0dHBzOi8vYXNzZXRzLnJibC5tcy8xOTA5NjgxMi9vcmlnaW4uanBnIiwiZXhwaXJlc19hdCI6MTY0ODUyNjA1MX0.8RuU-OSRf92wQpaPPJtvFreOVvicEwn39_jnbegiUOk/img.jpg?width=980" id="687a0" width="819" height="1072" data-rm-shortcode-id="ff5edf0a698e0681d11efde1d7872958" data-rm-shortcode-name="rebelmouse-image" />
Darwin's tubercle. Image source: Wikimedia commons<p> Yes, maybe the shell of you ear does feel like a dried apricot. Maybe not. But there's a ridge in that swirly structure that's a muscle which allowed us, at one point, to move our ears in the direction of interesting sounds. These days, we just turn our heads, but there it is.</p>
Goosebumps<img class="rm-lazyloadable-image rm-shortcode" type="lazy-image" data-runner-src="https://assets.rebelmouse.io/eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpbWFnZSI6Imh0dHBzOi8vYXNzZXRzLnJibC5tcy8xOTA5NzMxNC9vcmlnaW4uanBnIiwiZXhwaXJlc19hdCI6MTYyNzEyNTc2Nn0.aVMa5fsKgiabW5vkr7BOvm2pmNKbLJF_50bwvd4aRo4/img.jpg?width=980" id="d8420" width="1440" height="960" data-rm-shortcode-id="8827e55511c8c3aed8c36d21b6541dbd" data-rm-shortcode-name="rebelmouse-image" />
Goosebumps. Photo credit: Tyler Olson via Shutterstock<p>It's not entirely clear what purpose made goosebumps worth retaining evolutionarily, but there are two circumstances in which they appear: fear and cold. For fear, they may have been a way of making body hair stand up so we'd appear larger to predators, much the way a cat's tail puffs up — numerous creatures exaggerate their size when threatened. In the cold, they may have trapped additional heat for warmth.</p>
Tailbone<img class="rm-lazyloadable-image rm-shortcode" type="lazy-image" data-runner-src="https://assets.rebelmouse.io/eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpbWFnZSI6Imh0dHBzOi8vYXNzZXRzLnJibC5tcy8xOTA5NzMxNi9vcmlnaW4uanBnIiwiZXhwaXJlc19hdCI6MTY3MzQwMjc3N30.nBGAfc_O9sgyK_lOUo_MHzP1vK-9kJpohLlj9ax1P8s/img.jpg?width=980" id="9a2f6" width="1440" height="1440" data-rm-shortcode-id="4fe28368d2ed6a91a4c928d4254cc02a" data-rm-shortcode-name="rebelmouse-image" />
Image source: Decade3d-anatomy online via Shutterstock<p>Way back, we had tails that probably helped us balance upright, and was useful moving through trees. We still have the stump of one when we're embryos, from 4–6 weeks, and then the body mostly dissolves it during Weeks 6–8. What's left is the coccyx.</p>
The palmar grasp reflex<img class="rm-lazyloadable-image rm-shortcode" type="lazy-image" data-runner-src="https://assets.rebelmouse.io/eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpbWFnZSI6Imh0dHBzOi8vYXNzZXRzLnJibC5tcy8xOTA5NzMyMC9vcmlnaW4uanBnIiwiZXhwaXJlc19hdCI6MTYzNjY0MDY5NX0.OSwReKLmNZkbAS12-AvRaxgCM7zyukjQUaG4vmhxTtM/img.jpg?width=980" id="8804c" width="1440" height="960" data-rm-shortcode-id="67542ee1c5a85807b0a7e63399e44575" data-rm-shortcode-name="rebelmouse-image" />
Palmar reflex activated! Photo credit: Raul Luna on Flickr<p> You've probably seen how non-human primate babies grab onto their parents' hands to be carried around. We used to do this, too. So still, if you touch your finger to a baby's palm, or if you touch the sole of their foot, the palmar grasp reflex will cause the hand or foot to try and close around your finger.</p>
Other people's suggestions<p>Amir's followers dove right in, offering both cool and questionable additions to her list. </p>
Fangs?<blockquote class="twitter-tweet" data-conversation="none" data-lang="en"><p lang="en" dir="ltr">Lower mouth plate behind your teeth. Some have protruding bone under the skin which is a throw back to large fangs. Almost like an upsidedown Sabre Tooth.</p>— neil crud (@neilcrud66) <a href="https://twitter.com/neilcrud66/status/1085606005000601600?ref_src=twsrc%5Etfw">January 16, 2019</a></blockquote> <script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script>
Hiccups<blockquote class="twitter-tweet" data-conversation="none" data-lang="en"><p lang="en" dir="ltr">Sure: <a href="https://t.co/DjMZB1XidG">https://t.co/DjMZB1XidG</a></p>— Stephen Roughley (@SteBobRoughley) <a href="https://twitter.com/SteBobRoughley/status/1085529239556968448?ref_src=twsrc%5Etfw">January 16, 2019</a></blockquote> <script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script>
Hypnic jerk as you fall asleep<blockquote class="twitter-tweet" data-conversation="none" data-lang="en"><p lang="en" dir="ltr">What about when you “jump” just as you’re drifting off to sleep, I heard that was a reflex to prevent falling from heights.</p>— Bann face (@thebanns) <a href="https://twitter.com/thebanns/status/1085554171879788545?ref_src=twsrc%5Etfw">January 16, 2019</a></blockquote> <script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script> <p> This thing, often called the "alpha jerk" as you drop into alpha sleep, is properly called the hypnic jerk,. It may actually be a carryover from our arboreal days. The <a href="https://www.livescience.com/39225-why-people-twitch-falling-asleep.html" target="_blank" data-vivaldi-spatnav-clickable="1">hypothesis</a> is that you suddenly jerk awake to avoid falling out of your tree.</p>
Nails screeching on a blackboard response?<blockquote class="twitter-tweet" data-conversation="none" data-lang="en"><p lang="en" dir="ltr">Everyone hate the sound of fingernails on a blackboard. It's _speculated_ that this is a vestigial wiring in our head, because the sound is similar to the shrill warning call of a chimp. <a href="https://t.co/ReyZBy6XNN">https://t.co/ReyZBy6XNN</a></p>— Pet Rock (@eclogiter) <a href="https://twitter.com/eclogiter/status/1085587006258888706?ref_src=twsrc%5Etfw">January 16, 2019</a></blockquote> <script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script>
Ear hair<blockquote class="twitter-tweet" data-conversation="none" data-lang="en"><p lang="en" dir="ltr">Ok what is Hair in the ears for? I think cuz as we get older it filters out the BS.</p>— Sarah21 (@mimix3) <a href="https://twitter.com/mimix3/status/1085684393593561088?ref_src=twsrc%5Etfw">January 16, 2019</a></blockquote> <script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script>
Nervous laughter<blockquote class="twitter-tweet" data-lang="en"><p lang="en" dir="ltr">You may be onto something. Tooth-bearing with the jaw clenched is generally recognized as a signal of submission or non-threatening in primates. Involuntary smiling or laughing in tense situations might have signaled that you weren’t a threat.</p>— Jager Tusk (@JagerTusk) <a href="https://twitter.com/JagerTusk/status/1085316201104912384?ref_src=twsrc%5Etfw">January 15, 2019</a></blockquote> <script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script>
Um, yipes.<blockquote class="twitter-tweet" data-conversation="none" data-lang="en"><p lang="en" dir="ltr">Sometimes it feels like my big toe should be on the side of my foot, was that ever a thing?</p>— B033? K@($ (@whimbrel17) <a href="https://twitter.com/whimbrel17/status/1085559016011563009?ref_src=twsrc%5Etfw">January 16, 2019</a></blockquote> <script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script>
Context is everything.
The COVID-19 pandemic has introduced a number of new behaviours into daily routines, like physical distancing, mask-wearing and hand sanitizing. Meanwhile, many old behaviours such as attending events, eating out and seeing friends have been put on hold.
A new study looks at how images of coffee's origins affect the perception of its premiumness and quality.