Daniel Dennett on Reductio ad Absurdum, the Philosopher's Crowbar
Philosopher Daniel Dennett discusses reductio ad absurdum, "the workhorse of philosophical argumentation," wherewith thinkers test the validity of an opponent's argument by taking it to its most illogical extreme.
Daniel C. Dennett is the author of Intuition Pumps and Other Tools for Thinking, Breaking the Spell, Freedom Evolves, and Darwin's Dangerous Idea and is University Professor and Austin B. Fletcher Professor of Philosophy, and Co-Director of the Center for Cognitive Studies at Tufts University. He lives with his wife in North Andover, Massachusetts, and has a daughter, a son, and a grandson. He was born in Boston in 1942, the son of a historian by the same name, and received his B.A. in philosophy from Harvard in 1963. He then went to Oxford to work with Gilbert Ryle, under whose supervision he completed the D.Phil. in philosophy in 1965. He taught at U.C. Irvine from 1965 to 1971, when he moved to Tufts, where he has taught ever since, aside from periods visiting at Harvard, Pittsburgh, Oxford, and the École Normale Supérieure in Paris.
His first book, Content and Consciousness, appeared in 1969, followed by Brainstorms (1978), Elbow Room (1984), The Intentional Stance (1987), Consciousness Explained (1991), Darwin's Dangerous Idea (1995), Kinds of Minds (1996), and Brainchildren: A Collection of Essays 1984-1996. Sweet Dreams: Philosophical Obstacles to a Science of Consciousness, was published in 2005. He co-edited The Mind's I with Douglas Hofstadter in 1981 and he is the author of over three hundred scholarly articles on various aspects on the mind, published in journals ranging from Artificial Intelligence and Behavioral and Brain Sciences to Poetics Today and the Journal of Aesthetics and Art Criticism.
Dennett gave the John Locke Lectures at Oxford in 1983, the Gavin David Young Lectures at Adelaide, Australia, in 1985, and the Tanner Lecture at Michigan in 1986, among many others. He has received two Guggenheim Fellowships, a Fulbright Fellowship, and a Fellowship at the Center for Advanced Studies in Behavioral Science. He was elected to the American Academy of Arts and Sciences in 1987.
He was the Co-founder (in 1985) and Co-director of the Curricular Software Studio at Tufts, and has helped to design museum exhibits on computers for the Smithsonian Institution, the Museum of Science in Boston, and the Computer Museum in Boston.
Daniel Dennett: One of the reasons I wrote this book is because oddly enough, philosophers who are famous -- notorious for being naval gazers, for being reflective. I think, in fact, philosophers are often remarkably unreflective about their own methodology. I wanted to draw attention to how philosophers actually go about their business and get them thinking more self-consciously about the tools they use and how they use them.
A tool that everybody should be familiar with and, in fact, people use it all the time is reductio ad absurdum arguments. It's the sort of general purpose crowbar of rational argument where you take your opponents premises and deduce something absurd from them. That is, you deduce a contradiction officially. We use it all the time without paying much attention to it. If you say something like -- if he gets here in time for supper, he'll have to fly like Superman.
Which is absurd -- nobody can fly that fast. You don't bother spelling it out, you just say -- you point out that something that somebody imagined or proposed has a ridiculous consequence. Well, let's look at one of the great granddaddy reductio ad absurdum arguments of all times. And that's Galileo's proof that heavy things don't fall faster than light things leaving friction aside. He argued as follows.
Okay, suppose you take the premise that you're gonna show is false. Suppose heavier things do fall faster than light things. Now, take a stone A which is heavier than another stone B. That means if we tied B to A with a string, B should act as a drag on A when we drop it because A will fall faster, B will fall slower and so A tied to B should fall slower than A by itself. But A-B tied together is heavier than A by itself so it should fall faster. It should fall both faster and slower than A by itself. That's a manifest contradiction. So we know that our premise with which we began has to be false. That's a classic reductio ad absurdum. That's been known and named for several millennia I guess. And, as I say, it's the workhorse of philosophical argumentation.
Directed / Produced by Jonathan Fowler and Elizabeth Rodd
With his new book "Intuition Pumps and Other Tools for Thinking," philosopher Daniel Dennett offers a kind of self-help book for deep thinkers -- a series of thought experiments designed as a workout for the deliberative mind. Here he discusses reductio ad absurdum, "the workhorse of philosophical argumentation," wherewith thinkers test the validity of an opponent's argument by taking it to its most illogical extreme.
Once a week.
Subscribe to our weekly newsletter.
The distances between the stars are so vast that they can make your brain melt. Take for example the Voyager 1 probe, which has been traveling at 35,000 miles per hour for more than 40 years and was the first human object to cross into interstellar space. That sounds wonderful except, at its current speed, it will still take another 40,000 years to cross the typical distance between stars.
Worse still, if you are thinking about interstellar travel, nature provides a hard limit on acceleration and speed. As Einstein showed, it's impossible to accelerate any massive object beyond the speed of light. Since the galaxy is more than 100,000 light-years across, if you are traveling at less than light speed, then most interstellar distances would take more than a human lifetime to cross. If the known laws of physics hold, then it seems a galaxy-spanning human civilization is impossible.
Unless of course you can build a warp drive.
Ah, the warp drive, that darling of science fiction plot devices. So, what about a warp drive? Is that even a really a thing?
Let's start with the "warping" part of a warp drive. Without doubt, Albert Einstein's theory of general relativity ("GR") represents space and time as a 4-dimensional "fabric" that can be stretched and bent and folded. Gravity waves, representing ripples in the fabric of spacetime, have now been directly observed. So, yes spacetime can be warped. The warping part of a warp drive usually means distorting the shape of spacetime so that two distant locations can be brought close together — and you somehow "jump" between them.
This was a basic idea in science fiction long before Star Trek popularized the name "warp drive." But until 1994, it had remained science fiction, meaning there was no science behind it. That year, Miguel Alcubierre wrote down a solution to the basic equations of GR that represented a region that compressed spacetime ahead of it and expanded spacetime behind to create a kind of traveling warp bubble. This was really good news for warp drive fans.
The problems with a warp drive
There were some problems though. Most important was that this "Alcubierre drive" required lots of "exotic matter" or "negative energy" to work. Unfortunately, there's no such thing. These are things theorists dreamed up to stick into the GR equations in order to do cool things like make stable open wormholes or functioning warp drives.
It's also noteworthy that researchers have raised other concerns about an Alcubierre drive — like how it would violate quantum mechanics or how when you arrived at your destination it would destroy everything in front of the ship in an apocalyptic flash of radiation.
Warp drives: A new hope
Credit: Primada / 420366373 via Adobe Stock
Recently, however, there seemed to be good news on the warp drive front with the publication this April of a new paper by Alexey Bobrick and Gianni Martre entitled "Introducing Physical Warp Drives." The good thing about the Bobrick and Martre paper was it was extremely clear about the meaning of a warp drive.
Understanding the equations of GR means understanding what's on either side of the equals sign. On one side, there is the shape of spacetime, and on the other, there is the configuration of matter-energy. The traditional route with these equations is to start with a configuration of matter-energy and see what shape of spacetime it produces. But you can also go the other way around and assume the shape of spacetime you want (like a warp bubble) and determine what kind of configuration of matter-energy you will need (even if that matter-energy is the dream stuff of negative energy).
Warp drives are simpler and much less mysterious objects than the broader literature has suggested.
What Bobrick and Martre did was step back and look at the problem more generally. They showed how all warp drives were composed of three regions: an interior spacetime called the passenger space; a shell of material, with either positive or negative energy, called the warping region; and an outside that, far enough away, looks like normal unwarped spacetime. In this way they could see exactly what was and was not possible for any kind of warp drive. (Watch this lovely explainer by Sabine Hossenfelder for more details). They even showed that you could use good old normal matter to create a warp drive that, while it moved slower than light speed, produced a passenger area where time flowed at a different rate than in the outside spacetime. So even though it was a sub-light speed device, it was still an actual warp drive that could use normal matter.
That was the good news.
The bad news was this clear vision also showed them a real problem with the "drive" part of the Alcubierre drive. First of all, it still needed negative energy to work, so that bummer remains. But worse, Bobrick and Martre reaffirmed a basic understanding of relativity and saw that there was no way to accelerate an Alcubierre drive past light speed. Sure, you could just assume that you started with something moving faster than light, and the Alcubierre drive with its negative energy shell would make sense. But crossing the speed of light barrier was still prohibited.
So, in the end, the Star Trek version of the warp drive is still not a thing. I know this may bum you out if you were hoping to build that version of the Enterprise sometime soon (as I was). But don't be too despondent. The Bobrick and Martre paper really did make headway. As the authors put it in the end:
"One of the main conclusions of our study is that warp drives are simpler and much less mysterious objects than the broader literature has suggested"
That really is progress.
The Black Death wasn't the only plague in the 1300s.
- In a unique study, researchers have determined how many people in medieval England had bunions
- A fashion trend towards pointed toe shoes made the affliction common.
- Even monks got in on the trend, much to their discomfort later in life.
Late Medieval England had its share of problems. The Wars of Roses raged, the Black Death killed off large parts of the population, and passing ruffians could say "Ni" at will to old ladies.
To make matters worse, a first of its kind study published in the International Journal of Paleopathology has demonstrated that much of the population suffered from another plague — a plague of bunions likely caused by a ridiculous medieval fashion trend.
If the shoe fits, it won't cause bunions
The outlines of a leather shoe from the King's Ditch, Cambridge. It is easy to see how these shoes might be constricting. Copyright Cambridge Archaeological Unit.
The bunion, known to medicine as "hallux valgus," is a deformity of the joint connecting the big toe to the rest of the foot. It is painful and can cause other issues including poor balance. The condition is associated with having worn constrictive shoes for a long period of time as well as genetic factors. Today, it is often caused by wearing high heeled shoes.
The medieval English didn't care for high heeled shoes as much as modern fashionistas, but there was a major fashion trend toward shoes with long, pointed toes called "poulaines" or "crakows" for their supposed place of origin, Krakow, Poland.
This trend, already silly-looking to a modern observer, got out of hand in a hurry. According to some records, the points on nobleman's shoes could be so long as to require tying them to the leg with string so the wearer could walk. At one point, King Edward IV had to ban commoners from wearing points longer than two inches. A couple years later, he saw fit to ban the shoes altogether.
But, just knowing that people back in the day made poor fashion choices doesn't prove they suffered for it. That is where digging up old skeletons to look at their feet comes in.
Beauty is pain: the price of high medieval fashion
To learn how bad the bunion epidemic was, the researchers looked to four burial sites in and around Cambridge. One was a rural cemetery where poor peasants were buried. Another was the All Saints by the Castle parish, which had a mixed collection of people that tended toward poverty. The Hospital of St. John's burial ground contained both the poor charges of a charity hospital and wealthy benefactors. Lastly, they considered the cemetery of a local Augustinian friary, home to monks and well-to-do philanthropists.
The team considered 177 adult skeletons that were at least a quarter complete and still had enough of their feet to make studying them possible. The remains were classified by age and sex by observation and DNA testing. Each was examined for evidence of bunions and signs of complications from the condition, such as falling.
Those buried in the monastery's graveyard were the most affected. Nearly half, 43 percent, of the remains found there had bunions. This includes five of the eleven members of the clergy they found. Twenty-three percent of those laid to rest at the Hospital of St. John had bunions, though only 10 percent of those at the All Saints by the Castle parish graveyard did.
The rural cemetery had a much lower rate of instances, only three percent, suggesting that these peasants were able to avoid at least one plague.
Overall, eighteen percent of the individuals examined had bunions, with men more likely to have them than women. Those at cemeteries known for exclusivity were more likely to have them as well, though it is clear that the condition also affected members of other classes. This makes sense, as it is known that these shoes had mass appeal.
The authors note that the rural cemetery having fewer cases is partly because that cemetery "went out of use prior to the wide adoption of pointed shoes, and it is likely that those residing in the parish predominately wore soft leather shoes, or possibly went barefoot."
Those skeletons with evidence of bunions were more likely to have fractures indicative of a fall. This was more common on those estimated or recorded as having lived past age 45.
In our much more enlightened times, 23 percent of the population currently endures having bunions, most of them women, and one of the leading culprits behind this is the high heeled shoe.
Some things never change.
Maybe eyes really are windows into the soul — or at least into the brain, as a new study finds.
- Researchers find a correlation between pupil size and differences in cognitive ability.
- The larger the pupil, the higher the intelligence.
- The explanation for why this happens lies within the brain, but more research is needed.
What can you tell by looking into someone's eyes? You can spot a glint of humor, signs of tiredness, or maybe that they don't like something or someone.
But outside of assessing an emotional state, a person's eyes may also provide clues about their intelligence, suggests new research. A study carried out at the Georgia Institute of Technology shows that pupil size is "closely related" to differences in intelligence between individuals.
The scientists found that larger pupils may be connected to higher intelligence, as demonstrated by tests that gauged reasoning skills, memory, and attention. In fact, the researchers claim that the relationship of intelligence to pupil size is so pronounced, that it came across their previous two studies as well and can be spotted just with your naked eyes, without any additional scientific instruments. You should be able to tell who scored the highest or the lowest on the cognitive tests just by looking at them, say the researchers.
The pupil-IQ link
The connection was first noticed across memory tasks, looking at pupil dilations as signs of mental effort. The studies involved more than 500 people aged 18 to 35 from the Atlanta area. The subjects' pupil sizes were measured by eye trackers, which use a camera and a computer to capture light reflecting off the pupil and cornea. As the scientists explained in Scientific American, pupil diameters range from two to eight millimeters. To determine average pupil size, they took measurements of the pupils at rest when the participants were staring at a blank screen for a few minutes.
Another part of the experiment involved having the subjects take a series of cognitive tests that evaluated "fluid intelligence" (the ability to reason when confronted with new problems), "working memory capacity" (how well people could remember information over time), and "attention control" (the ability to keep focusing attention even while being distracted). An example of the latter involves a test that attempts to divert a person's focus on a disappearing letter by showing a flickering asterisk on another part of the screen. If a person pays too much attention to the asterisk, they might miss the letter.
The conclusions of the research were that having a larger baseline pupil size was related to greater fluid intelligence, having more attention control, and even greater working memory capacity, although to a smaller extent. In an email exchange with Big Think, author Jason Tsukahara pointed out, "It is important to consider that what we find is a correlation — which should not be confused with causation."
The researchers also found that pupil size seemed to decrease with age. Older people had more constricted pupils but when the scientists standardized for age, the pupil-size-to-intelligence connection still remained.
Why are pupils linked to intelligence?
The connection between pupil size and IQ likely resides within the brain. Pupil size has been previously connected to the locus coeruleus, a part of the brain that's responsible for synthesizing the hormone and neurotransmitter norepinephrine (noradrenaline), which mobilizes the brain and body for action. Activity in the locus coeruleus affects our perception, attention, memory, and learning processes.
As the authors explain, this region of the brain "also helps maintain a healthy organization of brain activity so that distant brain regions can work together to accomplish challenging tasks and goals." Because it is so important, loss of function in the locus coeruleus has been linked to conditions like Alzheimer's disease, Parkinson's, clinical depression, and attention deficit hyperactivity disorder (ADHD).
The researchers hypothesize that people who have larger pupils while in a restful state, like staring at a blank computer screen, have "greater regulation of activity by the locus coeruleus." This leads to better cognitive performance. More research is necessary, however, to truly understand why having larger pupils is related to higher intelligence.
In an email to Big Think, Tsukahara shared, "If I had to speculate, I would say that it is people with greater fluid intelligence that develop larger pupils, but again at this point we only have correlational data."
Do other scientists believe this?
As the scientists point out in the beginning of their paper, their conclusions are controversial and, so far, other researchers haven't been able to duplicate their results. The research team addresses this criticism by explaining that other studies had methodological issues and examined only memory capacity but not fluid intelligence, which is what they measured.
Being mortal makes life so much sweeter.
- Since the beginning of time, humans have fantasized over and quested for "eternal life."
- Lobsters and a kind of jellyfish offer us clues about what immortality might look like in the natural world.
- Evolution does not lend itself easily to longevity, and philosophy might suggest that life is more precious without immortality.
One of the oldest pieces of epic literature we have is known as the Epic of Gilgamesh. It's easy to get lost in all the ancient mythology — talking animals and heroic battles — but at its heart lies one of the most fundamental and universal quests of all time: the search for immortality. It's all about Gilgamesh wanting to live forever.
From Mesopotamian poetry to Indiana Jones and the Last Crusade, from golden apples to the philosopher's stone, humans, everywhere, have wanted and sought after eternal life.
And yet, perhaps the secret to immortality is not as elusive as we might think. Rather than holy objects or science fiction, we need only look to the animal world to see how nature, that most magical of places, might be able to answer one of the oldest questions there is.
If you ever find yourself at Red Lobster or about to munch into a lobster roll, take a moment to consider that you might just be eating a clue to perpetual youth. To see why, we have to know a tiny bit about aging.
As you get older, it's impossible not to notice how everything creaks a little more, how easy jobs now require great effort, and how hangovers are no longer a laughing matter. Our bodies are designed to degrade and wear away. This deterioration, known as "senescence" in biology, occurs at the cellular level. It's when the cells in our body stop dividing, yet remain in our body, active and alive. We need our cells to divide so that we can grow and repair. For instance, when we cut ourselves or lift weights in the gym, it is cell division that replaces and rebuilds the damage done. But, over time, our cells just stop dividing. They stay around to do the best they can, but like the macroscopic humans they make up, cells get slower and more error-prone — and so, we age.
But not lobsters. In normal cases of cell division, the shields at the end of our chromosomes — called telomeres — are remade a bit smaller, and so a bit less effective after each subsequent cell division at protecting our DNA. When this reaches a certain point, the cell enters senescence and will stop dividing. It won't self-destruct but will just carry on and wallow as it is. Lobsters, though, have a special enzyme (unsurprisingly, called telomerase) which makes sure that their cells' telomeres remain as long and brilliant as they've always been. Their cells will never enter senescence, and so a lobster just won't age.
However, what evolution giveth with one hand, it taketh with another. As crustaceans, their skeleton is on the outside, and having a constantly growing body means they are always outgrowing their exoskeletal homes. They need to abandon their old shells and regrow a new one all the time. This, of course, requires huge reserves of energy, and as the lobster reaches a certain size, it simply cannot consume enough calories to build the shell equivalent of a mansion. Lobsters do not die from old age but exhaustion (as well as disease and New England fisherman).
The jellyfish that reverses its life cycle
Although lobsters might not have perfected immortality, perhaps there's something to learn.
But there's another animal that does even better than the lobster, and it's the only creature recognized to be properly immortal. That's the jellyfish known as Turritopsis dohrnii. These jellyfish are tiny — about the size of a fly at their biggest — but they've mastered one ridiculous trick: they can reverse their life cycle.
An embryonic jellyfish starts as a fertilized egg before hooking onto some kind of surface to then grow up. In this stage, they will stretch out to look like any other jellyfish. Eventually, they will break away from this surface to become a mature, fully developed jellyfish, which is in turn ready to reproduce. So far, so normal.
Yet Turritopsis dohrnii does something remarkable. When things get tough — like the environment becomes hostile or there's a conspicuous absence of food — they can change back to one of the earlier stages in their lifecycle. It's like a frog becoming a tadpole or a fly becoming a maggot. It's the human equivalent of a mature adult saying, "Right, I've had enough of this job, that mortgage, this stress, and that anxiety, so I'm going to turn back into a toddler.". Or, it's like an old man deciding to become a fetus again, for one more round.
Obviously, a fingernail sized jellyfish is not immortal as we'd probably want the word to mean. They're as squishable and digestible as any animal. But, their ability to change back to earlier forms of life, ones which are better adapted to certain environments or where there are fewer food sources, means that they could, in theory, go on forever.
Why do we want to live forever?
Although the quest for immortality is as old as humanity itself, it's surprisingly hard to find across the diverse natural world. Truth be told, evolution doesn't care about how long we live, so long as we live long enough to pass on our genes and to make sure our children are vaguely looked after. Anything more than that is redundant, and evolution doesn't have much time for needless longevity.
The more philosophical question, though, is why do we want to live forever? We're all prone to existential anguish, and we all, at least some of the time, fear death. We don't want to leave our loved ones behind, we want to finish our projects, and we much prefer the known life to an unknown afterlife. Yet, death serves a purpose. As the German philosopher Martin Heidegger argued, death is what gives meaning to life.
Having the end makes the journey worthwhile. It's fair to say that playing a game is only fun because it doesn't go on forever, a play will always need its curtain call, and a word only makes sense at its last letter. As philosophy and religion has repeated throughout the ages: memento mori, or "remember you'll die."
Being mortal in this world makes life so much sweeter, which is surely why lobsters and tiny jellyfish have such ennui.Jonny Thomson teaches philosophy in Oxford. He runs a popular Instagram account called Mini Philosophy (@philosophyminis). His first book is Mini Philosophy: A Small Book of Big Ideas