Skip to content
Who's in the Video
Steven Pinker is an experimental psychologist who conducts research in visual cognition, psycholinguistics, and social relations. He grew up in Montreal and earned his BA from McGill and his PhD[…]

Steven Pinker, a renowned Canadian cognitive psychologist and author, speaks to Big Think in this wide-ranging conversation on topics such as human progress, the tragedy of the commons, Bayesian reasoning, and strategies to curb our most destructive instincts, with the ultimate goal of improving the world for everyone.

Central to Pinker’s argument is the promotion of rationality. His 2023 book, Rationality: What It Is, Why It Seems Scarce, Why It Matters, delves into the enigmatic nature of human progress, examining how we have achieved so many impressive scientific breakthroughs while concurrently succumbing to widespread irrationality, like fake news and conspiracy theories. Pinker maintains that humans are not innately irrational; instead, our thought processes are better adapted to low-tech environments.

So, how can we cultivate greater rationality in today’s complex world? Pinker believes that one key lies in improving education. By teaching children critical thinking skills, we can better equip them to identify biases, emotional reasoning, and cognitive distortions. A more effective educational approach might emphasize probability theory, the elements of persuasive rhetoric, and developing empathy by understanding the perspectives of those with differing viewpoints. By fostering these skills, we can nurture a more rational and informed society capable of addressing the challenges of our modern era.

STEVEN PINKER: My name is Steve Pinker. I am a professor of psychology at Harvard University. My most recent book is called "Rationality: What It Is, Why It Seems Scarce, Why It Matters."

NARRATOR: Why does rationality matter?

PINKER: The third part of the subtitle of my book "Rationality," is "Why It Matters." And one of the reasons that it matters is that it helps to explain phenomena that I documented in the two proceeding books. "The Better Angels of Our Nature: Why Violence Has Declined." And "Enlightenment Now: The Case for Reason, Science, Humanism, and Progress." Now, those two books presented more than a hundred graphs showing the existence of progress, that is plotting various bad things like deaths in war, deaths in crime, violence against women, violence against children, racism, and showing that over the course of history, those had decreased and lots of good things like happiness, like prosperity, like freedom, like democracy, that over time increased. Now, a question that I often get is: "Oh, does this mean that you believe in progress?" Well, there's a sense of which I don't believe in progress, at least not as a force in the Universe. I'd like to quote the humorous Fran Lebowitz: "I don't believe in anything you have to believe in," because there isn't any arc bending toward justice. There's no force that's living us ever upward- quite the contrary. The Universe doesn't care about us, and it often seems to be out to get us. There are pathogens and parasites that want to eat us from the inside, and they are Darwinian creatures, after all- they're just interested in surviving, and we're big, yummy hunks of meat from their point of view. And they can evolve faster than we can. There's just the laws of entropy. There are more ways for things to be disordered than ordered. There are more ways for things to go wrong than to go right. So as stuff happens at random, stuff's gonna get worse. There's human nature. We were not selected by the processes of evolution to be particularly nice. We have the capacity to be nice, but we also have the capacity for revenge and exploitation, and bigotry and sadism, and much else. So that's what's lined up against us. All of those forces are going to push against progress, but, nonetheless, progress has happened. How do we explain that what might seem like a miracle? The answer is rationality. That is, we are a cognitive species- we do have the wherewithal to try to figure out how the world works. We've got language so we can share our insights, our discoveries, our epiphanies, our trial and error accidents, the results of our experiments, the mistakes we hope not to repeat. And if people deploy their rationality- including their cognition, their language, with the goal of making other people better off, making them live longer, happier, freer, more prosperous lives- well, every once in a while, our species stumbles on the means to improve things, to push back against all these forces that tend to make us more miserable. If we remember the ones that work, try not to repeat our mistakes, then the result over time is what we call progress.

NARRATOR: How can we measure human progress?

PINKER: Well, the three great things people desire is to be healthy, wealthy, and wise. And so we can start our survey of progress with those three. Health, the ultimate meaning in life is to be alive rather than dead, and longevity has vastly increased. We live more than twice as long as our ancestors just a few generations ago. And this is not just true in the West, but it is true worldwide. People in the poorest countries today live longer than people in the richest countries a hundred years ago. We live more than twice as long as our ancestors. So we have not just extra life, but as if we've been granted an extra life. That includes the stupendous gift that fewer children die. It used to be that more than a third of children didn't live to see their fifth birthday. Now that it's in the more fortunate countries, it's less than a percentage point. And worldwide it's still just a few percentage points- still too high- but better than it used to be. Maternal mortality far fewer women die in childbirth. Famines are far less common. Famine used to be one of the Four Horsemen of the Apocalypse. It was just something that could bring misery, and death to any country following a bad harvest, but because of the agricultural revolution, the development of crop rotation, and vigorous hybrid seeds, and synthetic fertilizer, and the mechanization of agriculture, and transportation networks that can bring food from one part of the world to another, famines today don't occur because of a shortage of food, they occur because of wars and revolutions that prevent food from getting to the people who need it- but that is tremendous progress from an era in which it was divine punishment that a harvest would fail, and millions of people would starve. Wealthy: 200 years ago, by the current definition of extreme poverty, 90% of the world lived in extreme poverty. Today, about 9% of the world lives in extreme poverty, and that proportion falls every day. Wise: our natural state is illiteracy and ignorance. And until pretty recently, a small aristocratic minority was able to read and write; now it's a majority of the world's population- 90% of the world's population under the age of 25- and that shows which way the world is going, and in almost every part of the world, girls, not just boys. So those are three big dimensions of human well-being. Another one would be violence. Rates of interpersonal violence are much lower now than they were in the Middle Ages, or in any frontier region that is beyond the reach of law enforcement, and a court system. It's any region that's at the frontier like the American Wild West. And since World War II, wars have become less frequent. They've become shorter. They've become less damaging. They've had fewer casualties. Now this hasn't happened in a straight line- there have been huge setbacks like the War of Vietnam, the Iran-Iraq War. Most recently, the most flagrant violation of this "long peace," as historians have called it, has been the invasion of Ukraine by Putin's Russia. At the moment we're speaking, which is late summer of 2022, the invasion of Ukraine by Putin's Russia is about seven months old. And no honest person could say how it will play out. It has been a violation of a historic process where countries are much less likely to invade each other. Countries are much less likely to go to war in general. Countries are less likely to seize, and try to annex territory. And countries are more in the business of making their citizens wealthy and happy than in rectifying historic injustices, or expanding spheres of influence. Putin has pushed back against all of those processes. We don't know how it'll end up. So far, as tragic and criminal as the war has been, it has not taken us back to the era in which wars would kill people by the millions, and we can hope that it will not turn into one now. Recently, despite some pernicious trends in the United States, which in many ways is an outlier among countries that have enjoyed progress, we are not number one in a lot of measures of human well-being. Western Europe and commonwealth countries do way better than the United States, but still, we're not doing badly, but in suicide- suicides have gone down worldwide by 40% over the last 30 years. Mental health, despite some scary headlines, and some worsenings in some demographics, but, overall, as best we can can measure it, rates of mental health are pretty constant over the last 30 years. Leisure time has increased both because of the shortening of the work week, which used to be on average more than 60 hours- now it's less than 40 hours- and the universal availability of labor-saving devices, the availability of energy and running water, the availability of stoves and refrigerators and microwaves, and store-bought food, we have an additional 15 hours a week- compared to just 50 or 60 years ago- of time not spent on chores and housework. And then there are opportunities in the richness of life. More people travel. More people consume a variety of ethnic foods. More people consume a variety of music that in the past when we were kind of housebound, and people sometimes would not venture more than a few miles from their place of birth. So these and many other dimensions, as Barack Obama said a decade ago, "If you had to choose a period of history to live in, and you didn't know where you'd be, who you'd be, what race you'd be, what class you'd be, you'd pick now."

NARRATOR: Is your view of progress optimistic?

PINKER: When people see my argument that many things have, on average, gotten better, the reaction is often, "Oh, it's so nice you're an optimist." And I always resist that; I don't really consider myself an optimist. I just consider myself someone who looks at data rather than headlines. Headlines are guaranteed to make you pessimistic, even cynical or fatalistic because headlines are a gone random sample of the worst things happening on Earth, at any given time. And most things that happen, especially things that happen suddenly, and photogenically, are bad things. Things blow up. There's a shooter. There's a terrorist. There's a pandemic. Good things often consist of nothing happening: like a part of the world that is at peace, that people forgot used to be at war for decades. A city that is not shot up by terrorists, a city that is not plagued by a crime wave- but these aren't like something that happened on a particular Thursday in October, so you never read about them in the news. It's when you plot data, which includes not only the things that happened, but also the things that don't happen, that goes into the denominator. And that includes gradual trends, things that might creep up by a few percentage points a year, but that can exponentiate and transform the world. You see with your own eyes in these graphs how things have gotten better as a fact about human history. Not a matter of seeing the world through rose-colored glasses, seeing the glass as half-full, just an awareness of the world that is much more complete than either the stories and headlines, and images from the news, or the battles and kings and revolutions from conventional history.

NARRATOR: Are we irrational or an irrational species?

PINKER: Raising the question of how rational our species is bumps you immediately up against a kind of paradox: One of them is that by some measures we've never been more rational. We have smartphones, which are super computers that can fit into a jacket pocket. We have antibiotics and vaccines and blood transfusions, and other fruits of biomedical ingenuity that continue to extend our lives. Vaccines for COVID were developed in less than a year. In area after area, we seem to be reaching new heights of rationality. We have evidence-based medicine, we've got 3D printing, we've got robotics, we've got artificial intelligence, we've got data-driven policing, we've got Moneyball in sports- the use of statistics and hypothesis testing instead of folklore. That's at the high-end, but we're also seeing an awful lot of what you might call "rationality inequality." That is, at the other end, there's an awful lot of irrationality, of nuttiness. There are conspiracy theories such as that the COVID vaccines are actually a plot by Bill Gates to inject microchips to surveil us. There's the theory that jet contrails are actually mind-altering drugs dispersed through the atmosphere in a secret government program. There's paranormal woo-woo people who believe they can access past lives, who believe in reincarnation, in ghosts, in spirits. There's fake news like Joe Biden bans the Pledge of Allegiance, or Hillary Clinton had an affair with Yoko Ono, that spread fast and that people avidly consume. There's the big lie such as that the American election of 2020 was stolen, in defiance of all evidence. How do you explain, or how do I explain, as someone who claims to know a thing or two about rationality, how the same species could indulge in both? And part of the answer is that our feats of rationality never come from, like, one genius who figures it all out. We're not that smart. We're fallible, we are limited in our knowledge. No one is vouchsafed with a truth by God. We kind of blunder our way to the truth in institutions, in scientific societies, in government recordkeeping agencies, in responsible, journalistic outlets that try to pursue the truth, that have checks and balances 'cause everyone, of course, thinks they're right, but not everyone can be right. So you always need open debate, criticism, free speech, that when someone is wrong- which most people are most of the time- someone else can say they're wrong, and you need mechanisms of recordkeeping, and data collection. With this complicated apparatus, we can be collectively much more rational than any of us is individually, but it crucially depends on the rules of the game. That is, do you play by rules that's gonna make the whole group of you collectively more rational? Such as admit you're wrong when you're wrong. If you have a claim you gotta prove it, you can't just jam it down people's throats by authority, or power or prestige. You gotta test your ideas with experiments or data. All of these rules and norms are what allow networks of us to kind of blunder our way toward rationality and truth. Left to our own devices, we humans, I would not say that we're irrational: After all, most of us can show up to work on time, and get gas in the car so we don't run outta gas, and keep food in the fridge, and get the kids clothed and off to school on time. So we do plan our lives in ways that have to respect the laws of logic and cause and effect, 'cause those laws are merciless. They won't let you violate them, but only when it comes to our concrete day-to-day existence. When it comes to anything that transcends our experience: What is the origin of the Universe? What is the cause of fortune and misfortune and disease? What really happened in the planning that led to Vietnam, or Watergate, or the financial crisis of 2008? Those realms where none of us were there. None of us can see the microscopic little organisms that cause disease, or they go back in time to the Big Bang. We've got to trust the communities of people who make it their business to understand these things- and that isn't intuitive. The fact that we actually are capable, collectively, of groping our way toward the truth. Most people, it's like, "You can't find out?" And so you might as well believe the best story, the ones that make you and your tribe look great, that make your enemies look evil and stupid, that convey the right moral message, that make the kids grow up to have the right values. And that's what people tend to gravitate toward if they don't trust the institutions of science, and recordkeeping and history. So we have these seeds of rationality in us: That's how we spread out all over the planet. That's how we manage to survive in unforgiving places like the Arctic and the rainforest and the desert, but that rationality tends to be confined to the here and now of our lives. It's only since the Enlightenment, really, that we have robust institutions that could answer questions about what happened, what's true, what's false, how does the Universe work at bigger scales- but you gotta sign on to the rules of those games. You gotta trust the institutions that do.

NARRATOR: What are the current threats to rationality and progress?

PINKER: The ideals of the Enlightenment, that we can use knowledge to improve human well-being, aren't particularly natural for most of our history and pre-history. There was no knowledge that you could act on to do things like reduce infection, or reduce tribal warfare, or to extend life, to prevent a famine. And so the idea that we have a moral imperative to know how the world works, and to apply it to make people better off; everyone, all human beings make them happier, healthier, more secure, longer life, even as an aspiration. Until recently, that would've been a crazy thing even to hope for. So it's not intuitive, that's not how we evolved. It's not how our institutions developed, and they have to be reinforced, and there's always a tendency toward backsliding. And we're seeing several of them that, for example, in the authoritarian-nationalist, and populist-right, there's the widespread idea that the ultimate good is to maximize the glory, the greatness, the eminence of the nation. The individual people are just kind of cells of an organism- they don't really matter. What matters is how great America is, or Russia or Eurasian civilization. There's the blowing-off of scientific findings of the efficacy of vaccines, the reality of climate change, data on crime- where people just don't look at the data, they don't look at the findings; they have a belief that reinforces an uplifting narrative. We've got suppression of opinion and speech in the woke left that, despite the fact that our species' only way of understanding the world is to try out ideas, and let other people say what's wrong with them, that at any given time a lot of our factual beliefs are bound to be wrong. A lot of our moral beliefs are bound to be wrong. We may think that we're angels, but subsequent generations will find the flaws in our morality just as we look back in horror at our ancestors who burned heretics and kept slaves, and sacrificed innocent victims. So the engine of progress is the broaching of ideas, and the evaluation of their coherence, and truth that is being disabled by cancel culture, by the attempt to snuff out unpopular beliefs in newspapers, in university campuses, in corporations, in government agencies. So these are our threats. I don't believe that it means that we're gonna go back to the age of wars of religion, and burning heretics, and superstition replacing science. There's just the advantages of Enlightenment thinking are just too obvious, but they need constant maintenance, defense upkeep. Their advantages have to be emphasized. We have to be reminded why it really is good to have a court system with a presumption of innocence, and the ability to cross-examine witnesses rather than, say, trial by ordeal: Dunking a witch and if she drowns that shows that she's guilty, or lynch mobs. Courts of justice really are a big advance, and we gotta remember why they're an advance. Liberal democracy: The idea that the head of state is just like a chairman whose given certain fiduciary duties on a limited basis, but has to keep the interests of the nation foremost, and then steps down when his term is over. That is an Enlightenment principle that was kind of stomped into the Earth by Donald Trump, who assumed that what's good for him is what the president should be allowed to do, including refusing to accept a result if it compromised his own esteem. The war in Ukraine, another flouting of Enlightenment values, the idea that the ultimate good is the happiness of, and lives of people. For Putin, several tens of thousands, maybe several hundreds of thousands of people dying, their schools, their hospitals, their apartments reduced to rubble, small price to pay for the glory of Eurasian civilization, and the avenging of the humiliation of the Soviet Union. There are other examples, so needless to say, we're not wired for Enlightenment humanism, but, nonetheless, I am prepared to argue that, objectively, leaves us better off. We ought to endorse it, we ought to advance it. We ought to remind ourselves of what's so great about it.

NARRATOR: Why do new technologies tend to increase irrational thinking?

PINKER: Whenever there's a new technology of sharing- the printing press, television, satellite transmission, internet, social media- there's a Wild West of crazy stuff that gets shared. There are conspiracy theories, there's plagiarism, there's fakery. In the 19th-century America, the newspapers were just fountains of fake news. Civilizations discovered on Mars, and sea monsters spotted in a lake that no one was expected to ever go to to check for themselves. And it took a while for newspapers to get their act together, and to develop codes of ethics and accuracy, and fact-checking and error correction, to kind of clean up their own act so that people could start to trust them. And that's had to happen with medicine as well. Until recently, most doctors were quacks, and people were healthier if they stayed away from doctors who tended to go from patient to patient spreading disease. It was a point of pride to have blood, and bodily fluids all over your coat. It showed you were a real doctor until antisepsis was discovered and accepted. So institutions often take time to get their house in order. And when they're new technologies, it can often enable all kinds of weird stuff. It means that with the new technologies, including most recently social media, we need the safeguards and workarounds and norms so that not just anything that someone tries to get to go viral is accepted. Norms that encourage people to think twice, check the source. Don't believe things with exclamation points and headlines like, "The real truth about," and "What they don't want you to know about," that those are all kind of dead giveaways for nonsense. Sources that have earned trust by admitting when they're wrong, by showing their work, how their ideas are tested and corrected; we're in the midst of a chaotic period where those new safeguards have not been put into place.

NARRATOR: How do institutions both enable, and hinder progress?

PINKER: So, institutions of rationality and truth-seeking such as academia, such as scientific societies, such as literary intellectual societies, such as responsible journalism, government agencies, especially recordkeeping agencies, each one of these institutions is absolutely critical to realizing the ideals of the Enlightenment: namely, greater understanding, approaching the truth, deeper knowledge, simply because no one can do it on their own. No one's smart enough, no one's wise enough, no one's objective enough. We agree to play a game where the rules of the game will tend to steer us toward greater truth and rationality. It's those leagues that have rules that steer us toward greater rationality, that are responsible for the fact that we have learned stuff: We do know what causes diseases. We know what causes the weather. We know how the Earth came into being, thanks to all these institutions, but the institutions have to keep up their credibility. They've got to maintain just the principles that will allow us fallible, imperfect, bias-riddled people, to blunder our way toward the truth_ and not all of them have been doing such a good job at it. Academia in particular has become notorious for having a political monoculture increasingly, especially in humanities, faculties and science faculties, and public health faculties, there are fewer and fewer people who would call themselves a conservative or even centrist. It's moving inexorably leftward, unless you believe that the left has the truth about everything, in which case we don't need science. Just ask a left-wing pundit and we have the answer! If you don't believe that, you mean you are committed to the idea that we need diversity of ideas, of viewpoints, not just diversity of gonads and skin color, but also diversity of beliefs, the most important kind of diversity there is. As universities have driven out the various dissidents, and heretics and heterodox thinkers, often in comically outrageous episodes, which get replayed endlessly in popular right-wing outlets, the credibility of universities gets corroded, and by extension, public health agencies, media outlets. The problem being, that I don't think that universities, or media outlets, or the deep state is rotten to the core. I think there are termites nibbling at the foundations, but the foundations are still in place. And so when universities, and these other truth-seeking institutions do get things right- like anthropogenic climate change is real, it's happening- the fact that that comes out of the university doesn't mean it's false. In fact, there are good reasons to believe it's true, but if universities don't establish their credibility, if they develop a well-earned reputation for drumming out anyone who disagrees, they'll be blown-off by leaders. they'll be blown-off by the majority of the population. And I think that our institutions have done a poor job of safeguarding their credibility, and instead by following the natural tendency to form their own tribe, tribe of right-thinking people, they have blown-off people in power, and huge portions of the population.

NARRATOR: How does cancel culture stifle rationality?

PINKER: The thing about campaigns of intimidation is that they can often take root, even if they're not particularly popular because everyone assumes that everyone else thinks they're a good idea. This is sometimes called "pluralistic ignorance," or the spiral of silence, where in a famous case, all the members of a fraternity thought that it's really stupid to drink until you puke and pass out, but everyone thought that all the other frat boys thought it was cool. It turns out no one thought it was cool, but if you don't have people blurting out what they think, if you don't have a little boy saying the emperor is naked, then these beliefs can be entrenched- especially when there is punishment. If a person who does breach the consensus gets fired, driven out, be shown a world of pain, then people can sometimes think, "Well, I better punish, lest I be the one who's punished." And you might even have everyone punishing for a belief that they don't think is so bad in the first place. Now, I don't know to what extent some of the repression of heterodox beliefs, of beliefs that are actually held by a majority of Americans outside universities, are also held by people inside universities because you get punished if you say them, leading to the spiral of silence. So what's critical is to have the little boys who say the emperor is naked, that is have a space in which people can say things that they think are true without getting punished. And there is a pushback. There are organizations like the Academic Freedom Alliance, the Foundation for Individual Rights in Education, the Heterodox Academy, that are saying what a lot of people think privately, but are afraid to express. Namely, disagreement is good, disagreement is inevitable, that the kind of things you get canceled for saying might even be true. How are we gonna know unless you can say them?

NARRATOR: What are "tragedies of the commons," and how can they impede progress?

PINKER: One idea that is essential to make sense of our current predicament comes from game theory. Game theory is one of the cognitive tools that I try to explain in my book "Rationality," as normative benchmarks of rationality that every educated and informed person ought to have at their fingertips. Game theory is: What's the rational thing to do if you are in a situation where the outcome depends on what other rational people do? And the tragedy of the commons is a game-theoretic situation or predicament; it goes by other names, sometimes called a "public goods game," where what everyone does- that's rational for them- leaves everyone worse off when everyone does it. The original scenario was you've got a town green with commons, with grass growing and every shepherd thinking, "Well, should I bring my sheep to graze on the commons? Yeah, there's lots of grass there, and my sheep will be fatter and I'll be better off." And the second shepherd thinks that, and the third shepherd, and so on, but then if you have more sheep than the commons can support, if they graze the grass faster than it grows back, then all the sheep starve and everyone is worse off, but no shepherd is gonna say, "Well, I'm gonna be the one who forgoes the commons." Let everyone else enjoy it because it doesn't make any sense for him. There are lots of situations in which tragedies of the commons confront us. Greenhouse gas emissions: If I wait for the bus in the rain instead of driving my SUV, I'm not gonna save the climate. So it makes sense for me to take the SUV. Well, if everyone thinks that, then we're all in danger of being cooked. Now, an additional arena in which we have a tragedy of the commons is rationality itself. And this is part of the puzzle of how a species that obviously is capable of great feats of rationality can believe so much nonsense. Namely, you got a person who thinks, "Should I believe this or should I believe that? Well, if I believe this, I will be a hero, and all the people that matter to me: my friends, my coworkers, the people I rub shoulders with." It may not be true, but that's what everyone wants to believe 'cause it makes our side look good. And another member of the group thinks that, and another member and they all think, "Well, if you doubt that then you're making us look kind of stupid and evil, and shame on you if you bring disrepute onto the group." Well, if everyone believes that, then you could have two sides, each of which kind of individually rational, in the sense that each one gets the respect of his buddies, his pals, his colleagues, but the whole society is worse off because you just have warring tribes instead of a joint search for the truth. So there's a kind of tragedy of the rationality commons in that what's rational for every individual to believe, namely ratifying the beliefs of their tribe, is irrational for the society as a whole. Just as in the case of the commons, the original commons, you really do want grazing permits, or some way of limiting access. In the case of the rationality commons, you want norms of objectivity, truth-seeking, impartiality, the commitment to truth as more important than a slogan that makes your side look good in order for everyone to enjoy what is objectively good for everyone.

NARRATOR: Do tragedies of the commons contribute to political polarization?

PINKER: Certainly, the tragedy of the commons is one piece of the puzzle of why we see so much polarization and irrationality in the political arena, in the country as a whole, sadly even in academia, which ought to be dedicated to the disinterested pursuit of truth. Namely, what gains you brownie points within your community is not necessarily what's true. It ought to be-we should jigger the rules of the game so that they are the same, but what earns you points as a hero, what on the other end might make you a pariah, lead to social death, lead to ostracism, if you perhaps take seriously some belief that the other tribe holds, is completely rational not to become a pariah. Individually, it's not rational if everyone trying to avoid being a pariah won't even consider the possibility that one of their beliefs is mistaken 'cause most of our beliefs are mistaken most of the time. We ought to continually try to correct them, but if your criterion for believing something is: "Will it make me popular?" that is not a roadmap toward everyone getting to the truth.

NARRATOR: How can narrative thinking skew our perception of the world?

PINKER: We are storytelling animals: we spin narratives. That's one of the ways that we make sense of the world, but often the narratives can be more entertaining than accurate in terms of an understanding of the world. Now I have nothing against narratives, and we need them to make moral sense of our world. We need them as a constructive way to occupy our brains, but we do have a habit to fall back on narratives when it comes to big important questions like: How did the world come into being? Why do bad things happen to good people? What really happens in corporate boardrooms, or the White House, or 10 Downing Street? Now if you're a scientist, if you're a historian, if you're a journalist, you say, "Well, we can find the answers to those questions: Microbiology and immunology tell us why people get sick, and cosmology tells us how old the Universe is, and government transcripts of conversations among leaders tell us what actually happened in the White House." And with enough dogged work and attention to detail, and facts and peer review and open criticism, we can kind of figure out what really happened, what really causes grand, cosmic, historic events, but this is a radical, revolutionary, unnatural mindset. It's a gift of the Enlightenment that we have good objective science and history and recordkeeping, and data sets and journalism. The human mind hasn't really kind of caught up to that. And so, when it comes to these big cosmic questions, we seek narratives, we're satisfied by narratives: "Why is there a Depression?" "Well, it was conspiracy of the the rich Jews." "Why is there an epidemic?" "Well, a nefarious ethnic minority poisoned the wells." They're great stories. They would make fabulous fiction. They're false, and they can be dangerous needless to say. When we put on the mindset of what really happened, we are much better off kind of setting aside our narratives, keeping them in the realm of fiction, and trying to determine what really happened.

NARRATOR: What are cognitive illusions, and do they explain irrationality?

PINKER: I'm a cognitive psychologist, and like many cognitive psychologists, I think one of our proudest achievements are the various fallacies and biases, and quick shortcuts and rules of thumb that people use instead of actually applying the optimal statistical, or logical formulas. This work has been made famous by Daniel Kahneman, and Amos Tversky among others. Kahneman won a Nobel Prize for it, and it is an essential set of insights about what makes us tick, but I love teaching it, I plan to write a book about it, but I quickly realized that a lot of the fallacies, and irrationalities that we see around us are not just because of the classic cognitive illusions. So among those illusions, for example, are the "availability bias." That is we tend to overestimate the likelihood of events that we can easily recall from memory. If there's a plane crash, we think that plane travel is dangerous. If there's a terrorist attack, we're afraid to go out in public. If there's a shark attack, we don't want to get in the water, and we forget all the boring things like people falling off ladders, and getting into car crashes, that actually kill far more people, but that don't have gaudy headlines that engage this availability heuristic. We tend to reason by stereotypes: So if I describe to you Penelope, who is sensitive and loves composing sonnets, and loves to summer in Italy and France, is she more likely to be a psychology major, or an art history major? People say art history major 'cause she fits the stereotype. They don't really stop to think there are many, many more psychology majors than art history majors. Psychology is often the most popular major. So knowing nothing about her, you've got to start off with the assumption that she's more likely to be a psychology major regardless of how well she matches some other stereotype. It's another classic fallacy called "representativeness." Anyway, there's a long list, the "gambler's fallacy"- if a roulette wheel lands red five times in a row, you bet on black 'cause it's kind of due for black, a misunderstanding of the so-called "law of averages." People think the law of averages is that the random processes kind of go outta their way to try to look random and fair, whereas, in fact, the roulette wheel has no memory. So the chance of a red or a black is exactly 50% regardless of the string of reds beforehand. Anyway, there's a list of those fallacies, but what I quickly discovered is they're not gonna explain QAnon. Why do people believe, with no evidence, that there is a cabal of Satan-worshiping, cannibalistic pedophiles in the American deep state. The gambler's fallacy gives you no insight. And so, I had to range into other parts of our psychology that could explain these 'nutball' beliefs. And among them are the fact that we often aren't so committed to the factual veracity of beliefs that have a strong moral component, that are more in the realm of mythology. So if you consider a person who thinks that Hillary Clinton ran a child sex ring out of a pizzeria in Washington, like, a completely crazy belief- what does it actually mean when they say they believe that? It's not as if they call the police, which is what you do if you really thought that kids were being raped in the basement. Some of them would leave a one star review on Yelp, but it's really an open question whether I believe that Hillary Clinton ran a child sex ring is really just another way of saying, "Boo, Hillary!" "I think she is depraved enough that she could do it!" And whose to say whether she did it or not? You can't find out. A lot of beliefs are in this mythological realm where people just, they don't care whether they're true or false, they're good things to believe in that moral community. So I think that's part of the answer. We have to recognize, at least those of us who have a kind of modern, Enlightenment, scientific mindset, namely, there is a truth. Potentially you could find out what it is. You ought to believe only things that are true, and not to believe things just because they are pleasing narratives. That's a very unnatural cognitive mindset. It's a good mindset; I think we should try to encourage it as much as possible, but left to their own devices, that's not what people fall back on.

NARRATOR: What is Bayesian reasoning?

PINKER: The late great astronomer, and science popularizer, Carl Sagan, had a famous saying: "Extraordinary claims require extraordinary evidence," sometimes abbreviated ECREE. In this he was echoing an argument by David Hume as to why we shouldn't believe miracles, even if someone recounts them in a credible way. Hume said, "Well, what's more likely? That the laws of the Universe as we've always experienced them are wrong, or that some guy misremembered something?" Now these are all versions of a kind of reasoning that is called Bayesian after the Reverend Thomas Bayes, and it's a simple mathematical formula with only four terms, which you can translate into common sense, into everyday English; you don't even have to do any mathematics to appreciate it. So what Bayes' theorem gives you is a posterior probability. Again, that sounds more threatening than it is. It just means after you've seen all of the evidence, how much should you believe something? And it assumes that you don't just believe something, or disbelieve it, you assign a degree of belief between zero for impossible, and one for necessary, but most of your beliefs are gonna be in-between, where you have different degrees of credence that you put in an idea. Okay, we all want that. We don't wanna be black and white dichotomous absolutists. We wanna calibrate our degree of belief to the strength of the evidence. Bayes' theorem is how you ought to do that, and here's how you do it: The posterior probability, that is credence in an idea after looking at the evidence, can be estimated by the prior- that is, how much credence did the idea have even before you looked at those data, at that evidence? And, in fact, that is one bit of Bayes' theorem that has escaped from statistics and probability into common parlance. When you say, "Well, those are his priors," the term 'priors' comes from the first term in Bayes' theorem, the prior probability of a hypothesis. Now there is some subjectivity: How much did you believe it before you even look at the evidence? But it doesn't mean that Bayes' theorem allows you to believe anything you want, to set any prior you want. The prior should be based on something, on everything that we know so far, on data gathered in the past, our best-established theories in the case of something like, does a person have a disease? It might be the base rate, or prevalence of the disease in the population. Anything that's relevant to how much you should believe something before you look at the new evidence. Okay, so that's the first term the priors. Second term is sometimes called the 'likelihood.' And that refers to if the hypothesis is true, how likely is it that you will see the evidence that you are now seeing? So if a person, say, has Rocky Mountain spotted fever, chances are pretty good that he's got spots, and probably that he's been to the Rocky Mountains. You just divide that product- the prior, the likelihood-by the commonness of the data, probability of the data, which is how often do you expect to see that evidence across the board; whether the idea you're testing is true or false. If something is very common, so, for example, lots of things that give people headaches and back pain. You don't diagnose some exotic disease whose symptoms happen to be back pain and headaches just because so many different things can give you headaches and back pain. That goes into the denominator. If the evidence is common, the whole fraction is gonna be small, and that yanks down your posterior- that is, your degree of credence in the hypothesis after looking at that evidence. So that's Bayesian reasoning. It may sound kinda obvious, but it isn't always. And, in fact, one of the major findings of cognitive psychology, and its application, called "behavioral economics," is that people are often crummy Bayesian reasoners. They often forget about to take into account the base rate. Just a concrete example, and this happened to someone that I know very well. Her two-year-old daughter had twitches, and a doctor said, "Oh, she might have Tourette syndrome because people with Tourette syndrome often have twitches." Well, that terrified this young mother until she kind of thought it through, and rediscovered Bayes' theorem just from her own rationality, and said, "Well, wait a second. An awful lot of kids have twitches. Not very many people have Tourette syndrome. Even if most people with Tourette syndrome have twitches, that doesn't mean that most people with twitches have Tourette syndrome." The doctor failed to take into account the prior, the base rate. Now, she was fortunate that she had the wherewithal to kind of reconstruct Bayesian thinking with her own common sense, but a lot of people don't- and a lot of people have to explicitly be disabused over their base rate neglect. There's a cliche in medical education: "If you hear hoofbeats outside the window, don't conclude that it's a zebra- it's much more likely to be a horse." And that's another way of getting people to take into account Bayesian priors.

NARRATOR: What's a situation where people tend to neglect Bayesian reasoning?

PINKER: A classic example of how people often don't engage in Bayesian reasoning is what's called the "medical diagnosis problem." It's been given to thousands of people over many decades in experiments. It's a simple task: How do you interpret the results of a medical test? So take a disease, let's say it's breast cancer, and let's say it's prevalence in the population is 1%; 1% of women at any given time have it. Let's say there's a test whose sensitivity is 90%. That is if someone has the disease, 90% of the time the test will pick it up, but it also has a false positive rate. 9% of the time when a woman doesn't have the disease, there'll be a bogus result. So, someone tests positive, what are the chances that she has the disease? Well, people think, "Oh my God, a positive test. What a nightmare, that's 90% chance she's got the disease." The correct answer is 9%-that's a big difference. And, in fact, a majority of doctors get it wrong. Somewhat disconcerting when you count on your doctor to interpret a medical test for you. Where do they go wrong? Well, they forgot the fact that if a disease has a low base rate, which means that it ought to have a low prior, that's the first term in Bayes' theorem- in this case 1%. Well, what it means is that most of the positives are gonna be false positives. Now you plug those three numbers into Bayes' theorem, and out pops 9%. Intuitively, we tend not to do that. However, there is a way in which you can get people to reconstruct Bayesian reasoning on their own. The reason that people flub the medical diagnosis problem is not that we're just inherently irrational, as Mr. Spock might put it, "Humans are irrational," it's that we're not used to dealing with the quantification of the probability of a single event. And, in fact, it is somewhat of a mind-bending concept. What's the probability that that woman has cancer? Well, what do you mean the probability? Either she has it or she doesn't. What do you mean probability of one person? Now if you switch the terms of the puzzle from the probability that that patient has cancer to frequency, what percentage of people in the population have it? Well, now it suddenly becomes more intuitive. So to take the medical diagnosis problem, and just reframe it, the same numbers but in different terms: You say out of every thousand women in the population at any given time, 10 have breast cancer. Of the 10 who have cancer, nine of them will get a positive test result. Of the 990 who don't have breast cancer, 89 will get a false positive. Someone gets a positive result. What are the chances that they have the disease? Well, now people say, "Well, gee, if all the people who test positive nine of them, it's a true positive, 89, it's a false positive, nine divided by 89- it's about maybe a little less than 10%. And now, everyone's a genius. Everyone is a statistician, including kids, they get the answer right. So the lesson that we have to take away from a lot of the work on human rationality, and irrationality from the cognitive psychologists, from the behavioral economics, is not that people are irrational, but they have to consume the information in a mind-friendly format in one that harmonizes with the way that we experience things in our lives. And in our lives, unless you take a statistics course, you're not going to see a 1% probability. That's kind of invisible, but you are gonna see a bunch of sick and healthy people with symptoms and you can kind of mentally tally how many of the people with the symptoms end up with the disease or don't. And then it all becomes much more intuitive. So that is a way of capturing it in a little bit of jargon, is to say that people have ecological rationality. Now that doesn't mean out in the rainforest hugging a tree, being green: Ecological just means in a natural environment, a humanly natural environment that is encountering events in a personal and social context as opposed to what you might call formal, or academic intelligence, where you have an all-purpose, handy-dandy formula, and you plug the numbers into the P's and Q's, and X's and Y's and H's and D's. That, we're not so good at, and that's where you have to go to school, and learn the formula. Now, it's a good thing to do that because those formulas are really, really powerful. You don't have to be familiar with everything in order to make good predictions about it. With an abstract formula, brand new things get plugged into the formula, and you can have almost the gift of prophecy, but it's those abstract formulas that tend not to be so intuitive.

NARRATOR: Why is Bayesian reasoning indispensable for scientists and AI researchers?

PINKER: Bayesian reasoning is one of the most powerful tools in science, and it's seen a revival in one field after another: in cognitive psychology, in artificial intelligence, in engineering, in forensics, pretty much anywhere in which you've got to adjust your degree of belief in a hypothesis based on evidence, or in the case where you're building an intelligence, say an AI, or studying intelligence, say as a psychologist, you've got to have a benchmark for how you ought to calibrate your beliefs- and then you can use that to compare how humans, how animals, how robots actually do adjust their beliefs.

NARRATOR: How useful is Bayesian logic for everyday reasoning?

PINKER: There's a somewhat eccentric group of people who call themselves the "Rationality Community," who try to promote principles of sound reasoning in all realms of life. And they would be the first to say that the Bayes' theorem, Bayesian thinking, is probably the most important cognitive tool that we ought to apply widely. Well, one of them has in his, I think in his Twitter handle or in his website, pH given D, equals pH times pD, given H divided by pD- the rest is commentary. Now that I just gave you Bayes' theorem in its mathematical sense. This was a play on the famous line from Rabbi Hillel: "What is hateful unto you don't do unto your neighbors- the rest is commentary." In his summary of the entire Torah, meeting the challenge of summarizing the Torah while the respondent stood on one leg, but that's how much stock rational people put in Bayesian reasoning. It is- that's the core, and everything else is commentary. Well, why? Well, it's the basis for what you should believe, and how much you should believe it if you're a rational person. And there is no realm of calibrating your belief, assuming you want to believe true things, or things that are more likely to be true where Bayes' theorem is not relevant. It does require one crucial leap, and why it's not so intuitive is because the theorem itself it has three terms, it's just actually an algebraic rejiggering of the definition of conditional probability. So when you take probability 101, and you learn the formula for conditional probability, you kinda almost already know Bayes' theorem. It just takes one leap, and that is your degree of belief in a hypothesis, your degree of credence, can be treated like a probability. Now that's not obvious, it could even be disputed, but the idea that "Maybe it's true, maybe it's false; I'm kind of indifferent," and you say, "Okay, 0.5, 50/50," or "I'm almost certain that it's true. It's gotta be true. 99 times out of a hundred it would be true." And you say, "Okay, your belief has a credence of 0.99." That's the leap you have to take for Bayes' theorem to be relevant to everyday reasoning.

NARRATOR: Why is it so hard to get people to use Bayesian reasoning?

PINKER: Bayesian reasoning is not always intuitive, among other things, and probably the most fundamental problem is not everyone thinks that their beliefs ought to be grounded in the best possible evidence. That is a radical post-Enlightenment, scientifically informed mindset, but for most people most of the time it's like you can't know, you can't find out: "I'm gonna believe what is most morally uplifting, what makes for the greatest story, what makes my people look wisest and noblest, and what makes my adversaries look most foolish and stupid," but the idea is, well no, you can't just believe something because it makes your side look good- that's the big hurdle to get over. Once you get over that, then Bayesian reasoning commence itself much more naturally, but that's the very first barrier.

NARRATOR: When can a focus on Bayesian reasoning become problematic?

PINKER: There are many realms in life in which, if all we cared about was to be optimal statisticians, we should apply Bayes' theorem, just plug the numbers in, but there are things in life other than making the best possible statistical prediction, and sometimes we legitimately say, "Sorry, you can't look at the base rate, you've gotta look at the particulars of this case." So, for example, in a fraud trial, do you look at the person's ethnic group and say, "Well, store owners who have this religion, or this ethnicity have a much higher rate of defrauding their customers than those from some other religion. So even though we're not sure about the evidence in this case, let's go with the statistics and say, 'This person is more likely to lose the lawsuit, or rates of criminal violence, or rates of success in school.'" It almost gives you the creeps to think, "Well, let's look up how Asian people do, or how Jewish people do." It's like, "No, you really don't wanna go there." And it's true, you may not have the same statistical predictive power, but predictive power isn't the only thing in life. You may also want fairness. You may want to not perpetuate a vicious circle where some kinds of people, through disadvantage, might succeed less often, but then if everyone assumes they'll succeed less often, they'll succeed even less often. You wanna break that cycle. You might wanna say, "No, don't be a Bayesian, say, in the courtroom, or in college admissions." And that can be defended. It can also go too far. It can be cases where you might suspect that there is, or you might claim that there is racism or sexism just by saying, "Well, only 10% of mechanical engineers are women. So there must be a lot of sexism in mechanical engineering programs that cause women to fail." And you might say, "Well, wait a sec. What is the base rate of women who wanna be mechanical engineers in the first place?" There, if you're accusing lots of people of sexism without looking at the base rate, you might be making a lot of false accusations. And that can happen with religion, that can happen with race. So we've- I think we've got to think very carefully about the realms in which morally, we want not to be Bayesians, such as the courtroom, and the realms in which we do wanna be Bayesians, such as journalism and social science, where we just wanna understand the world. It's one of the most touchy and difficult, and politically-sensitive hot buttons that are out there. It is of the psychologist Philip Tetlock who studies the mindset of taboo, identifies forbidden base rates as one of the biggest sources of secular taboos. And the problem with that taboo, and all taboos is if you discuss, "Gee, is it rational or irrational to have this taboo? Should we try to overcome this taboo?" You're kind of already breaching the taboo just by bringing it up. The paradox is it's a little bit like the command for the next 30 seconds don't think about a polar bear. Now it's very hard to carry out that command 'cause the harder you try, the more impossible it is to do- and that's a dilemma that faces us with all taboos, including forbidden base rates. Still, we can't evade the responsibility of deciding when are base rates permissible, when are they forbidden?

NARRATOR: Why do some journalists and scientists forego Bayesian reasoning?

PINKER: Unfortunately, a lot of journalism, even some science journalism, is innocent of quantitative thinking, and victims of the availability bias by which we judge the likelihood of an event according to how easily we can dredge examples out of memory; especially since journalism is almost guaranteed to be kind of an availability machine, an availability trap, namely, that's what it feeds us: Images and narratives and anecdotes. Unless a story is accompanied by a base rate, how many people every year really do get eaten by sharks? If you're talking about a shark attack, that really should be part of the story. How many people are killed by terrorists? How many people are shot by the police? People have crazy misestimates of a lot of really important things in life because they assume, 'If I read about it in the paper, it must happen a lot.' That's very often false. So that's one fallacy of reasoning that I think journalists should be more mindful of. Another is a failure of Bayesian reasoning, namely the habit. And this, I think, is an affliction in particular of science journalism, of playing up every revolution, every upset applecart, every young upstart, every challenge to dogma and conventional wisdom as if it overturned our understanding. The problem is that a lot of cool, nifty, counterintuitive, weird, challenging findings turn out not to replicate. They turn out to be bogus, which we should have known beforehand because if a finding challenges the consensus, well, consensus means kind of high Bayesian prior- there must have been at least halfway decent reason to believe it in the first place. If there is halfway-decent reason to believe it in the first place then one datum, one finding, one experiment that seems to challenge it, should maybe tweak our Bayesian prior, pull it down a little bit, but it shouldn't throw it out- that's a kind of amnesia. And assuming that every experiment decides the issue, and the kind of headline: "Was Darwin wrong?" "Was Einstein wrong?" "Did the new revolutionary is challenging everything you think you knew?" A lot of those claims are probably gonna be wrong simply because they flout the Bayesian priors. There's a saying that: "90% of what's in science journals is wrong; 90% of what's in science textbooks is right." That is, new findings taken in isolation, especially given the publication bias, not just among journalists, but scientists themselves and editors- you're more likely to get something published if it challenges conventional wisdom. "The umpteenth finding that shows something that we knew all along, eh, who wants to read that?" But something that makes it seem as if everything that we believed is wrong, "Ooh, that's news, let's publish that!" That's a distorter, it's a biaser. And it's one of the reasons why, not just in the kind of softish sciences like psychology, but even in what we like to think are harder sciences like biomedicine, there are so many bogus claims. Things they claimed were bad for you that are good for you, and vice versa. The miracle drug that turns out to be no better than the placebo. The supplement that makes you worse instead of better. The gene for X that washes out in the bigger sample. A lot of these kind of disappointments come from a failure of Bayesian reasoning even among scientist; preferring the new, preferring the revolutionary to the already established.

NARRATOR: Is the media responsible for our inability to employ Bayesian reasoning?

PINKER: Because Bayesian reasoning is not widely available at our fingertips as a way of approaching every problem that we face, in some realms, we think like good Bayesians, but we don't kind of have it as an all-purpose cognitive tool, especially in unfamiliar domains where there are good statistics, even if we don't encounter them in our lives. The fact that in a lot of realms, people are natural Bayesians- journalism, even science editing, scientists themselves display failures of Bayesian reasoning- means it is something that we always have to keep in mind, be conscious of, and I believe it should be part of the training of every journalist. I think there is a problem that often journalism is something you go into if you're not good at math, that should not be true; journalism inherently involves trends. Trends are quantitative. It involves things that change, things that get better, things that get worse. Those are all quantitative phenomena, and if you're in the business of just serving up images, and anecdotes, the proper context and journalists agree, events should always be presented in context, but that context has to include the base rates, the likelihoods, the trends, the data, the number of times something could have happened and didn't, not just cherry-picking all the times that it did happen- that really ought to be part of the mindset of journalism. And I don't mean to pick on journalists, but it should be a part of the mindset of everyone.

NARRATOR: Can you simplify Bayes' theorem for us?

PINKER: Bayes' theorem, at first glance, looks kind of scary 'cause it's got all of these letters and symbols, but really it's mathematically pretty simple. It's just got three terms, but more important, conceptually, it's simple, and at some level I think we all know it. What Bayes' theorem just says is that your degree of belief in a hypothesis should be determined by how likely the hypothesis is beforehand, before you even look at the evidence. If the hypothesis is true, what are the odds that you would see the evidence that you are seeing, scaled by how common is that evidence across the board whether the hypothesis is true or false? If you could follow what I just said you understand Bayes' theorem.

NARRATOR: Can we become more rational?

PINKER: Can we become more rational? Can more people become more rational? It's a pressing question because irrational beliefs can lead to public health disasters as when people forego vaccination. They can lead to wars and genocides when people blame a misfortune on a conspiracy theory that implicates an ethnic group. We'd really be better off if more of us were more rational, and the question is how to make that happen. Now a common answer among my people, cognitive psychologists, or academics in general, is education. We've got to teach kids critical thinking, how to avoid fallacies, probability, statistics. There's often a suggestion that instead of teaching high school kids trigonometry, which for most people is pretty useless, we should teach them probability theory, which is extraordinarily useful. Curricula in critical thinking try to get people out of natural fallacies, like when you argue against someone, you set up a straw man that's easy to refute, and you knock down the straw man. Instead you should set up a steel man that is the strongest possible version of the hypothesis you're arguing against and find a flaw in that. You should avoid arguments from authority- that is, "So-and-so said it, and he's got a Nobel Prize." Keeping in mind that a lot of Nobel Prize winners have said a lot of goofy things out of their areas of expertise, or ad hominem arguments: "You're only saying that to sell books, and get invited to be on "Big Think." "The only reason you're saying that is that that's what people on the left think, or people on the right think, or white people think, or Black people think"- generalizations about a person's background as opposed to the strength of their arguments. So all of these habits, I agree they should be inculcated in school, but the thing about school, and I say this even as a guy who gets paid to do education for a living- I'm a professor, that's what I do- and so I really do believe it's a good thing, but education can't be the only part of the answer. And the reason is all too familiar: Students take a course, they cram for the exam, the exam is over. By the time the ink is dry, they forgot most of what they crammed for in the exam, and they don't generalize. That is a major finding in cognitive psychology. So you teach someone a principle of critical thinking, like try to see the issue from the other side to really understand it. "So what did World War I look like from the point of view of the Germans?" And so, okay, you answer an exam question. Now you say, "Okay, well, what does Donald Trump look like from the point of view of a Trump supporter?" And it just doesn't occur to people to think that way. It's like, "Oh, I thought that taking the other point of view had something to do with Germans in World War I." And it's like, "No, no, it's not about Germans in World War I. It's about always taking the other point of view in becoming more rational." And people aren't very good at making that kind of leap, or just to give another example, you teach the principle of sunk-costs. So the fallacy is you decide on a course of action depending on how much you have sunk into it so far. "Oh, I should continue on this project even though it hasn't led to any improvements because we spent so much money on it already." Well, that's the wrong way to think. You gotta think about given the money that we can, or can't spend going forward, should we proceed with it? Sometimes called the "Concorde Fallacy," after the fact that the British and French governments continue to pour money into the supersonic plane called the Concorde based on the argument, "Well, look how much we poured into it so far." And it always lost money, and it was canceled in the early 2000s. Okay, you teach people about the "sunk-cost fallacy." "Okay, no, we pull the plug on the Concorde if it doesn't make sense economically going forward." And now you say, "Well, you start reading a book, and you're really not enjoying it. Do you go through it to the bitter end even though you're learning nothing, and it's just miserable?" And a lot of people say, "Yes, when I start a book, I have to finish it." Well, that's the sunk-cost fallacy. It's not in money. It's not about airplanes, or how about being in a bad relationship? "Well, I've been with this person for a year-and-a-half. It would seem sad to end it now." Well, that's not the right question. Not what happened in the past, but how good will it be going forward? Again, sunk-cost fallacy, but people don't make the leap between say money, and reading a book and staying in a relationship. So that's the limitation of education, at least when it's not done with an eye toward encouraging the right generalization. And since a lot of education kind of ends when the exam is over, it can't just be you take a course in critical thinking: It's gotta be that the principles of critical thinking are just part of what it means to be a decent, thinking, respectable person. So if you're writing an op-ed or a blog post, or commenting on a tweet, or even having an argument in a bar, you gotta keep those principles of critical thinking. Set up a steel man, not a straw man. Attack the idea, not the person. Don't trust something just because a big shot said it. It's gotta be second nature. It's gotta be part of our norms. It's gotta be like not going out naked in public. Just something that's part of the expectation of what decent people do. Needless to say, that's much harder to engineer than simply having kids take a course.

NARRATOR: Do we have what it takes to be collectively more rational?

PINKER: Well, despite our vulnerability to fallacies, and conspiracy thinking, and paranormal nonsense and ghost stories, and superstitions and magical thinking, and all the rest, there is a capacity in us to become collectively more rational; we can just see it looking backwards. Several hundred years ago, people believed in the existence of werewolves and unicorns. They thought that there were omens in eclipses, and rainbows and comets. They believed in astrology, some people still do, but it used to be the common knowledge belief. People believed that you could placate angry gods by sacrificing innocent people, making a blood sacrifice. People would settle disputes by dueling among men of honor, like Alexander Hamilton. People would take the whole family out to laugh at the insane in an asylum for entertainment on Sunday afternoon. In area after area, we really do see progress. Sometimes it's slow, sometimes it's hard to see. It doesn't eradicate irrationality, but even in what many people would consider to be the most irrational presidential administration in recent memory with former President Trump advising people to inject bleach to treat COVID, and insisting that the 2020 election had been stolen, all that nonsense, but still he didn't evoke any astrology, or past lives or omens or messages from God. There are whole areas that become more and more off-limits, even in the wild arena of American politics. So there is hope, it doesn't happen instantly. It doesn't happen to everyone. There'll always be big pockets of irrationality, but we can try to kind of steer the ocean liner slowly, and gradually in the direction of greater rationality.


Related