Once a week.
Subscribe to our weekly newsletter.
Human Irrationality is a Fact, not a Fad
Once upon a time, we were taught that people are basically rational—at least when they have to be, at the stock market, the voting booth, the courtroom, the hospital, the school, the employment office and other important places. Economists, in particular, depended on their version (rationality=everyone is out for himself all the time) but the reliable rule of reason was important for politics, law, medicine and other fields as well. Then some economists changed their minds and put the word behavioral in front of their discipline. They reported abundant evidence that people don't know when they're being rational and can't decide to use reason at will. As these researchers were pushing against the old assumption, they (and their popularizers even more) banged the We're-Irrational drum pretty hard. So now there's a backlash: People pointing out that, foolish as it may be to say that people always think straight, it is equally dumb to claim they never think straight.
As a complaint about what I call post-rational thinking, this is largely a straw man. I've never run across a neuroscientist, psychologist or economist who claimed that people are utterly incapable of reason (what would be the point of showing them evidence that they can't make sense of evidence?). Maybe Jon Haidt didn't acknowledge his dependence on reason enough in his book. Maybe David Brooks sentimentally devalued conscious reasoning in his, as Thomas Nagel pointed out. But they didn't deny the rational mind's abilities. And then, in Thinking Fast and Slow, Daniel Kahneman, the world's most famous and most honored behavioral economist, continually reminds readers that people can and do reason well all the time. So I don't agree with my new fellow-blogger, Steven Mazie, that we're awash in "a faddish denial that human beings can think straight."
Indeed, the problems posed by post-rational research (and the reasons I'm interested in it) stem from the way human beings can "think straight." It's just that they can't tell when they are doing so, can't will themselves to do so, and often think they're being rational when they are not. As a result, there is a gap between the way our important institutions officially work and the way they really work, and this gap causes a great deal of harm.
Two examples: Officially markets are efficient sorters of information that help all participants find the true prices of goods and services. In reality, markets aren't meetingplaces of rational beings, and hence are susceptible to runs, panics, bubbles and fraud. Officially, judges are trained experts who objectively apply the law. In reality, judges who have had to make a lot of decisions without a break are more severe than they are when they've just had a break. And judges who have rolled a dice and gotten a high number will choose a longer sentence than will judges who rolled a low number, for the same criminal. We need to fix markets that rest on the false assumption of perfect rationality. We need to protect the justice system against the assumption that judges are consistent from 10 to 6. To ignore the evidence, by, say, dismissing it as a fad, is to let our institutions run badly in order to preserve the fiction of rationality on which they rest. That wouldn't be terribly rational of us, would it?
In between the absurd extremes of Perfect Rationality and and Perfect Irrationality (which nobody believes in anyway) are important questions, all of them still open, and none simple: When is Reason actually engaged? How can we tell? What rules do we follow instead of logic? And, most importantly, what do we mean by "thinking straight"?
This last question is seldom addressed in books and articles on human irrationality. Instead, as Mazie notes, a lot of this material uses a thin and impoverished notion of human thought. (Deirdre McCloskey makes a similar point about the allied field of "happiness economics" here.) Often, their model of what thought is, and what it is for, is as bad as the old Rational Economic Man models. In fact, often, it is the old REM model: After all, to say that people make systematic errors when they make choices is to say that we know what is correct, and that "correct" is doing what old-school economists would do. But perhaps those economists were wrong.
For example, people who have to choose between three options (call them A, B and C) will value them differently if they previously had to choose between A and B. Economists say this is faulty thinking, because the value of A and the value of B are not altered by the presence of C. But many creatures, including slime molds, are subject to this "error." So we should at least admit the possibility that it could be more appropriate for a living being to make the "error" than to act like a 20th-century economist. More broadly, we should realize that the end goal of a better understanding of human behavior isn't a few tweaks and nudges, but a better definition of what it means to think, and be, and be well.
Research into human irrationality, then, has the potential to cure some of our most important institutions of the habitual harms they inflict on us. And longer-term, it can contribute to a better understanding of what Reason is, and what we (sometimes) rational animals are. For both those reasons, I don't think this is just a passing fad.
Follow me on Twitter: @davidberreby
Geologists discover a rhythm to major geologic events.
- It appears that Earth has a geologic "pulse," with clusters of major events occurring every 27.5 million years.
- Working with the most accurate dating methods available, the authors of the study constructed a new history of the last 260 million years.
- Exactly why these cycles occur remains unknown, but there are some interesting theories.
Our hearts beat at a resting rate of 60 to 100 beats per minute. Lots of other things pulse, too. The colors we see and the pitches we hear, for example, are due to the different wave frequencies ("pulses") of light and sound waves.
Now, a study in the journal Geoscience Frontiers finds that Earth itself has a pulse, with one "beat" every 27.5 million years. That's the rate at which major geological events have been occurring as far back as geologists can tell.
A planetary calendar has 10 dates in red
Credit: Jagoush / Adobe Stock
According to lead author and geologist Michael Rampino of New York University's Department of Biology, "Many geologists believe that geological events are random over time. But our study provides statistical evidence for a common cycle, suggesting that these geologic events are correlated and not random."
The new study is not the first time that there's been a suggestion of a planetary geologic cycle, but it's only with recent refinements in radioisotopic dating techniques that there's evidence supporting the theory. The authors of the study collected the latest, best dating for 89 known geologic events over the last 260 million years:
- 29 sea level fluctuations
- 12 marine extinctions
- 9 land-based extinctions
- 10 periods of low ocean oxygenation
- 13 gigantic flood basalt volcanic eruptions
- 8 changes in the rate of seafloor spread
- 8 times there were global pulsations in interplate magmatism
The dates provided the scientists a new timetable of Earth's geologic history.
Tick, tick, boom
Credit: New York University
Putting all the events together, the scientists performed a series of statistical analyses that revealed that events tend to cluster around 10 different dates, with peak activity occurring every 27.5 million years. Between the ten busy periods, the number of events dropped sharply, approaching zero.
Perhaps the most fascinating question that remains unanswered for now is exactly why this is happening. The authors of the study suggest two possibilities:
"The correlations and cyclicity seen in the geologic episodes may be entirely a function of global internal Earth dynamics affecting global tectonics and climate, but similar cycles in the Earth's orbit in the Solar System and in the Galaxy might be pacing these events. Whatever the origins of these cyclical episodes, their occurrences support the case for a largely periodic, coordinated, and intermittently catastrophic geologic record, which is quite different from the views held by most geologists."
Assuming the researchers' calculations are at least roughly correct — the authors note that different statistical formulas may result in further refinement of their conclusions — there's no need to worry that we're about to be thumped by another planetary heartbeat. The last occurred some seven million years ago, meaning the next won't happen for about another 20 million years.
Research shows that those who spend more time speaking tend to emerge as the leaders of groups, regardless of their intelligence.
- A new study proposes the "babble hypothesis" of becoming a group leader.
- Researchers show that intelligence is not the most important factor in leadership.
- Those who talk the most tend to emerge as group leaders.
If you want to become a leader, start yammering. It doesn't even necessarily matter what you say. New research shows that groups without a leader can find one if somebody starts talking a lot.
This phenomenon, described by the "babble hypothesis" of leadership, depends neither on group member intelligence nor personality. Leaders emerge based on the quantity of speaking, not quality.
Researcher Neil G. MacLaren, lead author of the study published in The Leadership Quarterly, believes his team's work may improve how groups are organized and how individuals within them are trained and evaluated.
"It turns out that early attempts to assess leadership quality were found to be highly confounded with a simple quantity: the amount of time that group members spoke during a discussion," shared MacLaren, who is a research fellow at Binghamton University.
While we tend to think of leaders as people who share important ideas, leadership may boil down to whoever "babbles" the most. Understanding the connection between how much people speak and how they become perceived as leaders is key to growing our knowledge of group dynamics.
The power of babble
The research involved 256 college students, divided into 33 groups of four to ten people each. They were asked to collaborate on either a military computer simulation game (BCT Commander) or a business-oriented game (CleanStart). The players had ten minutes to plan how they would carry out a task and 60 minutes to accomplish it as a group. One person in the group was randomly designated as the "operator," whose job was to control the user interface of the game.
To determine who became the leader of each group, the researchers asked the participants both before and after the game to nominate one to five people for this distinction. The scientists found that those who talked more were also more likely to be nominated. This remained true after controlling for a number of variables, such as previous knowledge of the game, various personality traits, or intelligence.
How leaders influence people to believe | Michael Dowling | Big Think www.youtube.com
In an interview with PsyPost, MacLaren shared that "the evidence does seem consistent that people who speak more are more likely to be viewed as leaders."
Another find was that gender bias seemed to have a strong effect on who was considered a leader. "In our data, men receive on average an extra vote just for being a man," explained MacLaren. "The effect is more extreme for the individual with the most votes."
The great theoretical physicist Steven Weinberg passed away on July 23. This is our tribute.
- The recent passing of the great theoretical physicist Steven Weinberg brought back memories of how his book got me into the study of cosmology.
- Going back in time, toward the cosmic infancy, is a spectacular effort that combines experimental and theoretical ingenuity. Modern cosmology is an experimental science.
- The cosmic story is, ultimately, our own. Our roots reach down to the earliest moments after creation.
When I was a junior in college, my electromagnetism professor had an awesome idea. Apart from the usual homework and exams, we were to give a seminar to the class on a topic of our choosing. The idea was to gauge which area of physics we would be interested in following professionally.
Professor Gilson Carneiro knew I was interested in cosmology and suggested a book by Nobel Prize Laureate Steven Weinberg: The First Three Minutes: A Modern View of the Origin of the Universe. I still have my original copy in Portuguese, from 1979, that emanates a musty tropical smell, sitting on my bookshelf side-by-side with the American version, a Bantam edition from 1979.
Inspired by Steven Weinberg
Books can change lives. They can illuminate the path ahead. In my case, there is no question that Weinberg's book blew my teenage mind. I decided, then and there, that I would become a cosmologist working on the physics of the early universe. The first three minutes of cosmic existence — what could be more exciting for a young physicist than trying to uncover the mystery of creation itself and the origin of the universe, matter, and stars? Weinberg quickly became my modern physics hero, the one I wanted to emulate professionally. Sadly, he passed away July 23rd, leaving a huge void for a generation of physicists.
What excited my young imagination was that science could actually make sense of the very early universe, meaning that theories could be validated and ideas could be tested against real data. Cosmology, as a science, only really took off after Einstein published his paper on the shape of the universe in 1917, two years after his groundbreaking paper on the theory of general relativity, the one explaining how we can interpret gravity as the curvature of spacetime. Matter doesn't "bend" time, but it affects how quickly it flows. (See last week's essay on what happens when you fall into a black hole).
The Big Bang Theory
For most of the 20th century, cosmology lived in the realm of theoretical speculation. One model proposed that the universe started from a small, hot, dense plasma billions of years ago and has been expanding ever since — the Big Bang model; another suggested that the cosmos stands still and that the changes astronomers see are mostly local — the steady state model.
Competing models are essential to science but so is data to help us discriminate among them. In the mid 1960s, a decisive discovery changed the game forever. Arno Penzias and Robert Wilson accidentally discovered the cosmic microwave background radiation (CMB), a fossil from the early universe predicted to exist by George Gamow, Ralph Alpher, and Robert Herman in their Big Bang model. (Alpher and Herman published a lovely account of the history here.) The CMB is a bath of microwave photons that permeates the whole of space, a remnant from the epoch when the first hydrogen atoms were forged, some 400,000 years after the bang.
The existence of the CMB was the smoking gun confirming the Big Bang model. From that moment on, a series of spectacular observatories and detectors, both on land and in space, have extracted huge amounts of information from the properties of the CMB, a bit like paleontologists that excavate the remains of dinosaurs and dig for more bones to get details of a past long gone.
How far back can we go?
Confirming the general outline of the Big Bang model changed our cosmic view. The universe, like you and me, has a history, a past waiting to be explored. How far back in time could we dig? Was there some ultimate wall we cannot pass?
Because matter gets hot as it gets squeezed, going back in time meant looking at matter and radiation at higher and higher temperatures. There is a simple relation that connects the age of the universe and its temperature, measured in terms of the temperature of photons (the particles of visible light and other forms of invisible radiation). The fun thing is that matter breaks down as the temperature increases. So, going back in time means looking at matter at more and more primitive states of organization. After the CMB formed 400,000 years after the bang, there were hydrogen atoms. Before, there weren't. The universe was filled with a primordial soup of particles: protons, neutrons, electrons, photons, and neutrinos, the ghostly particles that cross planets and people unscathed. Also, there were very light atomic nuclei, such as deuterium and tritium (both heavier cousins of hydrogen), helium, and lithium.
So, to study the universe after 400,000 years, we need to use atomic physics, at least until large clumps of matter aggregate due to gravity and start to collapse to form the first stars, a few millions of years after. What about earlier on? The cosmic history is broken down into chunks of time, each the realm of different kinds of physics. Before atoms form, all the way to about a second after the Big Bang, it's nuclear physics time. That's why Weinberg brilliantly titled his book The First Three Minutes. It is during the interval between one-hundredth of a second and three minutes that the light atomic nuclei (made of protons and neutrons) formed, a process called, with poetic flair, primordial nucleosynthesis. Protons collided with neutrons and, sometimes, stuck together due to the attractive strong nuclear force. Why did only a few light nuclei form then? Because the expansion of the universe made it hard for the particles to find each other.
What about the nuclei of heavier elements, like carbon, oxygen, calcium, gold? The answer is beautiful: all the elements of the periodic table after lithium were made and continue to be made in stars, the true cosmic alchemists. Hydrogen eventually becomes people if you wait long enough. At least in this universe.
In this article, we got all the way up to nucleosynthesis, the forging of the first atomic nuclei when the universe was a minute old. What about earlier on? How close to the beginning, to t = 0, can science get? Stay tuned, and we will continue next week.
To Steven Weinberg, with gratitude, for all that you taught us about the universe.
Long before Alexandria became the center of Egyptian trade, there was Thônis-Heracleion. But then it sank.