Once a week.
Subscribe to our weekly newsletter.
John Gray on Jonathan Haidt's "The Righteous Mind"
John Gray's review of Jonathan Haidt's The Righteous Mind is fun because Gray is vehemently opposed to almost everything, but he clearly thinks this is a pretty good book anyway. Gray actually seems slightly irritated that Haidt is so intellectually sophisticated, as if he'd been itching to rail righteously against errors he was later disappointed to discover Haidt doesn't actually make. Nevertheless, he does manage to charge Haidt with a number of philosophical misdemeanors, few of which he is really guilty. The attempt to make the study of morality scientifically tractable earns Haidt the curse of "scientism," an epithet anxiously deployed by humanities scholars whenever anyone steps on their turf armed with a caliper. And Haidt is brought in for abuse for confusing descriptive and normative matters, for not really understanding utilitarianism or intuitionism, and more.
I'm pretty certain Haidt understands the is/ought distinction perfectly well. Likewise, he understands that his descriptive theory of morality has no clear normative upshot, and I'm sure he agrees with Gray that "moralities that have emerged by natural selection have no overriding authority." Early on, Gray tendentiously asserts that "[w]hen 'morality' becomes a term of art in a supposedly scientific discipline, there is no longer any difference between good and bad moralities." This is the sort of thing people tend to say if they think 'good' and 'bad' must be fixed by some kind of transcendental or culture-invariant standard. But when Haidt, the author of notably pluralist descriptive theory of morality, does seem to appeal to a transcendental, culture-invariant normative standard, utilitarianism, according to which one can say something about the difference between good and bad moralities, Gray dings him for not understanding Isaiah Berlin's pluralist objections to utilitarianism.
Better still, Gray dings Haidt for failing to understand why utilitarianism is unlikely to be widely adopted, despite the fact that Haidt in this very book lays out an entire, highly elaborated theory that illuminates this very fact. "One of the problems of morally diverse societies is that utilitarian understandings of harm may not be widely enough shared to form an agreed basis for public policies," Gray says. I'd like to see the look on Haidt's face when he receives this little gem of instruction. Indeed, most of Gray's complaints about Haidt's alleged utilitarianism are quite nicely supported by Haidt's descriptive theory of morality. For example:
Making public policies on a basis of utilitarian reasoning requires a high degree of convergence, not diversity, in moral intuitions. Such policies will not be accepted as legitimate if they violate deep-seated and widely held intuitions regarding, for example, sexuality and the sanctity of human life.
As Haidt's theory would predict!
The problem is, I think, that Gray has been confused by Haidt's weak affirmation of utilitarianism.
When we talk about making laws and implementing public policies in Western democracies that contain some degree of ethnic and moral diversity, then I think there is no compelling alternative to utilitarianism.
Unlike Gray, I do not take Haidt to have by these words committed himself to utilitarianism as the one true moral theory. I take him to have said that in Western democracies, most of us share a general conception of human well-being and that, though we're bound to disagree about many other moral matters, we mostly agree that improving well-being is morally important. Haidt is saying that the best we can do when arguing publicly about public policy is to reason from the common content of our diverse moral worldviews.
I very much doubt Haidt even means to deny that there is more to the overlapping consensus of Western, democratic moral opinion. As he shows us empirically, we are all of us animated at least a little by feelings and thoughts grounded on the other dimensions of moral sentiment. I think Haidt intends to say little more than that utilitarianism is the best available method for reasoning about policy, the closest thing we do have to a consensus standard of evaluation, given the fact of our moral diversity. I don't agree that the shared basis for public deliberation is quite so thin, but I do agree that common concerns about welfare are a large part of that shared basis. There is no compelling alternative to the best we can do, but the best we can do might not be very good.
Most of Gray's complaints about Haidt's utilitarianism dissolve under this more modest interpretation of his endorsement. Even were Haidt half the utilitarian Gray supposes, it remains silly to think, as Gray appears to think, that if one has affirmed utilitarianism, one is thereby saddled with all of Jeremy Bentham's opinions. There's this:
Haidt assumes a connection between utilitarianism and the values of liberal democracy that dissolves with a moment’s critical reflection. Jeremy Bentham, the founder of modern utilitarianism, believed that utilitarian ethics applied universally, and advocated enlightened despotism throughout much of the world.
James Madison was a founder of modern America, and he owned slaves! Suck it Americans.
Then there's this:
Making public policies on a basis of utilitarian reasoning requires a high degree of convergence, not diversity, in moral intuitions. Such policies will not be accepted as legitimate if they violate deep-seated and widely held intuitions regarding, for example, sexuality and the sanctity of human life. Bentham was clear that there may be an unbridgeable gulf between moral intuition and the results of utilitarian reasoning—and when such a discrepancy was the case, he was never in any doubt that it was intuition that must be sacrificed.
Gray is confusing Haidt's psychological moral "intuitionism" (the idea that moral judgment and cognition is driven primarily by passion, not reason) with metaethical intuitionism (the idea that we come to moral truth through intuitive apprehension). Anyway, Henry Sidgwick, who was a better philosopher than Jeremy Bentham, argued that the principle of utility is itself founded on ... guess what? Intuition! Sidgwick also correctly noted that when there is a discrepancy between moral intuition and the results of explicit utilitarian reasoning, one must, insofar as one is a good utilitarian, sacrifice whatever does less for utility. The principle of utility, the truth of which we apprehend through intuition, may well require that we at times jettison utilitarian reasoning and uphold intuition. Gray knows all this. He's playing dumb to land a few cheap shots.
Gray makes some fine points along the way, some of which I may discuss in another post, but mostly he confines himself to raging against "scientism" and the uselessness of having theories at all. Yet he doesn't quite want to say science is good for nothing. "Certainly we know a good deal more about human origins, and about the workings of the human brain, than we did [since the days of phrenology and dialectical materialism]," he concedes. "But we are no better equipped to deal with moral and political conflict. Intellectually, we may be less well prepared than previous generations, if only because we know less of our own history." So it's not that we don't know more than we used to about human nature, it's just that it's all totally useless compared to the sort of insight one might glean from a John Gray book.
Geologists discover a rhythm to major geologic events.
- It appears that Earth has a geologic "pulse," with clusters of major events occurring every 27.5 million years.
- Working with the most accurate dating methods available, the authors of the study constructed a new history of the last 260 million years.
- Exactly why these cycles occur remains unknown, but there are some interesting theories.
Our hearts beat at a resting rate of 60 to 100 beats per minute. Lots of other things pulse, too. The colors we see and the pitches we hear, for example, are due to the different wave frequencies ("pulses") of light and sound waves.
Now, a study in the journal Geoscience Frontiers finds that Earth itself has a pulse, with one "beat" every 27.5 million years. That's the rate at which major geological events have been occurring as far back as geologists can tell.
A planetary calendar has 10 dates in red
Credit: Jagoush / Adobe Stock
According to lead author and geologist Michael Rampino of New York University's Department of Biology, "Many geologists believe that geological events are random over time. But our study provides statistical evidence for a common cycle, suggesting that these geologic events are correlated and not random."
The new study is not the first time that there's been a suggestion of a planetary geologic cycle, but it's only with recent refinements in radioisotopic dating techniques that there's evidence supporting the theory. The authors of the study collected the latest, best dating for 89 known geologic events over the last 260 million years:
- 29 sea level fluctuations
- 12 marine extinctions
- 9 land-based extinctions
- 10 periods of low ocean oxygenation
- 13 gigantic flood basalt volcanic eruptions
- 8 changes in the rate of seafloor spread
- 8 times there were global pulsations in interplate magmatism
The dates provided the scientists a new timetable of Earth's geologic history.
Tick, tick, boom
Credit: New York University
Putting all the events together, the scientists performed a series of statistical analyses that revealed that events tend to cluster around 10 different dates, with peak activity occurring every 27.5 million years. Between the ten busy periods, the number of events dropped sharply, approaching zero.
Perhaps the most fascinating question that remains unanswered for now is exactly why this is happening. The authors of the study suggest two possibilities:
"The correlations and cyclicity seen in the geologic episodes may be entirely a function of global internal Earth dynamics affecting global tectonics and climate, but similar cycles in the Earth's orbit in the Solar System and in the Galaxy might be pacing these events. Whatever the origins of these cyclical episodes, their occurrences support the case for a largely periodic, coordinated, and intermittently catastrophic geologic record, which is quite different from the views held by most geologists."
Assuming the researchers' calculations are at least roughly correct — the authors note that different statistical formulas may result in further refinement of their conclusions — there's no need to worry that we're about to be thumped by another planetary heartbeat. The last occurred some seven million years ago, meaning the next won't happen for about another 20 million years.
Research shows that those who spend more time speaking tend to emerge as the leaders of groups, regardless of their intelligence.
If you want to become a leader, start yammering. It doesn't even necessarily matter what you say. New research shows that groups without a leader can find one if somebody starts talking a lot.
This phenomenon, described by the "babble hypothesis" of leadership, depends neither on group member intelligence nor personality. Leaders emerge based on the quantity of speaking, not quality.
Researcher Neil G. MacLaren, lead author of the study published in The Leadership Quarterly, believes his team's work may improve how groups are organized and how individuals within them are trained and evaluated.
"It turns out that early attempts to assess leadership quality were found to be highly confounded with a simple quantity: the amount of time that group members spoke during a discussion," shared MacLaren, who is a research fellow at Binghamton University.
While we tend to think of leaders as people who share important ideas, leadership may boil down to whoever "babbles" the most. Understanding the connection between how much people speak and how they become perceived as leaders is key to growing our knowledge of group dynamics.
The power of babble
The research involved 256 college students, divided into 33 groups of four to ten people each. They were asked to collaborate on either a military computer simulation game (BCT Commander) or a business-oriented game (CleanStart). The players had ten minutes to plan how they would carry out a task and 60 minutes to accomplish it as a group. One person in the group was randomly designated as the "operator," whose job was to control the user interface of the game.
To determine who became the leader of each group, the researchers asked the participants both before and after the game to nominate one to five people for this distinction. The scientists found that those who talked more were also more likely to be nominated. This remained true after controlling for a number of variables, such as previous knowledge of the game, various personality traits, or intelligence.
How leaders influence people to believe | Michael Dowling | Big Think www.youtube.com
In an interview with PsyPost, MacLaren shared that "the evidence does seem consistent that people who speak more are more likely to be viewed as leaders."
Another find was that gender bias seemed to have a strong effect on who was considered a leader. "In our data, men receive on average an extra vote just for being a man," explained MacLaren. "The effect is more extreme for the individual with the most votes."
The great theoretical physicist Steven Weinberg passed away on July 23. This is our tribute.
- The recent passing of the great theoretical physicist Steven Weinberg brought back memories of how his book got me into the study of cosmology.
- Going back in time, toward the cosmic infancy, is a spectacular effort that combines experimental and theoretical ingenuity. Modern cosmology is an experimental science.
- The cosmic story is, ultimately, our own. Our roots reach down to the earliest moments after creation.
When I was a junior in college, my electromagnetism professor had an awesome idea. Apart from the usual homework and exams, we were to give a seminar to the class on a topic of our choosing. The idea was to gauge which area of physics we would be interested in following professionally.
Professor Gilson Carneiro knew I was interested in cosmology and suggested a book by Nobel Prize Laureate Steven Weinberg: The First Three Minutes: A Modern View of the Origin of the Universe. I still have my original copy in Portuguese, from 1979, that emanates a musty tropical smell, sitting on my bookshelf side-by-side with the American version, a Bantam edition from 1979.
Inspired by Steven Weinberg
Books can change lives. They can illuminate the path ahead. In my case, there is no question that Weinberg's book blew my teenage mind. I decided, then and there, that I would become a cosmologist working on the physics of the early universe. The first three minutes of cosmic existence — what could be more exciting for a young physicist than trying to uncover the mystery of creation itself and the origin of the universe, matter, and stars? Weinberg quickly became my modern physics hero, the one I wanted to emulate professionally. Sadly, he passed away July 23rd, leaving a huge void for a generation of physicists.
What excited my young imagination was that science could actually make sense of the very early universe, meaning that theories could be validated and ideas could be tested against real data. Cosmology, as a science, only really took off after Einstein published his paper on the shape of the universe in 1917, two years after his groundbreaking paper on the theory of general relativity, the one explaining how we can interpret gravity as the curvature of spacetime. Matter doesn't "bend" time, but it affects how quickly it flows. (See last week's essay on what happens when you fall into a black hole).
The Big Bang Theory
For most of the 20th century, cosmology lived in the realm of theoretical speculation. One model proposed that the universe started from a small, hot, dense plasma billions of years ago and has been expanding ever since — the Big Bang model; another suggested that the cosmos stands still and that the changes astronomers see are mostly local — the steady state model.
Competing models are essential to science but so is data to help us discriminate among them. In the mid 1960s, a decisive discovery changed the game forever. Arno Penzias and Robert Wilson accidentally discovered the cosmic microwave background radiation (CMB), a fossil from the early universe predicted to exist by George Gamow, Ralph Alpher, and Robert Herman in their Big Bang model. (Alpher and Herman published a lovely account of the history here.) The CMB is a bath of microwave photons that permeates the whole of space, a remnant from the epoch when the first hydrogen atoms were forged, some 400,000 years after the bang.
The existence of the CMB was the smoking gun confirming the Big Bang model. From that moment on, a series of spectacular observatories and detectors, both on land and in space, have extracted huge amounts of information from the properties of the CMB, a bit like paleontologists that excavate the remains of dinosaurs and dig for more bones to get details of a past long gone.
How far back can we go?
Confirming the general outline of the Big Bang model changed our cosmic view. The universe, like you and me, has a history, a past waiting to be explored. How far back in time could we dig? Was there some ultimate wall we cannot pass?
Because matter gets hot as it gets squeezed, going back in time meant looking at matter and radiation at higher and higher temperatures. There is a simple relation that connects the age of the universe and its temperature, measured in terms of the temperature of photons (the particles of visible light and other forms of invisible radiation). The fun thing is that matter breaks down as the temperature increases. So, going back in time means looking at matter at more and more primitive states of organization. After the CMB formed 400,000 years after the bang, there were hydrogen atoms. Before, there weren't. The universe was filled with a primordial soup of particles: protons, neutrons, electrons, photons, and neutrinos, the ghostly particles that cross planets and people unscathed. Also, there were very light atomic nuclei, such as deuterium and tritium (both heavier cousins of hydrogen), helium, and lithium.
So, to study the universe after 400,000 years, we need to use atomic physics, at least until large clumps of matter aggregate due to gravity and start to collapse to form the first stars, a few millions of years after. What about earlier on? The cosmic history is broken down into chunks of time, each the realm of different kinds of physics. Before atoms form, all the way to about a second after the Big Bang, it's nuclear physics time. That's why Weinberg brilliantly titled his book The First Three Minutes. It is during the interval between one-hundredth of a second and three minutes that the light atomic nuclei (made of protons and neutrons) formed, a process called, with poetic flair, primordial nucleosynthesis. Protons collided with neutrons and, sometimes, stuck together due to the attractive strong nuclear force. Why did only a few light nuclei form then? Because the expansion of the universe made it hard for the particles to find each other.
What about the nuclei of heavier elements, like carbon, oxygen, calcium, gold? The answer is beautiful: all the elements of the periodic table after lithium were made and continue to be made in stars, the true cosmic alchemists. Hydrogen eventually becomes people if you wait long enough. At least in this universe.
In this article, we got all the way up to nucleosynthesis, the forging of the first atomic nuclei when the universe was a minute old. What about earlier on? How close to the beginning, to t = 0, can science get? Stay tuned, and we will continue next week.
To Steven Weinberg, with gratitude, for all that you taught us about the universe.
Long before Alexandria became the center of Egyptian trade, there was Thônis-Heracleion. But then it sank.