Once a week.
Subscribe to our weekly newsletter.
Study: A Little Forethought Can Cure the Urge Toward "Mindless Accumulation"
At the beginning of the Great Depression, John Maynard Keynes made a bold but logical prediction (pdf): In the long run, humanity was solving its economic problems, so that by 2030, in "progressive countries," a 15-hour work week would be the norm. Now, 17 years short of 2030, the world seems to have fulfilled Keynes' prophecy that we would be eight times better off economically than they were when he was writing. So where is the leisure he foresaw? Why are we all still working like fiends? In this paper, Christopher K. Hsee and his co-authors suggest that at least part of the explanation is psychological. Where rational economic creatures would work until they had earned enough to satisfy their needs, Homo sapiens has a propensity for "mindless accumulation": Working until you can't work any more, thereby earning way more than you need. In a series of lab experiments, the researchers write, they've isolated this tendency to "overearning" and found hints of a possible cure.
In their first experiment, 29 women and 26 men were each put in front of a computer monitor with a headset, on which pleasant piano music would play. For the next five minutes, the volunteer had a choice: Keep listening (ie, enjoy a bit of leisure time), or push a key and hear an irritating sound for a fifth of a second. For this annoying task (ie, work), they would be rewarded with a miniature Dove bar. Half the group was told it would take 20 noises to earn one chocolate; the other half got a much lower wage: 120 noises for one chocolate. In the second half of the experiment, the volunteers got their "pay" and could eat as much of it as they pleased. But, as in life, they couldn't take any chocolate with them when they left. So the volunteers had a clear incentive to "work" for as much chocolate as they could enjoy in the lab, and no reason to work for more.
Nonetheless, those in the high-wage group (one chocolate for 20 noises) "overearned" by a wide margin: As a whole, they worked enough to get nearly 11 Dove candies per person, even though they actually ate less than five per person. (There was an outlier—one hungry loon who earned 50 chocolates and ate 28 of them—which created some odd statistics but didn't alter the overall results.) Meanwhile, the low-wage people (one chocolate for 120 noises) earned only an average of two and a half Dove bars each. This was, nonetheless, more than they wanted to eat; they too left some chocolate on the table.
So, the experimenters write, they've shown that their volunteers will work earn more than they need, piling up chocolate that they'll never eat. And this tendency was much more pronounced in the "high-wage" group.
You might think people simply overestimated how much chocolate they would want to eat, and worked accordingly. But that doesn't explain the results. Firstly, Hsee et al. tolf a second group of volunteers about the procedure and asked them to estimate how much chocolate they would work for if they were to do the experiment themselves. They pegged it at around 4 chocolates. That's pretty close to what the high-earner group actually ate, suggesting that people didn't misjudge how much they would need. Secondly, if everyone in the experiment had a tendency to overestimate their desire for Dove chocolate bars, then the low-wage people should have done so as well, and worked harder.
Another possible explanation (very common on the streets of New York, where I live) is that the workers just loved their job, and couldn't get enough of it. The experimenters addressed this by asking the group to rate the experience of listening to the nasty noise versus listening the music. Idle music-listening was rated as far more pleasurable. So for this particular instance of "work," sheer joy was not the explanation.
In a second experiment, the researchers wanted to see what would happen if, while earning, people were made to think consciously about how they were piling up rewards they would never use. "One way to achieve this was to force participants to eat all of their earned chocolates," the authors write, maybe a little wistfully, "but doing so was not ethically justifiable." So the reward this time around was: jokes. During their first three minutes in front of the computer, volunteers could earn the right to read one joke by subjecting themselves to five instances of noise. In the second 3 minutes, the jokes they had earned would be displayed on the computer screen. Here, though, they were warned about the Catch-22: With the strict time limit, a person who had earned too many jokes would watch them fly by too fast to catch the punch lines.
Again, the participants (19 women and 21 men) overearned. However, half the group, which had been asked beforehand to think about how many jokes they wanted to see, overdid it a great deal less. Most of these people, in fact, reached their target number of jokes and then stopped "working." Participants were again asked how happy they were doing their tasks, and those who had earned the most jokes were least happy (presumably from the jestus interruptus they kept suffering). In fact, the paper says, "the more jokes participants earned, the less happy they were."
What this suggests, write Hsee et al., is that overearning is a strong drive. People did it even when they knew that overshooting the mark would reduce their enjoyment of their reward. On the other hand, the drive to overearn does seem susceptible to a framing effect. Some volunteers, remember, were reminded before the task about what they considered adequate for their needs. Those people engaged in less "mindless accumulation" than those who just sat down and started to work.
With that in mind, Hsee et al. set up a third experiment., with an explicit "earning cap." As they worked for their wage (this time, one Hershey's kiss for every 10 instances of noise) the volunteers were notified when they had earned 12 chocolates. Half saw a message that told them they could not earn any more reward, though they were welcome to keep irritating themselves with noises if they liked. The others were told they could keep adding to their chocolate pile by continuing the chore.
Those people kept working away, earning an average of 14 and a half Hershey's kisses per person—even though they ended up eating only six and a half each. Those who saw the "earning cap" message, though, stopped working sooner, earned less, and thus left much less uneaten chocolate on the table. Moreover, those people pronounced themselves happier.
Hsee et al. acknowledge, of course, that they've created a highly simplified model of the working life—one where love of work, desire to pass on an inheritance, competitiveness with others, social norms and a host of other elements are not present. There is also a Henrich caveat here: The experimental groups were not very big, and they were made up entirely of people from a WEIRD (Western, Educated, Industrial, Rich Democratic) society. "Our priority," they write, "was minimalism and controllability rather than realism and external validity." Fair enough. You have to start somewhere.
Many have considered the fact that higher incomes don't correlate with greater happiness and proposed that the tendency to earn more than we need might stem from status-seeking and envy of others. Hsee et al.'s work suggests there are other drivers worth considering. The authors think that "mindless accumulation" is simply a mismatch between human nature and our current abundance. Until recently, they note, scarcity made overearning impossible, and it made sense to work for a reward as long as one was on offer.
In any event, now that more and more of us are surrounded by material comfort, it does seem a good idea to think about how we can wean ourselves off the ancient habit of working till we drop. It will likely not be easy, as Keynes foresaw: "Yet there is no country and no people," he wrote, "who can look forward to the age of leisure and of abundance without a dread. For we have been trained too long to strive and not to enjoy."
Hsee, C., Zhang, J., Cai, C., & Zhang, S. (2013). Overearning Psychological Science DOI: 10.1177/0956797612464785
Illustration: The gold-obsessed dragon Fafner stands guard over his treasure. Arthur Rackham, via Wikimedia.
Follow me on Twitter: @davidberreby
Geologists discover a rhythm to major geologic events.
- It appears that Earth has a geologic "pulse," with clusters of major events occurring every 27.5 million years.
- Working with the most accurate dating methods available, the authors of the study constructed a new history of the last 260 million years.
- Exactly why these cycles occur remains unknown, but there are some interesting theories.
Our hearts beat at a resting rate of 60 to 100 beats per minute. Lots of other things pulse, too. The colors we see and the pitches we hear, for example, are due to the different wave frequencies ("pulses") of light and sound waves.
Now, a study in the journal Geoscience Frontiers finds that Earth itself has a pulse, with one "beat" every 27.5 million years. That's the rate at which major geological events have been occurring as far back as geologists can tell.
A planetary calendar has 10 dates in red
Credit: Jagoush / Adobe Stock
According to lead author and geologist Michael Rampino of New York University's Department of Biology, "Many geologists believe that geological events are random over time. But our study provides statistical evidence for a common cycle, suggesting that these geologic events are correlated and not random."
The new study is not the first time that there's been a suggestion of a planetary geologic cycle, but it's only with recent refinements in radioisotopic dating techniques that there's evidence supporting the theory. The authors of the study collected the latest, best dating for 89 known geologic events over the last 260 million years:
- 29 sea level fluctuations
- 12 marine extinctions
- 9 land-based extinctions
- 10 periods of low ocean oxygenation
- 13 gigantic flood basalt volcanic eruptions
- 8 changes in the rate of seafloor spread
- 8 times there were global pulsations in interplate magmatism
The dates provided the scientists a new timetable of Earth's geologic history.
Tick, tick, boom
Credit: New York University
Putting all the events together, the scientists performed a series of statistical analyses that revealed that events tend to cluster around 10 different dates, with peak activity occurring every 27.5 million years. Between the ten busy periods, the number of events dropped sharply, approaching zero.
Perhaps the most fascinating question that remains unanswered for now is exactly why this is happening. The authors of the study suggest two possibilities:
"The correlations and cyclicity seen in the geologic episodes may be entirely a function of global internal Earth dynamics affecting global tectonics and climate, but similar cycles in the Earth's orbit in the Solar System and in the Galaxy might be pacing these events. Whatever the origins of these cyclical episodes, their occurrences support the case for a largely periodic, coordinated, and intermittently catastrophic geologic record, which is quite different from the views held by most geologists."
Assuming the researchers' calculations are at least roughly correct — the authors note that different statistical formulas may result in further refinement of their conclusions — there's no need to worry that we're about to be thumped by another planetary heartbeat. The last occurred some seven million years ago, meaning the next won't happen for about another 20 million years.
Research shows that those who spend more time speaking tend to emerge as the leaders of groups, regardless of their intelligence.
If you want to become a leader, start yammering. It doesn't even necessarily matter what you say. New research shows that groups without a leader can find one if somebody starts talking a lot.
This phenomenon, described by the "babble hypothesis" of leadership, depends neither on group member intelligence nor personality. Leaders emerge based on the quantity of speaking, not quality.
Researcher Neil G. MacLaren, lead author of the study published in The Leadership Quarterly, believes his team's work may improve how groups are organized and how individuals within them are trained and evaluated.
"It turns out that early attempts to assess leadership quality were found to be highly confounded with a simple quantity: the amount of time that group members spoke during a discussion," shared MacLaren, who is a research fellow at Binghamton University.
While we tend to think of leaders as people who share important ideas, leadership may boil down to whoever "babbles" the most. Understanding the connection between how much people speak and how they become perceived as leaders is key to growing our knowledge of group dynamics.
The power of babble
The research involved 256 college students, divided into 33 groups of four to ten people each. They were asked to collaborate on either a military computer simulation game (BCT Commander) or a business-oriented game (CleanStart). The players had ten minutes to plan how they would carry out a task and 60 minutes to accomplish it as a group. One person in the group was randomly designated as the "operator," whose job was to control the user interface of the game.
To determine who became the leader of each group, the researchers asked the participants both before and after the game to nominate one to five people for this distinction. The scientists found that those who talked more were also more likely to be nominated. This remained true after controlling for a number of variables, such as previous knowledge of the game, various personality traits, or intelligence.
How leaders influence people to believe | Michael Dowling | Big Think www.youtube.com
In an interview with PsyPost, MacLaren shared that "the evidence does seem consistent that people who speak more are more likely to be viewed as leaders."
Another find was that gender bias seemed to have a strong effect on who was considered a leader. "In our data, men receive on average an extra vote just for being a man," explained MacLaren. "The effect is more extreme for the individual with the most votes."
The great theoretical physicist Steven Weinberg passed away on July 23. This is our tribute.
- The recent passing of the great theoretical physicist Steven Weinberg brought back memories of how his book got me into the study of cosmology.
- Going back in time, toward the cosmic infancy, is a spectacular effort that combines experimental and theoretical ingenuity. Modern cosmology is an experimental science.
- The cosmic story is, ultimately, our own. Our roots reach down to the earliest moments after creation.
When I was a junior in college, my electromagnetism professor had an awesome idea. Apart from the usual homework and exams, we were to give a seminar to the class on a topic of our choosing. The idea was to gauge which area of physics we would be interested in following professionally.
Professor Gilson Carneiro knew I was interested in cosmology and suggested a book by Nobel Prize Laureate Steven Weinberg: The First Three Minutes: A Modern View of the Origin of the Universe. I still have my original copy in Portuguese, from 1979, that emanates a musty tropical smell, sitting on my bookshelf side-by-side with the American version, a Bantam edition from 1979.
Inspired by Steven Weinberg
Books can change lives. They can illuminate the path ahead. In my case, there is no question that Weinberg's book blew my teenage mind. I decided, then and there, that I would become a cosmologist working on the physics of the early universe. The first three minutes of cosmic existence — what could be more exciting for a young physicist than trying to uncover the mystery of creation itself and the origin of the universe, matter, and stars? Weinberg quickly became my modern physics hero, the one I wanted to emulate professionally. Sadly, he passed away July 23rd, leaving a huge void for a generation of physicists.
What excited my young imagination was that science could actually make sense of the very early universe, meaning that theories could be validated and ideas could be tested against real data. Cosmology, as a science, only really took off after Einstein published his paper on the shape of the universe in 1917, two years after his groundbreaking paper on the theory of general relativity, the one explaining how we can interpret gravity as the curvature of spacetime. Matter doesn't "bend" time, but it affects how quickly it flows. (See last week's essay on what happens when you fall into a black hole).
The Big Bang Theory
For most of the 20th century, cosmology lived in the realm of theoretical speculation. One model proposed that the universe started from a small, hot, dense plasma billions of years ago and has been expanding ever since — the Big Bang model; another suggested that the cosmos stands still and that the changes astronomers see are mostly local — the steady state model.
Competing models are essential to science but so is data to help us discriminate among them. In the mid 1960s, a decisive discovery changed the game forever. Arno Penzias and Robert Wilson accidentally discovered the cosmic microwave background radiation (CMB), a fossil from the early universe predicted to exist by George Gamow, Ralph Alpher, and Robert Herman in their Big Bang model. (Alpher and Herman published a lovely account of the history here.) The CMB is a bath of microwave photons that permeates the whole of space, a remnant from the epoch when the first hydrogen atoms were forged, some 400,000 years after the bang.
The existence of the CMB was the smoking gun confirming the Big Bang model. From that moment on, a series of spectacular observatories and detectors, both on land and in space, have extracted huge amounts of information from the properties of the CMB, a bit like paleontologists that excavate the remains of dinosaurs and dig for more bones to get details of a past long gone.
How far back can we go?
Confirming the general outline of the Big Bang model changed our cosmic view. The universe, like you and me, has a history, a past waiting to be explored. How far back in time could we dig? Was there some ultimate wall we cannot pass?
Because matter gets hot as it gets squeezed, going back in time meant looking at matter and radiation at higher and higher temperatures. There is a simple relation that connects the age of the universe and its temperature, measured in terms of the temperature of photons (the particles of visible light and other forms of invisible radiation). The fun thing is that matter breaks down as the temperature increases. So, going back in time means looking at matter at more and more primitive states of organization. After the CMB formed 400,000 years after the bang, there were hydrogen atoms. Before, there weren't. The universe was filled with a primordial soup of particles: protons, neutrons, electrons, photons, and neutrinos, the ghostly particles that cross planets and people unscathed. Also, there were very light atomic nuclei, such as deuterium and tritium (both heavier cousins of hydrogen), helium, and lithium.
So, to study the universe after 400,000 years, we need to use atomic physics, at least until large clumps of matter aggregate due to gravity and start to collapse to form the first stars, a few millions of years after. What about earlier on? The cosmic history is broken down into chunks of time, each the realm of different kinds of physics. Before atoms form, all the way to about a second after the Big Bang, it's nuclear physics time. That's why Weinberg brilliantly titled his book The First Three Minutes. It is during the interval between one-hundredth of a second and three minutes that the light atomic nuclei (made of protons and neutrons) formed, a process called, with poetic flair, primordial nucleosynthesis. Protons collided with neutrons and, sometimes, stuck together due to the attractive strong nuclear force. Why did only a few light nuclei form then? Because the expansion of the universe made it hard for the particles to find each other.
What about the nuclei of heavier elements, like carbon, oxygen, calcium, gold? The answer is beautiful: all the elements of the periodic table after lithium were made and continue to be made in stars, the true cosmic alchemists. Hydrogen eventually becomes people if you wait long enough. At least in this universe.
In this article, we got all the way up to nucleosynthesis, the forging of the first atomic nuclei when the universe was a minute old. What about earlier on? How close to the beginning, to t = 0, can science get? Stay tuned, and we will continue next week.
To Steven Weinberg, with gratitude, for all that you taught us about the universe.
Long before Alexandria became the center of Egyptian trade, there was Thônis-Heracleion. But then it sank.