Once a week.
Subscribe to our weekly newsletter.
To be happier, try a 'rougher' schedule
A new study suggests using "rough scheduling" for leisure activities to avoid making even fun feel like work.
"Lost yesterday, somewhere between sunrise and sunset, two golden hours, each set with sixty diamond minutes. No reward is offered for they are gone forever." — Horace Mann
We have so much to do and so little time in which to do it. Days seem to go by fast—weeks, months, and even years race by ever more quickly. There seems to be no choice but to get organized and become adept at scheduling everything we need or want to do, even when it's supposed to be something fun. But as we multitask our ways through calendars and to-do lists, we somehow rarely gain any sense of pleasure—we're just hanging on for dear life even as we're engaging in fun activities meant to refresh and balance us. According to a new paper, there are things we can do to be optimally productive and enjoy our busy lives more, including a method the authors dub “rough scheduling."
The basic conflict
The study's title, “Activity Versus Outcome Maximization in Time Management," by time-management experts Selin A. Malkoc and Gabriela Tonietto frames the problem: It's a conflict between two competing goals:
- activity maximization — whose goal is getting as much done as possible.
- outcome maximization — whose goal is getting things done as satisfactorily as possible.
The new paper is actually a follow-up to Tonietto and Malkoc's 2016 “The Calendar Mindset: Scheduling Takes the Fun Out and Puts the Work In" that documents 13 surveys the authors performed. While activity maximization makes perfect sense for tasks related to obligations and responsibilities, the researchers found that it can be the enemy of outcome maximization. They concluded:
- When leisure activities are scheduled, they take on qualities of work, leading to lower utility.
- This effect is only observed when the scheduling is specific.
- The effect is unique to leisure vs. work activities.
Avoiding time famine
Overwhelmed. (Flickr user Alice)
Rough scheduling is intended as a remedy for what's called “time famine," the overwhelming feeling that there's simply not enough time to get everything done. And, as the new study notes, “This feeling of time scarcity is linked to many undesirable outcomes, from insomnia to worsening physical health to stingy wallets." Add to that list an increasing difficulty in doing even work-related tasks well.
There's an entire industry built around getting more things done. After all, telling others how busy one is is an image-builder, suggestive of success and productivity. There are books, videos, systems — like the aptly named GTD® (for “Getting Things Done") method — and a host of apps out there whose sole purpose is increasing your efficiency. And some of this works quite well, too, from the outcome maximization perspective. But with every minute of time accounted for in advance, our leisure activities end up having to be slotted into these hard-won, strategically parsed timetables. And that can ruin them.
Achieving a happier balance
(Image: Public domain)
The study suggests that by better balancing activity and outcome maximization we can achieve better performance when we're working as well as more rejuvenating pleasure when we play.
Increasing your performance during work activities
Instead of packing as many activities as possible into a period of time, prioritize. It can be heartbreaking to allow less-important tasks to slide, but by allotting a more reasonable amount of time to our high-priority activities, they're more likely to get done well since you're less under pressure from unrealistic time constraints as you perform them. (Maybe you can delegate the discarded low-priority jobs to someone else.)
Perform one task at a time
You may be proud of your perceived ability to handle multiple small tasks as you work on larger ones, but studies are clear that multitasking doesn't really work. It just divides your attention, leading to less care, attention, and creativity for what you're supposed to be doing, and, therefore, to lower-quality work.
Space deadlines evenly
While externally imposed deadlines have been shown to increase performance, when setting your own deadlines, spread them apart evenly to budget your attention more sensibly. As the paper puts it, “Deadlines that are evenly spaced increase performance relative to less staggered ones. For example, students with three evenly spaced deadlines throughout the semester obtained higher grades than those with all three deadlines at the end of the semester."
Increasing your enjoyment during leisure activities
(Credit: Thomas Hawk)
Schedule more roughly
“People schedule leisure activities to ensure their participation, implicitly assuming that participation in an activity automatically leads to its enjoyment," asserts the study. Alas, the authors' previous research reveals that it's just not so. The reason is that “the strict beginning and end times imposed by scheduling disrupts the free-flowing nature of leisure activities," making it less enjoyable and more like work.
The solution, Malkoc and Tonietto say, is to schedule roughly, by which they mean to schedule leisure activity only into windows of time that have neither a pre-planned beginning or end. An example of this would be to plan to go out to dinner “after work" instead of “6 pm." Their research suggests that leisure activities planned this way are just as satisfying as those that are completely spontaneous.
Avoid hard stops
Similarly problematic is the imposition of strict end times for enjoyable events, such as when they're to be followed by scheduled work activities. After all, this violates rough scheduling by requiring the fun to end at a specific time. “The hard stop posed by the scheduled activity creates time pressure that may undermine enjoyment during the preceding time," the study says. It takes you out of the enjoyable moment, distracting you by what's coming next.
A better idea is to leave an unscheduled buffer between fun and work, or, better yet, reschedule the work altogether when possible so as to leave leisure activities open-ended.
Focus on the now
As you might imagine from the section above, to really get the most out of leisure activities, it's imperative that you be present as they occur, even if what's pulling you away is something you're looking forward to. In the authors' prior study, “participants enjoyed a comedic video less when they knew that they would next watch another enjoyable video compared to those who were unaware of the future activity." As in many other areas in life, “being more in-the-moment, or mindful, improves enjoyment," says the current study.
As you seek a way to juggle the many tasks before you in a more satisfying way, try to remain aware of maintaining a balance between getting as much done as possible and having an enjoyable daily life. Both goals are worth striving for, and the study's suggestions make it less likely you'll someday reach the lower reaches of Life's To-Do List dissatisfied about what never got done, and feeling burnt-out. It's so much better to have enjoyed the action-packed ride.
Geologists discover a rhythm to major geologic events.
- It appears that Earth has a geologic "pulse," with clusters of major events occurring every 27.5 million years.
- Working with the most accurate dating methods available, the authors of the study constructed a new history of the last 260 million years.
- Exactly why these cycles occur remains unknown, but there are some interesting theories.
Our hearts beat at a resting rate of 60 to 100 beats per minute. Lots of other things pulse, too. The colors we see and the pitches we hear, for example, are due to the different wave frequencies ("pulses") of light and sound waves.
Now, a study in the journal Geoscience Frontiers finds that Earth itself has a pulse, with one "beat" every 27.5 million years. That's the rate at which major geological events have been occurring as far back as geologists can tell.
A planetary calendar has 10 dates in red
Credit: Jagoush / Adobe Stock
According to lead author and geologist Michael Rampino of New York University's Department of Biology, "Many geologists believe that geological events are random over time. But our study provides statistical evidence for a common cycle, suggesting that these geologic events are correlated and not random."
The new study is not the first time that there's been a suggestion of a planetary geologic cycle, but it's only with recent refinements in radioisotopic dating techniques that there's evidence supporting the theory. The authors of the study collected the latest, best dating for 89 known geologic events over the last 260 million years:
- 29 sea level fluctuations
- 12 marine extinctions
- 9 land-based extinctions
- 10 periods of low ocean oxygenation
- 13 gigantic flood basalt volcanic eruptions
- 8 changes in the rate of seafloor spread
- 8 times there were global pulsations in interplate magmatism
The dates provided the scientists a new timetable of Earth's geologic history.
Tick, tick, boom
Credit: New York University
Putting all the events together, the scientists performed a series of statistical analyses that revealed that events tend to cluster around 10 different dates, with peak activity occurring every 27.5 million years. Between the ten busy periods, the number of events dropped sharply, approaching zero.
Perhaps the most fascinating question that remains unanswered for now is exactly why this is happening. The authors of the study suggest two possibilities:
"The correlations and cyclicity seen in the geologic episodes may be entirely a function of global internal Earth dynamics affecting global tectonics and climate, but similar cycles in the Earth's orbit in the Solar System and in the Galaxy might be pacing these events. Whatever the origins of these cyclical episodes, their occurrences support the case for a largely periodic, coordinated, and intermittently catastrophic geologic record, which is quite different from the views held by most geologists."
Assuming the researchers' calculations are at least roughly correct — the authors note that different statistical formulas may result in further refinement of their conclusions — there's no need to worry that we're about to be thumped by another planetary heartbeat. The last occurred some seven million years ago, meaning the next won't happen for about another 20 million years.
Brain cells snap strands of DNA in many more places and cell types than researchers previously thought.
The urgency to remember a dangerous experience requires the brain to make a series of potentially dangerous moves: Neurons and other brain cells snap open their DNA in numerous locations — more than previously realized, according to a new study — to provide quick access to genetic instructions for the mechanisms of memory storage.
The extent of these DNA double-strand breaks (DSBs) in multiple key brain regions is surprising and concerning, says study senior author Li-Huei Tsai, Picower Professor of Neuroscience at MIT and director of The Picower Institute for Learning and Memory, because while the breaks are routinely repaired, that process may become more flawed and fragile with age. Tsai's lab has shown that lingering DSBs are associated with neurodegeneration and cognitive decline and that repair mechanisms can falter.
"We wanted to understand exactly how widespread and extensive this natural activity is in the brain upon memory formation because that can give us insight into how genomic instability could undermine brain health down the road," says Tsai, who is also a professor in the Department of Brain and Cognitive Sciences and a leader of MIT's Aging Brain Initiative. "Clearly, memory formation is an urgent priority for healthy brain function, but these new results showing that several types of brain cells break their DNA in so many places to quickly express genes is still striking."
In 2015, Tsai's lab provided the first demonstration that neuronal activity caused DSBs and that they induced rapid gene expression. But those findings, mostly made in lab preparations of neurons, did not capture the full extent of the activity in the context of memory formation in a behaving animal, and did not investigate what happened in cells other than neurons.
In the new study published July 1 in PLOS ONE, lead author and former graduate student Ryan Stott and co-author and former research technician Oleg Kritsky sought to investigate the full landscape of DSB activity in learning and memory. To do so, they gave mice little electrical zaps to the feet when they entered a box, to condition a fear memory of that context. They then used several methods to assess DSBs and gene expression in the brains of the mice over the next half-hour, particularly among a variety of cell types in the prefrontal cortex and hippocampus, two regions essential for the formation and storage of conditioned fear memories. They also made measurements in the brains of mice that did not experience the foot shock to establish a baseline of activity for comparison.
The creation of a fear memory doubled the number of DSBs among neurons in the hippocampus and the prefrontal cortex, affecting more than 300 genes in each region. Among 206 affected genes common to both regions, the researchers then looked at what those genes do. Many were associated with the function of the connections neurons make with each other, called synapses. This makes sense because learning arises when neurons change their connections (a phenomenon called "synaptic plasticity") and memories are formed when groups of neurons connect together into ensembles called engrams.
"Many genes essential for neuronal function and memory formation, and significantly more of them than expected based on previous observations in cultured neurons … are potentially hotspots of DSB formation," the authors wrote in the study.
In another analysis, the researchers confirmed through measurements of RNA that the increase in DSBs indeed correlated closely with increased transcription and expression of affected genes, including ones affecting synapse function, as quickly as 10-30 minutes after the foot shock exposure.
"Overall, we find transcriptional changes are more strongly associated with [DSBs] in the brain than anticipated," they wrote. "Previously we observed 20 gene-associated [DSB] loci following stimulation of cultured neurons, while in the hippocampus and prefrontal cortex we see more than 100-150 gene associated [DSB] loci that are transcriptionally induced."
Snapping with stress
In the analysis of gene expression, the neuroscientists looked at not only neurons but also non-neuronal brain cells, or glia, and found that they also showed changes in expression of hundreds of genes after fear conditioning. Glia called astrocytes are known to be involved in fear learning, for instance, and they showed significant DSB and gene expression changes after fear conditioning.
Among the most important functions of genes associated with fear conditioning-related DSBs in glia was the response to hormones. The researchers therefore looked to see which hormones might be particularly involved and discovered that it was glutocortocoids, which are secreted in response to stress. Sure enough, the study data showed that in glia, many of the DSBs that occurred following fear conditioning occurred at genomic sites related to glutocortocoid receptors. Further tests revealed that directly stimulating those hormone receptors could trigger the same DSBs that fear conditioning did and that blocking the receptors could prevent transcription of key genes after fear conditioning.
Tsai says the finding that glia are so deeply involved in establishing memories from fear conditioning is an important surprise of the new study.
"The ability of glia to mount a robust transcriptional response to glutocorticoids suggest that glia may have a much larger role to play in the response to stress and its impact on the brain during learning than previously appreciated," she and her co-authors wrote.
Damage and danger?
More research will have to be done to prove that the DSBs required for forming and storing fear memories are a threat to later brain health, but the new study only adds to evidence that it may be the case, the authors say.
"Overall we have identified sites of DSBs at genes important for neuronal and glial functions, suggesting that impaired DNA repair of these recurrent DNA breaks which are generated as part of brain activity could result in genomic instability that contribute to aging and disease in the brain," they wrote.
The National Institutes of Health, The Glenn Foundation for Medical Research, and the JPB Foundation provided funding for the research.
Research shows that those who spend more time speaking tend to emerge as the leaders of groups, regardless of their intelligence.
- A new study proposes the "babble hypothesis" of becoming a group leader.
- Researchers show that intelligence is not the most important factor in leadership.
- Those who talk the most tend to emerge as group leaders.
If you want to become a leader, start yammering. It doesn't even necessarily matter what you say. New research shows that groups without a leader can find one if somebody starts talking a lot.
This phenomenon, described by the "babble hypothesis" of leadership, depends neither on group member intelligence nor personality. Leaders emerge based on the quantity of speaking, not quality.
Researcher Neil G. MacLaren, lead author of the study published in The Leadership Quarterly, believes his team's work may improve how groups are organized and how individuals within them are trained and evaluated.
"It turns out that early attempts to assess leadership quality were found to be highly confounded with a simple quantity: the amount of time that group members spoke during a discussion," shared MacLaren, who is a research fellow at Binghamton University.
While we tend to think of leaders as people who share important ideas, leadership may boil down to whoever "babbles" the most. Understanding the connection between how much people speak and how they become perceived as leaders is key to growing our knowledge of group dynamics.
The power of babble
The research involved 256 college students, divided into 33 groups of four to ten people each. They were asked to collaborate on either a military computer simulation game (BCT Commander) or a business-oriented game (CleanStart). The players had ten minutes to plan how they would carry out a task and 60 minutes to accomplish it as a group. One person in the group was randomly designated as the "operator," whose job was to control the user interface of the game.
To determine who became the leader of each group, the researchers asked the participants both before and after the game to nominate one to five people for this distinction. The scientists found that those who talked more were also more likely to be nominated. This remained true after controlling for a number of variables, such as previous knowledge of the game, various personality traits, or intelligence.
How leaders influence people to believe | Michael Dowling | Big Think www.youtube.com
In an interview with PsyPost, MacLaren shared that "the evidence does seem consistent that people who speak more are more likely to be viewed as leaders."
Another find was that gender bias seemed to have a strong effect on who was considered a leader. "In our data, men receive on average an extra vote just for being a man," explained MacLaren. "The effect is more extreme for the individual with the most votes."
The great theoretical physicist Steven Weinberg passed away on July 23. This is our tribute.