Should we all be "taking life as it comes"?
It's good to be able to accept what there is. And although one does not need to apply this skill always or without moderation, it is one of the most important exercises for the human spirit.
There is a playful summary of the history of philosophy – written, I think, by Leszek Kołakowski – where every philosopher is given one fourth of a sentence. "Aristotle: stick to the Middle state between, you will not die." "Hegel: God has dissolved throughout the world because he had to." "Thales: because, water." And the Stoics? "Stoics: it is good the way it is."
This principle is witty, accurate, and yet simultaneously problematic. Specifically, it is problematic precisely because it is accurate. Because stoicism (at least the ancient variety) really tries to convince us that what there is, is good. In other words (one could say, with irony, that philosophy itself is based on such word play), stoicism is the art of convincing oneself that the way things are is good.
'It is good the way it is'. 'Come to terms with it'. 'Take life as it comes'. These phrases are wonderfully ambivalent, not in their meaning, but in the reflection they bring. On the one hand, they contain a profound, universal wisdom. It is no coincidence that the themes of 'coming to terms' and 'acceptance' appear in a wide variety of schools and traditions of thought – from Stoics and Buddhists to pantheists and practitioners of all kinds of modern-day mindfulness. Yes, this is the profound, fundamental truth about human life, one of its mysteries; one way to live on this Earth and not go mad. It's good to be able to reconcile, and accept what there is. And although one does not need to apply this skill always and without moderation – and although it will not be useful for everyone – it is undoubtedly one of the most important exercises and perspectives for the human spirit.
On the other hand, it is, in essence, painfully banal. After all, what's easier, more trivial, clichéd than saying: "You have to come to terms with what you cannot change"; "You have to accept the facts"; "You have to deal with what there is". This truth has been explored through hundreds of generations, in thousands of languages. It is outdated, graphomaniac even. It is trivial not only because we've internalized this wisdom, we've profanated and McDonaldized it. It is also trivial in the sense that it is radically simple. The idea of 'accepting things as they are' is just so painfully simple. So simple, in fact, that it seems... empty. It is almost a tautology – there's nothing to talk about here.
And if one cannot talk about it, it's easy to sneer. The idea of 'acceptance', 'reconciling with life', is constantly and regularly ridiculed as a kind of ornament that looks impressive, but does not bring anything to the table and perhaps even makes things worse. Online projects such as "Zdelegalizować coaching i rozwój osobisty" [Outlaw coaching and personal development] and "Magazyn Porażka" [Failure Magazine] provide healthy but miserable ridicule. Sneering, which often develops into solid social critique. A concrete example: at the beginning of this year (and the new decade – after a few months, it still sounds good) the news that Starbucks refused their employees a pay increase and instead offered a meditation app was widely debated online. In short: we will not give you money, but a tool to come to terms with the fact that you have so little of it. The grating is obvious: something is not working here, something is disproportionate. The idea of 'reconciling' and 'accepting the world as it is', noble in its intention, is used here for an ugly, oppressive purpose.
We could say: Yes to acceptance, no to pathologies. And yet we have to be careful not to slip into the said cliché. One precaution might be – attention! I am about to take a leap, hopefully not salto mortale – to turn to Anselm from Canterbury and his proof of the existence of God. I do not mean the proof itself, but its status and context. This proof, today known as ontological evidence, was not invented (by a Christian bishop thinker in 11th-century Europe) in order to really persuade anyone. The point of proving God's existence is not to convert anyone into a believer. The point – at least of Anselm's proof – was to show that, beginning with our faith in God and travelling on the great and complicated paths of reason, we will finally arrive at the same faith from which we departed. Logical reasoning will confirm what is known through faith. The point of departure and arrival is trivial, if only because it is one and the same. All value lies in what we learn along the way.
And it is kind of similar with acceptance, reconciliation with life. Of course, this idea is trivial, radically simple, and because of this simplicity a bit graphomaniac. However, the secret lies in how we reach it. Entire volumes have already been written about these paths, and new ones are still being produced. I am still writing new ones myself, and if nothing bad happens along the way, they will land in bookstores sooner than the next issue of "Przekrój".
Let's take another leap now: from Anselm to... Sartre. Because it was Sartre who pointed out somewhere that we, humans, have a fundamental problem with objects and substances that drag, smudge, are sticky and are difficult to clean. This doesn't really require an explanation, especially for those of you for whom the abbreviation OCD is not some mysterious acronym (I see you, brothers and sisters!). Favourite trousers stained with grease, shampoo spilled in the toiletries bag, dog shit on a fluffy carpet. Nobody likes that kind of thing.
Why am I even writing about it? Sartre – if I am not mistaken – draws attention to the deeper sense of our aversion to stickiness. We don't like it, because it blurs the difference between us and the outside world. In everyday life, we feel this difference quite strongly. My 'I' ends somewhere on the border of my skin, body, clothes. External things are beyond me, they are not mine. And, to some extent, I recognize that they are not mine, that they are not with me, that if I want, I can get up, leave, and move away from them. The spatial dimension emphasizes this difference between an external object and me.
But viscosity negates this. If I sit on old chewing gum, I will not be able to clean it easily off of my trousers. I will not be able to separate myself from it easily; there won't be a single pleasant moment of separation that will give me soothing confidence that me and the gum are separate. What is viscous is not only physically sticky – it also gets stuck to my 'I' and disturbs the pleasant awareness that my being is clearly defined and its boundaries are defined.
What does this have to do with anything? A lot! If 'acceptance' can indeed elude banality, it is not in objectively painful matters, but precisely in ambiguous, sticky ones. "Deal with the fact that you are mortal, with the loss that you will no longer fulfil your youthful dreams." These are all difficult, sad issues – often tragic, sometimes unmanageable. What connects them, however, is that we know – at least in theory – what this state of arriving at them should look like; we know what it means to come to terms with death, separation or lost dreams. There are suitable prescriptions – perhaps bitter, but they are there.
However, are they still valid when things get sticky and ambiguous? That is the question! Death, loss, lost dreams – these are blows to the 'I', which (in principle) attack my 'I' from the outside. As long as the difference between 'I' and 'not I' is determined, I at least recognize the field of struggle. It is much harder when these differences begin to blur. Here, of course, I do not mean that the Stoic will find it harder to accept gum on his bottoms or grease on his shirt. I mean situations that 'smudge', 'spill', and 'stick' – and, as such, undermine the boundary between 'I' and the outside world.
The experience of parenthood, with its crumbling piles of dozens of items, bags and parcels that need to be dragged around the world with you and the baby. The experience of a difficult family relationship that we would have ended a long time ago if it was a stranger, which goes on and on, and doesn't end because there is something of a life sentence about it. The experience of mental health problems or the nightmare of depression that does not attack me from the outside, but breaks down my 'I' from the inside. Such situations are the most difficult to accept, because the boundary between who is doing the accepting and what is to be accepted is rather blurred. Here, stoicism, but also more broadly, every maximalist philosophy that sees thing in black and white, finds a worthy challenge.
Translated from the Polish by Joanna Figiel
There is a neurological link between serotonin levels and the brain's ability to control impulses and patience levels.
- Prior research has suggested a possible link between a lack of serotonin receptors in the brain and impulsive behaviors.
- A recent study from the Neural Computation Unit at the OIST explored this further, resulting in evidence that there is in fact a neurological factor to the brain's ability to control impulses and manage patience.
- This research could reveal more data on how serotonin impacts regions of the brain, which could eventually lead to the development of new drug treatments for conditions such as depression and addiction, among others.
The old adage "patience is a virtue" is coming undone due to new research that suggests patience (along with impulse control) can be linked to specific neurological systems. Instead of being solely determined by our behaviors as previously thought, both patience and impulse control may be something derived from our biology.
A previous study involving mice showed a possible link between a lack of serotonin receptors in the brain and impulsive behaviors. As this link has been recently discovered and is not yet entirely understood, the new research team aimed to understand the neurological processes that control patience and impulsive behavior.
That same team of scientists published another study in the journal Nature Communications. This study pushed this theory further by researching the role of the dorsal raphe nucleus (DRN)—the part of the brain that contains serotonin-releasing neurons—in mice. It was during this study that they found a causal relationship between the action that serotonin has on this brain region and the patience for anticipated rewards.
Three areas of the brain can impact your patience and impulse control
What role does biology really play in our ability to be patient and control our impulses?
Credit: whitehoune on Adobe Stock
The Neural Computation Unit at the Okinawa Institute of Science and Technology Graduate University (OIST) ran the latest study, which focused on three parts of the brain:
- NAc - nucleus accumbens, which has been previously studied as a key region in the brain that mediates a variety of behaviors, including reward and satisfaction.
- OFC - orbitofrontal cortex, which is considered to have a role in higher-order cognition (like decision-making).
- mPFC - medial prefrontal cortex, which is among the brain regions that have the highest baseline metabolic activity. This part of the brain is also suggested to mediate decision-making.
According to Medical News Today, the team chose these regions of the brain because prior research has shown that damage to them leads to an increase in impulsive behaviors.
The mice were later divided into groups; one group had the optic fibers in the NAc part of the brain, one group had the optic fibers inserted into the OFC part of the brain, and the last group had optic fibers put into the mPFC parts of the brain. The team then observed how each group responded to serotonin stimulation.
To take their research to the next level, the team used mice that were genetically engineered to have specialized proteins that release serotonin on exposure to photostimulation. The mice were trained to poke their nose inside a hole (to wait for a food item). Then, they underwent surgery in which researchers implanted an optic fiber into the DRN part of the brains of the mice.
Seventy-five perfect of the mice were put through the waiting task again while activating a serotonin release through a light stimulation procedure. The other 25 percent went into an "omission" group that received no rewards or serotonin stimulation.
The results of this study prove serotonin plays a role in patience and impulse control.
When the research team activated the serotonergic neurons in the DRN of the mice, they displayed improved patience when waiting for food rewards. Stimulating the OFC area was almost as effective as stimulating the DRN area in promoting these prolonged wait times in the mice. However, triggering the NAc had no impact.
A particularly interesting part of this study was that, upon stimulating the mPFC region of the brain in the mice, their ability to wait for the food reward was enhanced but only when they did not know the food's arrival time. These results suggest that serotonin in the mPFC can impact the animal's ability to evaluate the time required to wait for a reward. Meanwhile, the neurochemical's presence in the OFC assists in the overall assessment of a delayed reward.
"This confirmed the idea that these two brain areas are calculating the probability of a reward independently from each other and that these independent calculations are then combined to ultimately determine how long the mice will wait," Dr. Miyazaki told Medical News Today.
This research could reveal more data on how serotonin impacts regions of the brain, which could eventually lead to the development of new drug treatments for conditions such as depression and addiction, among others.
In some situations, asking "what if everyone did that?" is a common strategy for judging whether an action is right or wrong.
It probably won't have a big impact on the financial well-being of your local transportation system. But now ask yourself, "What if everyone did that?" The outcome is much different — the system would likely go bankrupt and no one would be able to ride the train anymore.
Moral philosophers have long believed this type of reasoning, known as universalization, is the best way to make moral decisions. But do ordinary people spontaneously use this kind of moral judgment in their everyday lives?
In a study of several hundred people, MIT and Harvard University researchers have confirmed that people do use this strategy in particular situations called "threshold problems." These are social dilemmas in which harm can occur if everyone, or a large number of people, performs a certain action. The authors devised a mathematical model that quantitatively predicts the judgments they are likely to make. They also showed, for the first time, that children as young as 4 years old can use this type of reasoning to judge right and wrong.
"This mechanism seems to be a way that we spontaneously can figure out what are the kinds of actions that I can do that are sustainable in my community," says Sydney Levine, a postdoc at MIT and Harvard and the lead author of the study.
Other authors of the study are Max Kleiman-Weiner, a postdoc at MIT and Harvard; Laura Schulz, an MIT professor of cognitive science; Joshua Tenenbaum, a professor of computational cognitive science at MIT and a member of MIT's Center for Brains, Minds, and Machines and Computer Science and Artificial Intelligence Laboratory (CSAIL); and Fiery Cushman, an assistant professor of psychology at Harvard. The paper is appearing this week in the Proceedings of the National Academy of Sciences.
The concept of universalization has been included in philosophical theories since at least the 1700s. Universalization is one of several strategies that philosophers believe people use to make moral judgments, along with outcome-based reasoning and rule-based reasoning. However, there have been few psychological studies of universalization, and many questions remain regarding how often this strategy is used, and under what circumstances.
To explore those questions, the MIT/Harvard team asked participants in their study to evaluate the morality of actions taken in situations where harm could occur if too many people perform the action. In one hypothetical scenario, John, a fisherman, is trying to decide whether to start using a new, more efficient fishing hook that will allow him to catch more fish. However, if every fisherman in his village decided to use the new hook, there would soon be no fish left in the lake.
The researchers found that many subjects did use universalization to evaluate John's actions, and that their judgments depended on a variety of factors, including the number of people who were interested in using the new hook and the number of people using it that would trigger a harmful outcome.
To tease out the impact of those factors, the researchers created several versions of the scenario. In one, no one else in the village was interested in using the new hook, and in that scenario, most participants deemed it acceptable for John to use it. However, if others in the village were interested but chose not to use it, then John's decision to use it was judged to be morally wrong.
The researchers also found that they could use their data to create a mathematical model that explains how people take different factors into account, such as the number of people who want to do the action and the number of people doing it that would cause harm. The model accurately predicts how people's judgments change when these factors change.
In their last set of studies, the researchers created scenarios that they used to test judgments made by children between the ages of 4 and 11. One story featured a child who wanted to take a rock from a path in a park for his rock collection. Children were asked to judge if that was OK, under two different circumstances: In one, only one child wanted a rock, and in the other, many other children also wanted to take rocks for their collections.
The researchers found that most of the children deemed it wrong to take a rock if everyone wanted to, but permissible if there was only one child who wanted to do it. However, the children were not able to specifically explain why they had made those judgments.
"What's interesting about this is we discovered that if you set up this carefully controlled contrast, the kids seem to be using this computation, even though they can't articulate it," Levine says. "They can't introspect on their cognition and know what they're doing and why, but they seem to be deploying the mechanism anyway."
In future studies, the researchers hope to explore how and when the ability to use this type of reasoning develops in children.
In the real world, there are many instances where universalization could be a good strategy for making decisions, but it's not necessary because rules are already in place governing those situations.
"There are a lot of collective action problems in our world that can be solved with universalization, but they're already solved with governmental regulation," Levine says. "We don't rely on people to have to do that kind of reasoning, we just make it illegal to ride the bus without paying."
However, universalization can still be useful in situations that arise suddenly, before any government regulations or guidelines have been put in place. For example, at the beginning of the Covid-19 pandemic, before many local governments began requiring masks in public places, people contemplating wearing masks might have asked themselves what would happen if everyone decided not to wear one.
The researchers now hope to explore the reasons why people sometimes don't seem to use universalization in cases where it could be applicable, such as combating climate change. One possible explanation is that people don't have enough information about the potential harm that can result from certain actions, Levine says.
The research was funded by the John Templeton Foundation, the Templeton World Charity Foundation, and the Center for Brains, Minds, and Machines.
Researchers explore the "complex web of connections" in your brain that allows you to make split second decisions.
- Researchers at the University of Colorado discovered the cerebellum's role in split-second decision making.
- While it was previously thought that the cerebellum was in charge of these decisions, it's been uncovered that it is more like a "complex web of connections" through the brain that goes into how you make choices.
- If the decision is made within 100 milliseconds (of being presented with the choice), the change of mind will succeed in altering the original course of action.
You are driving down the highway listening to music and thinking about a beach vacation in Hawaii when you realize your exit is sooner than you thought. Do you quickly change lanes to try to make your turn or do you keep going and take the next exit?
This is a split decision. Researchers at the University of Colorado Anschutz Medical Campus wanted to explore how the brain makes these quick "go---no go" decisions. Previous research on this same topic is discussed in this article of Medical News Today.
"We wanted to know how this kind of decision making takes place," said the study's senior author Diego Restrepo, Ph.D., professor of cell and developmental biology at the University of Colorado School of Medicine.
While it was previously thought that the cerebellum was in charge of these decisions, it's been uncovered that it is more like a "complex web of connections" through the brain that goes into how you make a quick decision or choice.
Susan Courtney, a professor of psychological and brain sciences, found in her 2017 study that these quick decisions involve extremely fast coordination between an area of the premotor cortex and two areas in the prefrontal cortex.
Restrepo's team looked at the cerebellum's molecular layer interneurons (MLIs). In a Pavlovian twist, mice were rewarded with sugary water after smelling a "rewarded odorant." When an "unrewarded odorant" was released into the air, they were taught to avoid licking the spout. If they took a lick, they were sent to mouse timeout. After a few rounds, the mice learned the trick: lick to this smell, scurry away at that smell. Then Restrepo messed with their brains by throwing in chemogenetic agents that threw off their olfactory sense.
The cerebellum, or "little brain," sits just above the brainstem. Traditionally, it is responsible for the coordination of voluntary movements as well as motor functions like balance, posture, and coordination. It has also been linked to emotional control, such as in our fear and pleasure responses, and is associated with non-motor conditions such as autism spectrum disorders.
Research on cerebellum damage results in a variety of issues, including difficulty with balancing, errors in the force and speed of movement, gait impairment, and even decreased muscle tone. As the cerebellum changes with age, it's especially important to exercise in order to keep that region functioning optimally for as long as possible.
Photo: StunningArt / Shutterstock
This understanding of the cerebellum's role in decision-making is new. Because the mice became less confident in their choices after the release of those agents, it appears the cerebellum is partly responsible for quick decision-making responses.
Restrepo notes that the cerebellum is responsible for a lot of learning—perhaps unsurprisingly, given its proximity to the spinal cord and its influence on motor patterns. Split-second decisions are an old evolutionary necessity and would have started evolving quite early on. As he says,
"We found an entire subset of brain cells that change after learning. It sheds further light on how the cerebellum functions and the complex web of connections that go into quick decision making."
How long does it take to make a split decision and have good results?
The researchers in Susan Courtney's study highlighted that timing is everything when it comes to these quick decisions. If the decision to change is made within 100 milliseconds (of being presented with the choice), the change of mind will succeed in altering the original course of action. However, if it takes at least (or more than) 200 milliseconds, the chances of the change succeeding are significantly less.
Why do you feel the way you feel, think the way you think and behave the way you do? Here are 5 possible explanations.
- Psychology is the scientific study of the mind and behavior, but did you know there are actually 5 different perspectives to psychology?
- The earliest study of human psychology can be traced back to 400-500 BC.
- The biological approach, the psychodynamic approach, the behavioral approach, the cognitive approach, and the humanistic approach offer valid yet opposing ideas on why humans behave the way we do.
Human beings are fascinating. Have you ever wondered why some people can remember certain dates really well and others can't? Or why you are a home-body while your best friend is usually the life of the party? What makes us think, feel, and behave the way we do?
According to Simply Psychology, psychology is "the scientific study of the mind and behavior." The art of studying the psychology of humans can be traced back to as early as 400-500 BC, while modern psychology is said to have started in 1879 when Wilhelm Wundt opened the first psychology lab.
Wundt's laboratory would become a focus for those with a serious interest in psychology. First, opening its doors to German philosophers and psychology students, then for American and British students, as well. Wundt's aim was to record thoughts and sensations and to analyze them into their constituent elements in much the same way a chemist would analyze chemical compounds, in order to get to the underlying structure.
Psychology’s five major perspectives: Why are you the way you are?
There are five approaches to human psychology - which one do you trust most?
Image by FGC on Shutterstock
The study of psychology has progressed greatly, thanks to Wundt and other pioneers. Over the years, psychologists began to study all aspects of human behavior from personality traits to brain functions. Eventually, the studies began to look at the same human behaviors from various angles including biological, psychodynamic, behavioral, cognitive, and humanistic perspectives. These became known as the "five major perspectives" in psychology.
The biological approach
The biological approach to psychology focuses on examining our thoughts, feelings, and behaviors from a strictly biological point of view. In this approach, all thoughts, feelings, and behaviors would have a biological cause.
This approach is relevant to the study of psychology in three ways:
- Comparative method: different species of animals can be studied and then compared to each other. This helps us better understand human behavior.
- Physiology: the study of how the nervous system and hormones work, how the brain functions, how changes in the structure and/or function can affect our behavior. For example, how prescribed drugs to treat depression can affect our behavior through their interaction with the nervous system.
- Investigation of inheritance: the study of what we inherit from our parents (through genetics). For example, whether high intelligence is inherited from one generation to the next.
Each of these is inherently important to how we study human psychology from a biological point of view, and it's suggested that behavior can be largely explained through biology.
The psychodynamic approach
The psychodynamic approach to psychology is most well-known for its ties to Sigmund Freud and his followers. This approach includes all theories in psychology that see humans functioning based on the interaction of drives and forces within the person, particularly unconscious and between the different structures of the personality.
Freud developed a collection of theories (most of which were based on what his patients told him during therapy) that formed the basis of the psychodynamic approach.
The psychodynamic approach can be best described in basic assumptions that:
- Our behavior and feelings are powerfully affected by unconscious motives.
- Our behavior and feelings as adults are rooted in childhood experiences.
- All behavior has a cause, and that cause is usually an unconscious one.
- Personality is made of three parts (ID, ego, and super-ego).
The behavioral approach
The behavioral approach to psychology focuses on how one's environment and external stimuli impact a person's mental states and development. More importantly, it focuses on how these factors specifically "train" us for the behaviors we exhibit later on.
People who support this approach to psychology over others may believe that the concept of "free will" is an illusion because all behaviors are learned and based on our past experiences. In other words, that we've been conditioned to act the way we act so nothing is ever truly our own choice.
The cognitive approach
The cognitive approach to psychology shifts away from conditioned behavior and psychoanalytical notions to the study of how our mind works, how we process information, and how we use that processed information to drive our behaviors.
This approach focuses on:
- The meditational processes that occur between the stimulus and our response to the stimulus.
- Human beings are information processors and all learning is based on the relationships we form with various stimuli.
- Internal mental behavior can be scientifically studied using experiments that show us how we react to certain stimuli.
In other words, the cognitive approach focuses on how our brains react to the environment around us and how our cognitive brain has very specific ways of processing certain stimuli which can explain why we think, feel and behave in certain ways.
The humanistic approach
The humanistic approach to psychology was considered something of a rebellion against what psychologists saw as the limitations of the behaviorist and psychodynamic theories of psychology. It's the idea that we should approach psychological studies uniquely for each individual because we are all so vastly different.
This approach focuses on:
- The idea that we all have free will.
- The idea that people are all basically good and that we have an innate need to make ourselves and the world better.
- That we are motivated to self-actualize, grow, and thrive.
- That our experiences are what drive us.
This approach puts emphasis on the uniqueness of every person and every situation, suggesting that the other studies can never be fully accurate as there is such a wide range of thoughts, feelings, and human behaviors that can adapt and change as we do.