Big Think's Top 25 +1 Videos


Mama, Don't Let Your Babies Grow Up To Deny Evolution

If adults want to deny evolution, sure. That’s fine. Whatever. But those adults better not make their kids follow in step because we as society need them to be better. Bill Nye, everyone's favorite Science Guy, explains the importance of promoting evolution education for America's future voters and lawmakers.

My Man, Sir Isaac Newton

Are you at least 26 years-old? If so, you are older than Isaac Newton was when he invented calculus... on a dare! (If you're younger than 26, better hurry up.) Big Think expert and overall cool guy Neil deGrasse Tyson explains why Newton is the greatest physicist who ever and likely will ever live.

Will Mankind Destroy Itself?

Theoretical physicist Michio Kaku sees two major trends today. One eventually leads to a multicultural, scientific, tolerant society that will expand beyond Earth in the name of human progress. The other trend leads to fundamentalism, monoculturalism, and -- eventually -- civilizational ruin. Whichever of these two trends wins out will determine the fate of mankind. No pressure, everyone.

Ricky Gervais on the Principles of Comedy

Comedy isn't just about making people laugh, says actor Ricky Gervais. It's about making people think. And while different forms of comedy require different approaches, the crux of any good performance will always be rhythm.

Reading the Bible (Or the Koran, Or the Torah) Will Make You an Atheist

Author and magician Penn Jillette was asked to leave his Christian youth group by a pastor who told his parents: "He's no longer learning about the Bible from me. He is now converting everyone in the class to atheism." The reason? Jillette did his homework and was turned off by the hostilities of the text. It can be intimidating to come out as an atheist, especially in a religious community. Jillette found that having "out" atheist role models helped him feel unalone.

Henry Rollins: The One Decision that Changed My Life Forever

Punk legend Henry Rollins describes the biggest turning point in his life: the moment he decided to leave his job as manager of a Häagen-Dazs store and eventually become the lead singer of Black Flag. It was the courage to take a risk, plus a whole lot of luck, that got Rollins to where he is today.

5 Programming Languages Everyone Should Know

Java is "heavyweight, verbose, and everyone loves to hate it," but programmer Larry Wall still thinks you should know it. In this video, he offers suggestions for people interested in learning languages, as well as suggestions for those significantly less invested in computer programming.

The Importance of Unbelief

If you assume there’s no afterlife, Stephen Fry says, you’ll likely have a fuller, more interesting "now" life. The actor and comedian details the positive influence philosophers have had on his life, as well as his journey of understanding both what he believes and why he believes it.

Why be happy when you could be interesting?

We don't really want what we think we desire, says philosopher Slavoj Žižek.

James Gleick on the Common Character Traits of Geniuses

This video is part of a series on female genius, in proud collaboration with 92Y's 7 Days of Genius Festival.


The personalities of Isaac Newton and Richard Feynman were, on one level, extremely different. Biographer and former New York Times reporter James Gleick says Newton was argumentative, had few friends, and likely died a virgin. Feynman, on the other hand, loved dancing and going to parties, and had many friends in the scientific community. But in regards to their working habits, both men were solitary and had the ability to concentrate with a sort of intensity that is hard for mortals to grasp. At bottom, Gleick says geniuses tend to have a yearning for solitude which, though fruitful for their professional work, made the task of daily living more burdensome.

The Importance of Doing Useless Things

From poetry and ballet to mathematics and being clever, life is laden with frivolous pursuits that hold no bearing on our ability to survive. Yet, insists Richard Dawkins, if it weren’t for the development of these impractical activities, we wouldn’t be here.

Why monogamy is ridiculous

Dan Savage: the idea that one instance of infidelity should ruin a relationship is a new—and misguided—notion.

Dan Harris: Hack Your Brain's Default Mode With Meditation

Dan Harris explains the neuroscience behind meditation, but reminds us that the ancient practice isn't magic and likely won't send one floating into the cosmic ooze. He predicts that the exercise will soon become regularly scheduled maintenance, as commonplace as brushing your teeth or eating your veggies. Harris, an ABC News correspondent, was turned on to mediation after a live, on-air panic attack. His latest book is 10% Happier: How I Tamed the Voice in My Head, Reduced Stress Without Losing My Edge, and Found Self-Help That Actually Works--A True Story.

How Intellectuals Betrayed the Poor

For 40 years academics were duped into idolizing the idea of unfettered markets, says Cornel West, and now our society is paying a terrible price.

Why Some Races Outperform Others

A psychologist explains the latest research into education disparity.

Why It's So Hard for Scientists to Believe in God

Some scientists see religion as a threat to the scientific method that should be resisted. But faith "is really asking a different set of questions," says Collins.

Why Facebook Isn't Free

Internet pioneer Jaron Lanier argues that free technologies like Facebook come with a hidden and heavy cost – the livelihoods of their consumers.

How to Tell if You’re a Writer

For John Irving, the need for a daily ration of solitude was his strongest "pre-writing" moment as a child.

Your Behavior Creates Your Gender

Nobody is born one gender or the other, says the philosopher. "We act and walk and speak and talk in ways that consolidate an impression of being a man or being a woman."

Are You a Liberal Snob? Take The Quiz

Charles Mrray designed this quiz to have a salutary effect on bringing to people’s attention the degree to which they live in a bubble that seals them off from an awful lot of their fellow American citizens.

Why You Should Watch Filth

John Waters defends the creation and consumption of obscene films, and recommends some of his personal favorites.

What Are You Worth? Getting Past Status Anxiety.

Writer Alain De Botton says that status anxiety is more pernicious and destructive than most of us can imagine, and recommends getting out of the game altogether.

Sheila Heen on the Psychology of Happiness and Feedback

Sheila Heen, a Partner at Triad Consulting Group and a lecturer on Law at Harvard Law School, explains the psychology behind feedback and criticism. Heen is co-author of "Thanks for the Feedback: The Science and Art of Receiving Feedback Well."

Are You a Psychopath? Take the Test.

Psychologist Kevin Dutton presents the classic psychological test known as "the trolley problem" with a variation. Take the test and measure you response on the psychopathic spectrum.

Here's How to Catch a Liar, If You Really Want To

It’s very complex as to whether or not we really want to catch a liar. We think we do. What if we find out that both of our presidential candidates are lying? Then what do we do? I’m not saying they are; I never comment on anyone in office or running for office. Only after they’re out that they’re fair game. . . . Clinton said, "I didn’t have sex with that woman" and then gave her name. "That woman" is putting her at a distance from himself.

Why I Came Out at Age 81

As a teenager in the '40s, James Randi "would have gotten stoned" for being gay. But when he outed himself to the world in 2010, the reaction was "wonderful."

More playlists
  • Polarization and extreme partisanships have been on the rise in the United States.
  • Political psychologist Diana Mutz argues that we need more deliberation, not political activism, to keep our democracy robust.
  • Despite increased polarization, Americans still have more in common than we appear to.


Imagine everyday citizens engaging in the democratic process. What images spring to mind? Maybe you thought of town hall meetings where constituents address their representatives. Maybe you imagined mass sit-ins or marches in the streets to protest unpopular legislation. Maybe it's grassroot organizations gathering signatures for a popular referendum. Though they vary in means and intensity, all these have one thing in common: participation.

Participatory democracy is a democratic model that emphasizes civic engagement as paramount for a robust government. For many, it's both the "hallmark of social movements" and the gold standard of democracy.

But all that glitters may not be gold. While we can all point to historical moments in which participatory democracy was critical to necessary change, such activism can have deleterious effects on the health of a democracy, too. One such byproduct, political psychologist Diana Mutz argues, can be the lessening political tolerance.

Participation or deliberation?

In her book Hearing the Other Side: Deliberative Versus Participatory Democracy, Mutz argues that participatory democracy is best supported by close-knit groups of like-minded people. Political activism requires fervor to rouse people to action. To support such passions, people surround themselves with others who believe in the cause and view it as unassailable.

Alternative voices and ideologies — what Mutz calls "cross-cutting exposures" — are counterproductive to participation because they don't reinforce the group's beliefs and may soften the image of the opposing side. This can dampen political zeal and discourage participation, particularly among those averse to conflict. To prevent this from happening, groups can become increasingly intolerant of the other side.

"You can have a coup and maximize levels of participation, but that wouldn't be a great thing to do. It wouldn't be a sign of health and that things were going well."

As the book's title suggests, deliberative democracy fosters a different outlook for those who practice it. This model looks toward deliberation, communication, compromise, and consensus as the signs of a resilient democracy. While official deliberation is the purview of politicians and members of the court, it's worth noting that deliberative democracy doesn't mean inactivity from constituents. It's a philosophy we can use in our daily lives, from community memberships to interactions on social media.

"The idea is that people learn from one another," Mutz tells Big Think. "They learn arguments from the other side as well as learn more about the reasons behind their own views. [In turn], they develop a respect for the other side as well as moderate their own views."

Mutz's analysis leads her to support deliberation over activism in U.S. politics. She notes that the homogeneous networks required for activism can lead to positive changes — again, there are many historical examples to choose from. But such networks also risk developing intolerance and extremism within their ranks, examples of which are also readily available on both the right and left.

Meanwhile, the cross-cutting networks required for deliberative democracy offer a bounty of benefits, with the only risk being lowered levels of participation.

As Mutz writes: "Hearing the other side is also important for its indirect contributions to political tolerance. The capacity to see that there is more than one side to an issue, that political conflict is, in fact, a legitimate controversy with rationales on both sides, translates to greater willingness to extend civil liberties to even those groups whose political views one dislikes a great deal."

Of politics and summer camp

(Photo by Fox Photos/Getty Images)

Take that! A boxing bout between two members of a schoolboys' summer camp at Pendine, South Wales, takes place in a field within a ring of cheering campmates.

Of course, listening openly and honestly to the other side doesn't come naturally. Red versus blue. Religious versus secular. Rural versus cosmopolitan. We divide ourselves into polarized groups that seek to silence cross-cutting communication in the pursuit of political victory.

"The separation of the country into two teams discourages compromise and encourages an escalation of conflict," Lilliana Mason, assistant professor of Government and Politics at the University of Maryland, writes in her book Uncivil Agreement: How Politics Became Our Identity. "The cooperation and compromise required by democracy grow less attainable as partisan isolation and conflict increase."

Mason likens the current situation to Muzafer Sherif's famous Robbers Cave Experiment.

In the early 1950s, Sherif gathered a group of boys for a fun summer camp at Robbers Cave State Park, Oklahoma. At least, that was the pretense. In reality, Sherif and his counselors were performing an experiment in intergroup conflict that would now be considered unethical.

The 20 boys were divided into two groups, the Rattlers and the Eagles. For a while, the counselors kept the groups separate, allowing the boys to bond only with their assigned teammates. Then the two groups were introduced to participate in a tournament. They played competitive games, such as baseball and tug-o-war, with the winning team promised the summer camp trophy.

Almost immediately, the boys identified members of the other team as intruders. As the tournament continued, the conflict escalated beyond sport. The Eagles burned a Rattlers flag. The Rattlers raided the Eagles' cabin. When asked to describe the other side, both groups showed in-group favoritism and out-group aggression.

Most troubling, the boys wholly assumed the identity of an Eagle or Rattler despite having never been either before that very summer.

"We, as modern Americans, probably like to think of ourselves as more sophisticated and tolerant than a group of fifth-grade boys from 1954. In many ways, of course, we are," Mason writes. "But the Rattlers and the Eagles have a lot more in common with today's Democrats and Republicans than we would like to believe."

Like at Robbers Cave, signs of incendiary conflict are easy to spot in U.S. politics today.

"Political Polarization in the American Public", Pew Research Center, Washington, D.C. (June 12, 2014)

A 2014 Pew survey found that the ideological overlap between Democrats and Republicans is much more distant than in the past. More Republicans lie further right of moderate Democrats than before and vice versa. The survey also found that partisan animosity had doubled since 1994.

In her book, Mason points to research that shows an "increasing number of partisans don't want party leaders to compromise," blame "the other party for all incivility in government," and abhor the idea of dating someone from outside their ideological group.

And let's not forget Congress, which has grown increasingly divided along ideological lines over the past 60 years.

A dose of daily deliberation

Painting by Charles Francois Jalabert (1819-1901) 1846. Beaux-Arts museum, Nimes, France. Photo by Leemage/Corbis via Getty Images.

Horace, Virgil and Varius at the house of Maecenas.

A zero-sum mindset may be inevitable in a summer camp tournament, but it's detrimental if taken into wider society and politics. Yet if participatory democracy leads to the silencing of oppositional voices, a zero-sum mindset is exactly what we get. Conversely, creating networks that tolerate and support differing opinions offers non-zero benefits, like tolerance and an improvement of one's understanding of complicated issues.

Mutz wrote her book in 2006, but as she told us in our interview, the intervening years have only strengthened her resolve that deliberation improves democratic health:

"Right now, I'm definitely on the side of greater deliberation rather than just do whatever we can to maximize levels of participation. You can have a coup and maximize levels of participation, but that wouldn't be a great thing to do. It wouldn't be a sign of health and that things were going well. Democracy [must be] able to absorb differences in opinion and funnel them into a means of governing that people were okay with, even when their side didn't win."

Unfortunately, elected officials and media personalities play up incivility and the sense of national crisis for ratings and attention, respectively. That certainly doesn't help promote deliberation, but as Mutz reminded us, people perceive political polarization to be much higher than it actually is. In our daily lives, deliberative democracy is more commonplace than we realize and something we can promote in our communities and social groups.

Remember that 2014 Pew survey that found increased levels of partisan animosity? Its results showed the divide to be strongest among those most engaged and active in politics. The majority of those surveyed did not hold uniform left or right views, did not see the opposing party as an existential threat, and believed in the deliberative process in government. In other words, the extremes were pulling hard at the poles.

Then there's social media. The popular narrative is that social media is a morass of political hatred and clashing identities. But most social media posts have nothing to do with politics. An analysis of Facebook posts from September 2016, the middle of an election year, found the most popular topics centered on football, Halloween, Labor Day, country music, and slow cookers.

And what of political partisanship and prejudice? In an analysis of polarization and ideological identity, Mason found that labels like "liberal" and "conservative" had less to do with values and policy attitudes – as the majority of Americans agree on a substantial number of issues – and more to do with social group identification.

Yes, we all know those maps that media personalities dust off every election year, the ones that show the U.S. carved up into competing camps of red and blue. The reality is far more intricate and complex, and Americans' intolerance for the other side varies substantially from place to place and across demographics.

So while participation has its place, a healthy democracy requires deliberation, a recognition of the other side's point of view, and the willingness to compromise. Tolerance may not make for good TV or catchy political slogans, but it's something we all can foster in our own social groups.

Understanding what tolerance means in a highly polarized America

Imagine you're dining out at a casual restaurant with some friends. After looking over the menu, you decide to order the steak. But then, after a dinner companion orders a salad for their main course, you declare: “I'll have the salad too."


This kind of situation – making choices that you probably otherwise wouldn't make were you alone – probably happens more often than you think in a wide variety of settings, from eating out to shopping and even donating to charity. And it's not just a matter of you suddenly realizing the salad sounds more appetizing.

Prior research has shown people have a tendency to mimic the choices and behaviors of others. But other work suggests people also want to do the exact opposite to signal their uniqueness in a group by making a different choice from others.

As scholars who examine consumer behavior, we wanted to resolve this discrepancy: What makes people more likely to copy others' behavior, and what leads them to do their own thing?

A social signal

We developed a theory that how and why people match or mimic others' choices depends a lot on the attributes of the thing being selected.

Choices have what we call “ordinal" attributes that can be ranked objectively – such as size or price – as well as “nominal" attributes that are not as easily ranked – such as flavor or shape. We hypothesized that ordinal attributes have more social influence, alerting others to what may be seen as “appropriate" in a given context.

Nominal attributes, on the other hand, would seem to be understood as a reflection of one's personal preferences.

So we performed 11 studies to test our theory.

Size may be social, but flavor remains a personal choice.

One scoop or two

In one study conducted with 190 undergraduate students, we told participants that they were on their way to an ice cream parlor with a friend to get a cone. We then told our would-be ice cream consumers that their companion was getting either one scoop of vanilla, one scoop of chocolate, two scoops of vanilla or two scoops of chocolate. We then asked participants what they wanted to order.

We found that people were much more likely to order the same size as their companion but not the same flavor.

The participants seemed to interpret the number of scoops the companion ordered as an indication of what's appropriate. For example, ordering two scoops might signal "permission" to indulge or seem the more financially savvy – if less healthy – choice, since it usually costs only marginally more than one. Or a single scoop might suggest "let's enjoy some ice cream – but not too much."

The choice of chocolate or vanilla, on the other hand, is readily understood as a personal preference and thus signals nothing about which is better or more appropriate. I like vanilla, you like chocolate – everyone's happy.

We also asked participants to rate how important avoiding social discomfort was in their decision. Those who ordered the same number of scoops as their companion rated it as more important than those who picked a different amount.

Examining other contexts

In the other studies, we replicated our results using different products, in various settings and with a variety of ordinal and nominal attributes.

For example, in another experiment, we gave participants US$1 to buy one of four granola bars from a mock store we set up inside the University of Pittsburgh's Katz/CBA Business Research Center. As the ordinal attribute, we used brand prestige: They could pick either a more expensive well-known national brand or a cheaper one sold by a grocery store under its own label. Our nominal attribute was chocolate or peanut butter.

Before making the choice, a "store employee" stationed behind the checkout register told participants she or he had tested out a granola bar, randomly specifying one of the four – without saying anything about how it tasted. We rotated which granola bar the employee mentioned every hour during the five-day experiment.

Similar to the ice cream study, participants tended to choose the brand that the employee said he or she had chosen – whether it was the cheaper or pricier one – but ignored the suggested flavor.

Moving away from food, we also examined influences on charitable donations. In this study, we recruited online participants who were paid for their time. In addition, we gave each participant 50 cents to either keep or donate to charity.

If they chose to donate the money, they could give all of it or half to a charity focused on saving either elephants or polar bears. Before they made their choice, we told them what another participant had supposedly decided to do with their money – randomly based on one of the four possibilities.

The results were the same as in all our other studies, including ones we conducted involving different brands and shapes of pasta and varieties and taste profiles of wine. People matched the ordinal attribute – in this case the amount – but paid little heed to the nominal attribute – the chosen charity – which remained a personal preference.

These kinds of social cues regarding others' choices are everywhere, from face-to-face interactions with friends to online tweets or Instagram posts, making it difficult to escape the influence of what others do on our own consumption choices.

And if we believe we're making our companions feel more comfortable while still choosing something we like, what's the harm in that?

Kelly L. Haws, Associate Professor of Marketing, Vanderbilt University; Brent McFerran, W. J. Van Duse Associate Professor, Marketing, Simon Fraser University, and Peggy Liu, Assistant Professor of Business Administration, University of Pittsburgh

This article is republished from The Conversation under a Creative Commons license. Read the original article.

  • A Twitter account claiming to represent Google employees interested in the strike says more than 400 workers have so far pledged to join the protests.
  • Global Climate Strike is a global protest against calling for urgent action on climate change.
  • Google has recently faced criticism for its partnerships with oil and gas companies.


Google employees plan to join fellow tech workers from Amazon and Microsoft on Friday to protest the fossil fuel industry as part of the Global Climate Strike.

Global Climate Strike is a climate change protest scheduled to occur in at least 150 countries from Sept. 20 to 27, coinciding with the United Nations' 2019 Climate Action Summit on Sept. 23. The protests are calling for "climate justice for everyone" and an end to the use of fossil fuels.

"We need to act right now to stop burning fossil fuels and ensure a rapid energy revolution with equity, reparations and climate justice at its heart," reads a statement on the Global Climate Strike website.

As of Tuesday afternoon, more than 400 Google employees had planned to walk out on Friday, according to the Google Workers for Action on Climate Change Twitter account. It's unclear exactly what Google employees are demanding from their company, but the Google Workers for Action on Climate Change Twitter account showed protest signs calling for zero contracts with oil and gas companies.

Despite its progressive public stance on climate change, Google has faced criticism in recent years for partnering with oil and gas companies in an attempt to use technology to help automate the industry. The company has also established an oil, gas and energy division, headed by Darryl Willis, a 25-year veteran of BP. The Wall Street Journal described this division as "part of a new group Google has created to court the oil and gas industry." Microsoft and Amazon have made similar partnerships.

Jack Kelly, who runs the nonprofit Open Climate Fix and formerly worked as a research engineer at Google's DeepMind, tweeted:

"I'm so pleased this is happening. Google Cloud's enthusiastic sales pitch to upstream oil & gas producers heavily influenced my decision to leave Google."

Amazon employees, meanwhile, have tied three key demands to Friday's walkout, according to Amazon Employees for Climate Justice:

  • Zero emissions by 2030: Pilot electric vehicles first in communities most impacted by our pollution
  • Zero custom Amazon Web Services (AWS) contracts for fossil fuel companies to accelerate oil and gas extraction
  • Zero funding for climate denying lobbyists and politicians
The upcoming walkout comes less than a year after more than 20,000 Google employees walked off the job to protest the company's decision to give seven-figure exit packages to male executives accused of sexual misconduct. It's possible that, as big tech companies continue to see their employees increasingly engage in activism, Silicon Valley workers might feel more empowered to voice their philosophical and political complaints against these behemoth organizations, and without consequence.
After all, research continues to show that people are willing to make significant sacrifices in order to pursue meaningful work. The question that tech employees are increasingly raising is: Are their employers willing to do the same?
  • Along with Maslow, Carl Rogers helped pioneer the field of humanistic psychology.
  • Although most associate the term "self-actualization" with Maslow, it's a concept that's frequently found in humanistic psychological literature.
  • What's the difference between Maslow's and Rogers' versions of self-actualization, and what can we learn from Rogers?

One could be forgiven for thinking that the term "self-actualization" was developed entirely by Abraham Maslow. Today, there are very few contexts where one can hear the term outside of Maslow's famous hierarchy of needs. But in fact, the twentieth century featured many humanistic psychologists who used the term to mean one thing or another. It was first coined by psychologist Kurt Goldstein, who used it to refer to something very similar to what Maslow would later focus on: the tendency for human beings to become all that they can, that "what a man can be, he must be."

But this isn't the only take on self-actualization. Carl Rogers, a peer of Maslow's, thought of humanistic psychology and self-actualization in an entirely distinct way.

Rogers' theory of personality and behavior

Carl Rogers

Jan Rieckhoff/ullstein bild via Getty Images

A sketch of Carl Rogers.

Along with Maslow, Rogers was one of the pioneers of humanistic psychology. Specifically, Rogers' greatest contribution was to the practice of psychotherapy, particularly in the development of what's known as "person-centered therapy," which is thought of today as one of the major approaches to therapy, along with cognitive behavioral therapy, psychoanalysis, and so on.

At the core of this therapeutic approach was Rogers' theory of personality and behavior. Just as Maslow had his hierarchy of needs, with self-actualization at the top, Rogers had his own model of human development, although self-actualization played a very different role in Rogers' system. Rogers actually had 19 separate propositional statements upon which he built his theory, but we'll just summarize the major components.

In Rogers' theory, reality for an individual (which he refers to as an organism) is the sum of subjective perceptions that the organism experiences. A developing organism will take some of these perceptions and separate them, labeling them as the self. As an example, you might perceive your body and a box of paperclips on your desk, but you would only consider your perception of your body to fall under the designation of "self." This happens with concepts and beliefs, too. Some of these things become part of the self, while others are perceived as belonging to the environment.

This idea of what counts as the self and what doesn't isn't fixed; it's fluid. Different concepts, perceptions, and experiences occur as a result of interacting with the environment, and the organism has to sort out how to relate their identity to these experiences.

Naturally, this isn't a smooth process. As a result of these interactions, most of us invent an "ideal" self, the person we think we should be, rather than the person we actually are. In Rogers' system, the broader the gap between the real self and the ideal self, the greater the sense of incongruence. All sorts of behaviors and experiences might occur that seem unacceptable to who we think we are. If this incongruence is severe enough, the organism might develop a psychopathology. If, on the other hand, the person we actually are and the person we think we should be are congruent with one another, we become more open to experiences and have to do less work defending ourselves from the outside world.

Where does self-actualization fit into all of this?

Where Maslow had self-actualization at the very top of a hierarchy of motivations, Rogers argued that self-actualization was the only motivation and that it was constantly driving the organism forward. "The organism has one basic tendency and striving — to actualize, maintain, and enhance the experiencing organism," wrote Rogers. For Rogers, every behavior and motivation is directed in pursuit of actualization, of this constant negotiation between the self and the perceptual field that composes an individual's reality.

Immediately, we can see that Maslow's version of self-actualization is a lot more aspirational. In Rogers' system, self-actualization is sort of just the default way of life — the only way of life, in fact. And where Maslow's version is sort of an endpoint, Rogers saw self-actualization as a never-ending process. But Rogers does have his own version of an ideal way of living, which he called, appropriately enough, "the good life."

Living the good life

In order to live the good life, an organism must symbolically assimilate all experiences into a consistent relationship with the self. To be fair, that's not an exactly intuitive definition. Consider, for instance, a narcissist hearing criticism. The narcissist perceives themselves to be perfect, and criticism is a threat that cannot be assimilated into their concept of their perfect self. Somebody living the "good life," however, might take that criticism as potentially true — potentially false, too, but worth considering at the very least.

In this regard, somebody living the good life matches up neatly with Maslow's idea of the self-actualized individual. Like Maslow, Rogers also believed that individuals living the good life would exemplify certain characteristics that would make them distinct from the fragile, neurotic, rank-and-file folks that most of us are. According to Rogers, the fully functioning person living the good life would have these characteristics:

  • An increasing openness to experience, as no experience would threaten the individual's self-concept;
  • An increasingly existential and present lifestyle, since they wouldn't need to distort the present in a way that fits with their self-concept;
  • Greater trust in their own values rather than those imposed upon them by, say, their parents or their society;
  • Openness to a wide variety of choices, as they wouldn't be restricted by possible threats to their self-concept (such as a narcissist might be if they engaged in some activity that could make them appear foolish);
  • More creativity, as they wouldn't feel the need to conform;
  • More frequently constructive rather than destructive;
  • And living a rich and full life.

Seems like a pretty good life, all in all. But Rogers also warned that not everybody is ready for the good life. He wrote,

"This process of the good life is not, I am convinced, a life for the faint-hearted. It involves the stretching and growing of becoming more and more of one's potentialities. It involves the courage to be. It means launching oneself fully into the stream of life."

On Becoming a Person: A Therapist's View of Psychotherapy


  • "Super-agers" seem to escape the decline in cognitive function that affects most of the elderly population.
  • New research suggests this is because of higher functional connectivity in key brain networks.
  • It's not clear what the specific reason for this is, but research has uncovered several activities that encourage greater brain health in old age.

At some point in our 20s or 30s, something starts to change in our brains. They begin to shrink a little bit. The myelin that insulates our nerves begins to lose some of its integrity. Fewer and fewer chemical messages get sent as our brains make fewer neurotransmitters.

As we get older, these processes increase. Brain weight decreases by about 5 percent per decade after 40. The frontal lobe and hippocampus — areas related to memory encoding — begin to shrink mainly around 60 or 70. But this is just an unfortunate reality; you can't always be young, and things will begin to break down eventually. That's part of the reason why some individuals think that we should all hope for a life that ends by 75, before the worst effects of time sink in.

But this might be a touch premature. Some lucky individuals seem to resist these destructive forces working on our brains. In cognitive tests, these 80-year-old "super-agers" perform just as well as individuals in their 20s.

Just as sharp as the whippersnappers

To find out what's behind the phenomenon of super-agers, researchers conducted a study examining the brains and cognitive performances of two groups: 41 young adults between the ages of 18 and 35 and 40 older adults between the ages of 60 and 80.

First, the researchers administered a series of cognitive tests, like the California Verbal Learning Test (CVLT) and the Trail Making Test (TMT). Seventeen members of the older group scored at or above the mean scores of the younger group. That is, these 17 could be considered super-agers, performing at the same level as the younger study participants. Aside from these individuals, members of the older group tended to perform less well on the cognitive tests. Then, the researchers scanned all participants' brains in an fMRI, paying special attention to two portions of the brain: the default mode network and the salience network.

The default mode network is, as its name might suggest, a series of brain regions that are active by default — when we're not engaged in a task, they tend to show higher levels of activity. It also appears to be very related to thinking about one's self, thinking about others, as well as aspects of memory and thinking about the future.

The salience network is another network of brain regions, so named because it appears deeply linked to detecting and integrating salient emotional and sensory stimuli. (In neuroscience, saliency refers to how much an item "sticks out"). Both of these networks are also extremely important to overall cognitive function, and in super-agers, the activity in these networks was more coordinated than in their peers.

Default Mode Network

Wikimedia Commons

An image of the brain highlighting the regions associated with the default mode network.

How to ensure brain health in old age

While prior research has identified some genetic influences on how "gracefully" the brain ages, there are likely activities that can encourage brain health. "We hope to identify things we can prescribe for people that would help them be more like a superager," said Bradford Dickerson, one of the researchers in this study, in a statement. "It's not as likely to be a pill as more likely to be recommendations for lifestyle, diet, and exercise. That's one of the long-term goals of this study — to try to help people become superagers if they want to."

To date, there is some preliminary evidence of ways that you can keep your brain younger longer. For instance, more education and a cognitively demanding job predicts having higher cognitive abilities in old age. Generally speaking, the adage of "use it or lose it" appears to hold true; having a cognitively active lifestyle helps to protect your brain in old age. So, it might be tempting to fill your golden years with beer and reruns of CSI, but it's unlikely to help you keep your edge.

Aside from these intuitive ways to keep your brain healthy, regular exercise appears to boost cognitive health in old age, as Dickinson mentioned. Diet is also a protective factor, especially for diets delivering omega-3 fatty acids (which can be found in fish oil), polyphenols (found in dark chocolate!), vitamin D (egg yolks and sunlight), and the B vitamins (meat, eggs, and legumes). There's also evidence that having a healthy social life in old age can protect against cognitive decline.

For many, the physical decline associated with old age is an expected side effect of a life well-lived. But the idea that our intellect will also degrade can be a much scarier reality. Fortunately, the existence of super-agers shows that at the very least, we don't have to accept cognitive decline without a fight.