Once a week.
Subscribe to our weekly newsletter.
9 self-actualized historical figures
When he was developing his famous hierarchy of needs, Abraham Maslow cited 9 historical figures that achieved self-actualization.
- In order to develop his model of self-actualization, Abraham Maslow interviewed friends, colleagues, students, and historical figures.
- These 9 historical figures demonstrate different aspects of self-actualization that Maslow believed all self-actualized individuals possessed to one degree or another.
- By studying these figures, we can come to a better understanding of what self-actualization really is.
Most, by now, are familiar with Abraham Maslow's hierarchy of needs. The model describes a series of successive, basic needs that must be satisfied before a human being can concern themselves with the next level. One needs to eat before one can worry about safety, one needs to feel safe before seeking out belonging, one needs to feel love and belonging before one can establish self-esteem, and one needs to have self-esteem before they can reach the pinnacle of the hierarchy, self-actualization.
In his most comprehensive book on the subject, Motivation and Personality, Maslow described self-actualization as the "full use and exploitation of talents, capacities, etc. Such people seem to be fulfilling themselves and to be doing the best that they are capable of doing. […] They are people who have developed or are developing to the full stature of which they are capable."
To develop this definition, Maslow studied friends, colleagues, college students, as well as 9 historical figures that he believed had become self-actualized. The qualities of these figures, he argued, could shed light on the qualities of self-actualized individuals in general. Though they all share characteristics of self-actualized people to one degree or another, some stand out more than others.
1. Abraham Lincoln
Portrait of Abraham Lincoln
Stock Montage/Getty Images
Abraham Lincoln could be said to represent many of the qualities of self-actualized people, but Maslow called him out for one in particular: a philosophical, unhostile sense of humor. "Probably," wrote Maslow, "Lincoln never made a joke that hurt anybody else; it is also likely that many or even most of his jokes had something to say, had a function beyond just producing a laugh. They often seemed to be education in a more palatable form, akin to parables or fables."
In his book, Reminiscences of Abraham Lincoln, author David B. Locke wrote, "But with all the humor in his nature, which was more than humor because it was humor with a purpose (that constituting the difference between humor and wit) […] His flow of humor was a sparkling spring gushing out of a rock – the flashing water had a somber background which made it all the brighter."
2. Thomas Jefferson
Today, Thomas Jefferson's historical legacy is a bit mixed. Having argued that all men are created equal, his position as a slave-owner seems contradictory. Still, Maslow considered Jefferson to be a self-actualized person, perhaps because of Jefferson's "democratic character structure," though this may be the result of the thinking of 20th century historians in regards to Jefferson's slavery practices.
Self-actualized people, wrote Maslow, possess a "hard-to-get-at-tendency to give a certain quantum of respect to any human being just because he is a human individual; our subjects seem not to wish to go beyond a certain minimum point, even with scoundrels, of demeaning. of derogating, of robbing of dignity."
This is certainly reflected in Jefferson's most famous piece of writing, the Declaration of Independence, which contended that all men possess unalienable rights. It is, however, more difficult to square with his ambivalent position on slavery. Throughout his life, Jefferson expressed his dislike of slavery and introduced anti-slavery legislation, yet he owned over 600 slaves and freed only 7. He also believed blacks to be inferior — in this regard, Maslow may have picked a poor candidate.
3. Albert Einstein
Maslow argued that self-actualized people are firmly grounded in the real world, rather than the miasma of stereotypes, abstractions, expectations, and biases that most of us experience. "They are therefore far more apt to perceive what is there rather than their own wishes, hopes, fears, anxieties, their own theories and beliefs, or those of their cultural group," he wrote.
Maslow argued that many excellent scientists possess this quality and that it drives them to learn more about the unknown, the ambiguous, and the unstructured. Most people like stability and are disturbed when reality doesn't seem to reflect that desired stability. In this regard, Einstein is very much the opposite; he once said "The most beautiful thing we can experience is the mysterious. It is the source of all art and science."
4. Eleanor Roosevelt
Eleanor Roosevelt, wife of Franklin Delano Roosevelt and First Lady of the United States from 1933 to 1945, holds up the Universal Declaration of Human Rights.
Eleanor Roosevelt best exemplified the quality that Maslow called Gemeinshaftsgefuhl, a kind of psychologically healthy social connectedness and concern for other's well-being, even — or especially — when other's behavior is disgraceful or disappointing. Roosevelt was an extremely productive humanitarian and much loved for it. She has been described as "the First Lady of the World" and "the object of almost universal respect," and for good reason. Roosevelt was one of the earliest advocates for the civil rights of African Americans, spoke out against the discrimination of Japanese Americans after Pearl Harbor, and oversaw the drafting of the Universal Declaration of Human Rights.
5. Jane Addams
As an early feminist, social worker, and pacifist, Jane Addams best represents the sense of morality that Maslow believed self-actualized people to possess. To Maslow, the self-actualized individual "rarely showed in their day-to-day living the chaos, the confusion, the inconsistency, or the conflict that are so common in the average person's ethical dealings."
Addams fought for women's right to vote, documented the impact of typhoid fever on the poor, and worked diligently to bring an end to World War I, despite considerable criticism from the public after the U.S. joined the war. Rather than succumb to public pressure, however, Addams maintained her position, in part due to the innate moral compass that self-actualized individuals possess. Because of her work, she was rewarded the Nobel Peace Prize in 1931.
6. William James
Stock Montage/Getty Images
Known as the "father of American psychology," William James serves as an example of self-actualized people's ability to accept the self, nature, and others. In 1875, James offered the very first U.S. course in psychology. Prior to James, serious research into the function of the human mind was scant in the U.S.
As a young man, James experienced depression himself and often contemplated suicide. "I originally studied medicine in order to be a physiologist," wrote James, "but I drifted into psychology and philosophy from a sort of fatality." In seeking to understand the human mind, James fits the bill for self-actualized people's ability to accept the world around them without bias or prejudice. Maslow wrote that self-actualized individuals "see human nature as it is and not as they would prefer it to be. Their eyes see what is before them without being strained through spectacles of various sorts to distort or shape or color the reality."
The nineteenth century is often referred to as the "asylum era," where a large number of mentally ill individuals were locked up, mainly to be ignored and forgotten about. The work of early psychologists like James helped to dismantle this practice.
7. Albert Schweitzer
Self-actualized people, wrote Maslow, "customarily have some mission in life, some task to fulfill, some problem outside themselves which enlists much of their energies." Polymath and Nobel Peace Prize recipient Albert Schweitzer best exemplifies this quality.
In addition to being an accomplished theologian, Schweitzer was a driven medical missionary, returning to what is now the country of Gabon (then a French colony) twice to establish a functional hospital. The hospital was desperately needed, as Schweitzer saw more than 2,000 patients in his first nine months there, treating leprosy, yellow fever, malaria, and many other diseases.
The fact that Maslow selected Schweitzer as indicative of the superlative qualities of self-actualized people reflects mid-century American attitudes, too: Schweitzer would later be criticized as having a somewhat racist, paternalistic attitude towards the Africans he treated, reflected through statements like "The African is indeed my brother, but my junior brother." Though the good Schweitzer brought to the world is undisputable, his personal attitudes may not truly reflect those of the self-actualized individual.
8. Aldous Huxley
ullstein bild/ullstein bild via Getty Images
Another quality that Maslow argued self-actualized people presented was frequent "peak" or "mystical" experiences. These were moments of ecstasy and awe that conveyed "the feeling of being simultaneously more powerful and also more helpless than one ever was before" and "the conviction that something extremely important and valuable had happened."
For science fiction writer Aldous Huxley, pursuing mystical experiences was central to his work. Not only did his most famous work, Brave New World, criticize the pursuit of superficial pleasures, Huxley also pursued deep experiences through the use of psychedelic drugs like mescaline and LSD. He wrote about his psychedelic experiences in The Doors to Perception. Regarding these experiences, Huxley wrote "The mystical experience is doubly valuable; it is valuable because it gives the experiencer a better understanding of himself and the world and because it may help him to lead a less self-centered and more creative life."
9. Baruch Spinoza
Baruch Spinoza was a 17th century philosopher who demonstrated the kind of autonomy and independence of culture that Maslow claims self-actualized individuals to possess. "Self-actualizing people," he wrote, "are not dependent for their main satisfactions on the real world, or other people or culture or means to ends or, in general, on extrinsic satisfactions. Rather they are dependent for their own development and continued growth on their own potentialities and latent resources."
Spinoza worked against the grain of the dominant culture at the time. For his rationalist philosophy and theological criticism, the Jewish community issued a cherem against him, similar to excommunication in Christianity.
His works in philosophy are today considered foundational to metaphysics, epistemology, and ethics, though his greatest work, Ethics, was published after his death in 1677. This work established him as one of the Enlightenment's great thinkers, and despite being a somewhat famous philosopher prior to this, Spinoza lived a modest life as a lens grinder. He turned down being named the heir of his friend, Simon de Vries, turned down a prestigious academic position at the University of Heidelberg, and doggedly persisted in writing a work of biblical criticism that advocated for a secular, constitutional government, despite a possible threat to his life. Although he was despised by many in his own time, even his enemies admitted that he lived "a saintly life."
Geologists discover a rhythm to major geologic events.
- It appears that Earth has a geologic "pulse," with clusters of major events occurring every 27.5 million years.
- Working with the most accurate dating methods available, the authors of the study constructed a new history of the last 260 million years.
- Exactly why these cycles occur remains unknown, but there are some interesting theories.
Our hearts beat at a resting rate of 60 to 100 beats per minute. Lots of other things pulse, too. The colors we see and the pitches we hear, for example, are due to the different wave frequencies ("pulses") of light and sound waves.
Now, a study in the journal Geoscience Frontiers finds that Earth itself has a pulse, with one "beat" every 27.5 million years. That's the rate at which major geological events have been occurring as far back as geologists can tell.
A planetary calendar has 10 dates in red
Credit: Jagoush / Adobe Stock
According to lead author and geologist Michael Rampino of New York University's Department of Biology, "Many geologists believe that geological events are random over time. But our study provides statistical evidence for a common cycle, suggesting that these geologic events are correlated and not random."
The new study is not the first time that there's been a suggestion of a planetary geologic cycle, but it's only with recent refinements in radioisotopic dating techniques that there's evidence supporting the theory. The authors of the study collected the latest, best dating for 89 known geologic events over the last 260 million years:
- 29 sea level fluctuations
- 12 marine extinctions
- 9 land-based extinctions
- 10 periods of low ocean oxygenation
- 13 gigantic flood basalt volcanic eruptions
- 8 changes in the rate of seafloor spread
- 8 times there were global pulsations in interplate magmatism
The dates provided the scientists a new timetable of Earth's geologic history.
Tick, tick, boom
Credit: New York University
Putting all the events together, the scientists performed a series of statistical analyses that revealed that events tend to cluster around 10 different dates, with peak activity occurring every 27.5 million years. Between the ten busy periods, the number of events dropped sharply, approaching zero.
Perhaps the most fascinating question that remains unanswered for now is exactly why this is happening. The authors of the study suggest two possibilities:
"The correlations and cyclicity seen in the geologic episodes may be entirely a function of global internal Earth dynamics affecting global tectonics and climate, but similar cycles in the Earth's orbit in the Solar System and in the Galaxy might be pacing these events. Whatever the origins of these cyclical episodes, their occurrences support the case for a largely periodic, coordinated, and intermittently catastrophic geologic record, which is quite different from the views held by most geologists."
Assuming the researchers' calculations are at least roughly correct — the authors note that different statistical formulas may result in further refinement of their conclusions — there's no need to worry that we're about to be thumped by another planetary heartbeat. The last occurred some seven million years ago, meaning the next won't happen for about another 20 million years.
Research shows that those who spend more time speaking tend to emerge as the leaders of groups, regardless of their intelligence.
If you want to become a leader, start yammering. It doesn't even necessarily matter what you say. New research shows that groups without a leader can find one if somebody starts talking a lot.
This phenomenon, described by the "babble hypothesis" of leadership, depends neither on group member intelligence nor personality. Leaders emerge based on the quantity of speaking, not quality.
Researcher Neil G. MacLaren, lead author of the study published in The Leadership Quarterly, believes his team's work may improve how groups are organized and how individuals within them are trained and evaluated.
"It turns out that early attempts to assess leadership quality were found to be highly confounded with a simple quantity: the amount of time that group members spoke during a discussion," shared MacLaren, who is a research fellow at Binghamton University.
While we tend to think of leaders as people who share important ideas, leadership may boil down to whoever "babbles" the most. Understanding the connection between how much people speak and how they become perceived as leaders is key to growing our knowledge of group dynamics.
The power of babble
The research involved 256 college students, divided into 33 groups of four to ten people each. They were asked to collaborate on either a military computer simulation game (BCT Commander) or a business-oriented game (CleanStart). The players had ten minutes to plan how they would carry out a task and 60 minutes to accomplish it as a group. One person in the group was randomly designated as the "operator," whose job was to control the user interface of the game.
To determine who became the leader of each group, the researchers asked the participants both before and after the game to nominate one to five people for this distinction. The scientists found that those who talked more were also more likely to be nominated. This remained true after controlling for a number of variables, such as previous knowledge of the game, various personality traits, or intelligence.
How leaders influence people to believe | Michael Dowling | Big Think www.youtube.com
In an interview with PsyPost, MacLaren shared that "the evidence does seem consistent that people who speak more are more likely to be viewed as leaders."
Another find was that gender bias seemed to have a strong effect on who was considered a leader. "In our data, men receive on average an extra vote just for being a man," explained MacLaren. "The effect is more extreme for the individual with the most votes."
The great theoretical physicist Steven Weinberg passed away on July 23. This is our tribute.
- The recent passing of the great theoretical physicist Steven Weinberg brought back memories of how his book got me into the study of cosmology.
- Going back in time, toward the cosmic infancy, is a spectacular effort that combines experimental and theoretical ingenuity. Modern cosmology is an experimental science.
- The cosmic story is, ultimately, our own. Our roots reach down to the earliest moments after creation.
When I was a junior in college, my electromagnetism professor had an awesome idea. Apart from the usual homework and exams, we were to give a seminar to the class on a topic of our choosing. The idea was to gauge which area of physics we would be interested in following professionally.
Professor Gilson Carneiro knew I was interested in cosmology and suggested a book by Nobel Prize Laureate Steven Weinberg: The First Three Minutes: A Modern View of the Origin of the Universe. I still have my original copy in Portuguese, from 1979, that emanates a musty tropical smell, sitting on my bookshelf side-by-side with the American version, a Bantam edition from 1979.
Inspired by Steven Weinberg
Books can change lives. They can illuminate the path ahead. In my case, there is no question that Weinberg's book blew my teenage mind. I decided, then and there, that I would become a cosmologist working on the physics of the early universe. The first three minutes of cosmic existence — what could be more exciting for a young physicist than trying to uncover the mystery of creation itself and the origin of the universe, matter, and stars? Weinberg quickly became my modern physics hero, the one I wanted to emulate professionally. Sadly, he passed away July 23rd, leaving a huge void for a generation of physicists.
What excited my young imagination was that science could actually make sense of the very early universe, meaning that theories could be validated and ideas could be tested against real data. Cosmology, as a science, only really took off after Einstein published his paper on the shape of the universe in 1917, two years after his groundbreaking paper on the theory of general relativity, the one explaining how we can interpret gravity as the curvature of spacetime. Matter doesn't "bend" time, but it affects how quickly it flows. (See last week's essay on what happens when you fall into a black hole).
The Big Bang Theory
For most of the 20th century, cosmology lived in the realm of theoretical speculation. One model proposed that the universe started from a small, hot, dense plasma billions of years ago and has been expanding ever since — the Big Bang model; another suggested that the cosmos stands still and that the changes astronomers see are mostly local — the steady state model.
Competing models are essential to science but so is data to help us discriminate among them. In the mid 1960s, a decisive discovery changed the game forever. Arno Penzias and Robert Wilson accidentally discovered the cosmic microwave background radiation (CMB), a fossil from the early universe predicted to exist by George Gamow, Ralph Alpher, and Robert Herman in their Big Bang model. (Alpher and Herman published a lovely account of the history here.) The CMB is a bath of microwave photons that permeates the whole of space, a remnant from the epoch when the first hydrogen atoms were forged, some 400,000 years after the bang.
The existence of the CMB was the smoking gun confirming the Big Bang model. From that moment on, a series of spectacular observatories and detectors, both on land and in space, have extracted huge amounts of information from the properties of the CMB, a bit like paleontologists that excavate the remains of dinosaurs and dig for more bones to get details of a past long gone.
How far back can we go?
Confirming the general outline of the Big Bang model changed our cosmic view. The universe, like you and me, has a history, a past waiting to be explored. How far back in time could we dig? Was there some ultimate wall we cannot pass?
Because matter gets hot as it gets squeezed, going back in time meant looking at matter and radiation at higher and higher temperatures. There is a simple relation that connects the age of the universe and its temperature, measured in terms of the temperature of photons (the particles of visible light and other forms of invisible radiation). The fun thing is that matter breaks down as the temperature increases. So, going back in time means looking at matter at more and more primitive states of organization. After the CMB formed 400,000 years after the bang, there were hydrogen atoms. Before, there weren't. The universe was filled with a primordial soup of particles: protons, neutrons, electrons, photons, and neutrinos, the ghostly particles that cross planets and people unscathed. Also, there were very light atomic nuclei, such as deuterium and tritium (both heavier cousins of hydrogen), helium, and lithium.
So, to study the universe after 400,000 years, we need to use atomic physics, at least until large clumps of matter aggregate due to gravity and start to collapse to form the first stars, a few millions of years after. What about earlier on? The cosmic history is broken down into chunks of time, each the realm of different kinds of physics. Before atoms form, all the way to about a second after the Big Bang, it's nuclear physics time. That's why Weinberg brilliantly titled his book The First Three Minutes. It is during the interval between one-hundredth of a second and three minutes that the light atomic nuclei (made of protons and neutrons) formed, a process called, with poetic flair, primordial nucleosynthesis. Protons collided with neutrons and, sometimes, stuck together due to the attractive strong nuclear force. Why did only a few light nuclei form then? Because the expansion of the universe made it hard for the particles to find each other.
What about the nuclei of heavier elements, like carbon, oxygen, calcium, gold? The answer is beautiful: all the elements of the periodic table after lithium were made and continue to be made in stars, the true cosmic alchemists. Hydrogen eventually becomes people if you wait long enough. At least in this universe.
In this article, we got all the way up to nucleosynthesis, the forging of the first atomic nuclei when the universe was a minute old. What about earlier on? How close to the beginning, to t = 0, can science get? Stay tuned, and we will continue next week.
To Steven Weinberg, with gratitude, for all that you taught us about the universe.
Long before Alexandria became the center of Egyptian trade, there was Thônis-Heracleion. But then it sank.