Once a week.
Subscribe to our weekly newsletter.
9 self-actualized historical figures
When he was developing his famous hierarchy of needs, Abraham Maslow cited 9 historical figures that achieved self-actualization.
- In order to develop his model of self-actualization, Abraham Maslow interviewed friends, colleagues, students, and historical figures.
- These 9 historical figures demonstrate different aspects of self-actualization that Maslow believed all self-actualized individuals possessed to one degree or another.
- By studying these figures, we can come to a better understanding of what self-actualization really is.
Most, by now, are familiar with Abraham Maslow's hierarchy of needs. The model describes a series of successive, basic needs that must be satisfied before a human being can concern themselves with the next level. One needs to eat before one can worry about safety, one needs to feel safe before seeking out belonging, one needs to feel love and belonging before one can establish self-esteem, and one needs to have self-esteem before they can reach the pinnacle of the hierarchy, self-actualization.
In his most comprehensive book on the subject, Motivation and Personality, Maslow described self-actualization as the "full use and exploitation of talents, capacities, etc. Such people seem to be fulfilling themselves and to be doing the best that they are capable of doing. […] They are people who have developed or are developing to the full stature of which they are capable."
To develop this definition, Maslow studied friends, colleagues, college students, as well as 9 historical figures that he believed had become self-actualized. The qualities of these figures, he argued, could shed light on the qualities of self-actualized individuals in general. Though they all share characteristics of self-actualized people to one degree or another, some stand out more than others.
1. Abraham Lincoln
Portrait of Abraham Lincoln
Stock Montage/Getty Images
Abraham Lincoln could be said to represent many of the qualities of self-actualized people, but Maslow called him out for one in particular: a philosophical, unhostile sense of humor. "Probably," wrote Maslow, "Lincoln never made a joke that hurt anybody else; it is also likely that many or even most of his jokes had something to say, had a function beyond just producing a laugh. They often seemed to be education in a more palatable form, akin to parables or fables."
In his book, Reminiscences of Abraham Lincoln, author David B. Locke wrote, "But with all the humor in his nature, which was more than humor because it was humor with a purpose (that constituting the difference between humor and wit) […] His flow of humor was a sparkling spring gushing out of a rock – the flashing water had a somber background which made it all the brighter."
2. Thomas Jefferson
Today, Thomas Jefferson's historical legacy is a bit mixed. Having argued that all men are created equal, his position as a slave-owner seems contradictory. Still, Maslow considered Jefferson to be a self-actualized person, perhaps because of Jefferson's "democratic character structure," though this may be the result of the thinking of 20th century historians in regards to Jefferson's slavery practices.
Self-actualized people, wrote Maslow, possess a "hard-to-get-at-tendency to give a certain quantum of respect to any human being just because he is a human individual; our subjects seem not to wish to go beyond a certain minimum point, even with scoundrels, of demeaning. of derogating, of robbing of dignity."
This is certainly reflected in Jefferson's most famous piece of writing, the Declaration of Independence, which contended that all men possess unalienable rights. It is, however, more difficult to square with his ambivalent position on slavery. Throughout his life, Jefferson expressed his dislike of slavery and introduced anti-slavery legislation, yet he owned over 600 slaves and freed only 7. He also believed blacks to be inferior — in this regard, Maslow may have picked a poor candidate.
3. Albert Einstein
Maslow argued that self-actualized people are firmly grounded in the real world, rather than the miasma of stereotypes, abstractions, expectations, and biases that most of us experience. "They are therefore far more apt to perceive what is there rather than their own wishes, hopes, fears, anxieties, their own theories and beliefs, or those of their cultural group," he wrote.
Maslow argued that many excellent scientists possess this quality and that it drives them to learn more about the unknown, the ambiguous, and the unstructured. Most people like stability and are disturbed when reality doesn't seem to reflect that desired stability. In this regard, Einstein is very much the opposite; he once said "The most beautiful thing we can experience is the mysterious. It is the source of all art and science."
4. Eleanor Roosevelt
Eleanor Roosevelt, wife of Franklin Delano Roosevelt and First Lady of the United States from 1933 to 1945, holds up the Universal Declaration of Human Rights.
Eleanor Roosevelt best exemplified the quality that Maslow called Gemeinshaftsgefuhl, a kind of psychologically healthy social connectedness and concern for other's well-being, even — or especially — when other's behavior is disgraceful or disappointing. Roosevelt was an extremely productive humanitarian and much loved for it. She has been described as "the First Lady of the World" and "the object of almost universal respect," and for good reason. Roosevelt was one of the earliest advocates for the civil rights of African Americans, spoke out against the discrimination of Japanese Americans after Pearl Harbor, and oversaw the drafting of the Universal Declaration of Human Rights.
5. Jane Addams
As an early feminist, social worker, and pacifist, Jane Addams best represents the sense of morality that Maslow believed self-actualized people to possess. To Maslow, the self-actualized individual "rarely showed in their day-to-day living the chaos, the confusion, the inconsistency, or the conflict that are so common in the average person's ethical dealings."
Addams fought for women's right to vote, documented the impact of typhoid fever on the poor, and worked diligently to bring an end to World War I, despite considerable criticism from the public after the U.S. joined the war. Rather than succumb to public pressure, however, Addams maintained her position, in part due to the innate moral compass that self-actualized individuals possess. Because of her work, she was rewarded the Nobel Peace Prize in 1931.
6. William James
Stock Montage/Getty Images
Known as the "father of American psychology," William James serves as an example of self-actualized people's ability to accept the self, nature, and others. In 1875, James offered the very first U.S. course in psychology. Prior to James, serious research into the function of the human mind was scant in the U.S.
As a young man, James experienced depression himself and often contemplated suicide. "I originally studied medicine in order to be a physiologist," wrote James, "but I drifted into psychology and philosophy from a sort of fatality." In seeking to understand the human mind, James fits the bill for self-actualized people's ability to accept the world around them without bias or prejudice. Maslow wrote that self-actualized individuals "see human nature as it is and not as they would prefer it to be. Their eyes see what is before them without being strained through spectacles of various sorts to distort or shape or color the reality."
The nineteenth century is often referred to as the "asylum era," where a large number of mentally ill individuals were locked up, mainly to be ignored and forgotten about. The work of early psychologists like James helped to dismantle this practice.
7. Albert Schweitzer
Self-actualized people, wrote Maslow, "customarily have some mission in life, some task to fulfill, some problem outside themselves which enlists much of their energies." Polymath and Nobel Peace Prize recipient Albert Schweitzer best exemplifies this quality.
In addition to being an accomplished theologian, Schweitzer was a driven medical missionary, returning to what is now the country of Gabon (then a French colony) twice to establish a functional hospital. The hospital was desperately needed, as Schweitzer saw more than 2,000 patients in his first nine months there, treating leprosy, yellow fever, malaria, and many other diseases.
The fact that Maslow selected Schweitzer as indicative of the superlative qualities of self-actualized people reflects mid-century American attitudes, too: Schweitzer would later be criticized as having a somewhat racist, paternalistic attitude towards the Africans he treated, reflected through statements like "The African is indeed my brother, but my junior brother." Though the good Schweitzer brought to the world is undisputable, his personal attitudes may not truly reflect those of the self-actualized individual.
8. Aldous Huxley
ullstein bild/ullstein bild via Getty Images
Another quality that Maslow argued self-actualized people presented was frequent "peak" or "mystical" experiences. These were moments of ecstasy and awe that conveyed "the feeling of being simultaneously more powerful and also more helpless than one ever was before" and "the conviction that something extremely important and valuable had happened."
For science fiction writer Aldous Huxley, pursuing mystical experiences was central to his work. Not only did his most famous work, Brave New World, criticize the pursuit of superficial pleasures, Huxley also pursued deep experiences through the use of psychedelic drugs like mescaline and LSD. He wrote about his psychedelic experiences in The Doors to Perception. Regarding these experiences, Huxley wrote "The mystical experience is doubly valuable; it is valuable because it gives the experiencer a better understanding of himself and the world and because it may help him to lead a less self-centered and more creative life."
9. Baruch Spinoza
Baruch Spinoza was a 17th century philosopher who demonstrated the kind of autonomy and independence of culture that Maslow claims self-actualized individuals to possess. "Self-actualizing people," he wrote, "are not dependent for their main satisfactions on the real world, or other people or culture or means to ends or, in general, on extrinsic satisfactions. Rather they are dependent for their own development and continued growth on their own potentialities and latent resources."
Spinoza worked against the grain of the dominant culture at the time. For his rationalist philosophy and theological criticism, the Jewish community issued a cherem against him, similar to excommunication in Christianity.
His works in philosophy are today considered foundational to metaphysics, epistemology, and ethics, though his greatest work, Ethics, was published after his death in 1677. This work established him as one of the Enlightenment's great thinkers, and despite being a somewhat famous philosopher prior to this, Spinoza lived a modest life as a lens grinder. He turned down being named the heir of his friend, Simon de Vries, turned down a prestigious academic position at the University of Heidelberg, and doggedly persisted in writing a work of biblical criticism that advocated for a secular, constitutional government, despite a possible threat to his life. Although he was despised by many in his own time, even his enemies admitted that he lived "a saintly life."
So much for rest in peace.
- Australian scientists found that bodies kept moving for 17 months after being pronounced dead.
- Researchers used photography capture technology in 30-minute intervals every day to capture the movement.
- This study could help better identify time of death.
We're learning more new things about death everyday. Much has been said and theorized about the great divide between life and the Great Beyond. While everyone and every culture has their own philosophies and unique ideas on the subject, we're beginning to learn a lot of new scientific facts about the deceased corporeal form.
An Australian scientist has found that human bodies move for more than a year after being pronounced dead. These findings could have implications for fields as diverse as pathology to criminology.
Dead bodies keep moving
Researcher Alyson Wilson studied and photographed the movements of corpses over a 17 month timeframe. She recently told Agence France Presse about the shocking details of her discovery.
Reportedly, she and her team focused a camera for 17 months at the Australian Facility for Taphonomic Experimental Research (AFTER), taking images of a corpse every 30 minutes during the day. For the entire 17 month duration, the corpse continually moved.
"What we found was that the arms were significantly moving, so that arms that started off down beside the body ended up out to the side of the body," Wilson said.
The researchers mostly expected some kind of movement during the very early stages of decomposition, but Wilson further explained that their continual movement completely surprised the team:
"We think the movements relate to the process of decomposition, as the body mummifies and the ligaments dry out."
During one of the studies, arms that had been next to the body eventually ended up akimbo on their side.
The team's subject was one of the bodies stored at the "body farm," which sits on the outskirts of Sydney. (Wilson took a flight every month to check in on the cadaver.)Her findings were recently published in the journal, Forensic Science International: Synergy.
Implications of the study
The researchers believe that understanding these after death movements and decomposition rate could help better estimate the time of death. Police for example could benefit from this as they'd be able to give a timeframe to missing persons and link that up with an unidentified corpse. According to the team:
"Understanding decomposition rates for a human donor in the Australian environment is important for police, forensic anthropologists, and pathologists for the estimation of PMI to assist with the identification of unknown victims, as well as the investigation of criminal activity."
While scientists haven't found any evidence of necromancy. . . the discovery remains a curious new understanding about what happens with the body after we die.
Metal-like materials have been discovered in a very strange place.
- Bristle worms are odd-looking, spiky, segmented worms with super-strong jaws.
- Researchers have discovered that the jaws contain metal.
- It appears that biological processes could one day be used to manufacture metals.
The bristle worm, also known as polychaetes, has been around for an estimated 500 million years. Scientists believe that the super-resilient species has survived five mass extinctions, and there are some 10,000 species of them.
Be glad if you haven't encountered a bristle worm. Getting stung by one is an extremely itchy affair, as people who own saltwater aquariums can tell you after they've accidentally touched a bristle worm that hitchhiked into a tank aboard a live rock.
Bristle worms are typically one to six inches long when found in a tank, but capable of growing up to 24 inches long. All polychaetes have a segmented body, with each segment possessing a pair of legs, or parapodia, with tiny bristles. ("Polychaeate" is Greek for "much hair.") The parapodia and its bristles can shoot outward to snag prey, which is then transferred to a bristle worm's eversible mouth.
The jaws of one bristle worm — Platynereis dumerilii — are super-tough, virtually unbreakable. It turns out, according to a new study from researchers at the Technical University of Vienna, this strength is due to metal atoms.
Metals, not minerals
Fireworm, a type of bristle wormCredit: prilfish / Flickr
This is pretty unusual. The study's senior author Christian Hellmich explains: "The materials that vertebrates are made of are well researched. Bones, for example, are very hierarchically structured: There are organic and mineral parts, tiny structures are combined to form larger structures, which in turn form even larger structures."
The bristle worm jaw, by contrast, replaces the minerals from which other creatures' bones are built with atoms of magnesium and zinc arranged in a super-strong structure. It's this structure that is key. "On its own," he says, "the fact that there are metal atoms in the bristle worm jaw does not explain its excellent material properties."
Just deformable enough
Credit: by-studio / Adobe Stock
What makes conventional metal so strong is not just its atoms but the interactions between the atoms and the ways in which they slide against each other. The sliding allows for a small amount of elastoplastic deformation when pressure is applied, endowing metals with just enough malleability not to break, crack, or shatter.
Co-author Florian Raible of Max Perutz Labs surmises, "The construction principle that has made bristle worm jaws so successful apparently originated about 500 million years ago."
Raible explains, "The metal ions are incorporated directly into the protein chains and then ensure that different protein chains are held together." This leads to the creation of three-dimensional shapes the bristle worm can pack together into a structure that's just malleable enough to withstand a significant amount of force.
"It is precisely this combination," says the study's lead author Luis Zelaya-Lainez, "of high strength and deformability that is normally characteristic of metals.
So the bristle worm jaw is both metal-like and yet not. As Zelaya-Lainez puts it, "Here we are dealing with a completely different material, but interestingly, the metal atoms still provide strength and deformability there, just like in a piece of metal."
Observing the creation of a metal-like material from biological processes is a bit of a surprise and may suggest new approaches to materials development. "Biology could serve as inspiration here," says Hellmich, "for completely new kinds of materials. Perhaps it is even possible to produce high-performance materials in a biological way — much more efficiently and environmentally friendly than we manage today."
Dealing with rudeness can nudge you toward cognitive errors.
- Anchoring is a common bias that makes people fixate on one piece of data.
- A study showed that those who experienced rudeness were more likely to anchor themselves to bad data.
- In some simulations with medical students, this effect led to higher mortality rates.
Cognitive biases are funny little things. Everyone has them, nobody likes to admit it, and they can range from minor to severe depending on the situation. Biases can be influenced by factors as subtle as our mood or various personality traits.
A new study soon to be published in the Journal of Applied Psychology suggests that experiencing rudeness can be added to the list. More disturbingly, the study's findings suggest that it is a strong enough effect to impact how medical professionals diagnose patients.
Life hack: don't be rude to your doctor
The team of researchers behind the project tested to see if participants could be influenced by the common anchoring bias, defined by the researchers as "the tendency to rely too heavily or fixate on one piece of information when making judgments and decisions." Most people have experienced it. One of its more common forms involves being given a particular value, say in negotiations on price, which then becomes the center of reasoning even when reason would suggest that number should be ignored.
It can also pop up in medicine. As co-author Dr. Trevor Foulk explains, "If you go into the doctor and say 'I think I'm having a heart attack,' that can become an anchor and the doctor may get fixated on that diagnosis, even if you're just having indigestion. If doctors don't move off anchors enough, they'll start treating the wrong thing."
Lots of things can make somebody more or less likely to anchor themselves to an idea. The authors of the study, who have several papers on the effects of rudeness, decided to see if that could also cause people to stumble into cognitive errors. Past research suggested that exposure to rudeness can limit people's perspective — perhaps anchoring them.
In the first version of the study, medical students were given a hypothetical patient to treat and access to information on their condition alongside an (incorrect) suggestion on what the condition was. This served as the anchor. In some versions of the tests, the students overheard two doctors arguing rudely before diagnosing the patient. Later variations switched the diagnosis test for business negotiations or workplace tasks while maintaining the exposure to rudeness.
Across all iterations of the test, those exposed to rudeness were more likely to anchor themselves to the initial, incorrect suggestion despite the availability of evidence against it. This was less significant for study participants who scored higher on a test of how wide of a perspective they tended to have. The disposition of these participants, who answered in the affirmative to questions like, "Before criticizing somebody, I try to imagine how I would feel if I were in his/her place," was able to effectively negate the narrowing effects of rudeness.
What this means for you and your healthcare
The effects of anchoring when a medical diagnosis is on the line can be substantial. Dr. Foulk explains that, in some simulations, exposure to rudeness can raise the mortality rate as doctors fixate on the wrong problems.
The authors of the study suggest that managers take a keener interest in ensuring civility in workplaces and giving employees the tools they need to avoid judgment errors after dealing with rudeness. These steps could help prevent anchoring.
Also, you might consider being nicer to people.