Creativity: The science behind the madness
Human brains evolved for creativity. We just have to learn how to access it.
RAINN WILSON: Creativity is absolutely for everyone. I firmly believe this. I think if you're the driest accountant with the plastic pocket pen protector it's in how you interact with the world. There's artistry in everything that we do.
ANTHONY BRANDT: The fact of the matter is we all are born with a creative license. We have this software running in our brains.
DAVID EAGLEMAN: What is it that's special about the human brain that allows creativity to happen? Because when you look at us compared to all the other species on the earth we have very similar brains. I mean obviously we're cousins with our nearest neighbors and all throughout the animal kingdom, it's a continuous family tree, but we're running around the planet doing something unbelievable. You don't have squirrels going to the moon or dogs inventing the Internet or cows doing theater plays for one another or any of the gajillion things that we do. What is below all of that? What is the basic cognitive software that's running in the human brain that takes ideas in and smushes them up and crunches them. It's like a food processor that's constantly spitting out new ideas.
SCOTT BARRY KAUFMAN: So, many of you might have heard of the left brain right brain myth about creativity, that the left brain is not related to creativity much at all because it's really boring and logical and super serious and analytical, and that the right brain is where all the artistic beauty comes out and it's very poetic. Well, the reality is that creativity involves an interaction of lots of different brain networks that rely on both the left side and the right side of the brain.
WENDY SUZUKI: It really is the most creative people are using both sides of the brain together. So, this is an important concept that the brain is subdivided into two major hemispheres. We have two of each structure, almost all the structures of our brain are paired. So, the idea is well one side of the brain is for certain things and the other side of the brain is important for other things and the one thing we can say for sure is yes language is on the left side of the brain. But for creativity it actually makes more sense to me that with a function so broad as that you would benefit from having the most crosstalk possible between all parts of your brain, in fact that's what the neuroscience is showing.
KAUFMAN: When you have lots of different parts of the brain that are communicating with each other to solve a certain task then it's called a brain network. And you find that creativity draws on multiple interacting brain networks. In particular it draws on three brain networks that seem to be absolutely essential to creativity across whatever field it is, whether it's science or its art. One of those brain networks that is important is what's called the executive attention network. And the executive attention network allows you to integrate lots of information in your head at one time, hold stuff in your working memory, maintain strategies that you're currently working on at one time so you don't forget what your strategy is or forget what you already did and then redo it. The executive attention network it's also helpful for inhibiting the obvious responses or the first things that comes to your mind. And so, creativity is important to access remote associations so the executive attention network is going to be helpful to inhibit the most immediate obvious things that come to mind. People who are very good improv artists, for instance, the first thing that comes to their mind is usually not the most creative so they tend to like wait for the second or third thing and that's one of the improv activities. So, the second major brain network that's important is the default mode network, but I like to call it the imagination network because it's highly active every time we turn our attention our focus of attention inward and we focus on our daydreams, we focus on our future goals, on whenever we're trying to take the perspective of someone else. So, it's very important for having compassion for someone else because it allows us to imagine what someone else is thinking or feeling and so that's the imagination brain network. And then the third major brain network that's important for creativity that I think is a very underrated brain network it's called the salience brain network. And that's associated with what is most salient in our environment? What is most interesting to us? Before we think through consciously about a creative activity and even before we activate our imagination there's a process before both of that where we have a subconscious process where the salient brain network tags things as interesting or not interesting in our environment and it either feeds it to our imagination network or to our executive attention network to pay attention to. Creativity involves the interaction of all three, it's when we're captivated by the moment, we're mindful, but we're also imaginative and we're also motivated and passionate to engage in the creative activity.
EAGLEMAN: What's special about the human brain is that we, during the evolution of the cortex, got a lot more space between input and output. So, other animals have these much closer together so when they get some stimulus they make essentially a reflexive response. In humans as the cortex expanded there's a lot more room there which means that inputs can come in and sort of percolate around and get stored and get thought about and then maybe you make an output or maybe you don't. And there's one other thing that happened with the expansion of the cortex, which is that we got a much bigger prefrontal cortex, that's the part right behind the forehead, and that is what allows us to simulate what if's, to separate ourselves from our location in space and time and think about possibilities. What if I did that? What if I had done that? What if I could do that? And so, we do this all of the time and the amazing part is now there are almost 8 billion brains running around the planet and as a result creativity, I mean the creativity of our species has gone up in this mad, amazing way because there's so much raw material to draw on and there are so many of us that are constantly saying what if this, what if that?
BRANDT: When I look at my heroes in composition they are all incredible risk-takers. And it's a constant reminder that you can introduce something new to the world and be certain of the results. And so, tolerating the risks, living with the risk, even enjoying it is, again, part of being a creative person.
KAUFMAN: Creativity requires both intelligence and imagination. Creativity requires our ability to know what has come before so we can stand on the shoulders of giants, it also requires the ability to have great foresight and vision to imagine the world the way that it could be and when we combined the two I think that makes us much more likely we'll have creativity.
ETHAN HAWKE: The beauty of jazz music is that there's no plan. There's a plan, there's an architecture. Let's take something obvious like my favorite things, John Coltrane is my favorite things. If people know one jazz thing often they'll know that one. And he takes this famous song and they all start riffing on it and the musicians start riffing on it and they find a new melody inside it and it changes and it changes and then mysteriously comes back around again. And spontaneity mixed with discipline and intelligence it evolves into something you cannot plan that is more sophisticated and more interesting than something the intellectual mind can plan. When you're really being creative at your best you've used your discipline to open up your subconscious.
WILSON: If it's a pure expression of yourself no matter what it is or what medium, it's going to shine. It's going to resonate. You could look inside of yourself and you can have a canvas and you can paint a dot in it, but if that's where your creative purpose is taking you then it needs to be that dot.
EAGLEMAN: We are vessels of our own space and time so the particular things we create have to do with what we have absorbed. So, if you compare 19th century Japanese music to 19th century French music to 19th century Kenyon music and so on, you'll see these are extremely different but it's not that a composer over here couldn't have done what a composer over here was doing, it's simply that it wouldn't have stuck in their culture, it would have been strange and wouldn't make sense. Why? Because what we're doing is building on the foundations of what has come before us.
HAWKE: In a way you're channeling yourself and you're channeling your own questions and your own seeking, which is deeply connected to your own. We all have it. We all have an essence, a center that is us. We have it the day we're born and when you can access it then you can access the subconscious and that's going to be more powerful and more true than anything your intellectual mind has to say.
BEAU LOTTO: Because nothing interesting begins with knowing, it begins with not knowing. Uncertainty is such a difficult dangerous thing that evolution has created a brain that tries to avoid it all together to the extent that we have things like conformational bias. Well, we'll start looking for evidence to confirm what we assume to be true already, that we would rather hold onto assumptions that we know don't work because that is safer we think than questioning them and stepping to a place that we don't actually know. We do almost everything to avoid uncertainty and yet the irony is that that's the only place we can go if we're ever going to see it differently. And that's why creativity, seeing differently, always begins in the same way it begins with a question, it begins with not knowing, it begins with a why, it begins with a what if.
EAGLEMAN: What good creators do is they cover the spectrum, this is as true of individuals as it is for companies, they cover the spectrum where they're doing some things that are sort of nearby and some things that are wackier and wackier and this is how they feel out the border of the possible, this is how they figure out what's going to stick with their society. Because the thing about any sort of creative act is that you never know what's going to stick, what will actually make a difference in your society.
SUZUKI: Then the question is, well, how do I up my creativity? That's what everybody is interested in.
EAGLEMAN: The key is that humans are really different from one another and for one person taking a hot shower might work and for another person a cold shower, one person works well in the morning and another person at night, for one writer they should go and sit in the coffee shop where it's loud and another writer it works better for them to sit alone in their quiet office and write. So, I suspect there's no single piece of advice that's going to apply to everyone.
WILSON: When people have "creative blocks", and I know my share of friends do as well if they're at some stuck point they're not sure what to do with their lives or their writing or their photography or their filmmaking or whatever it is that they're doing I think the best advice is you have to change your life up completely, to go on a trip, go spend a year being of service, be willing to take some major drastic action to get you out of your comfort zone and go inside not outside. I think our society is all about focusing on the externals, oh these people like me, I'm successful because of these people, they view me as being good and we need to take that vision and instead of expanding it outwards we need to look inside ourselves.
- An all-star cast of Big Thinkers—actors Rainn Wilson and Ethan Hawke; composer Anthony Brandt; neuroscientists David Eagleman, Wendy Suzuki, and Beau Lotto; and psychologist Scott Barry Kaufman—share how they define creativity and explain how our brains uniquely evolved for the phenomenon.
- According to Eagleman, during evolution there was an increase in space between our brain's input and output that allows information more time to percolate. We also grew a larger prefrontal cortex which "allows us to simulate what ifs, to separate ourselves from our location in space and time and think about possibilities."
- Scott Barry Kaufman details 3 brain networks involved in creative thinking, and Wendy Suzuki busts the famous left-brain, right-brain myth.
Once a week.
Subscribe to our weekly newsletter.
A recent study used fMRI to compare the brains of psychopathic criminals with a group of 100 well-functioning individuals, finding striking similarities.
- The study used psychological inventories to assess a group of violent criminals and healthy volunteers for psychopathy, and then examined how their brains responded to watching violent movie scenes.
- The fMRI results showed that the brains of healthy subjects who scored high in psychopathic traits reacted similarly as the psychopathic criminal group. Both of these groups also showed atrophy in brain regions involved in regulating emotion.
- The study adds complexity to common conceptions of what differentiates a psychopath from a "healthy" individual.
When considering what precisely makes someone a psychopath, the lines can be blurry.
Psychological research has shown that many people in society have some degree of malevolent personality traits, such as those described by the "dark triad": narcissism (entitled self-importance), Machiavellianism (strategic exploitation and deceit), and psychopathy (callousness and cynicism). But while people who score high in these traits are more likely to end up in prison, most of them are well functioning and don't engage in extreme antisocial behaviors.
Now, a new study published in Cerebral Cortex found that the brains of psychopathic criminals are structurally and functionally similar to many well-functioning, non-criminal individuals with psychopathic traits. The results suggest that psychopathy isn't a binary classification, but rather a "constellation" of personality traits that "vary in the non-incarcerated population with normal range of social functioning."
Assessing your inner psychopath
The researchers used functional magnetic resonance imaging (fMRI) to compare the brains of violent psychopathic criminals to those of healthy volunteers. All participants were assessed for psychopathy through commonly used inventories: the Hare Psychopathy Checklist-Revised and the Levenson Self-Report Psychopathy Scale.
Experimental design and sample stimuli. The subjects viewed a compilation of 137 movie clips with variable violent and nonviolent content.Nummenmaa et al.
Both groups watched a 26-minute-long medley of movie scenes that were selected to portray a "large variability of social and emotional content." Some scenes depicted intense violence. As participants watched the medley, fMRI recorded how various regions of their brains responded to the content.
The goal was to see whether the brains of psychopathic criminals looked and reacted similarly to the brains of healthy subjects who scored high in psychopathic traits. The results showed similar reactions: When both groups viewed violent scenes, the fMRI revealed strong reactions in the orbitofrontal cortex and anterior insula, brain regions associated with regulating emotion.
These similarities manifested as a positive association: The more psychopathic traits a healthy subject displayed, the more their brains responded like the criminal group. What's more, the fMRI revealed a similar association between psychopathic traits and brain structure, with those scoring high in psychopathy showing lower gray matter density in the orbitofrontal cortex and anterior insula.
There were some key differences between the groups, however. The researchers noted that the structural abnormalities in the healthy sample were mainly associated with primary psychopathic traits, which are: inclination to lie, lack of remorse, and callousness. Meanwhile, the functional responses of the healthy subjects were associated with secondary psychopathic traits: impulsivity, short temper, and low tolerance for frustration.
Overall, the study further illuminates some of the biological drivers of psychopathy, and it adds nuance to common conceptions of the differences between psychopathy and being "healthy."
Why do some psychopaths become criminals?
The million-dollar question remains unanswered: Why do some psychopaths end up in prison, while others (or, people who score high in psychopathic traits) lead well-functioning lives? The researchers couldn't give a definitive answer, but they did note that psychopathic criminals had lower connectivity within "key nodes of the social and emotional brain networks, including amygdala, insula, thalamus, and frontal pole."
"Thus, even though there are parallels in the regional responsiveness of the brain's affective circuit in the convicted psychopaths and well-functioning subjects with psychopathic traits, it is likely that the disrupted functional connectivity of this network is specific to criminal psychopathy."
Counterintuitively, directly combating misinformation online can spread it further. A different approach is needed.
- Like the coronavirus, engaging with misinformation can inadvertently cause it to spread.
- Social media has a business model based on getting users to spend increasing amounts of time on their platforms, which is why they are hesitant to remove engaging content.
- The best way to fight online misinformation is to drown it out with the truth.
A year ago, the Center for Countering Digital Hate warned of the parallel pandemics — the biological contagion of COVID-19 and the social contagion of misinformation, aiding the spread of the disease. Since the outbreak of COVID-19, anti-vaccine accounts have gained 10 million new social media followers, while we have witnessed arson attacks against 5G masts, hospital staff abused for treating COVID patients, and conspiracists addressing crowds of thousands.
Many have refused to follow guidance issued to control the spread of the virus, motivated by beliefs in falsehoods about its origins and effects. The reluctance we see in some to get the COVID vaccine is greater amongst those who rely on social media rather than traditional media for their information. In a pandemic, lies cost lives, and it has felt like a new conspiracy theory has sprung up online every day.
How we, as social media users, behave in response to misinformation can either enable or prevent it from being seen and believed by more people.
The rules are different online
Credit: Pool via Getty Images
If a colleague mentions in the office that Bill Gates planned the pandemic, or a friend at dinner tells the table that the COVID vaccine could make them infertile, the right thing to do is often to challenge their claims. We don't want anyone to be left believing these falsehoods.
But digital is different. The rules of physics online are not the same as they are in the offline world. We need new solutions for the problems we face online.
Now, imagine that in order to reply to your friend, you must first hand him a megaphone so that everyone within a five-block radius can hear what he has to say. It would do more damage than good, but this is essentially what we do when we engage with misinformation online.
Think about misinformation as being like the coronavirus — when we engage with it, we help to spread it to everyone else with whom we come into contact. If a public figure with a large following responds to a post containing misinformation, they ensure the post is seen by hundreds of thousands or even millions of people with one click. Social media algorithms also push content into more users' newsfeeds if it appears to be engaging, so lots of interactions from users with relatively small followings can still have unintended negative consequences.
The trend of people celebrating and posting photos of themselves or loved ones receiving the vaccine has been far more effective than any attempt to disprove a baseless claim about Bill Gates or 5G mobile technology.
Additionally, whereas we know our friend from the office or dinner, most of the misinformation we see online will come from strangers. They often will be from one of two groups — true believers, whose minds are made up, and professional propagandists, who profit from building large audiences online and selling them products (including false cures). Both of these groups use trolling tactics, that is, seeking to trigger people to respond in anger, thus helping them reach new audiences and thereby gaming the algorithm.
On the day the COVID vaccine was approved in the UK, anti-vaccine activists were able to provoke pro-vaccine voices into posting about thalidomide, exposing new audiences to a reason to distrust the medical establishment. Those who spread misinformation understand the rules of the game online; it's time those of us on the side of enlightenment values of truth and science did too.
How to fight online misinformation
Of course, it is much easier for social media companies to take on this issue than for us citizens. Research from the Center for Countering Digital Hate and Anti-Vax Watch last month found that 65% of anti-vaccine content on social media is linked to just twelve individuals and their organizations. Were the platforms to simply remove the accounts of these superspreaders, it would do a huge amount to reduce harmful misinformation.
The problem is that social media platforms are resistant to do so. These businesses have been built by constantly increasing the amount of time users spend on their platforms. Getting rid of the creators of engaging content that has millions of people hooked is antithetical to the business model. It will require intervention from governments to force tech companies to finally protect their users and society as a whole.
So, what can the rest of us do, while we await state regulation?
Instead of engaging, we should be outweighing the bad with the good. Every time you see a piece of harmful misinformation, share advice or information from a trusted source, like the WHO or BBC, on the same subject. The trend of people celebrating and posting photos of themselves or loved ones receiving the vaccine has been far more effective than any attempt to disprove a baseless claim about Bill Gates or 5G mobile technology. In the attention economy that governs tech platforms, drowning out is a better strategy than rebuttal.
Imran Ahmed is CEO of the Center for Countering Digital Hate.
A Harvard professor's study discovers the worst year to be alive.
- Harvard professor Michael McCormick argues the worst year to be alive was 536 AD.
- The year was terrible due to cataclysmic eruptions that blocked out the sun and the spread of the plague.
- 536 ushered in the coldest decade in thousands of years and started a century of economic devastation.
The past year has been nothing but the worst in the lives of many people around the globe. A rampaging pandemic, dangerous political instability, weather catastrophes, and a profound change in lifestyle that most have never experienced or imagined.
But was it the worst year ever?
Nope. Not even close. In the eyes of the historian and archaeologist Michael McCormick, the absolute "worst year to be alive" was 536.
Why was 536 so bad? You could certainly argue that 1918, the last year of World War I when the Spanish Flu killed up to 100 million people around the world, was a terrible year by all accounts. 1349 could also be considered on this morbid list as the year when the Black Death wiped out half of Europe, with up to 20 million dead from the plague. Most of the years of World War II could probably lay claim to the "worst year" title as well. But 536 was in a category of its own, argues the historian.
It all began with an eruption...
According to McCormick, Professor of Medieval History at Harvard University, 536 was the precursor year to one of the worst periods of human history. It featured a volcanic eruption early in the year that took place in Iceland, as established by a study of a Swiss glacier carried out by McCormick and the glaciologist Paul Mayewski from the Climate Change Institute of The University of Maine (UM) in Orono.
The ash spewed out by the volcano likely led to a fog that brought an 18-month-long stretch of daytime darkness across Europe, the Middle East, and portions of Asia. As wrote the Byzantine historian Procopius, "For the sun gave forth its light without brightness, like the moon, during the whole year." He also recounted that it looked like the sun was always in eclipse.
Cassiodorus, a Roman politician of that time, wrote that the sun had a "bluish" color, the moon had no luster, and "seasons seem to be all jumbled up together." What's even creepier, he described, "We marvel to see no shadows of our bodies at noon."
...that led to famine...
The dark days also brought a period of coldness, with summer temperatures falling by 1.5° C. to 2.5° C. This started the coldest decade in the past 2300 years, reports Science, leading to the devastation of crops and worldwide hunger.
...and the fall of an empire
In 541, the bubonic plague added considerably to the world's misery. Spreading from the Roman port of Pelusium in Egypt, the so-called Plague of Justinian caused the deaths of up to one half of the population of the eastern Roman Empire. This, in turn, sped up its eventual collapse, writes McCormick.
Between the environmental cataclysms, with massive volcanic eruptions also in 540 and 547, and the devastation brought on by the plague, Europe was in for an economic downturn for nearly all of the next century, until 640 when silver mining gave it a boost.
Was that the worst time in history?
Of course, the absolute worst time in history depends on who you were and where you lived.
Native Americans can easily point to 1520, when smallpox, brought over by the Spanish, killed millions of indigenous people. By 1600, up to 90 percent of the population of the Americas (about 55 million people) was wiped out by various European pathogens.
Like all things, the grisly title of "worst year ever" comes down to historical perspective.
Because of our ability to think about thinking, "the gap between ape and man is immeasurably greater than the one between amoeba and ape."
- Self-awareness — namely, our capacity to think about our thoughts — is central to how we perceive the world.
- Without self-awareness, education, literature, and other human endeavors would not be possible.
- Striving toward greater self-awareness is the spiritual goal of many religions and philosophies.
The following is an excerpt from Dr. Stephen Fleming's forthcoming book Know Thyself. It is reprinted with permission from the author.
I now run a neuroscience lab dedicated to the study of self-awareness at University College London. My team is one of several working within the Wellcome Centre for Human Neuroimaging, located in an elegant town house in Queen Square in London. The basement of our building houses large machines for brain imaging, and each group in the Centre uses this technology to study how different aspects of the mind and brain work: how we see, hear, remember, speak, make decisions, and so on. The students and postdocs in my lab focus on the brain's capacity for self-awareness. I find it a remarkable fact that something unique about our biology has allowed the human brain to turn its thoughts on itself.
Until quite recently, however, this all seemed like nonsense. As the nineteenth-century French philosopher Auguste Comte put it: "The thinking individual cannot cut himself in two — one of the parts reasoning, while the other is looking on. Since in this case the organ observed and the observing organ are identical, how could any observation be made?" In other words, how can the same brain turn its thoughts upon itself?
Comte's argument chimed with scientific thinking at the time. After the Enlightenment dawned on Europe, an increasingly popular view was that self-awareness was special and not something that could be studied using the tools of science. Western philosophers were instead using self-reflection as a philosophical tool, much as mathematicians use algebra in the pursuit of new mathematical truths. René Descartes relied on self-reflection in this way to reach his famous conclusion, "I think, therefore I am," noting along the way that "I know clearly that there is nothing that can be perceived by me more easily or more clearly than my own mind." Descartes proposed that a central soul was the seat of thought and reason, commanding our bodies to act on our behalf. The soul could not be split in two — it just was. Self-awareness was therefore mysterious and indefinable, and off-limits to science.
We now know that the premise of Comte's worry is false. The human brain is not a single, indivisible organ. Instead, the brain is made up of billions of small components — neurons — that each crackle with electrical activity and participate in a wiring diagram of mind-boggling complexity. Out of the interactions among these cells, our entire mental life — our thoughts and feelings, hopes and dreams — flickers in and out of existence. But rather than being a meaningless tangle of connections with no discernible structure, this wiring diagram also has a broader architecture that divides the brain into distinct regions, each engaged in specialized computations. Just as a map of a city need not include individual houses to be useful, we can obtain a rough overview of how different areas of the human brain are working together at the scale of regions rather than individual brain cells. Some areas of the cortex are closer to the inputs (such as the eyes) and others are further up the processing chain. For instance, some regions are primarily involved in seeing (the visual cortex, at the back of the brain), others in processing sounds (the auditory cortex), while others are involved in storing and retrieving memories (such as the hippocampus).
In a reply to Comte in 1865, the British philosopher John Stuart Mill anticipated the idea that self-awareness might also depend on the interaction of processes operating within a single brain and was thus a legitimate target of scientific study. Now, thanks to the advent of powerful brain imaging technologies such as functional magnetic resonance imaging (fMRI), we know that when we self-reflect, particular brain networks indeed crackle into life and that damage or disease to these same networks can lead to devastating impairments of self-awareness.
I often think that if we were not so thoroughly familiar with our own capacity for self-awareness, we would be gobsmacked that the brain is able to pull off this marvelous conjuring trick. Imagine for a moment that you are a scientist on a mission to study new life-forms found on a distant planet. Biologists back on Earth are clamoring to know what they're made of and what makes them tick. But no one suggests just asking them! And yet a Martian landing on Earth, after learning a bit of English or Spanish or French, could do just that. The Martians might be stunned to find that we can already tell them something about what it is like to remember, dream, laugh, cry, or feel elated or regretful — all by virtue of being self-aware.
I find it a remarkable fact that something unique about our biology has allowed the human brain to turn its thoughts on itself.
But self-awareness did not just evolve to allow us to tell each other (and potential Martian visitors) about our thoughts and feelings. Instead, being self-aware is central to how we experience the world. We not only perceive our surroundings; we can also reflect on the beauty of a sunset, wonder whether our vision is blurred, and ask whether our senses are being fooled by illusions or magic tricks. We not only make decisions about whether to take a new job or whom to marry; we can also reflect on whether we made a good or bad choice. We not only recall childhood memories; we can also question whether these memories might be mistaken.
Self-awareness also enables us to understand that other people have minds like ours. Being self-aware allows me to ask, "How does this seem to me?" and, equally importantly, "How will this seem to someone else?" Literary novels would become meaningless if we lost the ability to think about the minds of others and compare their experiences to our own. Without self-awareness, there would be no organized education. We would not know who needs to learn or whether we have the capacity to teach them. The writer Vladimir Nabokov elegantly captured this idea that self-awareness is a catalyst for human flourishing:
"Being aware of being aware of being. In other words, if I not only know that I am but also know that I know it, then I belong to the human species. All the rest follow s— the glory of thought, poetry, a vision of the universe. In that respect, the gap between ape and man is immeasurably greater than the one between amoeba and ape."
In light of these myriad benefits, it's not surprising that cultivating accurate self-awareness has long been considered a wise and noble goal. In Plato's dialogue Charmides, Socrates has just returned from fighting in the Peloponnesian War. On his way home, he asks a local boy, Charmides, if he has worked out the meaning of sophrosyne — the Greek word for temperance or moderation, and the essence of a life well lived. After a long debate, the boy's cousin Critias suggests that the key to sophrosyne is simple: self-awareness. Socrates sums up his argument: "Then the wise or temperate man, and he only, will know himself, and be able to examine what he knows or does not know…No other person will be able to do this."
Likewise, the ancient Greeks were urged to "know thyself" by a prominent inscription carved into the stone of the Temple of Delphi. For them, self-awareness was a work in progress and something to be striven toward. This view persisted into medieval religious traditions: for instance, the Italian priest and philosopher Saint Thomas Aquinas suggested that while God knows Himself by default, we need to put in time and effort to know our own minds. Aquinas and his monks spent long hours engaged in silent contemplation. They believed that only by participating in concerted self-reflection could they ascend toward the image of God.
A similar notion of striving toward self-awareness is seen in Eastern traditions such as Buddhism. The spiritual goal of enlightenment is to dissolve the ego, allowing more transparent and direct knowledge of our minds acting in the here and now. The founder of Chinese Taoism, Lao Tzu, captured this idea that gaining self-awareness is one of the highest pursuits when he wrote, "To know that one does not know is best; Not to know but to believe that one knows is a disease."
Today, there is a plethora of websites, blogs, and self-help books that encourage us to "find ourselves" and become more self-aware. The sentiment is well meant. But while we are often urged to have better self-awareness, little attention is paid to how self-awareness actually works. I find this odd. It would be strange to encourage people to fix their cars without knowing how the engine worked, or to go to the gym without knowing which muscles to exercise. This book aims to fill this gap. I don't pretend to give pithy advice or quotes to put on a poster. Instead, I aim to provide a guide to the building blocks of self-awareness, drawing on the latest research from psychology, computer science, and neuroscience. By understanding how self-awareness works, I aim to put us in a position to answer the Athenian call to use it better.