Our society mostly emphasizes developing logical, procedural thinking skills, but this isn't the only way to come up with great ideas. Forgetting to develop our lateral thinking skills may mean missing out on unexpected innovations.
- Lateral thinking is a way of approaching problems. It deliberately forgoes obvious approaches in favor of oblique or unexpected ones.
- Deliberately ignoring perfectly good but straightforward solutions enables us to find hidden innovations we would otherwise miss.
- Edward de Bono, who developed the concept of lateral thinking, lays out 4 specific lateral thinking techniques: awareness, random stimulation, alternatives, and alteration.
Here's a puzzle: A man walks into a bar and asks the bartender for a glass of water, who instead pulls out a gun, cocks it, and points it at the man. The man thanks the bartender and walks out. Why did the man thank the bartender?
There's no way to arrive at the answer (which appears at the end* of this article) without asking questions, testing the different elements of the story to see what missing information hasn't been provided. It's an example of a lateral thinking puzzle, a type of puzzle that requires creative, sometimes oblique thinking to arrive at the answer. In essence, lateral thinking is a method of approaching a problem by deliberately forgoing obvious methods of reasoning. It requires one to consider a given issue from unlikely angles, uncovering innovative solutions as a result.
Traditional thinking is vertical, moving step-by-step to a logical conclusion based off of the available data. Lateral thinking, however, is horizontal, putting the emphasis on generating many ideas while de-emphasizing the details of how those ideas could be implemented. Both vertical and lateral thinking are complementary: Without lateral thinking, vertical thinking would be too narrow-minded; without vertical thinking, lateral thinking would produce many possible solutions but no plans to implement them.
Despite their complementary nature, our society really values and focuses on improving vertical thinking. We believe that adequate training on specific techniques and systems will produce a talented engineer, lawyer, or doctor. But when it comes to professions that rely on creative, generative, lateral skills, we tend to assume that only those born with innate talent can excel in them. Even when it comes to the more vertically minded professions like engineering, creativity is seen as a desirable bonus that great engineers are born with.
Two stages of thinking
Psychologist Edward de Bono, who developed the concept of lateral thinking, argued that the brain thinks in two stages: The first is a perceiving stage, where the brain chooses to frame its environment in a certain way, identifying a particular pattern. The second stage uses that pattern, that particular way of looking at the environment, and builds upon it to reach a conclusion. No matter how effective we are at the vertical thinking of the second stage, better vertical thinking can never correct errors that have arisen in the first stage. In order to more accurately perceive patterns in our environment, we have to develop our lateral thinking skills.
In the video below, author David Epstein illustrates this principal through the case of Japanese repairman Gunpei Yokoi. Yokoi wasn't a particularly gifted engineer, but he perceived his environment in a way that his more talented and specialized peers were not able to. Because they had specialized so much, these more traditionally talented engineers could only frame their environment in terms of the specific technologies they specialized in. Yokoi, on the other hand, saw how various older — and therefore overlooked — pieces of technology could work together. The result was the Nintendo Game Boy.
Lateral thinking: The reason you’ve heard of Nintendo and Marvel
Learning to think laterally is, almost by definition, counterintuitive. Fortunately, de Bono developed some practical techniques for developing this overlooked capability. In his paper, "Information Processing and New Ideas — Lateral and Vertical Thinking," de Bono described four such techniques. Here they are:
- Awareness: Being aware of the way the brain processes information is the first step to improving the lateral thinking process. It's important to recognize the brain's tendency to rely on established patterns of thinking before starting to work on a new problem.
- Random stimulation: Often when we're trying to think about some issue, we shut out all outside stimuli so we can focus. However, allowing unplanned, outside stimuli can disrupt our reliance on imperfect frameworks. Paying attention to randomness can propel our thinking to new insights.
- Alternatives: de Bono argued that even if there is an apparently suitable solution to a problem, it can be useful to set it aside and deliberately consider alternative approaches, regardless of how ridiculous they might seem. Doing so will help you to consider a problem from all possible angles.
- Alteration: This technique consists of the deliberate alteration of available options, like doing the opposite of an implied direction or reversing any relationship between elements of the problem. This can include denying elements that are taken for granted, breaking large patterns down into tiny fragments, or translating a relationship to an analogy and then translating it back again just to see what changed. Arbitrarily altering elements of the problem space can produce novel tools to build a solution with.
*The man has the hiccups and was hoping to cure it with a glass of water. Seeing this, the bartender decided to scare the man to cure his hiccups. Realizing he no longer had the hiccups, the man thanked the bartender and left.
Conventional wisdom believes "screen time" disrupts mental development, but research hints at a more complicated relationship between our minds and digital technology.
- Worry over test scores has led many to blame digital technology for waning educational achievement.
- New studies show that the persistent effects of "screen time" are not yet understood and may be short-lived.
- Many experts argue the best approach is to teach students the strategic and selective use of digital technology.
Artificial intelligence has proven equal and even better than humans in making some diagnoses.
- A review of studies found that AI is at least equal to human healthcare workers in making diagnoses.
- The conclusion applies to cases where AI looked at images.
- More real-world tests are necessary to further develop artificial intelligence in medicine.
Our ability to make predictions about the future distinguishes our level of consciousness.
- One of the great questions in all of science is where consciousness comes from.
- When it comes to consciousness, Kaku believes different species have different levels of consciousness, based on their feedback loops needed to survive in space, society, and time.
- According to the theoretical physicist, human beings' ability to use past experiences, memories, to predict the future makes us distinct among animals — and even robots (they're currently unable to understand, or operate within, a social hierarchy).
Most elderly individuals' brains degrade over time, but some match — or even outperform — younger individuals on cognitive tests.
- "Super-agers" seem to escape the decline in cognitive function that affects most of the elderly population.
- New research suggests this is because of higher functional connectivity in key brain networks.
- It's not clear what the specific reason for this is, but research has uncovered several activities that encourage greater brain health in old age.
At some point in our 20s or 30s, something starts to change in our brains. They begin to shrink a little bit. The myelin that insulates our nerves begins to lose some of its integrity. Fewer and fewer chemical messages get sent as our brains make fewer neurotransmitters.
As we get older, these processes increase. Brain weight decreases by about 5 percent per decade after 40. The frontal lobe and hippocampus — areas related to memory encoding — begin to shrink mainly around 60 or 70. But this is just an unfortunate reality; you can't always be young, and things will begin to break down eventually. That's part of the reason why some individuals think that we should all hope for a life that ends by 75, before the worst effects of time sink in.
But this might be a touch premature. Some lucky individuals seem to resist these destructive forces working on our brains. In cognitive tests, these 80-year-old "super-agers" perform just as well as individuals in their 20s.
Just as sharp as the whippersnappers
To find out what's behind the phenomenon of super-agers, researchers conducted a study examining the brains and cognitive performances of two groups: 41 young adults between the ages of 18 and 35 and 40 older adults between the ages of 60 and 80.
First, the researchers administered a series of cognitive tests, like the California Verbal Learning Test (CVLT) and the Trail Making Test (TMT). Seventeen members of the older group scored at or above the mean scores of the younger group. That is, these 17 could be considered super-agers, performing at the same level as the younger study participants. Aside from these individuals, members of the older group tended to perform less well on the cognitive tests. Then, the researchers scanned all participants' brains in an fMRI, paying special attention to two portions of the brain: the default mode network and the salience network.
The default mode network is, as its name might suggest, a series of brain regions that are active by default — when we're not engaged in a task, they tend to show higher levels of activity. It also appears to be very related to thinking about one's self, thinking about others, as well as aspects of memory and thinking about the future.
The salience network is another network of brain regions, so named because it appears deeply linked to detecting and integrating salient emotional and sensory stimuli. (In neuroscience, saliency refers to how much an item "sticks out"). Both of these networks are also extremely important to overall cognitive function, and in super-agers, the activity in these networks was more coordinated than in their peers.
An image of the brain highlighting the regions associated with the default mode network.
How to ensure brain health in old age
While prior research has identified some genetic influences on how "gracefully" the brain ages, there are likely activities that can encourage brain health. "We hope to identify things we can prescribe for people that would help them be more like a superager," said Bradford Dickerson, one of the researchers in this study, in a statement. "It's not as likely to be a pill as more likely to be recommendations for lifestyle, diet, and exercise. That's one of the long-term goals of this study — to try to help people become superagers if they want to."
To date, there is some preliminary evidence of ways that you can keep your brain younger longer. For instance, more education and a cognitively demanding job predicts having higher cognitive abilities in old age. Generally speaking, the adage of "use it or lose it" appears to hold true; having a cognitively active lifestyle helps to protect your brain in old age. So, it might be tempting to fill your golden years with beer and reruns of CSI, but it's unlikely to help you keep your edge.
Aside from these intuitive ways to keep your brain healthy, regular exercise appears to boost cognitive health in old age, as Dickinson mentioned. Diet is also a protective factor, especially for diets delivering omega-3 fatty acids (which can be found in fish oil), polyphenols (found in dark chocolate!), vitamin D (egg yolks and sunlight), and the B vitamins (meat, eggs, and legumes). There's also evidence that having a healthy social life in old age can protect against cognitive decline.
For many, the physical decline associated with old age is an expected side effect of a life well-lived. But the idea that our intellect will also degrade can be a much scarier reality. Fortunately, the existence of super-agers shows that at the very least, we don't have to accept cognitive decline without a fight.