200 cognitive biases rule our everyday thinking

Almost 200 cognitive biases rule our everyday thinking. A new codex boils them down to 4.

Benson/Manoogian approach
  • Nearly 200 cognitive biases affect our decision-making.
  • The sheer amount of biases should teach us humility.
  • And we should recognize the essential role they play in life, as well.

Aside from mythical spiritual figures and biblical kings, humans are not objective in how they react to the world. As much as we would like to be fair and impartial about how we deal with the situations that arise on a daily basis, we process them through a complex series of internal biases before deciding how to react. Even the most self-conscious of us cannot escape the full spectrum of internal prejudices.

Brain biases can quickly become a hall of mirrors. How you understand and retain knowledge about cognitive shortcuts will determine what, if any, benefits you can derive from the substantial psychological science that's been done around them. Here we take a look at different ways of understanding cognitive biases, and different approaches to learning from them. Enjoy!

The Peter Baumann approach

Originally a pioneer of German electronic dance music, Peter Baumann now devotes himself to exploring the science and philosophy of the human experience. To him, cognitive biases are everything and nothing.

There is nothing that is not a bias.

We prefer sweet food to bitter food, solid ground to unstable ground, and are imbued with cultural assumptions that help us live more peacefully in society. Noting that biases exist in the biological domain, Baumann frees cognitive bias from the trap of being views as an entirely mental phenomenon.

Biases do not obstruct a healthy or positive life.

Biases are shortcuts we've inherited through past generations. They are designed to help us to survive. Confirmation bias, for example, solve the problem of not being able to take in all the world's information each time we make a decision. Of course, being closed off to new information is equally hazardous in modern society, where information is the currency of our knowledge-driven world.

Baumann's favorite bias?

The uniqueness bias amuses him the most because it's a bias that each person necessarily has. We all think of ouselves as unique because each person is at the center of their own existence. But interestingly, there are circles of uniqueness. People you have close relationships with are more unique than people you don't know. Which of course has some obvious limitations as a reliable point of view.

What to do about biases

Listen better, says Baumann. Understanding the predispositions we bring to the table should make us more open to understanding other people's points of view. If you're not so special, not so right, not so perfect all the time, there's a greater likelihood that you have something valuable to learn from others.

The Buster Benson approach

Benson/Manoogian III

Buster Benson (a marketing manager at Slack) decided to organize 175 known biases into a giant codex.

Benson (with help from illustrations by John Manoogian III), sorted biases for duplicates and grouped them into four larger categories, each called a "conundrum" or "problem". All four of these limit our intelligence but are actually trying to be helpful. According to Benson, "Every cognitive bias is there for a reason — primarily to save our brains time or energy." But the end result of utilizing such mental shortcuts, which are often useful, is that they also introduce errors into our thinking. By becoming aware of how our minds make decisions, we can be mindful of the inherent inaccuracies and fallacies and hopefully act with more fairness and grace.

Here's how Benson divides up more than 200 cognitive biases into four problems that biases actually help address:

The world is a set of information that's just too enormous for your brain to handle.

If something has already been in our memories and we're used to seeing that issue a certain way, that's how our brain is likely to react to it again. The biases that stem from this are plenty - the Attentional Bias, for example, that tells us to perceive events through our recurring thoughts at that time. This prevents us from considering alternate paths and possibilities.

Our biases that result from this kind of thinking include the context effects, the mood-congruent memory bias, or the empathy gap, which makes us underestimate the influence of visceral drives on our attitudes and actions.

We look at how much something has changed more than what the new value of this something is if it was presented by itself. Cue the Focusing effect, Money illusion, Conservatism, or Distinction bias.

And because you can't grasp everything, you'll always be missing a lot of essential information.

We utilize stereotypes and quick fill-in-the-gaps thinking to make decisions about something when we don't know everything about it. Mental mistakes like the Group attribution error, Ultimate attribution error, Stereotyping, Essentialism, the Bandwagon effect and the Placebo effect all arise from such a cognitive approach.

According to Benson, and probably to your own life experience, we also tend to like more the things and the people we know than those we don't. In this grouping, we'd find the Cheerleader effect and the Positivity effect among others.

You need to act fast, so you'll be relying on a limited set of information.

These cognitive issues arise from having to make decisions without having all the time and information you'd prefer. We often have to decide on a course of action quickly, relying on biases and instinct rather than all the possible facts.

One way to make decisions quickly is to do it with confidence, convincing yourself that what you're doing is important. Because of this, we often get overconfident, leading to such biases as the Dunning-Kruger effect, when people overestimate their abilities as well as Optimism Bias and Armchair Fallacy.

When we have to just go for it, we also tend to "favor the immediate," write Benson. The thing in front of us is worth much more than something potential and distant.

You need to remember some things. But it's impossible (and totally undesirable) to remember everything.

There's just so much information that permeates our daily lives that we are constantly made to choose between what to address and what to forget. This overload results in choosing generalizations and other biases that help us deal with the data onslaught.

Some of the tactics we rely on include creating false memories or discarding specifics in favor of stereotypes and prejudices. Unfortunately, it's just easier to function that way for some people.

We also tend to reduce events and lists to commonalities, choosing a small number of items to stand for the whole. Another thing we do is storing memories based on how we experienced them. This is when the circumstances of the experience affect the value we place on it. This is also when we get such great biases as the Tip of the tongue phenomenon, which is when we feel like we are about to remember something but we just fail to do it. You know that feeling.

Another fun modern bias of this kind is the Google effect, also called "digital amnesia". This is when we quickly forget information easily found online using a search engine like Google. Let's see if that happens with this article.

You can buy the codex (now featuring 188 biases) here. Hang it on your wall (and hopefully let some of it inform your thinking)!

"You look at this overwhelming array of cognitive biases and distortions, and realize how there are so many things that come between us and objective reality," Manoogian explained The Huffington Post. "One of the most overwhelming things to me that came out of this project is humility."

The reductive approach

While there nearly 200 cognitive biases that frame our decision making each day, here are 20 that you might want to pay particular attention to. At Business Insider, Samantha Lee put together a great infographic showing 20 cognitive biases that can get in the way of solid decision-making.

The Julia Galef approach

Julia Galef, President of the Center for Applied Rationality, says that looking at issues as an outsider is a surefire approach to outwit the commitment bias and the sunk-cost fallacy. Intel famously used this approach to leave behind a faltering memory-chip product for more lucrative ventures.

Iron Age discoveries uncovered outside London, including a ‘murder’ victim

A man's skeleton, found facedown with his hands bound, was unearthed near an ancient ceremonial circle during a high speed rail excavation project.

Photo Credit: HS2
Culture & Religion
  • A skeleton representing a man who was tossed face down into a ditch nearly 2,500 years ago with his hands bound in front of his hips was dug up during an excavation outside of London.
  • The discovery was made during a high speed rail project that has been a bonanza for archaeology, as the area is home to more than 60 ancient sites along the planned route.
  • An ornate grave of a high status individual from the Roman period and an ancient ceremonial circle were also discovered during the excavations.
Keep reading Show less

Are we really addicted to technology?

Fear that new technologies are addictive isn't a modern phenomenon.

Credit: Rodion Kutsaev via Unsplash
Technology & Innovation

This article was originally published on our sister site, Freethink, which has partnered with the Build for Tomorrow podcast to go inside new episodes each month. Subscribe here to learn more about the crazy, curious things from history that shaped us, and how we can shape the future.

In many ways, technology has made our lives better. Through smartphones, apps, and social media platforms we can now work more efficiently and connect in ways that would have been unimaginable just decades ago.

But as we've grown to rely on technology for a lot of our professional and personal needs, most of us are asking tough questions about the role technology plays in our own lives. Are we becoming too dependent on technology to the point that it's actually harming us?

In the latest episode of Build for Tomorrow, host and Entrepreneur Editor-in-Chief Jason Feifer takes on the thorny question: is technology addictive?

Popularizing medical language

What makes something addictive rather than just engaging? It's a meaningful distinction because if technology is addictive, the next question could be: are the creators of popular digital technologies, like smartphones and social media apps, intentionally creating things that are addictive? If so, should they be held responsible?

To answer those questions, we've first got to agree on a definition of "addiction." As it turns out, that's not quite as easy as it sounds.

If we don't have a good definition of what we're talking about, then we can't properly help people.


"Over the past few decades, a lot of effort has gone into destigmatizing conversations about mental health, which of course is a very good thing," Feifer explains. It also means that medical language has entered into our vernacular —we're now more comfortable using clinical words outside of a specific diagnosis.

"We've all got that one friend who says, 'Oh, I'm a little bit OCD' or that friend who says, 'Oh, this is my big PTSD moment,'" Liam Satchell, a lecturer in psychology at the University of Winchester and guest on the podcast, says. He's concerned about how the word "addiction" gets tossed around by people with no background in mental health. An increased concern surrounding "tech addiction" isn't actually being driven by concern among psychiatric professionals, he says.

"These sorts of concerns about things like internet use or social media use haven't come from the psychiatric community as much," Satchell says. "They've come from people who are interested in technology first."

The casual use of medical language can lead to confusion about what is actually a mental health concern. We need a reliable standard for recognizing, discussing, and ultimately treating psychological conditions.

"If we don't have a good definition of what we're talking about, then we can't properly help people," Satchell says. That's why, according to Satchell, the psychiatric definition of addiction being based around experiencing distress or significant family, social, or occupational disruption needs to be included in any definition of addiction we may use.

Too much reading causes... heat rashes?

But as Feifer points out in his podcast, both popularizing medical language and the fear that new technologies are addictive aren't totally modern phenomena.

Take, for instance, the concept of "reading mania."

In the 18th Century, an author named J. G. Heinzmann claimed that people who read too many novels could experience something called "reading mania." This condition, Heinzmann explained, could cause many symptoms, including: "weakening of the eyes, heat rashes, gout, arthritis, hemorrhoids, asthma, apoplexy, pulmonary disease, indigestion, blocking of the bowels, nervous disorder, migraines, epilepsy, hypochondria, and melancholy."

"That is all very specific! But really, even the term 'reading mania' is medical," Feifer says.

"Manic episodes are not a joke, folks. But this didn't stop people a century later from applying the same term to wristwatches."

Indeed, an 1889 piece in the Newcastle Weekly Courant declared: "The watch mania, as it is called, is certainly excessive; indeed it becomes rabid."

Similar concerns have echoed throughout history about the radio, telephone, TV, and video games.

"It may sound comical in our modern context, but back then, when those new technologies were the latest distraction, they were probably really engaging. People spent too much time doing them," Feifer says. "And what can we say about that now, having seen it play out over and over and over again? We can say it's common. It's a common behavior. Doesn't mean it's the healthiest one. It's just not a medical problem."

Few today would argue that novels are in-and-of-themselves addictive — regardless of how voraciously you may have consumed your last favorite novel. So, what happened? Were these things ever addictive — and if not, what was happening in these moments of concern?

People are complicated, our relationship with new technology is complicated, and addiction is complicated — and our efforts to simplify very complex things, and make generalizations across broad portions of the population, can lead to real harm.


There's a risk of pathologizing normal behavior, says Joel Billieux, professor of clinical psychology and psychological assessment at the University of Lausanne in Switzerland, and guest on the podcast. He's on a mission to understand how we can suss out what is truly addictive behavior versus what is normal behavior that we're calling addictive.

For Billieux and other professionals, this isn't just a rhetorical game. He uses the example of gaming addiction, which has come under increased scrutiny over the past half-decade. The language used around the subject of gaming addiction will determine how behaviors of potential patients are analyzed — and ultimately what treatment is recommended.

"For a lot of people you can realize that the gaming is actually a coping (mechanism for) social anxiety or trauma or depression," says Billieux.

"Those cases, of course, you will not necessarily target gaming per se. You will target what caused depression. And then as a result, If you succeed, gaming will diminish."

In some instances, a person might legitimately be addicted to gaming or technology, and require the corresponding treatment — but that treatment might be the wrong answer for another person.

"None of this is to discount that for some people, technology is a factor in a mental health problem," says Feifer.

"I am also not discounting that individual people can use technology such as smartphones or social media to a degree where it has a genuine negative impact on their lives. But the point here to understand is that people are complicated, our relationship with new technology is complicated, and addiction is complicated — and our efforts to simplify very complex things, and make generalizations across broad portions of the population, can lead to real harm."

Behavioral addiction is a notoriously complex thing for professionals to diagnose — even more so since the latest edition of the Diagnostic and Statistical Manual of Mental Disorders (DSM-5), the book professionals use to classify mental disorders, introduced a new idea about addiction in 2013.

"The DSM-5 grouped substance addiction with gambling addiction — this is the first time that substance addiction was directly categorized with any kind of behavioral addiction," Feifer says.

"And then, the DSM-5 went a tiny bit further — and proposed that other potentially addictive behaviors require further study."

This might not sound like that big of a deal to laypeople, but its effect was massive in medicine.

"Researchers started launching studies — not to see if a behavior like social media use can be addictive, but rather, to start with the assumption that social media use is addictive, and then to see how many people have the addiction," says Feifer.

Learned helplessness

The assumption that a lot of us are addicted to technology may itself be harming us by undermining our autonomy and belief that we have agency to create change in our own lives. That's what Nir Eyal, author of the books Hooked and Indistractable, calls 'learned helplessness.'

"The price of living in a world with so many good things in it is that sometimes we have to learn these new skills, these new behaviors to moderate our use," Eyal says. "One surefire way to not do anything is to believe you are powerless. That's what learned helplessness is all about."

So if it's not an addiction that most of us are experiencing when we check our phones 90 times a day or are wondering about what our followers are saying on Twitter — then what is it?

"A choice, a willful choice, and perhaps some people would not agree or would criticize your choices. But I think we cannot consider that as something that is pathological in the clinical sense," says Billieux.

Of course, for some people technology can be addictive.

"If something is genuinely interfering with your social or occupational life, and you have no ability to control it, then please seek help," says Feifer.

But for the vast majority of people, thinking about our use of technology as a choice — albeit not always a healthy one — can be the first step to overcoming unwanted habits.

For more, be sure to check out the Build for Tomorrow episode here.

Why the U.S. and Belgium are culture buddies

The Inglehart-Welzel World Cultural map replaces geographic accuracy with closeness in terms of values.

According to the latest version of the Inglehart-Welzel World Cultural Map, Belgium and the United States are now each other's closest neighbors in terms of cultural values.

Credit: World Values Survey, public domain.
Strange Maps
  • This map replaces geography with another type of closeness: cultural values.
  • Although the groups it depicts have familiar names, their shapes are not.
  • The map makes for strange bedfellows: Brazil next to South Africa and Belgium neighboring the U.S.
Keep reading Show less