Once a week.
Subscribe to our weekly newsletter.
The Common Belief Fallacy
The twisting path to becoming less dumb has led to many stops.
Back when Shakespeare said you were the paragon of animals, both noble in reason and infinite in faculties, he did so during a time when physicians believed the body was filled with black bile, yellow bile, phlegm, and blood, and all sickness and health depended on the interaction of those fluids. Lethargic and lazy? Well, that’s because you are full of phlegm. Feeling sick? Maybe you’ve got too much blood and should go see a barber to get drained. Yes, the creator of some of the greatest works of the English language believed you could cure a fever with a knife.
It’s easy to laugh at the very wrong things that people once believed, but try not to feel too superior. My friend Susannah Gregg was living in South Korea and working there as an English teacher when she first learned about fan death, a common belief among people in that country that oscillating desk fans are among the most deadly inventions known to man. She was stepping out for a beer with a friend when he noticed, to his horror, she had left her fan running with her pet rabbit still inside her house. Her friend, a twenty-eight-year-old college graduate, refused to leave until she turned off the fan. He explained to her that everyone knows you can’t leave a fan running inside a room with the windows shut. That would mean certain death. It was shocking to him that she was unaware of something so simple and potentially life-threatening. Susannah thought he was kidding. It took several conversations to convince him it wasn’t true and that in her country, in most countries, no one believed such a thing. She successfully avoided absorbing the common belief not because she was smarter than her friend but because she had already done the experiments necessary to disprove the myth. She had slept in a house with a fan running many times and lived to tell about it. Since then, she has asked many friends and coworkers there about fans, and the response has been mixed. Some people think it is silly, and some think fan death is real. In 2013, despite the debunking power of a few Google searches, the belief that you shouldn’t fall asleep or spend too much time in a room with a running electric fan is so pervasive in South Korea that Susannah told me you can’t buy one within their borders without a safety device that turns it off after a set amount of time. The common belief is so deep and strong that fan manufacturers must include a safety switch to soothe the irrational fears of most consumers.
Your ancestors may not have had the toolset you do when it came to avoiding mental stumbling blocks or your immense cultural inheritance, but their minds worked in much the same way. The people who thought the world rested on the back of a great tortoise or who thought dancing would make it rain - they had the same brain as you; that is to say, they had the same blueprint in their DNA for making brains. So a baby born into their world was about the same as one born into yours. Evolution is so slow that not enough has changed in the way brains are made to tell much of a difference between you and a person from ten thousand years ago. That means that from gods in burning chariots to elves making cookies in trees, people long ago believed in all sorts of silly things thanks to the same faulty reasoning you deal with today. They, too, were fueled by a desire to make sense of reality and to answer the age-old question: “What, exactly, is happening here?” Instead of letting that question hang in the air, your distant relatives tended to go ahead and answer it, and they kept answering it over and over again, with newer yet equally dumb ideas because of one of the most profoundly difficult obstacles humans have faced since we started chipping away at flint to make heads for spears. This malfunction of the mind is called the common belief fallacy.
In Latin, it is argumentum ad populum, or “appeal to the people,” which should clue you in that this is something your species has worried about for a long time. The fallacy works like this: If most people believe something is true, you are more likely to believe it is true the first time you hear about it. You then pass along that mistaken belief, and on and on it goes.
Being a social creature, the first thing you do in a new job, new school, new country, or any other novel situation is ask people who are familiar with the environment to help you get acquainted with the best way to do things, the best places to eat, the hand gestures that might get you beheaded, etc. The problem, of course, is that your info is now based on opinions that are based on things such as conformity and emotions and norms and popularity, and if you’ve spent any time in a high school, on a dance floor, or at a rave, you know that what is popular is not always what is good or true. It isn’t exactly something we’ve overcome, but at least we now have a strategy for dealing with it.
Before we had a method for examining reality, the truth was a slippery fish, which is why your ancestors were so dumb. So dumb, in fact, that for a very long time people got smarter in a slow, meandering, and unreliable sort of way until human beings finally invented and adopted a tool with which to dig their way out of the giant hole of stupid into which they kept falling. The hole here is a metaphor for self-delusion. Your great-great-great-grandparents didn’t really keep falling into giant holes, at least not in numbers large enough to justify a book on the topic.
The tool here is also a metaphor. I’m talking about the scientific method. Your ancestors invented the scientific method because the common belief fallacy renders your default strategies for making sense of the world generally awful and prone to error. Why do bees like flowers? What causes snow? Where do babies come from? Every explanation in every tribe, city, and nation was as good as the next, even if it was completely made up. Even worse, once an explanation was woven into a culture, it would often become the official explanation for many lifetimes. “What is thunder?” a child might have asked. “Oh, that’s the giant snow crab in the sky falling off his bed,” a shaman would have explained, and that would have been good enough for everyone until they all had their own kids and eventually died of dysentery. That hamster wheel of limited knowledge kept spinning until the scientific method caught on. Even then, there was a long way to go and lots of cobwebs to be cleared from common sense.
Scholars used to believe that life just sort of happened sometimes. Learned people going all the way back to Aristotle truly believed that if you left meat outside long enough it would spontaneously generate new life in the form of maggots and flies. The same people thought that if you piled up dirty rags and left them alone for a while they would magically turn into mice. Seriously. The idea started to fade in 1668 when a physician named Francesco Redi tested the hypothesis by placing meat and eggs in both sealed and unsealed containers and then checked back to see which ones contained life. The sealed containers didn’t spontaneously generate flies, and thus the concept began to die. Other thinkers contested his discoveries at first, and it took Louis Pasteur’s great fame and his own experiments to put the idea away forever some two centuries later.
People learned that science, as a tool, as a lens to create an upside-down way of looking at the world, made life better. Your natural tendency is to start from a conclusion and work backward to confirm your assumptions, but the scientific method drives down the wrong side of the road and tries to disconfirm your assumptions. A couple of centuries back people began to catch on to the fact that looking for disconfirming evidence was a better way to conduct research than proceeding from common belief. They saw that eliminating suspicions caused the outline of the truth to emerge. Once your forefathers and foremothers realized that this approach generated results, in a few generations your species went from burning witches and drinking mercury to mapping the human genome and playing golf on the moon.
The twisting path to becoming less dumb has led to many stops.
and starts, yet humans persist. Sure, scientists are just people, prone to the same delusions as anyone else, but the enterprise, the process, slowly but surely grinds away human weakness. It is a self-correcting system that is always closer to the truth today than it was yesterday.
The people who came before you invented science because your natural way of understanding and explaining what you experience is terrible. When you believe in something, you rarely seek out evidence to the contrary to see how it matches up with your assumptions. That’s the source of urban legends, folklore, superstitions, and all the rest. Skepticism is not your strong suit. In the background, while you crochet and golf and browse cat videos, people using science are fighting against your stupidity. No other human enterprise is fighting as hard, or at least not fighting and winning.
When you have zero evidence, every assumption is basically equal. You prefer to see causes rather than effects, signals in the noise, patterns in the randomness. You prefer easy-to-understand stories, and thus turn everything in life into a narrative so that complicated problems become easy. Scientists work to remove the narrative, to boil it away, leaving behind only the raw facts. Those data sit there naked and exposed so they can be reflected upon and rearranged by each new visitor.
Scientists will speculate, and they will argue, but the data they extract from observation will not budge. They may not even make sense for a hundred years or more, but thanks to the scientific method, the stories, full of biases and fallacies, will crash against the facts and recede into history.
This excerpt is an edited (to be made much shorter) and slightly altered chapter excerpted from the book, “You Are Now Less Dumb.”
Reprinted by arrangement with Gotham Books, a member of Penguin Group (USA) LLC, A Penguin Random House Company. Copyright © David McRaney, 2013.
Image courtesy of Shutterstock
Is information the fifth form of matter?
- Researchers have been trying for over 60 years to detect dark matter.
- There are many theories about it, but none are supported by evidence.
- The mass-energy-information equivalence principle combines several theories to offer an alternative to dark matter.
The “discovery” of dark matter
We can tell how much matter is in the universe by the motions of the stars. In the1920s, physicists attempting to do so discovered a discrepancy and concluded that there must be more matter in the universe than is detectable. How can this be?
In 1933, Swiss astronomer Fritz Zwicky, while observing the motion of galaxies in the Coma Cluster, began wondering what kept them together. There wasn't enough mass to keep the galaxies from flying apart. Zwicky proposed that some kind of dark matter provided cohesion. But since he had no evidence, his theory was quickly dismissed.
Then, in 1968, astronomer Vera Rubin made a similar discovery. She was studying the Andromeda Galaxy at Kitt Peak Observatory in the mountains of southern Arizona when she came across something that puzzled her. Rubin was examining Andromeda's rotation curve, or the speed at which the stars around the center rotate, and realized that the stars on the outer edges moved at the exact same rate as those at the interior, violating Newton's laws of motion. This meant there was more matter in the galaxy than was detectable. Her punch card readouts are today considered the first evidence of the existence of dark matter.
Many other galaxies were studied throughout the '70s. In each case, the same phenomenon was observed. Today, dark matter is thought to comprise up to 27% of the universe. "Normal" or baryonic matter makes up just 5%. That's the stuff we can detect. Dark energy, which we can't detect either, makes up 68%.
Dark energy is what accounts for the Hubble Constant, or the rate at which the universe is expanding. Dark matter on the other hand, affects how "normal" matter clumps together. It stabilizes galaxy clusters. It also affects the shape of galaxies, their rotation curves, and how stars move within them. Dark matter even affects how galaxies influence one another.
Leading theories on dark matter
NASA writes: 'This graphic represents a slice of the spider-web-like structure of the universe, called the "cosmic web." These great filaments are made largely of dark matter located in the space between galaxies.'
Credit: NASA, ESA, and E. Hallman (University of Colorado, Boulder)
Since the '70s, astronomers and physicists have been unable to identify any evidence of dark matter. One theory is it's all tied up in space-bound objects called MACHOs (Massive Compact Halo Objects). These include black holes, supermassive black holes, brown dwarfs, and neutron stars.
Another theory is that dark matter is made up of a type of non-baryonic matter, called WIMPS (Weakly Interacting Massive Particles). Baryonic matter is the kind made up of baryons, such as protons and neutrons and everything composed of them, which is anything with an atomic nucleus. Electrons, neutrinos, muons, and tau particles aren't baryons, however, but a class of particles called leptons. Even though the (hypothetical) WIMPS would have ten to a hundred times the mass of a proton, their interactions with normal matter would be weak, making them hard to detect.
Then there are those aforementioned neutrinos. Did you know that giant streams of them pass from the Sun through the Earth each day, without us ever noticing? They're the focus of another theory that says that neutral neutrinos, that only interact with normal matter through gravity, are what dark matter is comprised of. Other candidates include two theoretical particles, the neutral axion and the uncharged photino.
Now, one theoretical physicist posits an even more radical notion. What if dark matter didn't exist at all? Dr. Melvin Vopson of the University of Portsmouth, in the UK, has a hypothesis he calls the mass-energy-information equivalence. It states that information is the fundamental building block of the universe, and it has mass. This accounts for the missing mass within galaxies, thus eliminating the hypothesis of dark matter entirely.
To be clear, the idea that information is an essential building block of the universe isn't new. Classical Information Theory was first posited by Claude Elwood Shannon, the "father of the digital age" in the mid-20th century. The mathematician and engineer, well-known in scientific circles—but not so much outside of them, had a stroke of genius back in 1940. He realized that Boolean algebra coincided perfectly with telephone switching circuits. Soon, he proved that mathematics could be employed to design electrical systems.
Shannon was hired at Bell Labs to figure out how to transfer information over a system of wires. He wrote the bible on using mathematics to set up communication systems, thereby laying the foundation for the digital age. Shannon was also the first to define one unit of information as a bit.
There was perhaps no greater proponent of information theory than another unsung paragon of science, John Archibald Wheeler. Wheeler was part of the Manhattan Project, worked out the "S-Matrix" with Niels Bohr and helped Einstein develop a unified theory of physics. In his later years, he proclaimed, "Everything is information." Then he went about exploring connections between quantum mechanics and information theory.
He also coined the phrase "it from bit" or that every particle in the universe emanates from the information locked inside it. At the Santa Fe Institute in 1989, Wheeler announced that everything, from particles to forces to the fabric of spacetime itself "… derives its function, its meaning, its very existence entirely … from the apparatus-elicited answers to yes-or-no questions, binary choices, bits."
Part Einstein, part Landauer
Vopson takes this notion one step further. He says that not only is information the essential unit of the universe but also that it is energy and has mass. To support this claim, he unifies and coordinates special relativity with the Landauer Principle. The latter is named after Rolf Landauer. In 1961, he predicted that erasing even one bit of information would release a tiny amount of heat, a figure which he calculated. Landauer said this proves information is more than just a mathematical quantity. This connects information to energy. Through experimental testing over the years, the Landauer Principle has held up.
Vopson says, "He [Landauer] first identified the link between thermodynamics and information by postulating that logical irreversibility of a computational process implies physical irreversibility." This indicates that information is physical, Vopson says, and demonstrates the link between information theory and thermodynamics.
In Vopson's theory, information, once created has "finite and quantifiable mass." It so far applies only to digital systems, but could very well apply to analogue and biological ones too, and even quantum or relativistic-moving systems. "Relativity and quantum mechanics are possible future directions of the mass-energy-information equivalence principle," he says.
In the paper published in the journal AIP Advances, Vopson outlines the mathematical basis for his hypothesis. "I am the first to propose the mechanism and the physics by which information acquires mass," he said, "as well as to formulate this powerful principle and to propose a possible experiment to test it."
The fifth state of matter
To measure the mass of digital information, you start with an empty data storage device. Next, you measure its total mass with a highly sensitive measuring apparatus. Then, you fill it and determine its mass. Next, you erase one file and evaluate it again. The trouble is, the "ultra-accurate mass measurement" device the paper describes doesn't exist yet. This would be an interferometer, something similar to LIGO. Or perhaps an ultrasensitive weighing machine akin to a Kibble balance.
"Currently, I am in the process of applying for a small grant, with the main objective of designing such an experiment, followed by calculations to check if detection of these small mass changes is even possible," Vopson says. "Assuming the grant is successful and the estimates are positive, then a larger international consortium could be formed to undertake the construction of the instrument." He added, "This is not a workbench laboratory experiment, and it would most likely be a large and costly facility." If eventually proved correct, Vopson will have discovered the fifth form of matter.
So, what's the connection to dark matter? Vopson says, "M.P. Gough published an article in 2008 in which he worked out … the number of bits of information that the visible universe would contain to make up all the missing dark matter. It appears that my estimates of information bit content of the universe are very close to his estimates."
The experience of life flashing before one's eyes has been reported for well over a century, but where's the science behind it?
At the age of 16, when Tony Kofi was an apprentice builder living in Nottingham, he fell from the third story of a building. Time seemed to slow down massively, and he saw a complex series of images flash before his eyes.
As he described it, “In my mind's eye I saw many, many things: children that I hadn't even had yet, friends that I had never seen but are now my friends. The thing that really stuck in my mind was playing an instrument". Then Tony landed on his head and lost consciousness.
When he came to at the hospital, he felt like a different person and didn't want to return to his previous life. Over the following weeks, the images kept flashing back into his mind. He felt that he was “being shown something" and that the images represented his future.
Later, Tony saw a picture of a saxophone and recognized it as the instrument he'd seen himself playing. He used his compensation money from the accident to buy one. Now, Tony Kofi is one of the UK's most successful jazz musicians, having won the BBC Jazz awards twice, in 2005 and 2008.
Though Tony's belief that he saw into his future is uncommon, it's by no means uncommon for people to report witnessing multiple scenes from their past during split-second emergency situations. After all, this is where the phrase “my life flashed before my eyes" comes from.
But what explains this phenomenon? Psychologists have proposed a number of explanations, but I'd argue the key to understanding Tony's experience lies in a different interpretation of time itself.
When life flashes before our eyes
The experience of life flashing before one's eyes has been reported for well over a century. In 1892, a Swiss geologist named Albert Heim fell from a precipice while mountain climbing. In his account of the fall, he wrote is was “as if on a distant stage, my whole past life [was] playing itself out in numerous scenes".
More recently, in July 2005, a young woman called Gill Hicks was sitting near one of the bombs that exploded on the London Underground. In the minutes after the accident, she hovered on the brink of death where, as she describes it: “my life was flashing before my eyes, flickering through every scene, every happy and sad moment, everything I have ever done, said, experienced".
In some cases, people don't see a review of their whole lives, but a series of past experiences and events that have special significance to them.
Explaining life reviews
Perhaps surprisingly, given how common it is, the “life review experience" has been studied very little. A handful of theories have been put forward, but they're understandably tentative and rather vague.
For example, a group of Israeli researchers suggested in 2017 that our life events may exist as a continuum in our minds, and may come to the forefront in extreme conditions of psychological and physiological stress.
Another theory is that, when we're close to death, our memories suddenly “unload" themselves, like the contents of a skip being dumped. This could be related to “cortical disinhibition" – a breaking down of the normal regulatory processes of the brain – in highly stressful or dangerous situations, causing a “cascade" of mental impressions.
But the life review is usually reported as a serene and ordered experience, completely unlike the kind of chaotic cascade of experiences associated with cortical disinhibition. And none of these theories explain how it's possible for such a vast amount of information – in many cases, all the events of a person's life – to manifest themselves in a period of a few seconds, and often far less.
Thinking in 'spatial' time
An alternative explanation is to think of time in a “spatial" sense. Our commonsense view of time is as an arrow that moves from the past through the present towards the future, in which we only have direct access to the present. But modern physics has cast doubt on this simple linear view of time.
Indeed, since Einstein's theory of relativity, some physicists have adopted a “spatial" view of time. They argue we live in a static “block universe" in which time is spread out in a kind of panorama where the past, the present and the future co-exist simultaneously.
The modern physicist Carlo Rovelli – author of the best-selling The Order of Time – also holds the view that linear time doesn't exist as a universal fact. This idea reflects the view of the philosopher Immanuel Kant, who argued that time is not an objectively real phenomenon, but a construct of the human mind.
This could explain why some people are able to review the events of their whole lives in an instant. A good deal of previous research – including my own – has suggested that our normal perception of time is simply a product of our normal state of consciousness.
In many altered states of consciousness, time slows down so dramatically that seconds seem to stretch out into minutes. This is a common feature of emergency situations, as well as states of deep meditation, experiences on psychedelic drugs and when athletes are “in the zone".
The limits of understanding
But what about Tony Kofi's apparent visions of his future? Did he really glimpse scenes from his future life? Did he see himself playing the saxophone because somehow his future as a musician was already established?
There are obviously some mundane interpretations of Tony's experience. Perhaps, for instance, he became a saxophone player simply because he saw himself playing it in his vision. But I don't think it's impossible that Tony did glimpse future events.
If time really does exist in a spatial sense – and if it's true that time is a construct of the human mind – then perhaps in some way future events may already be present, just as past events are still present.
Admittedly, this is very difficult to make sense of. But why should everything make sense to us? As I have suggested in a recent book, there must be some aspects of reality that are beyond our comprehension. After all, we're just animals, with a limited awareness of reality. And perhaps more than any other phenomenon, this is especially true of time.
Might as well face it, you're addicted to love.
- Many writers have commented on the addictive qualities of love. Science agrees.
- The reward system of the brain reacts similarly to both love and drugs
- Someday, it might be possible to treat "love addiction."
Since people started writing, they've written about love. The oldest love poem known dates back to the 21st century BCE. For most of that time, writers also apparently have been of two (or more) minds about it, announcing that love can be painful, impossible to quit, or even addictive — while also mentioning how nice it is.
The idea of love as an addiction is one that is both familiar and unsettling. Surely it can't be the case that our mutual love with our partner — a thing that can produce euphoria, consumes a great deal of our time, and which we fear losing — can be compared to a drug habit? But indeed, many scientists have turned their attention to the idea of "love addiction" and how your brain on drugs might resemble your brain in love.
Love and other drugs
In a 2017 article published in the journal Philosophy, Psychiatry, & Psychology, a team of neuroethicists considered the idea that love is addicting and held the idea up to science for scrutiny.
They point out that the leading model of addiction rests on the notion of a drug causing the brain to release an unnatural level of reward chemicals, such as dopamine, effectively hijacking the brain's reward system. This phenomenon isn't strictly limited to drugs, though they are more effective at this process than other things. Rats can get a similar rush from sugar as from cocaine, and they can have terrible withdrawal symptoms when the sugar crash kicks in.
On the structural level, there is a fair amount of overlap between the parts of the brain that handle love and pair-bonding and the parts that deal with addiction and reward processing. When inside an MRI machine and asked to think about the person they love romantically, the reward centers of people's brains light up like Broadway.
Love as an addiction
These facts lead the authors to consider two ideas, dubbed the "narrow" and "broad" views of love as an addiction.
The narrow view holds that addiction is the result of abnormal brain processes that simply don't exist in non-addicts. Under this paradigm, "food-seeking or love-seeking behaviors are not truly the result of addiction, no matter how addiction-like they may outwardly appear." It could be that abnormal processes cause the brain's reward system to misfire when exposed to love and to react to it excessively.
If this model is accurate, love addiction would be a rare thing — one study puts it around five to ten percent of the population — but could be considered a disorder similar to others and caused by faulty wiring in the brain. As with other addictions, this malfunction of the reward system could lead to an inability to fully live a typical life, difficulty having healthy relationships, and a number of other negative consequences.
The broad view looks at addiction differently, perhaps even radically.
It begins with the idea that addiction exists on a spectrum of motivations. All of our appetites, including those for food and water, exist on this spectrum and activate similar parts of the brain when satisfied. We can have appetites for anything that taps into our reward system, including food, gambling, sex, drugs, and love. For most people most of the time, our appetites are fairly temperate, if recurring. I might be slightly "addicted" to food — I do need some a few times per day — but that "addiction" doesn't have any negative effects on my health.
An appetite for cocaine, however, is rarely temperate and usually dangerous. Likewise, a person's appetite for love could reach addiction levels, and a person could be considered "hooked" on relationships (or on a particular person). This would put love addiction at the extreme end of the spectrum.
None of this is to say that the authors think that love is bad for you just because it can resemble an addiction. Love addiction is not the same as cocaine addiction at the neurological level: important differences, like how long it takes for the desire for another "hit" to occur, do exist. Rather, the authors see this as an opportunity to reconsider our approach to addiction in general and to think about how we can help the heartsick when they just can't seem to get over their last relationship.
Is "love addiction" a treatable disorder?
Hypothetically, a neurological basis for an addiction to love could point toward interventions that "correct" for it. If the narrow view of addiction is accurate, perhaps some people will be able to seek treatment for love addiction in the same way that others seek help to quit smoking. If the broad view of addiction is correct, the treatment of love addiction would be unlikely as it may be difficult to properly identify where the cutoff of acceptability on a spectrum should be.
Either way, since love is generally held in high regard by all cultures and doesn't quite seem to be in the same category as a bad cocaine habit in terms of social undesirability, the authors doubt we'll be treating anyone for "love addiction" anytime soon.
A brief passage from a recent UN report describes what could be the first-known case of an autonomous weapon, powered by artificial intelligence, killing in the battlefield.