Cancel culture vs. toleration: The consequences of punishing dissent
When we limit the clash of ideas, we ultimately hinder progress for the entire society.
- Pluralism is the idea that different people, traditions, and beliefs not only can coexist together in the same society but also should coexist together because society benefits from the vibrant workshopping of ideas.
- Cancel culture is a threat to a liberal society because it seeks to shape the available information rather than seek truth.
- Practicing toleration for those ideas does not mean merely putting up with them but actually acknowledging the ideas with an open spirit, as Chandran Kukathas, professor at Singapore Management University, says.
"Cancel culture now poses a real threat to intellectual freedom in the United States," Jonathan Rauch, distinguished fellow at the Institute for Humane Studies, writes in Persuasion. Rauch cites a Cato Institute poll that found a third of Americans worry their careers will be harmed if they express their real political opinions. Canceling is different than healthy criticism, Rauch writes, because canceling "is about shaping the information battlefield, not seeking truth; and its intent—or at least its predictable outcome—is to coerce conformity[.]"
And conformity is a death knell for liberalism. In a homogenous society—one in which everyone has roughly the same background, religion, values, and goals—people will generally agree on what it means to be a good person and live a good life. But a key tenet of liberalism is pluralism: the idea that different people, traditions, and beliefs not only can coexist together in the same society but also should coexist together because society benefits from vibrant heterogeneity.
"Liberal thinking really arises out of a reflection on the fact that people disagree substantially about things," Chandran Kukathas, professor at Singapore Management University, says in a Big Think video on pluralism and toleration. "They have different ways of life."
[Cancel culture] is about shaping the information battlefield, not seeking truth; and its intent—or at least its predictable outcome—is to coerce conformity[.]
Throughout history, men and women who've changed the world have been living examples of pluralism—people whose lives and minds were unique products of a diverse, interconnected world. Alexander Hamilton was, as the musical Hamilton says, "a bastard, orphan, son of a whore and a Scotsman, dropped in the middle of a forgotten spot in the Caribbean" before he came to the colonies. Marie Curie (neé Skłodowska) was the daughter of two Polish teachers, one atheist and one Catholic, and attended an underground university in Warsaw before immigrating to Paris. Sergey Brin was born in the Soviet Union to Jewish parents before his family fled persecution and came to the United States, where Brin co-founded Google.
A pluralistic society nourishes innovation and progress, where diverse people with unique life experiences develop and share ideas. If people stayed in discrete, homogenous communities, how many world-changing lives and ideas would never have existed?
Critics might say: It's one thing to welcome people from diverse backgrounds into your society; it's another to welcome diverse ideas, even if some are offensive or harmful.
But our vibrant, evolving world depends on diverse ideas and cultures. In a homogenous society, ideas and customs can be stagnant for generations. But in a pluralistic society, ideas and customs evolve by being brought into constant contact with alternative ideas and customs. In On Liberty, John Stuart Mill writes:
…the peculiar evil of silencing the expression of an opinion is that it is robbing the human race; posterity as well as the existing generation; those who dissent from the opinion, still more than those who hold it. If the opinion is right, they are deprived of the opportunity of exchanging error for truth: if wrong, they lose, what is almost as great a benefit, the clearer perception and livelier impression of truth, produced by its collision with error.
For humanity to benefit from pluralism—to benefit from the exchange of cultures and the collision of ideas—we must practice toleration. We must respect the rights of our colleagues and neighbors to think and live differently than we do.
When someone practices toleration, Kukathas says, they don't just put up with something but actually acknowledge it "with a kind of open spirit." Intentional, meaningful tolerance includes making an effort to understand others' points of view. We don't have to agree, but we should seek to understand. And, ultimately, we have to tolerate ideas we disagree with if we want to live in a flourishing and peaceful society.
This is what cancel culture robs society of—the healthy and essential practice of toleration, without which pluralism and a peaceful society cannot be sustained.
Information may not seem like something physical, yet it has become a central concern for physicists. A wonderful new book explores the importance of the "dataome" for the physical, biological, and human worlds.
- The most important current topic in physics relates to a subject that hardly seems physical at all — information, which is central to thermodynamics and perhaps the universe itself.
- The "dataome" is the way human beings have been externalizing information about ourselves and the world since we first began making paintings on cave walls.
- The dataome is vast and growing everyday, sucking up an ever increasing share of the energy humans produce.
Physics is a field that is supposed to study real stuff. By real, I mean things like matter and energy. Matter is, of course, the kind of stuff you can hold in your hand. Energy may seem a little more abstract, but its reality is pretty apparent, appearing in the form of motion or gravity or electromagnetic fields.
What has become apparent recently, however, is the importance to physics of something that seems somewhat less real: information. From black holes to quantum mechanics to understanding the physics of life, information has risen to become a principal concern of many physicists in many domains. This new centrality of information is why you really need to read astrophysicist Caleb Scharf's new book The Ascent of Information: Books, Bits, Machines, and Life's Unending Algorithms.
Scharf is currently the director of the Astrobiology Program at Columbia University. He is also the author of four other books as well as a regular contributor to Scientific American.
(Full disclosure: Scharf and I have been collaborators on a scientific project involving the Fermi Paradox, so I was a big fan before I read this new book. Of course, the reason why I collaborated with him is because I really like the way he thinks, and his creativity in tackling tough problems is on full display in The Ascent of Information.)
What is the dataome?
In his new book, Scharf is seeking a deeper understanding of what he calls the "dataome." This is the way human beings have been externalizing information about ourselves and the world since we first began making paintings on cave walls. The book opens with a compelling exploration of how Shakespeare's works, which began as scribbles on a page, have gone on to have lives of their own in the dataome. Through reprintings in different languages, recordings of performances, movie adaptations, comic books, and so on, Shakespeare's works are now a permanent part of the vast swirling ensemble of information that constitutes the human dataome.
I found gems in these parts of the book that forced me to put the volume down and stare into space for a time to deal with their impact.
But the dataome does not just live in our heads. Scharf takes us on a proper physicist's journey through the dataome, showing us how information can never be divorced from energy. Your brain needs the chemical energy from food you ate this morning to read, process, and interpret these words. One of the most engaging parts of the book is when Scharf details just how much energy and real physical space our data-hungry world consumes as it adds to the dataome. For example, the Hohhot Data Center in the Inner Mongolia Autonomous Region of China is made of vast "farms" of data processing servers covering 245 acres of real estate. A single application like Bitcoin, Scharf tells us, consumes 7.7 gigawatts per year, equivalent to the output of half a dozen nuclear reactors!
Information is everywhere
But the dataome is not just about energy. Entropy is central to the story as well. Scharf takes the reader through a beautifully crafted discussion of information and the science of thermodynamics. This is where the links between energy, entropy, the limits of useful work, and probability all become profoundly connected to the definition of information.
The second law of thermodynamics tells us that you cannot use all of a given amount of energy to do useful work. Some of that energy must be wasted by getting turned into heat. Entropy is the physicist's way of measuring that waste (which can also be thought of as disorder). Scharf takes the reader through the basic relations of thermodynamics and then shows how entropy became intimately linked with information. It was Claude Shannon's brilliant work in the 1940s that showed how information — bits — could be defined for communication and computation as an entropy associated with the redundancy of strings of symbols. That was the link tying the physical world of physics explicitly to the informational and computational world of the dataome.
The best parts of the book are where Scharf unpacks how information makes its appearance in biology. From the data storage and processing that occurs with every strand of DNA, to the tangled pathways that define evolutionary dynamics, Scharf demonstrates how life is what happens to physics and chemistry when information matters. I found gems in these parts of the book that forced me to put the volume down and stare into space for a time to deal with their impact.
The physics of information
There are a lot of popular physics books out there about black holes and exoplanets and other cool stuff. But right now, I feel like the most important topic in physics relates to a subject that hardly seems physical at all. Information is a relatively new addition to the physics bestiary, making it even more compelling. If you are looking for a good introduction to how that is so, The Ascent of Information is a good place to start.
A new study tested to what extent dogs can sense human deception.
Is humanity's best friend catching on to our shenanigans? Researchers at the University of Vienna discovered that dogs can in certain cases know when people are lying.
The scientists carried out a study with hundreds of dogs to determine to what extent dogs could spot deception. The team's new paper, published in Proceedings of the Royal Society B, outlined experiments that tested whether dogs, like humans, have some inner sense of how to assess truthfulness.
As the researchers wrote in their paper, "Among non-primates, dogs (Canis familiaris) constitute a particularly interesting case, as their social environment has been shared with humans for at least 14,000 years. For this reason, dogs have been considered as a model species for the comparative investigation of socio-cognitive abilities." The investigation focused specifically on understanding if dogs were "sensitive to some mental or psychological states of humans."
The experiments involved 260 dogs, which were made to listen to advice from a human "communicator" whom they did not know. The human told them which one of two bowls had a treat hidden inside by touching it and saying, "Look, this is very good!" If the dogs took the person's advice, they would get the treat.
Once they established the trust of the dogs, the researchers then complicated the experience by letting dogs watch another human that they did not know transfer the treat from one bowl to another. In some cases, the original communicator would also be present to watch but not always.
The findings revealed that half of the dogs did not follow the advice of the communicator if that person was not present when the food was switched to a different bowl. The dogs had a sense that this human could not have known the true location of the treat. Furthermore, two-thirds of the dogs ignored the human's suggestion if she did see the food switch but pointed to the wrong bowl. The dogs figured out the human was lying to them.
Photos of experiments showing the dog, human communicator, and person hiding the treat. Credit: Lucrezia Lonardo et al / Proceedings of the Royal Society B.
"We thought dogs would behave like children under age five and apes, but now we speculate that perhaps dogs can understand when someone is being deceitful," co-author Ludwig Huber from the University of Vienna told New Scientist. "Maybe they think, 'This person has the same knowledge as me, and is nevertheless giving me the wrong [information].' It's possible they could see that as intentionally misleading, which is lying."
This is not the first time such experiments have been carried out. Previously, children under age five, macaques, and chimps were tested in a similar way. It turned out that children and other animals were more likely than dogs to listen to the advice of the liars. Notably, among the dogs, terriers were found to be more like children and apes, more eagerly following false suggestions.
When we rely on the conscious mind alone, we lose; but when we listen to the body, we gain a winning edge.
- Our surroundings contain far more information than our conscious minds can process.
- Our non-conscious minds are constantly gathering information and identifying patterns.
- By being interoceptively attuned — that is, aware of the inner state of the body — we can tap into what our non-conscious mind is trying to tell us.
The following is an adapted excerpt from the book The Extended Mind. It is reprinted with permission of the author.
If you'd like to make smarter choices and sounder decisions — and who doesn't? — you might want to take advantage of a resource you already have close at hand: your interoception. Interoception is, simply stated, an awareness of the inner state of the body. Just as we have sensors that take in information from the outside world (retinas, cochleas, taste buds, olfactory bulbs), we have sensors inside our bodies that send our brains a constant flow of data from within. These sensations are generated in places all over the body — in our internal organs, in our muscles, even in our bones — and then travel via multiple pathways to a structure in the brain called the insula. Such internal reports are merged with several other streams of information — our active thoughts and memories, sensory inputs gathered from the external world — and integrated into a single snapshot of our present condition, a sense of "how I feel" in the moment, as well as a sense of the actions we must take to maintain a state of internal balance.
To understand the role interoception can play in smart decision-making, it's important to know that the world is full of far more information than our conscious minds can process. However, we are also able to collect and store the volumes of information we encounter on a non-conscious basis. As we proceed through each day, we are continuously apprehending and storing regularities in our experience, tagging them for future reference. Through this information-gathering and pattern-identifying process, we come to know things — but we're typically not able to articulate the content of such knowledge or to ascertain just how we came to know it. This trove of data remains mostly under the surface of consciousness, and that's usually a good thing. Its submerged status preserves our limited stores of attention and working memory for other uses.
A study led by cognitive scientist Pawel Lewicki demonstrates this process in microcosm. Participants in Lewicki's experiment were directed to watch a computer screen on which a cross-shaped target would appear, then disappear, then reappear in a new location; periodically they were asked to predict where the target would show up next. Over the course of several hours of exposure to the target's movements, the participants' predictions grew more and more accurate. They had figured out the pattern behind the target's peregrinations. But they could not put this knowledge into words, even when the experimenters offered them money to do so. The subjects were not able to describe "anything even close to the real nature" of the pattern, Lewicki observes. The movements of the target operated according to a pattern too complex for the conscious mind to accommodate — but the capacious realm that lies below consciousness was more than roomy enough to contain it.
"Nonconscious information acquisition," as Lewicki calls it, along with the ensuing application of such information, is happening in our lives all the time. As we navigate a new situation, we're scrolling through our mental archive of stored patterns from the past, checking for ones that apply to our current circumstances. We're not aware that these searches are under way; as Lewicki observes, "The human cognitive system is not equipped to handle such tasks on the consciously controlled level." He adds, "Our conscious thinking needs to rely on notes and flowcharts and lists of 'if-then' statements — or on computers — to do the same job which our non-consciously operating processing algorithms can do without external help, and instantly."
But — if our knowledge of these patterns is not conscious, how then can we make use of it? The answer is that, when a potentially relevant pattern is detected, it's our interoceptive faculty that tips us off: with a shiver or a sigh, a quickening of the breath or a tensing of the muscles. The body is rung like a bell to alert us to this useful and otherwise inaccessible information. Though we typically think of the brain as telling the body what to do, just as much does the body guide the brain with an array of subtle nudges and prods. (One psychologist has called this guide our "somatic rudder.") Researchers have even captured the body in mid-nudge, as it alerts its inhabitant to the appearance of a pattern that she may not have known she was looking for.
Such interoceptive prodding was visible during a gambling game that formed the basis of an experiment led by neuroscientist Antonio Damasio, a professor at the University of Southern California. In the game, presented on a computer screen, players were given a starting purse of two thousand "dollars" and were shown four decks of digital cards. Their task, they were told, was to turn the cards in the decks face-up, choosing which decks to draw from such that they would lose the least amount of money and win the most. As they started clicking to turn over cards, players began encountering rewards — bonuses of $50 here, $100 there — and also penalties, in which small or large amounts of money were taken away. What the experimenters had arranged, but the players were not told, was that decks A and B were "bad" — they held lots of large penalties in store — and decks C and D were "good," bestowing more rewards than penalties over time.
How Our Brains Feel Emotion | Antonio Damasio | Big Think www.youtube.com
As they played the game, the participants' state of physiological arousal was monitored via electrodes attached to their fingers; these electrodes kept track of their level of "skin conductance." When our nervous systems are stimulated by an awareness of potential threat, we start to perspire in a barely perceptible way. This slight sheen of sweat momentarily turns our skin into a better conductor of electricity. Researchers can thus use skin conductance as a measure of nervous system arousal. Looking over the data collected by the skin sensors, Damasio and his colleagues noticed something interesting: after the participants had been playing for a short while, their skin conductance began to spike when they contemplated clicking on the bad decks of cards. Even more striking, the players started avoiding the bad decks, gravitating increasingly to the good decks. As in the Lewicki study, subjects got better at the task over time, losing less and winning more.
Yet interviews with the participants showed that they had no awareness of why they had begun choosing some decks over others until late in the game, long after their skin conductance had started flaring. By card 10 (about forty-five seconds into the game), measures of skin conductance showed that their bodies were wise to the way the game was rigged. But even ten turns later — on card 20 — "all indicated that they did not have a clue about what was going on," the researchers noted. It took until card 50 was turned, and several minutes had elapsed, for all the participants to express a conscious hunch that decks A and B were riskier. Their bodies figured it out long before their brains did. Subsequent studies supplied an additional, and crucial, finding: players who were more interoceptively aware were more apt to make smart choices within the game. For them, the body's wise counsel came through loud and clear.
Damasio's fast-paced game shows us something important. The body not only grants us access to information that is more complex than what our conscious minds can accommodate. It also marshals this information at a pace that is far quicker than our conscious minds can handle. The benefits of the body's intervention extend well beyond winning a card game; the real world, after all, is full of dynamic and uncertain situations, in which there is no time to ponder all the pros and cons. When we rely on the conscious mind alone, we lose — but when we listen to the body, we gain a winning edge.
Annie Murphy Paul is a science writer who covers research on learning and cognition. She is the author of The Extended Mind: The Power of Thinking Outside the Brain, from which this article is adapted.
SMARTER FASTER trademarks owned by Freethink Media, Inc. All rights reserved.