The philosophy of bullsh*t and how to avoid stepping in it

A philosopher's guide to detecting nonsense and getting around it.

A compass pointing to truth

A compass pointing to truth

Olivier Le Moal/Shutterstock
  • A professor in Sweden has a bold on idea on what BS, pseudoscience, and pseudophilosophy actually are.
  • He suggests they are defined by a lack of "epistemic conscientiousness" rather than merely being false.
  • He offers suggestions on how to avoid producing nonsense and how to identify it on sight.

There is a lot of BS going around these days. Fake cures for disease are being passed off by unscrupulous hacks, the idea that the world is flat has a shocking amount of sincere support online, and plenty of people like to suggest there isn't a scientific consensus on climate change. It can be hard to keep track of it all.

Even worse, it can be difficult to easily define all of it in a way that lets people know what they're encountering is nonsense right away. Luckily for us, Dr. Victor Moberger recently published an essay in Theoria on what counts as bullsh*t, how it interacts with pseudoscience and pseudophilosophy, and what to do about it.

The Unified Theory of B.S.  

The essay "Bullshit, Pseudoscience and Pseudophilosophy" considers much of the nonsense we encounter and offers a definition that allows us to move forward in dealing with it.

Dr. Moberger argues that what makes something bullshit is a "lack of epistemic conscientiousness," meaning that the person arguing for it takes no care to assure the truth of their statements. This typically manifests in systemic errors in reasoning and the frequent use of logical fallacies such as ad hominem, red herring, false dilemma, and cherry picking, among others.

This makes bullsh*t different from lying, which involves caring what the truth is and purposely moving away from it, or mere indifference to truth, as it is quite possible for people pushing nonsense to care about their nonsense being true. It also makes it different from making the occasional mistake with reasoning, occasional errors differ from a systemic reliance on them.

Importantly, nonsense is also dependent on the epistemic unconscientiousness of the person pushing it rather than its content alone. This means some of it may end up being true (consider cases where a person's personality does match up with their star sign), but they end up being true for reasons unrelated to the bad reasoning used by its advocates.

Lots of things can, justly, be deemed "bullshit" under this understanding; such as astrology, homeopathy, climate change denialism, flat-Earthism, creationism, and the anti-vaccine movement.

Two subcategories: pseudoscience and pseudophilosophy  

Two commonly encountered kinds of bullsh*t are pseudoscience and pseudophilosophy. They can be easily defined as "bullshit with scientific pretensions" and "bullshit with philosophical pretensions."Here are a few examples which will clarify exactly what these things mean.

A form of pseudoscience would be flat-Earthism. While it takes on scientific pretensions and can be, and has been, proven false, supporters of the idea that the Earth is flat are well known for handwaving away any evidence that falsifies their stance and dismissing good arguments against their worldview.

An amusing and illustrative example is the case of the flat-Earthers who devised two experiments to determine if the earth was flat or spherical. When their experiments produced results exactly consistent with the Earth being spherical, they refused to accept the results and concluded that something went wrong; despite having no reason to do so. Clearly, these fellows lack epistemic conscientiousness.

Pseudophilosophy is less frequently considered, but can be explained with examples of its two most popular forms.

The first is dubbed "obscurantist pseudophilosophy." It often takes the form of nonsense posing as philosophy using copious amounts of jargon and arcane, frequently erroneous reasoning connecting a mundane truth to an exciting, fantastic falsehood.

As an example, there are more than a few cases where people have argued that physical reality is a social construct. This idea is based on the perhaps trivial notion that our beliefs about reality are social constructs. Often in cases like this, when challenged on the former point, advocates of the more fantastic point will retreat to the latter, as its is less controversial, and claim the issue was one of linguistic confusion caused by their obscure terminology. When the coast is clear, they frequently return to the original stance.

Dr. Moberger suggests that the humanities and social sciences seem to have a weakness for these seemingly profound pseudophilosophies without being nonsensical fields themselves.

The second is "scientistic pseudophilosophy" and is often seen in popular science writing. It frequently manifests when questions considered in scientific writing are topics of philosophy rather than science. Because science writers are often not trained in philosophy, they may produce pseudophilosophy when trying to interact with these questions.

A famous example is Sam Harris' attempt at reducing the problems of moral philosophy to scientific problems. His book "The Moral Landscape" is infamously littered with strawman arguments, a failure to interact with relevant philosophical literature, and bad philosophy in general.

In all of these cases, we see that the supporters of some kind of nonsense think that what they are supporting is true, but that they are willing to ignore the basic rules of science and philosophical reasoning in order to do so.

Okay, so there is plenty of nonsense in the world. What do we do about it?

While the first step to dealing with this nonsense is to understand what it is, many people would like to go a little farther than that.

Dr. Moberger explained that sometimes, the best thing we can do is show a little humility:


"One of the main points of the essay is that there is no sharp boundary between bullshit and non-bullshit. Pseudoscience, pseudophilosophy and other kinds of bullshit are very much continuous with the kind of epistemic irresponsibility or unconscientiousness that we all display in our daily lives. We all have biases and we all dislike cognitive dissonance, and so without realizing it we cherry-pick evidence and use various kinds of fallacious reasoning. This tendency is especially strong when it comes to emotionally sensitive areas, such as politics, where we may have built part of our sense of identity and worth around a particular stance. Well-educated, smart people are no exception. In fact, they are sometimes worse, since they are more adept at using sophistry to rationalize their biases. Thus, the first thing to realize, I think, is that all of us are prone to produce bullshit and that it is much easier to spot other people's bullshit than our own. Intellectual humility is first and foremost. To me it does not come naturally and I struggle with it all the time."

He also advises that people take the time to develop their critical thinking skills:

"I think it is also very helpful to develop the kind of critical thinking skills that are taught to undergraduates in philosophy. The best book I know of in the genre is Richard Feldman's 'Reason and Argument.' It provides the basic conceptual tools necessary for thinking clearly about philosophical issues, but those tools are certainly useful outside of philosophy too."

Lastly, he reminds us that looking at the facts of the matter can clear things up:

"Finally, no degree of intellectual humility or critical thinking skills is a substitute for gathering relevant information about the issue at hand. And this is where empirical science comes in. If we want to think rationally about any broadly speaking empirical issue, we need to inform ourselves about what empirical science has to say about it. We also need to remember that individual scientists are often unreliable and that scientific consensus is what we should look for. (Indeed, it is a common theme in pseudoscience to appeal to individual scientists whose views do not reflect scientific consensus.)"

A great deal of the pseudoscience and pseudophilosophy we deal with is characterized not by being false or even unfalsifiable, but rather by a lack of concern for assuring that something is true by the person pushing it. Oftentimes, it is presented with fairly common logical fallacies and bold claims of rejecting the scientific consensus.

While having this definition doesn't remove bullshit from the world, it might help you avoid stepping in it. In the end, isn't that what matters?

A Cave in France Changes What We Thought We Knew About Neanderthals

A cave in France contains man’s earliest-known structures that had to be built by Neanderthals who were believed to be incapable of such things.

Image source: yannvdb/Wikimedia Commons
Surprising Science

In a French cave deep underground, scientists have discovered what appear to be 176,000-year-old man-made structures. That's 150,000 years earlier than any that have been discovered anywhere before. And they could only have been built by Neanderthals, people who were never before considered capable of such a thing.

Keep reading Show less

Psychopath-ish: How “healthy” brains can look and function like those of psychopaths

A recent study used fMRI to compare the brains of psychopathic criminals with a group of 100 well-functioning individuals, finding striking similarities.

Obscure freaky smiling psycho man

Mind & Brain
  • The study used psychological inventories to assess a group of violent criminals and healthy volunteers for psychopathy, and then examined how their brains responded to watching violent movie scenes.
  • The fMRI results showed that the brains of healthy subjects who scored high in psychopathic traits reacted similarly as the psychopathic criminal group. Both of these groups also showed atrophy in brain regions involved in regulating emotion.
  • The study adds complexity to common conceptions of what differentiates a psychopath from a "healthy" individual.
Keep reading Show less

Fighting online misinformation: We're doing it wrong

Counterintuitively, directly combating misinformation online can spread it further. A different approach is needed.

Credit: China Photos via Getty Images
Coronavirus
  • Like the coronavirus, engaging with misinformation can inadvertently cause it to spread.
  • Social media has a business model based on getting users to spend increasing amounts of time on their platforms, which is why they are hesitant to remove engaging content.
  • The best way to fight online misinformation is to drown it out with the truth.
Keep reading Show less
Mind & Brain

Self-awareness is what makes us human

Because of our ability to think about thinking, "the gap between ape and man is immeasurably greater than the one between amoeba and ape."

Quantcast