A philosopher's guide to detecting nonsense and getting around it.
- A professor in Sweden has a bold on idea on what BS, pseudoscience, and pseudophilosophy actually are.
- He suggests they are defined by a lack of "epistemic conscientiousness" rather than merely being false.
- He offers suggestions on how to avoid producing nonsense and how to identify it on sight.
The Unified Theory of B.S.<p><span style="background-color: initial;">The essay "</span><a href="https://onlinelibrary.wiley.com/doi/full/10.1111/theo.12271?af=R" target="_blank">Bullshit, Pseudoscience and Pseudophilosophy</a>"<span style="background-color: initial;"><em> </em></span><span style="background-color: initial;">considers much of the nonsense we encounter </span><span style="background-color: initial;">and </span><span style="background-color: initial;">offers a definition </span><span style="background-color: initial;">that allows us to move forward in dealing with it.</span></p><p>Dr. Moberger argues that what makes something bullshit is a "lack of epistemic conscientiousness," meaning that the person arguing for it takes no care to assure the truth of their statements. This typically manifests in systemic errors in reasoning and the frequent use of <a href="https://bigthink.com/scotty-hendricks/ten-logical-mistakes-you-make-everyday-and-what-to-instead" target="_blank">logical fallacies </a>such as <a href="https://yourlogicalfallacyis.com/ad-hominem" target="_blank">ad hominem</a>, red herring, <a href="https://yourlogicalfallacyis.com/black-or-white" target="_blank">false dilemma</a>, and <a href="https://yourlogicalfallacyis.com/the-texas-sharpshooter" target="_blank">cherry picking</a><em>, </em>among <a href="https://yourlogicalfallacyis.com/" target="_blank">others</a>. </p><p>This makes bullsh*t different from lying, which involves caring what the truth is and purposely moving away from it, or mere indifference to truth, as it is quite possible for people pushing nonsense to care about their nonsense being true. It also makes it different from making the occasional mistake with reasoning, occasional errors differ from a systemic reliance on them. </p><p>Importantly, nonsense is also dependent on the epistemic unconscientiousness of the person pushing it rather than its content alone. This means some of it may end up being true (consider cases where a person's personality does match up with their star sign), but they end up being true for reasons unrelated to the bad reasoning used by its <a href="https://yourlogicalfallacyis.com/the-fallacy-fallacy" target="_blank">advocates</a>. </p><p>Lots of things can, justly, be deemed "bullshit" under this understanding; such as <a href="https://medium.com/the-philosophers-stone/dismantling-astrology-and-pseudoscience-arguments-628411bc26af" target="_blank">astrology</a>, <a href="https://www.theguardian.com/commentisfree/2015/mar/12/no-scientific-case-homeopathy-remedies-pharmacists-placebos" target="_blank">homeopathy</a>, climate change denialism,<a href="https://www.popsci.com/10-ways-you-can-prove-earth-is-round/" target="_blank"> flat-Earthism</a>, <a href="https://www.scientificamerican.com/article/15-answers-to-creationist/" target="_blank">creationism</a>, and the anti-vaccine movement. </p>
Two subcategories: pseudoscience and pseudophilosophy<iframe width="730" height="430" src="https://www.youtube.com/embed/PCdcluiAOKU" frameborder="0" allow="accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe><p><span style="background-color: initial;">Two commonly </span><span style="background-color: initial;">encountered</span><span style="background-color: initial;"> kinds of </span><span style="background-color: initial;">bullsh*t</span> are pseudoscience and <span style="background-color: initial;">pseudophilosophy</span><span style="background-color: initial;">. They can be easily defined as "</span><span style="background-color: initial;">bullshit with scientific pretensions" and "</span><span style="background-color: initial;">bullshit</span> with philosophical pretensions."<span style="background-color: initial;">Here are a few examples which </span><span style="background-color: initial;">will</span><span style="background-color: initial;"> clarify exactly what </span><span style="background-color: initial;">these things mean</span><span style="background-color: initial;">.</span></p><p>A form of pseudoscience would be flat-Earthism. While it takes on scientific pretensions and can be, <a href="https://www.space.com/38931-kids-can-prove-earth-round.html" target="_blank">and has been,</a> proven false, supporters of the idea that the Earth is flat are well known for handwaving away any evidence that falsifies their stance and dismissing good arguments against their worldview. </p><p>An amusing and illustrative example is the case of the flat-Earthers who devised two experiments to determine if the earth was flat or spherical. When their experiments produced results exactly consistent with the Earth being <a href="https://www.newsweek.com/behind-curve-netflix-ending-light-experiment-mark-sargent-documentary-movie-1343362" target="_blank">spherical</a>, they refused to accept the results and concluded that something went wrong; despite having no reason to <a href="https://www.newscientist.com/article/mg24132210-900-feedback-flat-earthers-accidentally-prove-themselves-wrong/" target="_blank">do so</a>. Clearly, these fellows lack epistemic conscientiousness. </p><p>Pseudophilosophy is less frequently considered, but can be explained with examples of its two most popular forms. </p><p>The first is dubbed "obscurantist pseudophilosophy.<em>" </em>It often takes the form of nonsense posing as philosophy using copious amounts of jargon and arcane, frequently erroneous reasoning connecting a mundane truth to an exciting, fantastic falsehood. </p><p>As an example, there are more than a few cases where people have argued that physical reality is a social construct. This idea is based on the perhaps trivial notion that our beliefs about reality are social <a href="https://philpapers.org/archive/shatvo-2.pdf" target="_blank">constructs</a>. Often in cases like this, when challenged on the former point, advocates of the more fantastic point will retreat to the latter, as its is less controversial, and claim the issue was one of linguistic confusion caused by their <a href="https://yourlogicalfallacyis.com/ambiguity" target="_blank">obscure terminology</a>. When the coast is clear, they frequently return to the original stance. </p><p>Dr. Moberger suggests that the humanities and social sciences seem to have a weakness for these seemingly profound pseudophilosophies without being nonsensical fields themselves. </p><p>The second is "scientistic pseudophilosophy<em>" </em>and is often seen in popular science writing. It frequently manifests when questions considered in scientific writing are topics of philosophy rather than science. Because science writers are often not trained in philosophy, they may produce pseudophilosophy when trying to interact with these questions. </p><p>A famous example is Sam Harris' attempt at reducing the problems of moral philosophy to scientific problems. His book "The Moral Landscape" is infamously littered with <a href="https://yourlogicalfallacyis.com/strawman" target="_blank">strawman arguments,</a> a failure to interact with relevant philosophical literature, and <a href="https://www.prospectmagazine.co.uk/magazine/blackburn-ethics-without-god-secularism-religion-sam-harris" target="_blank">bad philosophy in general</a>. </p><p>In all of these cases, we see that the supporters of some kind of nonsense think that what they are supporting is true, but that they are willing to ignore the basic rules of science and philosophical reasoning in order to do so. </p>
Okay, so there is plenty of nonsense in the world. What do we do about it?<iframe width="730" height="430" src="https://www.youtube.com/embed/omTJxZJgSKk" frameborder="0" allow="accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe><p><span style="background-color: initial;">While the first step to dealing with</span><span style="background-color: initial;"> this</span><span style="background-color: initial;"> nonsense is to understand what it is, many people would like to go a little farther than that.</span></p><p>Dr. Moberger explained that sometimes, the best thing we can do is show a little humility: </p><p style="margin-left: 20px;"><br><em>"One of the main points of the essay is that there is no sharp boundary between bullshit and non-bullshit. Pseudoscience, pseudophilosophy and other kinds of bullshit are very much continuous with the kind of epistemic irresponsibility or unconscientiousness that we all display in our daily lives. We all have biases and we all dislike cognitive dissonance, and so without realizing it we cherry-pick evidence and use various kinds of fallacious reasoning. This tendency is especially strong when it comes to emotionally sensitive areas, such as politics, where we may have built part of our sense of identity and worth around a particular stance. Well-educated, smart people are no exception. In fact, they are sometimes worse, since they are more adept at using sophistry to rationalize their biases. Thus, the first thing to realize, I think, is that all of us are prone to produce bullshit and that it is much easier to spot other people's bullshit than our own. Intellectual humility is first and foremost. To me it does not come naturally and I struggle with it all the time."</em> </p><p>He also advises that people take the time to develop their critical thinking skills: </p><p style="margin-left: 20px;"><em>"I think it is also very helpful to develop the kind of critical thinking skills that are taught to undergraduates in philosophy. The best book I know of in the genre is Richard Feldman's '<a href="http://www.susanpetrilli.com/files/II.-Reason-and-Argument,-R.-Feldman.pdf" target="_blank">Reason and Argument</a>.' It provides the basic conceptual tools necessary for thinking clearly about philosophical issues, but those tools are certainly useful outside of philosophy too."</em> </p><p>Lastly, he reminds us that looking at the facts of the matter can clear things up: </p><p style="margin-left: 20px;">"Finally, no degree of intellectual humility or critical thinking skills is a substitute for gathering relevant information about the issue at hand. And this is where empirical science comes in. If we want to think rationally about any broadly speaking empirical issue, we need to inform ourselves about what empirical science has to say about it. We also need to remember that individual scientists are often unreliable and that scientific consensus is what we should look for. (Indeed, it is a common theme in pseudoscience to appeal to individual scientists whose views do not reflect scientific consensus.)" </p><p>A great deal of the pseudoscience and pseudophilosophy we deal with is characterized not by being false or even unfalsifiable, but rather by a lack of concern for assuring that something is true by the person pushing it. Oftentimes, it is presented with fairly common logical fallacies and bold claims of rejecting the scientific consensus. <br> <br>While having this definition doesn't remove bullshit from the world, it might help you avoid stepping in it. In the end, isn't that what matters? </p>
Researchers at Cornell found through new experiments that people will overlook dishonesty if it benefits them and the group they identify with.
- New studies suggest that in competitive settings, group loyalty leads to group members displaying more dishonest tendencies.
- Research at Cornell found that there is a fundamental link between dishonesty and loyalty when it comes to group think.
- Dishonesty in politics which is an ever-present and timeless aspect is most likely due to this phenomenon.
Merits of the dishonesty study<img class="rm-lazyloadable-image rm-shortcode" type="lazy-image" data-runner-src="https://assets.rebelmouse.io/eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpbWFnZSI6Imh0dHBzOi8vYXNzZXRzLnJibC5tcy8xODcxMDA1Ni9vcmlnaW4uanBnIiwiZXhwaXJlc19hdCI6MTYzMDAyOTYxMX0.iUKYLdiXZfjMp6SH1rbijZUN0sIMR5HkrrELUZYVsKk/img.jpg?width=980" id="8bdb8" width="1024" height="683" data-rm-shortcode-id="75cd1b8dad246118f4ab406bbe731cd8" data-rm-shortcode-name="rebelmouse-image" />
Getty Images<p>Angus Hildreth, Cornell's management professor, set up an experiment to explore the <a href="http://psycnet.apa.org/record/2016-00795-003" target="_blank">tumultuous relationship between truthfulness or lackthereof and loyalty. </a>Hildreth and his team selected groups of random students, fraternity brothers and other volunteers then asked them to solve a number of puzzles and word games. </p><p>The rules of the game were simple. If the team performed well on these tasks, then the whole team would make more money.</p><p>The subjects were able to self report and then lie about puzzles they didn't complete. Though they didn't know that the researchers were able to tell if they were lying. Some failed or incomplete worksheets were dug out of the trash or the researchers intentionally gave them impossible puzzles. </p><p>Throughout the study, the team was encouraged and often felt righteous about their lying in the event that it benefitted themselves and their group. <br></p><p>Later on when these subjects pledged loyalty to a group to face off against other teams, it was found that more than 60 percent of people lied. Those who pledged loyalty but weren't inspired by competition against other groups lied less at 15 to 20 percent.</p>
Political takeaways from the study<p>Researchers felt that loyalty was the cause of a lot of political corruption. They stated that: </p><blockquote>Loyalty often drives corruption. Corporate scandals, political machinations, and sports cheating highlight how loyalty's pernicious nature manifests in collusion, conspiracy, cronyism, nepotism, and other forms of cheating.</blockquote><p>But at the same time loyalty is a fundamental and ethical principle, which drives a lot of our behavior. Even so, the results and hypotheses proved that it was an implicit factor when it came to lying. </p><blockquote>Across nine studies, we found that individuals primed with loyalty cheated less than those not primed (Study 1A and 1B). Members more loyal to their fraternities (Study 2A) and students more loyal to their study groups (Study 2B) also cheated less than their less loyal counterparts due to greater ethical salience when they pledged their loyalty (Studies 3A and 3B). Importantly, competition moderated these effects: when competition was high, members more loyal to their fraternities (Study 4) or individuals primed with loyalty (Studies 5A and 5B) cheated more.</blockquote><p>Competition, which is the name of the game in the political realm, will always breed lying discontent between factions.</p>
The truth is a messy business, but an information revolution is coming. Danny Hillis and Peter Hopkins discuss knowledge, fake news and disruption at NeueHouse in Manhattan.
- In 2005, Danny Hillis co-founded Freebase, an open-source knowledge database that was acquired by Google in 2010. Freebase formed the foundation of Google's famous Knowledge Graph, which enhances its search engine results and powers Google Assistant and Google Home.
- Hillis is now building The Underlay, a new knowledge database and future search engine app that is meant to serve the common good rather than private enterprise. He calls it his "penance for having sold the other one to Google."
- Powerful collections of machine-readable knowledge are becoming exceedingly important, but most are privatized and serve commercial goals.
- Decentralizing knowledge and making information provenance transparent will be a revolution in the so-called "post-truth age". The Underlay is being developed at MIT by Danny Hillis, SJ Klein, Travis Rich.
Do we really believe everything we say? Are you always trying to establish the truth when you argue? This thought experiment will help answer these questions.
Most of us have views on politics, current events, religion, society, morality and sport, and we spend a lot of time expressing these views, whether in conversation or on social media. We argue for our positions, and get annoyed if they are challenged. Why do we do this? The obvious answer is that we believe the views we express (ie, we think they are true), and we want to get others to believe them too, because they are true. We want the truth to prevail. That's how it seems. But do we really believe everything we say? Are you always trying to establish the truth when you argue, or might there be other motives at work?
Middle America is tired of those latte-sipping liberals and their "elite media" hanging out in New York City, but Ariel Levy makes the case that Americans aren't as different from one another as they'd like to think.
Middle America is tired of those latte-sipping liberals and their "elite media" hanging out in New York City, but author and New Yorker staff writer Ariel Levy makes the case that Americans aren't as different from one another as they'd like to think—and in fact they are all bound by one thing: truth. "No little falsehood is okay, ever, and we take that very seriously," says Levy, speaking of the allegiance to truth and extreme fact-checking that happens at The New Yorker. Journalists are human, and therein lies inevitable errors, but to claim that fake news is coming from the liberal media or that climate science is liberal propaganda is very much off base, she says. Here she delves into what the journalist's mandate is, and why there's no point making up facts: reality gets you in the end. Ariel Levy's memoir The Rules Do Not Apply, is out now.