The foundations of mathematics are unproven

Philosopher and logician Kurt Gödel upended our understanding of mathematics and truth.

Credit: Peter Macdiarmid via Getty Images
• In 1900, mathematician David Hilbert laid down 23 problems for the mathematics world to solve, the biggest of which was how to prove mathematics itself.
• Far from solving the issue, Kurt Gödel showed just how groundless the axioms of mathematics are.
• Gödel's theorem does not devalue mathematics but reveals that some truths are unprovable.

Everything's a bit crazy at the moment. We're drowning in a sea of lies, half-truths, polarization, debate, argument, and uncertainty. But at least there's math, right? That one sanctuary of truth and certainty. It's the algebraic flotsam we can grip on to, before we're swept away.

Well… look away now if you like it like that, because Kurt Gödel might be about to snatch even that away. His incompleteness theorems shook the foundations of the (math) universe. In fact, he rather did away with those foundations altogether.

Math problems

In the early 20th century, the famous mathematician David Hilbert laid down 23 problems for the mathematics world to solve. Some of them are particularly esoteric, but the big ones concerned the issues of math's consistency and completeness. Hilbert hated the fact that the whole of mathematics depended on certain "axioms" that were, themselves, not proven. He wanted no loose ends, paradoxes, or unproven items. This was math after all!

Gödel, though, rather quashed all that.

Gödel would have hated what the postmodernists made of his work.

To see how, we have to first know that "axioms" are those statements that we accept as true before we go about doing math. They're like the letters needed to make words. For example, A + B = B + A is an axiom, as are all the functions of arithmetic and so on. Simply put, axioms are the building blocks of mathematics. They're as true for Euclid, drawing squares in ancient Greek dust, as they are for a pained 15-year-old, frowning over some calculus.

The problem is that these axioms are not proven. They're true because they always work, and we observe them as true all of the time. But they're not proven.

Gödel's challenge

Imagine the whole of mathematics as a huge sack, and inside are all the possible things math can do. It's a mighty big sack, indeed. What Gödel proved is that, first, there exists in this sack a set of things which cannot be proven or disproven, such as axioms. Second, there is no possible way to prove these axioms from within that sack. It's impossible for math, on its own, to prove its own axioms.

Essentially, it's a problem of self-reference. It's an issue seen, too, in Russell's paradox about sets. More famously, the liar paradox imagines a sentence like, "This sentence is false." When you examine it closely, it creates a logical circularity. If the sentence is true, then it's false; but then if it's false, it's true. It's enough to make a robot's brain explode.

Credit: ROBYN BECK via Getty Images

Gödel applied a similar logic to the whole system of mathematics. He took the sentence, "This statement is unproven," and converted it into a number statement about numbers (with a code system known as "Gödel numbering"). He discovered that this proposition cannot be proven within that system.

Going even further than this, Gödel concluded that in every system that's rich enough to allow for arithmetic, there will be a proposition within it that cannot be proven by it's own tools. We need some kind of "meta language" to prove the rules by which a system operates. It's a bit like how we can't see our own eyes or draw around the hand that's holding the pencil.

How postmodernists weaponized Gödel

Gödel has been misrepresented, even in his lifetime. For instance, certain postmodernist philosophers used him to say, "There is no truth! Even math is groundless!" They wanted to show how everything was meaningless, and truth amounted only to opinion.

But this isn't the point. Gödel only showed that truth does not always need to be proven. This is, of course, no small thing. To pull apart truth and provability, to allow for "unproven truths," seems highly counterintuitive

Gödel, himself, thought there were objective truths. His theory only went to show the limitations of mathematics but not that it was flawed in any way. He would have hated what the postmodernists made of his work.

Jonny Thomson teaches philosophy in Oxford. He runs a popular Instagram account called Mini Philosophy (@philosophyminis). His first book is Mini Philosophy: A Small Book of Big Ideas.

The hardest question ever asked: What is truth?

Scientists believe they have the answer, but philosophers prove them wrong.

Credit: Ecce Homo, 1871 by Antonio Ciseri via Public Domain / Wikipedia
• Science is based on the correspondence theory of truth, which claims that truth corresponds with facts and reality.
• Various philosophers have put forth substantive challenges to the truth claims made by science.
• While science is the best tool to understand the material universe, it has nothing substantive to say about the things that matter most, like love, beauty, and purpose.

In the Gospel According to John, the author retells a conversation between Jesus of Nazareth, who is on trial, and Pontius Pilate, the governor of the Roman province of Judaea. Toward the end of the interrogation, Jesus tells Pilate, "Everyone on the side of truth listens to me." Infamously, Pilate responds, "What is truth?"

Pilate's tone is not clear. Was he asking a genuine question out of curiosity? Was he being sarcastic? Or was he asking the question in desperation, following a lifelong, exhausting search for the truth? We don't know. What we do know is that he didn't stick around for an answer.

So, what is truth?

Philosophers have struggled with this question since the dawn of time, perhaps because it's the hardest question ever asked. The field of epistemology is the subdiscipline of philosophy that grapples it, along with the nature of knowledge itself. The question, "What do we know and how do we know it?" occupies the mind of the epistemologist.

The prevailing theory of truth, at least among the public and certainly among scientists, is the correspondence theory, which states that truth corresponds with facts and reality. It's a good theory, especially since it's practical and governs our day-to-day interactions. If I'm holding a tart, reddish-yellow, spherical fruit, I'm holding a Cosmic Crisp apple. There is no alternative theory of truth that could convince me that it's a limousine. Likewise, business contracts, the judicial system, and society as a whole are built around the idea that truth corresponds to reality.

Science still can't answer the biggest — and arguably, most important — questions in life.

Many scientists would take this a step further and argue that the scientific method is the foremost system for determining facts. Therefore, science is the best tool to determine reality and truth. But this is where things start to get tricky.

Philosophers vs. Scientists

At least two philosophers have presented substantial challenges to the epistemic privilege of science. In An Enquiry Concerning Human Understanding (1748), David Hume argues that inductive logic is unjustified. Inductive logic is the process of making observations and then drawing larger conclusions from limited data. When astrophysicists make a claim like, "All stars are flaming balls of hydrogen and helium," that grand, sweeping claim is based on observing lots and lots of stars and observing the same thing over and over again. But they haven't observed all the stars in the universe. Further, there is no guarantee that future stars will resemble past stars. So how can they really know for sure?

That might sound like a childish objection, but consider this: At one time, Europeans believed that all swans are white. After all, everywhere they looked, they saw white swans. The swans on the river, the swans on the lake — all white. But then, one intrepid European (Willem de Vlamingh) went to Australia in 1697 and saw black swans. In this instance, inductive logic failed. This is the basis of Hume's argument that inductive logic is unjustified.

Immanuel KantCredit: Public domain via Wikipedia

In his Critique of Pure Reason (1781), Immanuel Kant provides another challenge: It is impossible for humans to distinguish between reality (what he called the "noumenon") and our perception of reality (what he called the "phenomenon"). The reason is because our experience of reality is filtered through our minds. When I look at a basketball and see that it is orange, how do I know that it really is orange? The photons that bounce off the ball and stimulate the cells in my retina trigger a series of electrochemical reactions in my nervous system that results in my brain interpreting the color as orange. But how do I know that my brain is right? What if basketballs are actually green, but our brains misinterpret the color as orange?

Though Karl Popper's theory of falsification is a really good counterargument, there are no dispositive responses to these challenges, which is why scientists generally respond with, "Buzz off, philosophers." Stephen Hawking claimed that philosophy is dead (apparently unaware that the scientific method is rooted in epistemology). To bolster their case, scientists boast that they put people on the moon and gave us really cool things like iPads, non-stick skillets, and Viagra. Sure, Hume and Kant made some clever remarks a long time ago, but science works.

Chapter 1.4: Karl Popper and the logic of falsification www.youtube.com

Science can't answer the big questions

Fair enough. Science has demonstrated capably that it is the best way to understand the material universe. But science still can't answer the biggest — and arguably, most important — questions in life. It certainly can't answer the questions that we care most about. Consider the following:

• Is the economy doing well?
• Does your family really love you?
• Why is there hatred in the world?
• Is the Mona Lisa beautiful?
• What is the purpose of life?
• Who is the best football player ever?
• Are you having a good day?
• Does this dress make me look fat?

How does one scientifically answer any of these questions? Even the first question, which is the most (dismally) scientific of the bunch, does not have a clear-cut answer. How do we determine the health of the economy? Do we use GDP? The unemployment rate? The poverty rate? Median household income? The minimum wage? Stock market indices? Gross national happiness? There is no inherently correct metric to measure the economy's health.

Mandelbrot set, a type of fractal.Credit: Wolfgang Beyer via Wikipedia and licensed under CC BY-SA 3.0.

The questions only get harder from there. Love, beauty, purpose — science has nothing substantive to say about any of these. Yet, they are the driving forces behind most human behavior. We have friends and families because we love others. We ponder art, listen to music, and read poetry because we appreciate beauty. We have jobs because we must fulfill our purpose (in addition to putting food on the table).

While science is largely silent on topics like love, beauty, and purpose, philosophy (as well as religion) has plenty to say. The most meaningful understanding of reality — and therefore, our best attempt to grasp the truth — will happen only when science and philosophy unite. May we all be students of both.

Does science tell the truth?

It is impossible for science to arrive at ultimate truths, but functional truths are good enough.

Credit: Sergey Nivens via Adobe Stock / 202871840
• What is truth? This is a very tricky question, trickier than many would like to admit.
• Science does arrive at what we can call functional truth, that is, when it focuses on what something does as opposed to what something is. We know how gravity operates, but not what gravity is, a notion that has changed over time and will probably change again.
• The conclusion is that there are not absolute final truths, only functional truths that are agreed upon by consensus. The essential difference is that scientific truths are agreed upon by factual evidence, while most other truths are based on belief.

Does science tell the truth? The answer to this question is not as simple as it seems, and my 13.8 colleague Adam Frank took a look at it in his article about the complementarity of knowledge. There are many levels of complexity to what truth is or means to a person or a community. Why?

It's complicated

First, "truth" itself is hard to define or even to identify. How do you know for sure that someone is telling you the truth? Do you always tell the truth? In groups, what may be considered true to a culture with a given set of moral values may not be true in another. Examples are easy to come by: the death penalty, abortion rights, animal rights, environmentalism, the ethics of owning weapons, etc.

At the level of human relations, truth is very convoluted. Living in an age where fake news has taken center stage only corroborates this obvious fact. However, not knowing how to differentiate between what is true and what is not leads to fear, insecurity, and ultimately, to what could be called worldview servitude — the subservient adherence to a worldview proposed by someone in power. The results, as the history of the 20th century has shown extensively, can be catastrophic.

Proclamations of final or absolute truths, even in science, shouldn't be trusted.

The goal of science, at least on paper, is to arrive at the truth without recourse to any belief or moral system. Science aims to go beyond the human mess so as to be value-free. The premise here is that Nature doesn't have a moral dimension, and that the goal of science is to describe Nature the best possible way, to arrive at something we could call the "absolute truth." The approach is a typical heir to the Enlightenment notion that it is possible to take human complications out of the equation and have an absolute objective view of the world. However, this is a tall order.

It is tempting to believe that science is the best pathway to truth because, to a spectacular extent, science does triumph at many levels. You trust driving your car because the laws of mechanics and thermodynamics work. NASA scientists and engineers just managed to have the Ingenuity Mars Helicopter — the first man-made device to fly over another planet — hover above the Martian surface all by itself.

We can use the laws of physics to describe the results of countless experiments to amazing levels of accuracy, from the magnetic properties of materials to the position of your car in traffic using GPS locators. In this restricted sense, science does tell the truth. It may not be the absolute truth about Nature, but it's certainly a kind of pragmatic, functional truth at which the scientific community arrives by consensus based on the shared testing of hypotheses and results.

What is truth?

Credit: Sergey Nivens via Adobe Stock / 242235342

But at a deeper level of scrutiny, the meaning of truth becomes intangible, and we must agree with the pre-Socratic philosopher Democritus who declared, around 400 years BCE, that "truth is in the depths." (Incidentally, Democritus predicted the existence of the atom, something that certainly exists in the depths.)

A look at a dictionary reinforces this view. "Truth: the quality of being true." Now, that's a very circular definition. How do we know what is true? A second definition: "Truth: a fact or belief that is accepted as true." Acceptance is key here. A belief may be accepted to be true, as is the case with religious faith. There is no need for evidence to justify a belief. But note that a fact as well can be accepted as true, even if belief and facts are very different things. This illustrates how the scientific community arrives at a consensus of what is true by acceptance. Sufficient factual evidence supports that a statement is true. (Note that what defines sufficient factual evidence is also accepted by consensus.) At least until we learn more.

Take the example of gravity. We know that an object in free fall will hit the ground, and we can calculate when it does using Galileo's law of free fall (in the absence of friction). This is an example of "functional truth." If you drop one million rocks from the same height, the same law will apply every time, corroborating the factual acceptance of a functional truth, that all objects fall to the ground at the same rate irrespective of their mass (in the absence of friction).

But what if we ask, "What is gravity?" That's an ontological question about what gravity is and not what it does. And here things get trickier. To Galileo, it was an acceleration downward; to Newton a force between two or more massive bodies inversely proportional to the square of the distance between them; to Einstein the curvature of spacetime due to the presence of mass and/or energy. Does Einstein have the final word? Probably not.

Is there an ultimate scientific truth?

Final or absolute scientific truths assume that what we know of Nature can be final, that human knowledge can make absolute proclamations. But we know that this can't really work, for the very nature of scientific knowledge is that it is incomplete and contingent on the accuracy and depth with which we measure Nature with our instruments. The more accuracy and depth our measurements gain, the more they are able to expose the cracks in our current theories, as I illustrated last week with the muon magnetic moment experiments.

So, we must agree with Democritus, that truth is indeed in the depths and that proclamations of final or absolute truths, even in science, shouldn't be trusted. Fortunately, for all practical purposes — flying airplanes or spaceships, measuring the properties of a particle, the rates of chemical reactions, the efficacy of vaccines, or the blood flow in your brain — functional truths do well enough.

The philosophy of bullsh*t and how to avoid stepping in it

A philosopher's guide to detecting nonsense and getting around it.

Olivier Le Moal/Shutterstock
• A professor in Sweden has a bold on idea on what BS, pseudoscience, and pseudophilosophy actually are.
• He suggests they are defined by a lack of "epistemic conscientiousness" rather than merely being false.
• He offers suggestions on how to avoid producing nonsense and how to identify it on sight.

There is a lot of BS going around these days. Fake cures for disease are being passed off by unscrupulous hacks, the idea that the world is flat has a shocking amount of sincere support online, and plenty of people like to suggest there isn't a scientific consensus on climate change. It can be hard to keep track of it all.

Even worse, it can be difficult to easily define all of it in a way that lets people know what they're encountering is nonsense right away. Luckily for us, Dr. Victor Moberger recently published an essay in Theoria on what counts as bullsh*t, how it interacts with pseudoscience and pseudophilosophy, and what to do about it.

The Unified Theory of B.S.

The essay "Bullshit, Pseudoscience and Pseudophilosophy" considers much of the nonsense we encounter and offers a definition that allows us to move forward in dealing with it.

Dr. Moberger argues that what makes something bullshit is a "lack of epistemic conscientiousness," meaning that the person arguing for it takes no care to assure the truth of their statements. This typically manifests in systemic errors in reasoning and the frequent use of logical fallacies such as ad hominem, red herring, false dilemma, and cherry picking, among others.

This makes bullsh*t different from lying, which involves caring what the truth is and purposely moving away from it, or mere indifference to truth, as it is quite possible for people pushing nonsense to care about their nonsense being true. It also makes it different from making the occasional mistake with reasoning, occasional errors differ from a systemic reliance on them.

Importantly, nonsense is also dependent on the epistemic unconscientiousness of the person pushing it rather than its content alone. This means some of it may end up being true (consider cases where a person's personality does match up with their star sign), but they end up being true for reasons unrelated to the bad reasoning used by its advocates.

Lots of things can, justly, be deemed "bullshit" under this understanding; such as astrology, homeopathy, climate change denialism, flat-Earthism, creationism, and the anti-vaccine movement.

Two subcategories: pseudoscience and pseudophilosophy

Two commonly encountered kinds of bullsh*t are pseudoscience and pseudophilosophy. They can be easily defined as "bullshit with scientific pretensions" and "bullshit with philosophical pretensions."Here are a few examples which will clarify exactly what these things mean.

A form of pseudoscience would be flat-Earthism. While it takes on scientific pretensions and can be, and has been, proven false, supporters of the idea that the Earth is flat are well known for handwaving away any evidence that falsifies their stance and dismissing good arguments against their worldview.

An amusing and illustrative example is the case of the flat-Earthers who devised two experiments to determine if the earth was flat or spherical. When their experiments produced results exactly consistent with the Earth being spherical, they refused to accept the results and concluded that something went wrong; despite having no reason to do so. Clearly, these fellows lack epistemic conscientiousness.

Pseudophilosophy is less frequently considered, but can be explained with examples of its two most popular forms.

The first is dubbed "obscurantist pseudophilosophy." It often takes the form of nonsense posing as philosophy using copious amounts of jargon and arcane, frequently erroneous reasoning connecting a mundane truth to an exciting, fantastic falsehood.

As an example, there are more than a few cases where people have argued that physical reality is a social construct. This idea is based on the perhaps trivial notion that our beliefs about reality are social constructs. Often in cases like this, when challenged on the former point, advocates of the more fantastic point will retreat to the latter, as its is less controversial, and claim the issue was one of linguistic confusion caused by their obscure terminology. When the coast is clear, they frequently return to the original stance.

Dr. Moberger suggests that the humanities and social sciences seem to have a weakness for these seemingly profound pseudophilosophies without being nonsensical fields themselves.

The second is "scientistic pseudophilosophy" and is often seen in popular science writing. It frequently manifests when questions considered in scientific writing are topics of philosophy rather than science. Because science writers are often not trained in philosophy, they may produce pseudophilosophy when trying to interact with these questions.

A famous example is Sam Harris' attempt at reducing the problems of moral philosophy to scientific problems. His book "The Moral Landscape" is infamously littered with strawman arguments, a failure to interact with relevant philosophical literature, and bad philosophy in general.

In all of these cases, we see that the supporters of some kind of nonsense think that what they are supporting is true, but that they are willing to ignore the basic rules of science and philosophical reasoning in order to do so.

Okay, so there is plenty of nonsense in the world. What do we do about it?

While the first step to dealing with this nonsense is to understand what it is, many people would like to go a little farther than that.

Dr. Moberger explained that sometimes, the best thing we can do is show a little humility:

"One of the main points of the essay is that there is no sharp boundary between bullshit and non-bullshit. Pseudoscience, pseudophilosophy and other kinds of bullshit are very much continuous with the kind of epistemic irresponsibility or unconscientiousness that we all display in our daily lives. We all have biases and we all dislike cognitive dissonance, and so without realizing it we cherry-pick evidence and use various kinds of fallacious reasoning. This tendency is especially strong when it comes to emotionally sensitive areas, such as politics, where we may have built part of our sense of identity and worth around a particular stance. Well-educated, smart people are no exception. In fact, they are sometimes worse, since they are more adept at using sophistry to rationalize their biases. Thus, the first thing to realize, I think, is that all of us are prone to produce bullshit and that it is much easier to spot other people's bullshit than our own. Intellectual humility is first and foremost. To me it does not come naturally and I struggle with it all the time."

He also advises that people take the time to develop their critical thinking skills:

"I think it is also very helpful to develop the kind of critical thinking skills that are taught to undergraduates in philosophy. The best book I know of in the genre is Richard Feldman's 'Reason and Argument.' It provides the basic conceptual tools necessary for thinking clearly about philosophical issues, but those tools are certainly useful outside of philosophy too."

Lastly, he reminds us that looking at the facts of the matter can clear things up:

"Finally, no degree of intellectual humility or critical thinking skills is a substitute for gathering relevant information about the issue at hand. And this is where empirical science comes in. If we want to think rationally about any broadly speaking empirical issue, we need to inform ourselves about what empirical science has to say about it. We also need to remember that individual scientists are often unreliable and that scientific consensus is what we should look for. (Indeed, it is a common theme in pseudoscience to appeal to individual scientists whose views do not reflect scientific consensus.)"

A great deal of the pseudoscience and pseudophilosophy we deal with is characterized not by being false or even unfalsifiable, but rather by a lack of concern for assuring that something is true by the person pushing it. Oftentimes, it is presented with fairly common logical fallacies and bold claims of rejecting the scientific consensus.

While having this definition doesn't remove bullshit from the world, it might help you avoid stepping in it. In the end, isn't that what matters?

Why voters value loyalty over honesty in politics

Researchers at Cornell found through new experiments that people will overlook dishonesty if it benefits them and the group they identify with.

• New studies suggest that in competitive settings, group loyalty leads to group members displaying more dishonest tendencies.
• Research at Cornell found that there is a fundamental link between dishonesty and loyalty when it comes to group think.
• Dishonesty in politics which is an ever-present and timeless aspect is most likely due to this phenomenon.

No matter what side of the political spectrum you're on, there is bound to be someone complaining about how the other side is horrible or they're slanderous liars. This vitriol and heart felt damnation for the other team is nothing new. In politics we've always tended to group together with other like-minded individuals even in the event that it goes too far.

Recent research out of Cornell proposes that we really aren't that adverse to lying as we proclaim to be. Especially if the lies told benefit our side or whatever group we propose to belong to.

Comedian George Carlin once quipped that, "If honesty were suddenly introduced to politics, it would throw everything off — the whole system would collapse."

Carlin said this during the Clinton administration and as you might have guessed it, things haven't changed much... Lies, untruth or whatever doublespeak the holier-than-thou crowd wants to fling at their opposing side need to realize one thing — they're all liars to some degree.

Merits of the dishonesty study

Getty Images

Angus Hildreth, Cornell's management professor, set up an experiment to explore the tumultuous relationship between truthfulness or lackthereof and loyalty. Hildreth and his team selected groups of random students, fraternity brothers and other volunteers then asked them to solve a number of puzzles and word games.

The rules of the game were simple. If the team performed well on these tasks, then the whole team would make more money.

The subjects were able to self report and then lie about puzzles they didn't complete. Though they didn't know that the researchers were able to tell if they were lying. Some failed or incomplete worksheets were dug out of the trash or the researchers intentionally gave them impossible puzzles.

Throughout the study, the team was encouraged and often felt righteous about their lying in the event that it benefitted themselves and their group.

Later on when these subjects pledged loyalty to a group to face off against other teams, it was found that more than 60 percent of people lied. Those who pledged loyalty but weren't inspired by competition against other groups lied less at 15 to 20 percent.

Political takeaways from the study

Researchers felt that loyalty was the cause of a lot of political corruption. They stated that:

Loyalty often drives corruption. Corporate scandals, political machinations, and sports cheating highlight how loyalty's pernicious nature manifests in collusion, conspiracy, cronyism, nepotism, and other forms of cheating.

But at the same time loyalty is a fundamental and ethical principle, which drives a lot of our behavior. Even so, the results and hypotheses proved that it was an implicit factor when it came to lying.

Across nine studies, we found that individuals primed with loyalty cheated less than those not primed (Study 1A and 1B). Members more loyal to their fraternities (Study 2A) and students more loyal to their study groups (Study 2B) also cheated less than their less loyal counterparts due to greater ethical salience when they pledged their loyalty (Studies 3A and 3B). Importantly, competition moderated these effects: when competition was high, members more loyal to their fraternities (Study 4) or individuals primed with loyalty (Studies 5A and 5B) cheated more.

Competition, which is the name of the game in the political realm, will always breed lying discontent between factions.