Does science tell the truth?
It is impossible for science to arrive at ultimate truths, but functional truths are good enough.
Marcelo Gleiser is a professor of natural philosophy, physics, and astronomy at Dartmouth College. He is a Fellow of the American Physical Society, a recipient of the Presidential Faculty Fellows Award from the White House and NSF, and was awarded the 2019 Templeton Prize. Gleiser has authored five books and is the co-founder of 13.8, where he writes about science and culture with physicist Adam Frank.
- What is truth? This is a very tricky question, trickier than many would like to admit.
- Science does arrive at what we can call functional truth, that is, when it focuses on what something does as opposed to what something is. We know how gravity operates, but not what gravity is, a notion that has changed over time and will probably change again.
- The conclusion is that there are not absolute final truths, only functional truths that are agreed upon by consensus. The essential difference is that scientific truths are agreed upon by factual evidence, while most other truths are based on belief.
Does science tell the truth? The answer to this question is not as simple as it seems, and my 13.8 colleague Adam Frank took a look at it in his article about the complementarity of knowledge. There are many levels of complexity to what truth is or means to a person or a community. Why?
First, "truth" itself is hard to define or even to identify. How do you know for sure that someone is telling you the truth? Do you always tell the truth? In groups, what may be considered true to a culture with a given set of moral values may not be true in another. Examples are easy to come by: the death penalty, abortion rights, animal rights, environmentalism, the ethics of owning weapons, etc.
At the level of human relations, truth is very convoluted. Living in an age where fake news has taken center stage only corroborates this obvious fact. However, not knowing how to differentiate between what is true and what is not leads to fear, insecurity, and ultimately, to what could be called worldview servitude — the subservient adherence to a worldview proposed by someone in power. The results, as the history of the 20th century has shown extensively, can be catastrophic.
Proclamations of final or absolute truths, even in science, shouldn't be trusted.
The goal of science, at least on paper, is to arrive at the truth without recourse to any belief or moral system. Science aims to go beyond the human mess so as to be value-free. The premise here is that Nature doesn't have a moral dimension, and that the goal of science is to describe Nature the best possible way, to arrive at something we could call the "absolute truth." The approach is a typical heir to the Enlightenment notion that it is possible to take human complications out of the equation and have an absolute objective view of the world. However, this is a tall order.
It is tempting to believe that science is the best pathway to truth because, to a spectacular extent, science does triumph at many levels. You trust driving your car because the laws of mechanics and thermodynamics work. NASA scientists and engineers just managed to have the Ingenuity Mars Helicopter — the first man-made device to fly over another planet — hover above the Martian surface all by itself.
We can use the laws of physics to describe the results of countless experiments to amazing levels of accuracy, from the magnetic properties of materials to the position of your car in traffic using GPS locators. In this restricted sense, science does tell the truth. It may not be the absolute truth about Nature, but it's certainly a kind of pragmatic, functional truth at which the scientific community arrives by consensus based on the shared testing of hypotheses and results.
What is truth?
Credit: Sergey Nivens via Adobe Stock / 242235342
But at a deeper level of scrutiny, the meaning of truth becomes intangible, and we must agree with the pre-Socratic philosopher Democritus who declared, around 400 years BCE, that "truth is in the depths." (Incidentally, Democritus predicted the existence of the atom, something that certainly exists in the depths.)
A look at a dictionary reinforces this view. "Truth: the quality of being true." Now, that's a very circular definition. How do we know what is true? A second definition: "Truth: a fact or belief that is accepted as true." Acceptance is key here. A belief may be accepted to be true, as is the case with religious faith. There is no need for evidence to justify a belief. But note that a fact as well can be accepted as true, even if belief and facts are very different things. This illustrates how the scientific community arrives at a consensus of what is true by acceptance. Sufficient factual evidence supports that a statement is true. (Note that what defines sufficient factual evidence is also accepted by consensus.) At least until we learn more.
Take the example of gravity. We know that an object in free fall will hit the ground, and we can calculate when it does using Galileo's law of free fall (in the absence of friction). This is an example of "functional truth." If you drop one million rocks from the same height, the same law will apply every time, corroborating the factual acceptance of a functional truth, that all objects fall to the ground at the same rate irrespective of their mass (in the absence of friction).
But what if we ask, "What is gravity?" That's an ontological question about what gravity is and not what it does. And here things get trickier. To Galileo, it was an acceleration downward; to Newton a force between two or more massive bodies inversely proportional to the square of the distance between them; to Einstein the curvature of spacetime due to the presence of mass and/or energy. Does Einstein have the final word? Probably not.
Is there an ultimate scientific truth?
Final or absolute scientific truths assume that what we know of Nature can be final, that human knowledge can make absolute proclamations. But we know that this can't really work, for the very nature of scientific knowledge is that it is incomplete and contingent on the accuracy and depth with which we measure Nature with our instruments. The more accuracy and depth our measurements gain, the more they are able to expose the cracks in our current theories, as I illustrated last week with the muon magnetic moment experiments.
So, we must agree with Democritus, that truth is indeed in the depths and that proclamations of final or absolute truths, even in science, shouldn't be trusted. Fortunately, for all practical purposes — flying airplanes or spaceships, measuring the properties of a particle, the rates of chemical reactions, the efficacy of vaccines, or the blood flow in your brain — functional truths do well enough.
The research also raises an intriguing question: Can we get around the Heisenberg uncertainty principle?
- New experiments with vibrating drums push the boundaries of quantum mechanics.
- Two teams of physicists create quantum entanglement in larger systems.
- Critics question whether the study gets around the famous Heisenberg uncertainty principle.
Recently published research pushes the boundaries of key concepts in quantum mechanics. Studies from two different teams used tiny drums to show that quantum entanglement, an effect generally linked to subatomic particles, can also be applied to much larger macroscopic systems. One of the teams also claims to have found a way to evade the Heisenberg uncertainty principle.
One question that the scientists were hoping to answer pertained to whether larger systems can exhibit quantum entanglement in the same way as microscopic ones. Quantum mechanics proposes that two objects can become "entangled," whereby the properties of one object, such as position or velocity, can become connected to those of the other.
An experiment performed at the U.S. National Institute of Standards and Technology in Boulder, Colorado, led by physicist Shlomi Kotler and his colleagues, showed that a pair of vibrating aluminum membranes, each about 10 micrometers long, can be made to vibrate in sync, in such a way that they can be described to be quantum entangled. Kotler's team amplified the signal from their devices to "see" the entanglement much more clearly. Measuring their position and velocities returned the same numbers, indicating that they were indeed entangled.
Tiny aluminium membranes used by Kotler's team.Credit: Florent Lecoq and Shlomi Kotler/NIST
Evading the Heisenberg uncertainty principle?
Another experiment with quantum drums — each one-fifth the width of a human hair — by a team led by Prof. Mika Sillanpää at Aalto University in Finland, attempted to find what happens in the area between quantum and non-quantum behavior. Like the other researchers, they also achieved quantum entanglement for larger objects, but they also made a fascinating inquiry into getting around the Heisenberg uncertainty principle.
The team's theoretical model was developed by Dr. Matt Woolley of the University of New South Wales. Photons in the microwave frequency were employed to create a synchronized vibrating pattern as well as to gauge the positions of the drums. The scientists managed to make the drums vibrate in opposite phases to each other, achieving "collective quantum motion."
The study's lead author, Dr. Laure Mercier de Lepinay, said: "In this situation, the quantum uncertainty of the drums' motion is canceled if the two drums are treated as one quantum-mechanical entity."
This effect allowed the team to measure both the positions and the momentum of the virtual drumheads at the same time. "One of the drums responds to all the forces of the other drum in the opposing way, kind of with a negative mass," Sillanpää explained.
Theoretically, this should not be possible under the Heisenberg uncertainty principle, one of the most well-known tenets of quantum mechanics. Proposed in the 1920s by Werner Heisenberg, the principle generally says that when dealing with the quantum world, where particles also act like waves, there's an inherent uncertainty in measuring both the position and the momentum of a particle at the same time. The more precisely you measure one variable, the more uncertainty in the measurement of the other. In other words, it is not possible to simultaneously pinpoint the exact values of the particle's position and momentum.
Big Think contributor astrophysicist Adam Frank, known for the 13.8 podcast, called this "a really fascinating paper as it shows that it's possible to make larger entangled systems which behave like a single quantum object. But because we're looking at a single quantum object, the measurement doesn't really seem to me to be 'getting around' the uncertainty principle, as we know that in entangled systems an observation of one part constrains the behavior of other parts."
Ethan Siegel, also an astrophysicist, commented, "The main achievement of this latest work is that they have created a macroscopic system where two components are successfully quantum mechanically entangled across large length scales and with large masses. But there is no fundamental evasion of the Heisenberg uncertainty principle here; each individual component is exactly as uncertain as the rules of quantum physics predicts. While it's important to explore the relationship between quantum entanglement and the different components of the systems, including what happens when you treat both components together as a single system, nothing that's been demonstrated in this research negates Heisenberg's most important contribution to physics."The papers, published in the journal Science, could help create new generations of ultra-sensitive measuring devices and quantum computers.
As bad as this sounds, a new essay suggests that we live in a surprisingly egalitarian age.
- A new essay depicts 700 years of economic inequality in Europe.
- The only stretch of time more egalitarian than today was the period between 1350 to approximately the year 1700.
- Data suggest that, without intervention, inequality does not decrease on its own.
Economic inequality is a constant topic. No matter the cycle — boom or bust — somebody is making a lot of money, and the question of fairness is never far behind.
A recently published essay in the Journal of Economic Literature by Professor Guido Alfani adds an intriguing perspective to the discussion by showing the evolution of income inequality in Europe over the last several hundred years. As it turns out, we currently live in a comparatively egalitarian epoch.
Seven centuries of economic history
Figure 8 from Guido Alfani, Journal of Economic Literature, 2021.
This graph shows the amount of wealth controlled by the top ten percent in certain parts of Europe over the last seven hundred years. Archival documentation similar to — and often of a similar quality as — modern economic data allows researchers to get a glimpse of what economic conditions were like centuries ago. Sources like property tax records and documents listing the rental value of homes can be used to determine how much a person's estate was worth. (While these methods leave out those without property, the data is not particularly distorted.)
The first part of the line, shown in black, represents work by Prof. Alfani and represents the average inequality level of the Sabaudian State in Northern Italy, The Florentine State, The Kingdom of Naples, and the Republic of Venice. The latter part, in gray, is based on the work of French economist Thomas Piketty and represents an average of inequality in France, the United Kingdom, and Sweden during that time period.
Despite the shift in location, the level of inequality and rate of increase are very similar between the two data sets.
Apocalyptic events cause decreases in inequality
Note that there are two substantial declines in inequality. Both are tied to truly apocalyptic events. The first is the Black Death, the common name for the bubonic plague pandemic in the 14th century, which killed off anywhere between 30 and 50 percent of Europe. The second, at the dawn of the 20th century, was the result of World War I and the many major events in its aftermath.
The 20th century as a whole was a time of tremendous economic change, and the periods not featuring major wars are notable for having large experiments in distributive economic policies, particularly in the countries Piketty considers.
The slight stall in the rise of inequality during the 17th century is the result of the Thirty Years' War, a terrible religious conflict that ravaged Europe and left eight million people dead, and of major plagues that affected South Europe. However, the recurrent outbreaks of the plague after the Black Death no longer had much effect on inequality. This was due to a number of factors, not the least of which was the adaptation of European institutions to handle pandemics without causing such a shift in wealth.
In 2010, the last year covered by the essay, inequality levels were similar to those of 1340, with 66 percent of the wealth of society being held by the top ten percent. Also, inequality levels were continuing to rise, and the trends have not ended since. As Prof. Alfani explained in an email to BigThink:
"During the decade preceding the Covid pandemic, economic inequality has shown a slow tendency towards further inequality growth. The Great Recession that began in 2008 possibly contributed to slow down inequality growth, especially in Europe, but it did not stop it. However, the expectation is that Covid-19 will tend to increase inequality and poverty. This, because it tends to create a relatively greater economic damage to those having unstable occupations, or who need physical strength to work (think of the effects of the so-called "long-Covid," which can prove physically invalidating for a long time). Additionally, and thankfully, Covid is not lethal enough to force major leveling dynamics upon society."
Can only disasters change inequality?
That is the subject of some debate. While inequality can occur in any economy, even one that doesn't grow all that much, some things appear to make it more likely to rise or fall.
Thomas Piketty suggested that the cause of changes in inequality levels is the difference in the rate of return on capital and the overall growth rate of the economy. Since the return on capital is typically higher than the overall growth rate, this means that those who have capital to invest tend to get richer faster than everybody else.
While this does explain a great deal of the graph after 1800, his model fails to explain why inequality fell after the Black Death. Indeed, since the plague destroyed human capital and left material goods alone, we would expect the ratio of wealth over income to increase and for inequality to rise. His model can provide explanations for the decline in inequality in the decades after the pandemic, however- it is possible that the abundance of capital could have lowered returns over a longer time span.
The catastrophe theory put forth by Walter Scheidel suggests that the only force strong enough to wrest economic power from those who have it is a world-shattering event like the Black Death, the fall of the Roman Empire, or World War I. While each event changed the world in a different way, they all had a tremendous leveling effect on society.
But not even this explains everything in the above graph. Pandemics subsequent to the Black Death had little effect on inequality, and inequality continued to fall for decades after World War II ended. Prof. Alfani suggests that we remember the importance of human agency through institutional change. He attributes much of the post-WWII decline in inequality to "the redistributive policies and the development of the welfare states from the 1950s to the early 1970s."
What does this mean for us now?
As Professor Alfani put it in his email:
"[H]istory does not necessarily teach us whether we should consider the current trend toward growth in economic inequality as an undesirable outcome or a problem per se (although I personally believe that there is some ground to argue for that). Nor does it teach us that high inequality is destiny. What it does teach us, is that if we do not act, we have no reason whatsoever to expect that inequality will, one day, decline on its own. History also offers abundant evidence that past trends in inequality have been deeply influenced by our collective decisions, as they shaped the institutional framework across time. So, it is really up to us to decide whether we want to live in a more, or a less unequal society."
Our love-hate relationship with browser tabs drives all of us crazy. There is a solution.
- A new study suggests that tabs can cause people to be flustered as they try to keep track of every website.
- The reason is that tabs are unable to properly organize information.
- The researchers are plugging a browser extension that aims to fix the problem.
A lot of ideas that people had about the internet in the 1990s have fallen by the wayside as technology and our usage patterns evolved. Long gone are things like GeoCities, BowieNet, and the belief that letting anybody post whatever they are thinking whenever they want is a fundamentally good idea with no societal repercussions.
While these ideas have been abandoned and the tools that made them possible often replaced by new and improved ones, not every outdated part of our internet experience is gone. A new study by a team at Carnegie Mellon makes the case that the use of tabs in a web browser is one of these outdated concepts that we would do well to get rid of.
How many tabs do you have open right now?
We didn't always have tabs. Introduced in the early 2000s, tabs are now included on all major web browsers, and most users have had access to them for a little over a decade. They've been pretty much the same since they came out, despite the ever changing nature of the internet. So, in this new study, researchers interviewed and surveyed 113 people on their use of — and feelings toward — the ubiquitous tabs.
Most people use tabs for the short-term storage of information, particularly if it's information that is needed again soon. Some keep tabs that they know they'll never get around to reading. Others used them as a sort of external memory bank. One participant described this action to the researchers:
"It's like a manifestation of everything that's on my mind right now. Or the things that should be on my mind right now... So right now, in this browser window, I have a web project that I'm working on. I don't have time to work on it right now, but I know I need to work on it. So it's sitting there reminding me that I need to work on it."
You suffer from tab overload
Unfortunately, trying to use tabs this way can cause a number of problems. A quarter of the interview subjects reported having caused a computer or browser to crash because they had too many tabs open. Others reported feeling flustered by having so many tabs open — a situation called "tab overload" — or feeling ashamed that they appeared disorganized by having so many tabs up at once. More than half of participants reported having problems like this at least two or three times a week.
However, people can become emotionally invested in the tabs. One participant explained, "[E]ven when I'm not using those tabs, I don't want to close them. Maybe it's because it took efforts [sic] to open those tabs and organize them in that way."
So, we have a tool that inefficiently saves web pages that we might visit again while simultaneously reducing our productivity, increasing our anxiety, and crashing our machines. And yet we feel oddly attached to them.
Either the system is crazy or we are.
Skeema: The anti-tab revolution
The researchers concluded that at least part of the problem is caused by tabs not being an ideal way of organizing the work we now do online. They propose a new model that better compartmentalizes tabs by task and subtask, reflects users' mental models, and helps manage the users' attention on what is important right now rather than what might be important later.
To that end, the team also created Skeema, an extension for Google Chrome, that treats tabs as tasks and offers a variety of ways to organize them. Users of an early version reported having fewer tabs and windows open at one time and were better able to manage the information they contained.
Tabs were an improvement over having multiple windows open at the same time, but they may have outlived their usefulness. While it might take a paradigm shift to fully replace the concept, the study suggests that taking a different approach to tabs might be worth trying.
And now, excuse me, while I close some of the 87 tabs I currently have open.
SMARTER FASTER trademarks owned by Freethink Media, Inc. All rights reserved.