- Maintaining standards of evidence is the most important and least appreciated idea in science.
- Modern science was established in the late Renaissance when networks of researchers began working out best practices for linking evidence with conclusions.
- In the face of science denial and attempts to create a post-truth society, we have to protect the primacy of standards of evidence in science and society.
I talk a lot about science to people who are not scientists. It’s generally a lot of fun because most folks are science-curious even if they don’t think about it a lot on their own time. But whether I’m talking about alien life, black holes, or the weirdnesses of quantum mechanics, there is always one really important idea that I try to get across that generally no one is interested in:
Standards of evidence. It’s the most important boring idea in the universe.
Networks of scientists led to scientific societies
The development of modern science was a long, slow process that required input from most of the world’s cultures ranging from ancient Greece and medieval Islam to India and China and eventually Renaissance Europe.
One of the most critical elements in Europe was the gradual build-up of international communities of scholars. While we usually think of science as being driven forward through the inspiration of one singular genius after another, that’s only part of the story. For every Galileo and Newton there were hundreds of people you never heard of. They formed a network of thinkers and tinkerers writing letters to each other and making visits across the continent. In this way, they exchanged notes on things like the best way to carry out an experiment on boiling liquids or a new way to consider the mathematics of problems in celestial mechanics.
Unless you are a scientist, you probably have very little idea of how science knows what it knows, or even more important, how it knows what it doesn’t know.
While they might not have known it at the time, what these scholars were also doing was setting up the foundations for an international order of scientific knowledge that would rest upon mutually agreed standards of evidence.
Eventually these networks became formalized. Scientific academies started popping up in places like Italy where the Academy of the Mysteries of Nature was founded in Naples in 1560. Later the Royal Society in England, formally known as the The Royal Society of London for Improving Natural Knowledge, was established in 1660. The French Academy of Sciences was formed just six years later. Over the years, these institutions and others would lead the way in establishing “best practices” for how to carry out scientific research and how to make sure that the conclusions a scientist drew from that research were supported by the evidence.
Scientific societies led to standards of evidence
I’m telling you this not because I think the history is so cool (though it is). Instead, what matters is seeing how the idea of standards of evidence was born in its scientific form. It came from people arguing in public over what should count as public facts or better yet public knowledge. Science didn’t drop out of the sky fully formed. It was, and is, the fruit of a very human, very collective effort. The goal of that effort was to determine the best way to ask nature questions and ensure that you’re getting correct answers.
This was not, by the way, a smooth process. There were lots of wrong turns in figuring out what counted as meaningful evidence and what was just another way of getting fooled. But over time, people figured out that there were standards for how to set up an experiment, how to collect data from it, and how to interpret that data. These standards now include things like isolating the experimental apparatus from spurious environmental effects, understanding how data collection devices respond to inputs, and accounting for systematic errors in analyzing the data. There are, of course, many more.
In this way, scientists figured out which standards were useful in linking evidence to conclusions.
Why standards matter
Science is now the most powerful force shaping human life. Without it, there could never be seven billion of us living on the planet at the same time. It has shaped and reshaped how we eat, how we travel, how we deal with sickness, how we communicate, and how we go to war. It is also how we are pushing Earth into new and dangerous (for us) climate states. But despite all this ubiquity and power, unless you are a scientist, you probably have very little idea of how science knows what it knows, or even more important, how it knows what it doesn’t know.
Most of us don’t understand what it means to have standards of evidence or how these standards get applied. That means that we can’t see how the same methods that gave us our cell phones also gave us our understanding of climate change. When a pandemic hits, we can’t see how the science is going to be an evolving process as those standards of evidence get used to sort through the firehose of real-time data. And when it comes to things like UFOs or “Ancient Aliens,” we won’t see that holding fast to those standards is the only thing that can keep us from being fooled by a conclusion that we may want to be true as opposed to accepting the one that actually is true.
Admittedly, standards of evidence is not the most thrilling topic in the world. But it very well may be the most important.