What do unexpected experimental results actually tell us?
- It often happens that an experiment will be conducted, that it will give surprising results that don’t agree with theoretical predictions, and then one of two things happens: it either gets attention from everyone, or practically no one.
- We know, from the past several decades of experience, that scientific revolutions can and do occur, but also that they’re rare. More often than not, these unexpected results simply don’t add up.
- It can be difficult to know what’s worth paying attention to and what can be dismissed out of hand, especially among non-experts. Here’s what everyone should be considering when they next encounter one.
When you’re a scientist, getting an unexpected result can be a double-edged sword. The best prevailing theories of the day can tell you what sort of data you ought to expect to acquire as you ask nature questions about itself, but only by confronting your predictions with real-world scientific inquiry — involving experiments, measurements, and observations — can you put those theories to the test. Most commonly, the results agree with what the leading theories predict; after all, that’s why they became the leading theories in the first place. Still, it’s important to keep pushing the frontiers of even the most well-established theories in new and untested regimes, as if there’s ever going to be a new scientific breakthrough, the first hints of it will come from experiments and observations that nature has never been subjected to before.
That’s why it’s so compelling when, every once in a while, scientists get a result that conflicts with our theoretical expectations. In general, when this happens in physics, most people default to the most skeptical of explanations: that there’s a problem with the experiment, the data, or the analysis. The general assumption is that either there’s:
- an unintentional mistake,
- or a delusional self-deception,
- or an outright case of deliberate fraud.
But it’s also possible that something quite fantastic is afoot: we’re seeing the first signs of something new and unexpected in the Universe. It’s important to remain simultaneously both skeptical and open-minded, as these five examples from science history clearly illustrate.
Story 1: It’s the 1880s, and scientists have measured the speed of light to very good precision: 299,800 km/s or so, with an uncertainty of about 0.005%. That’s precise enough that, if light travels through the medium of a fixed and unchanging space, we should be able to tell when and whether that light is moving with, against, or at an angle to Earth’s motion (at 30 km/s) around the Sun.
The Michelson-Morley experiment was designed to test exactly this, anticipating that light would travel through the medium of space — then known as the aether — at different speeds dependent on the direction of Earth’s motion relative to the apparatus. Yet, when the experiment was performed, it always gave the same results: results that indicated that the speed of light was a constant in all directions at all times. This constancy was observed regardless of factors such as how the apparatus was oriented or when in Earth’s orbit the measurements were taken. This was an unexpected result that flew in the face of the leading theory of the day, but was performed so exquisitely that the results were extremely compelling to the broader community of physicists who were investigating nature at a fundamental level.
Story 2: It’s the late 1920s, and scientists have now discovered three types of radioactive decay: alpha, beta, and gamma decays. In alpha decay, an unstable atomic nucleus emits an alpha particle (helium-4 nucleus), where the total energy and momentum of both “daughter” particles appears to be conserved, and equals the energy and momentum from the “parent” particle. In gamma decay, a gamma particle (photon) is emitted from an unstable atomic nucleus, where both energy and momentum are conserved from the initial to the final states as well. This energy and momentum conservation has also been observed to hold for all non-decaying particles and reactions as well; they appear to be immutable laws of nature.
However, then there was beta decay. In the process of beta decay, a beta particle (electron) is emitted from an atomic nucleus, which transmutes into a different element on the periodic table: one element up. In beta decay, however, the total energy is less for the two observed daughter particles (the emitted electron and the new nucleus) than it was for the parent particle (the old nucleus), and momentum is no longer conserved in this process. The expectation was that energy and momentum are two quantities that are expected to always be conserved in particle interactions, and so seeing a reaction where energy is lost and a net momentum appears out of nowhere violates both of those rules, never seen to be violated in any other particle reaction, collision, or decay.
Story 3: It’s the late 1990s, and scientists are working hard to measure exactly how the Universe is expanding. Not only to answer the question of, “How fast is the Universe expanding today?” but also to answer the complementary question of, “How has the Universe’s expansion rate changed and evolved throughout its history?” In theory — and this had been known since the 1920s — if you could answer both of those questions, you could determine precisely what all the various different types of matter-and-energy were that existed throughout the Universe, and what their energy densities were at every point in cosmic history.
A combination of ground-based observations and space-based ones (including the then-relatively new Hubble Space Telescope) were using every type of distance indicator to measure those two key parameters:
- the Hubble constant (the expansion rate today), and
- the deceleration parameter (how gravity is slowing the Universe’s expansion).
After years of carefully measuring the brightnesses and redshifts of many different type Ia supernovae at large distances, two teams of scientists tentatively published their results. From their data, they each reached the same conclusion: that “the deceleration parameter” is actually negative; instead of gravity slowing the Universe’s expansion, more distant galaxies appear to be speeding up in their apparent recession velocities as time goes on. In a Universe composed of normal matter, dark matter, radiation, neutrinos, and spatial curvature, this effect is theoretically impossible; either something was wrong with this data or how it was being interpreted, or some exotic form of energy must exist within our Universe.
Story 4: It’s 2011, and the Large Hadron Collider has only been operating for a short while. After initially turning on in 2008, a leak in the liquid helium system caused extensive damage to the machine, requiring years of repairs. Now that there are beams of fast-moving protons circulating within it at incredible speeds, just 3 m/s below the speed of light, the first science results are poised to come in. A variety of experiments that take advantage of these energetic particles are underway, seeking to measure a variety of aspects about the Universe. Some of them involve collisions of particles in one direction with particles moving equally fast in the other direction; others involve “fixed target” experiments, where fast-moving particles are collided with stationary ones.
In this latter case, enormous numbers of particles are produced all moving in the same general direction: a particle shower. These so-called “daughter particles” proceed to travel at near-light speeds in the same direction that the original protons were moving in. Some of these daughter particles will quickly decay, producing neutrinos when they do. One experiment successfully measures these neutrinos from a downstream location that’s hundreds of kilometers away, reaching a startling conclusion: the particles are arriving tens of nanoseconds earlier than their predicted arrival time. If all particles, including neutrinos, are limited by the speed of light, this “early arrival time” should be theoretically impossible.
Story 5: It’s now well into the 2010s, and the Large Hadron Collider has been operating for years. The full results from its first run are now in, and the Higgs boson has been discovered: a Nobel Prize-winning discovery. The last undiscovered particle in the Standard Model has now been found, and many of the other Standard Model particles have been subjected to unprecedented tests of their properties, showing no discernible deviation from their predicted behaviors. With all the pieces of the Standard Model now firmly in place, and little to point to anything being out of the ordinary otherwise, particle physics seems secure as-is, and the Standard Model seems more robust than ever.
Nevertheless, there are a few anomalous “bumps” that appear in the data: extra events that appear at certain energies where the Standard Model predicts that there should be no extra events. With two competing collaborations colliding particles at the maximum energies that the LHC can achieve, both working independently, a sensible cross-check would be to see if both CMS and ATLAS find similar evidence of any bumps occurring at the same energies and with the same level of significance. Remarkably there’s at least one location where both experiments see the exact same “extra” signal, consistent with the same “bump” in the data, and that’s an incredibly suggestive piece of evidence. Whatever’s going on, it doesn’t match the theoretical predictions that our most successful theories of all-time give, making us wonder if we aren’t on the cusp of discovering a new fundamental particle, interaction, or physical phenomenon.
In each of these cases, it’s important to recognize what the possible outcomes are. In general, there are three possibilities for what’s going to occur.
1.) There is literally nothing to see here. What’s being touted as a potential new discovery is nothing more than an error of some sort. Whether it’s because of:
- an honest, unforeseen mistake,
- an erroneous setup,
- experimental incompetence,
- an act of sabotage,
- or a deliberate hoax or fraud perpetrated by a charlatan,
is irrelevant; the claimed effect is not real.
2.) The rules of physics, as we’ve conceived them up until now, are not as we believed them to be, and this result is a hint — perhaps the first key hint — that there’s something different about our Universe than we’ve thought up until this point. It’s going to require a new physical law, principle, or even a whole new conception of reality to set things right.
3.) There is a new component to the Universe — something not previously included in our theoretical expectations — whose effects are showing up in these new results, possibly for the first time.
If you yourself are a scientist, you recognize immediately that your default assumption should be the first one, and that it would require an overwhelming amount of additional supporting evidence to show us that either the second or third option, both of which would be revolutionary, is instead correct.
Of course, that’s not at all how reality plays out for most of us. Many of our scientific colleagues these days are quick to write papers putting forth novel, fringe ideas that either alter the rules of physics or propose new, additional particles or interactions as “leading explanations” for these results. Most of the discussions you’ll see in popular, even mainstream media sources is about how some new evidence threatens to “break the Universe” or something equally sensationalistic. But these are not answers; these are merely examples of ambulance-chasing: where something loud, flashy, and new is attracting all sorts of attention, particularly unscrupulous attention, from people who should ethically know better.
How will we actually determine which explanation is the correct one for these new observations? The scientific process demands just one thing: that we gather more data, better data, and independent data that either confirms or refutes what’s been seen. New ideas and theories that supersede the old ones ought to be considered, so long as they:
- reproduce the same successful results as the old theories where they work,
- explain the new results where the old theories do not, and
- make at least one new prediction that differs from the old theory that can be, in principle, looked for and measured.
The correct first response to an unexpected result is to try to independently reproduce it and to compare these results with other, complementary results that should help us interpret this new result in the context of the full suite of evidence.
Each one of these five historical stories had a different ending, although they all had the potential to revolutionize the Universe. In order, here’s what happened:
- The speed of light, as further experiments demonstrated, turns out to be the same as measured by all observers in all reference frames. There is no aether necessary; instead, our conception of how things move through the Universe is governed by Einstein’s relativity, not Newton’s laws.
- Energy and momentum are actually both conserved, but that’s because there was a new, unseen particle that’s also emitted in beta decay: the neutrino, as proposed by Wolfgang Pauli in 1930. Neutrinos, a mere hypothesis for decades, were finally directly detected 1956, two years before Pauli died.
- Initially met with skepticism, the two independent teams (the supernova cosmology project and the high-z supernova search team) continued to gather data on the expansion of the Universe, but skeptics weren’t convinced until improved data from the cosmic microwave background and large-scale structure data all supported the same inescapable conclusion. The Universe, in addition to the known forms of matter and radiation, also contains dark energy, which is the underlying cause of the observed accelerated expansion.
- Initially reported as a 6.8-sigma result by the OPERA collaboration, other experiments using the same setup, such as ICARUS, failed to confirm their results. Eventually, the OPERA team found an experimental error that was the cause of their anomalous results: there was a loose cable that was giving an incorrect reading for the time-of-flight of these neutrinos. With the error fixed, the anomaly disappeared.
- Even with data from both CMS and ATLAS, the significance of these results (both the diboson and diphoton bumps) never crossed the vaunted 5-sigma threshold, or the “gold standard” for statistical significance. With more data, what was originally a “bump” in the data simply regressed to the mean, showing that these initially promising results were mere statistical fluctuations. With much more data now in the LHC’s coffers, there is no evidence for either of these bumps any longer.
On the other hand, there are a large number of collaborations that are too quick to observe an anomaly and then make extraordinary claims based on that one observation. The DAMA collaboration claims to have directly detected dark matter, despite a whole slew of red flags and failed confirmation attempts. The Atomki anomaly, which observes a specific nuclear decay, sees an unexpected result in the distribution of angles of that decay, claiming the existence of a new particle, the X17, with a series of unprecedented properties. There have been numerous claims that cold fusion has been achieved, which defies the conventional rules of nuclear physics.
There have been claims of reactionless, thrustless engines, which defy the rules of momentum conservation. And there have been extraordinary claims made by real physicists, such as from the Alpha Magnetic Spectrometer or BICEP2, that had mundane, rather than extraordinary, explanations. More recently, there have been claims about room-temperature superconductivity surrounding a substance known as LK-99, now known not to superconduct at all, and the muon g-2 anomaly, which appears to be an experimental triumph but which comes alongside a theoretical calculation whose errors were badly underestimated.
Whenever you do a real, bona fide experiment, it’s important that you don’t bias yourself toward getting whatever result you anticipate or, worse, hope for. You, as the scientist, need to be the most skeptical of your own setup, the most honest about your errors and uncertainties, and the most forthcoming about your methodologies and their possible flaws. You’ll want to be as responsible as possible, doing everything you can to calibrate your instruments properly and understand all of your sources of error and uncertainty, but in the end, you have to report your results honestly, regardless of what you see. It’s going to be up to the rest of the scientific community to either validate or refute what you’ve found, and if you’ve been unscrupulous at any point along the way, you’re going to get exposed eventually.
There should be no penalty to collaborations for coming up with results that aren’t borne out by later experiments; the OPERA, ATLAS, and CMS collaborations in particular did admirable jobs in releasing their data with all the appropriate caveats. When the first hints of an anomaly arrive, unless there is a particularly glaring flaw with the experiment (or the experimenters), there is no way to know whether it’s an experimental flaw, evidence for an unseen component, or the harbinger of a new set of physical laws. Only with more, better, and independent scientific data can we hope to solve whatever puzzle our investigations reveal about the natural world.