Skip to content
Starts With A Bang

What do unexpected experimental results actually tell us?

Scientific surprises, driven by experiment, are often how science advances. But more often than not, they’re just bad science.
Interior of a particle physics laboratory showing a complex particle accelerator setup with multiple cables, detectors, and machinery designed to study glueball particles.
A view from above of the BES III detector at the electron-positron collider in Beijing, China. Exotic particles have newly been detected here, including X, Y, and Z mesons which don't fit the normal scheme of a quark-antiquark combination. With the X(2370) particle, we may have detected the first glueball in history.
Credit: Institute for High Energy Physics, Beijing
Key Takeaways
  • It often happens that an experiment will be conducted, that it will give surprising results that don’t agree with theoretical predictions, and then one of two things happens: it either gets attention from everyone, or practically no one.
  • We know, from the past several decades of experience, that scientific revolutions can and do occur, but also that they’re rare. More often than not, these unexpected results simply don’t add up.
  • It can be difficult to know what’s worth paying attention to and what can be dismissed out of hand, especially among non-experts. Here’s what everyone should be considering when they next encounter one.
Sign up for the Starts With a Bang newsletter
Travel the universe with Dr. Ethan Siegel as he answers the biggest questions of all

When you’re a scientist, getting an unexpected result can be a double-edged sword. The best prevailing theories of the day can tell you what sort of data you ought to expect to acquire as you ask nature questions about itself, but only by confronting your predictions with real-world scientific inquiry — involving experiments, measurements, and observations — can you put those theories to the test. Most commonly, the results agree with what the leading theories predict; after all, that’s why they became the leading theories in the first place. Still, it’s important to keep pushing the frontiers of even the most well-established theories in new and untested regimes, as if there’s ever going to be a new scientific breakthrough, the first hints of it will come from experiments and observations that nature has never been subjected to before.

That’s why it’s so compelling when, every once in a while, scientists get a result that conflicts with our theoretical expectations. In general, when this happens in physics, most people default to the most skeptical of explanations: that there’s a problem with the experiment, the data, or the analysis. The general assumption is that either there’s:

  • an unintentional mistake,
  • or a delusional self-deception,
  • or an outright case of deliberate fraud.

But it’s also possible that something quite fantastic is afoot: we’re seeing the first signs of something new and unexpected in the Universe. It’s important to remain simultaneously both skeptical and open-minded, as these five examples from science history clearly illustrate.

The idea behind a Michelson interferometer is that a source of light can be split into two by passing it through a device like a beam splitter, which sends half of the original light down each of two perpendicular paths. At the end of the path, a mirror bounces the light back toward the way it came from, and then those two beams are recombined, producing an interference pattern on the screen. If the speed of light is different down each path, the interference pattern will change in response.
Credit: Polytec GmbH/Wikimedia Commons

Story 1: It’s the 1880s, and scientists have measured the speed of light to very good precision: 299,800 km/s or so, with an uncertainty of about 0.005%. That’s precise enough that, if light travels through the medium of a fixed and unchanging space, we should be able to tell when and whether that light is moving with, against, or at an angle to Earth’s motion (at 30 km/s) around the Sun.

The Michelson-Morley experiment was designed to test exactly this, anticipating that light would travel through the medium of space — then known as the aether — at different speeds dependent on the direction of Earth’s motion relative to the apparatus. Yet, when the experiment was performed, it always gave the same results: results that indicated that the speed of light was a constant in all directions at all times. This constancy was observed regardless of factors such as how the apparatus was oriented or when in Earth’s orbit the measurements were taken. This was an unexpected result that flew in the face of the leading theory of the day, but was performed so exquisitely that the results were extremely compelling to the broader community of physicists who were investigating nature at a fundamental level.

radioactive beta decay
Schematic illustration of nuclear beta decay in a massive atomic nucleus. Only if the (missing) neutrino energy and momentum is included can these quantities be conserved. The transition from a neutron to a proton (and an electron and an antielectron neutrino) is energetically favorable, with the additional mass getting converted into the kinetic energy of the decay products. The inverse reaction, of a proton, electron, and an antineutrino all combining to create a neutron, never occurs in nature.
Credit: Inductiveload/Wikimedia Commons

Story 2: It’s the late 1920s, and scientists have now discovered three types of radioactive decay: alpha, beta, and gamma decays. In alpha decay, an unstable atomic nucleus emits an alpha particle (helium-4 nucleus), where the total energy and momentum of both “daughter” particles appears to be conserved, and equals the energy and momentum from the “parent” particle. In gamma decay, a gamma particle (photon) is emitted from an unstable atomic nucleus, where both energy and momentum are conserved from the initial to the final states as well. This energy and momentum conservation has also been observed to hold for all non-decaying particles and reactions as well; they appear to be immutable laws of nature.

However, then there was beta decay. In the process of beta decay, a beta particle (electron) is emitted from an atomic nucleus, which transmutes into a different element on the periodic table: one element up. In beta decay, however, the total energy is less for the two observed daughter particles (the emitted electron and the new nucleus) than it was for the parent particle (the old nucleus), and momentum is no longer conserved in this process. The expectation was that energy and momentum are two quantities that are expected to always be conserved in particle interactions, and so seeing a reaction where energy is lost and a net momentum appears out of nowhere violates both of those rules, never seen to be violated in any other particle reaction, collision, or decay.

Measuring back in time and distance (to the left of “today”) can inform how the Universe will evolve and accelerate/decelerate far into the future. By linking the expansion rate to the matter-and-energy contents of the Universe and measuring the expansion rate, we can come up with an estimate for the amount of time that’s passed since the start of the hot Big Bang. The supernova data in the late 1990s was the first set of data to indicate that we lived in a dark energy-rich Universe, rather than a matter-and-radiation dominated one; the data points, to the left of “today,” clearly drift from the standard “decelerating” scenario that had held sway through most of the 20th century.
Credit: Saul Perlmutter/UC Berkeley

Story 3: It’s the late 1990s, and scientists are working hard to measure exactly how the Universe is expanding. Not only to answer the question of, “How fast is the Universe expanding today?” but also to answer the complementary question of, “How has the Universe’s expansion rate changed and evolved throughout its history?” In theory — and this had been known since the 1920s — if you could answer both of those questions, you could determine precisely what all the various different types of matter-and-energy were that existed throughout the Universe, and what their energy densities were at every point in cosmic history.

A combination of ground-based observations and space-based ones (including the then-relatively new Hubble Space Telescope) were using every type of distance indicator to measure those two key parameters:

  1. the Hubble constant (the expansion rate today), and
  2. the deceleration parameter (how gravity is slowing the Universe’s expansion).

After years of carefully measuring the brightnesses and redshifts of many different type Ia supernovae at large distances, two teams of scientists tentatively published their results. From their data, they each reached the same conclusion: that “the deceleration parameter” is actually negative; instead of gravity slowing the Universe’s expansion, more distant galaxies appear to be speeding up in their apparent recession velocities as time goes on. In a Universe composed of normal matter, dark matter, radiation, neutrinos, and spatial curvature, this effect is theoretically impossible; either something was wrong with this data or how it was being interpreted, or some exotic form of energy must exist within our Universe.

Diagram illustrating the underground structures at CERN for neutrinos to Gran Sasso, detailing the paths of excavated, concreted, and access galleries, as well as tubes for decay tunnels and neutrino transfers—potentially yielding unexpected results.
Sending any particles through hundreds of kilometers of space should always result in the particles arriving no faster than a photon would. The OPERA collaboration famously observed a faster result back in 2011. The neutrinos arrived tens of nanoseconds earlier than expected, as recorded in their detector, which translates into a speed exceeding the speed of light by about 0.002%.
Credit: CNGS layout/OPERA experiment

Story 4: It’s 2011, and the Large Hadron Collider has only been operating for a short while. After initially turning on in 2008, a leak in the liquid helium system caused extensive damage to the machine, requiring years of repairs. Now that there are beams of fast-moving protons circulating within it at incredible speeds, just 3 m/s below the speed of light, the first science results are poised to come in. A variety of experiments that take advantage of these energetic particles are underway, seeking to measure a variety of aspects about the Universe. Some of them involve collisions of particles in one direction with particles moving equally fast in the other direction; others involve “fixed target” experiments, where fast-moving particles are collided with stationary ones.

In this latter case, enormous numbers of particles are produced all moving in the same general direction: a particle shower. These so-called “daughter particles” proceed to travel at near-light speeds in the same direction that the original protons were moving in. Some of these daughter particles will quickly decay, producing neutrinos when they do. One experiment successfully measures these neutrinos from a downstream location that’s hundreds of kilometers away, reaching a startling conclusion: the particles are arriving tens of nanoseconds earlier than their predicted arrival time. If all particles, including neutrinos, are limited by the speed of light, this “early arrival time” should be theoretically impossible.

Two scientific graphs with data points and error bars compare events against energy levels (GeV), labeled ATLAS and CMS. The plots, which include fit models and confidence intervals (±1σ and ±2σ), reveal unexpected results in certain energy ranges.
The ATLAS and CMS diphoton bumps from 2015, displayed together, clearly correlating at ~750 GeV. This suggestive result was significant at more than 3-sigma, but went away entirely with more data. This is an example of a statistical fluctuation, one of the ‘red herrings’ of experimental physics that can easily lead scientists astray.
Credit: CERN, CMS + ATLAS collaborations, Matt Strassler

Story 5: It’s now well into the 2010s, and the Large Hadron Collider has been operating for years. The full results from its first run are now in, and the Higgs boson has been discovered: a Nobel Prize-winning discovery. The last undiscovered particle in the Standard Model has now been found, and many of the other Standard Model particles have been subjected to unprecedented tests of their properties, showing no discernible deviation from their predicted behaviors. With all the pieces of the Standard Model now firmly in place, and little to point to anything being out of the ordinary otherwise, particle physics seems secure as-is, and the Standard Model seems more robust than ever.

Nevertheless, there are a few anomalous “bumps” that appear in the data: extra events that appear at certain energies where the Standard Model predicts that there should be no extra events. With two competing collaborations colliding particles at the maximum energies that the LHC can achieve, both working independently, a sensible cross-check would be to see if both CMS and ATLAS find similar evidence of any bumps occurring at the same energies and with the same level of significance. Remarkably there’s at least one location where both experiments see the exact same “extra” signal, consistent with the same “bump” in the data, and that’s an incredibly suggestive piece of evidence. Whatever’s going on, it doesn’t match the theoretical predictions that our most successful theories of all-time give, making us wonder if we aren’t on the cusp of discovering a new fundamental particle, interaction, or physical phenomenon.

fusion device LLNL
To create fusion ignition, the National Ignition Facility’s laser energy is converted into X-rays inside the device holding the fusionable fuel, known as a hohlraum. Those X-rays then heat and compress the region surrounding a fuel capsule until it implodes, creating a high temperature, high pressure plasma where fusion occurs. While hot fusion has been achieved many times in the laboratory, cold fusion has never been robustly demonstrated, and is instead a pseudoscientific field rife with charlatans and incompetents.
Credit: Lawrence Livermore National Laboratory

In each of these cases, it’s important to recognize what the possible outcomes are. In general, there are three possibilities for what’s going to occur.

1.) There is literally nothing to see here. What’s being touted as a potential new discovery is nothing more than an error of some sort. Whether it’s because of:

  • an honest, unforeseen mistake,
  • an erroneous setup,
  • experimental incompetence,
  • an act of sabotage,
  • or a deliberate hoax or fraud perpetrated by a charlatan,

is irrelevant; the claimed effect is not real.

2.) The rules of physics, as we’ve conceived them up until now, are not as we believed them to be, and this result is a hint — perhaps the first key hint — that there’s something different about our Universe than we’ve thought up until this point. It’s going to require a new physical law, principle, or even a whole new conception of reality to set things right.

3.) There is a new component to the Universe — something not previously included in our theoretical expectations — whose effects are showing up in these new results, possibly for the first time.

If you yourself are a scientist, you recognize immediately that your default assumption should be the first one, and that it would require an overwhelming amount of additional supporting evidence to show us that either the second or third option, both of which would be revolutionary, is instead correct.

Pantheon+
This graph shows the 1550 supernovae that are a part of the Pantheon+ analysis, plotted as a function of magnitude versus redshift. The supernova data, for many decades now (ever since 1998), has pointed toward a Universe that expands in a particular fashion that requires something beyond matter, radiation, and/or spatial curvature: a new form of energy that drives the expansion, known as dark energy. The supernovae all fall along the line that our standard cosmological model predicts, with even the highest-redshift, most far-flung type Ia supernovae adhering to this simple relation. Calibrating the relation without substantial error is of paramount importance.
Credit: D. Brout et al./Pantheon+, Astrophysical Journal, 2022

Of course, that’s not at all how reality plays out for most of us. Many of our scientific colleagues these days are quick to write papers putting forth novel, fringe ideas that either alter the rules of physics or propose new, additional particles or interactions as “leading explanations” for these results. Most of the discussions you’ll see in popular, even mainstream media sources is about how some new evidence threatens to “break the Universe” or something equally sensationalistic. But these are not answers; these are merely examples of ambulance-chasing: where something loud, flashy, and new is attracting all sorts of attention, particularly unscrupulous attention, from people who should ethically know better.

How will we actually determine which explanation is the correct one for these new observations? The scientific process demands just one thing: that we gather more data, better data, and independent data that either confirms or refutes what’s been seen. New ideas and theories that supersede the old ones ought to be considered, so long as they:

  • reproduce the same successful results as the old theories where they work,
  • explain the new results where the old theories do not, and
  • make at least one new prediction that differs from the old theory that can be, in principle, looked for and measured.

The correct first response to an unexpected result is to try to independently reproduce it and to compare these results with other, complementary results that should help us interpret this new result in the context of the full suite of evidence.

The neutrino was first proposed in 1930, but was not detected until 1956, from nuclear reactors. In the years and decades since, we’ve detected neutrinos from the Sun, from cosmic rays, and even from supernovae. Here, we see the construction of the tank used in the solar neutrino experiment in the Homestake gold mine from the 1960s. This technique, of building neutrino observatories deep underground, has been a hallmark of particle physics experiments for over 60 years.
Credit: Brookhaven National Laboratory

Each one of these five historical stories had a different ending, although they all had the potential to revolutionize the Universe. In order, here’s what happened:

  1. The speed of light, as further experiments demonstrated, turns out to be the same as measured by all observers in all reference frames. There is no aether necessary; instead, our conception of how things move through the Universe is governed by Einstein’s relativity, not Newton’s laws.
  2. Energy and momentum are actually both conserved, but that’s because there was a new, unseen particle that’s also emitted in beta decay: the neutrino, as proposed by Wolfgang Pauli in 1930. Neutrinos, a mere hypothesis for decades, were finally directly detected 1956, two years before Pauli died.
  3. Initially met with skepticism, the two independent teams (the supernova cosmology project and the high-z supernova search team) continued to gather data on the expansion of the Universe, but skeptics weren’t convinced until improved data from the cosmic microwave background and large-scale structure data all supported the same inescapable conclusion. The Universe, in addition to the known forms of matter and radiation, also contains dark energy, which is the underlying cause of the observed accelerated expansion.
  4. Initially reported as a 6.8-sigma result by the OPERA collaboration, other experiments using the same setup, such as ICARUS, failed to confirm their results. Eventually, the OPERA team found an experimental error that was the cause of their anomalous results: there was a loose cable that was giving an incorrect reading for the time-of-flight of these neutrinos. With the error fixed, the anomaly disappeared.
  5. Even with data from both CMS and ATLAS, the significance of these results (both the diboson and diphoton bumps) never crossed the vaunted 5-sigma threshold, or the “gold standard” for statistical significance. With more data, what was originally a “bump” in the data simply regressed to the mean, showing that these initially promising results were mere statistical fluctuations. With much more data now in the LHC’s coffers, there is no evidence for either of these bumps any longer.
Early on in Run I at the LHC, the ATLAS collaboration saw evidence for a diboson “bump” at around 2,000 GeV, suggestive of a new particle, which many hoped was evidence for SUSY. Unfortunately, that signal went away and was found to be mere statistical noise with the accumulation of more data, as have all such fluctuations.
Credits: ATLAS and CMS collaborations

On the other hand, there are a large number of collaborations that are too quick to observe an anomaly and then make extraordinary claims based on that one observation. The DAMA collaboration claims to have directly detected dark matter, despite a whole slew of red flags and failed confirmation attempts. The Atomki anomaly, which observes a specific nuclear decay, sees an unexpected result in the distribution of angles of that decay, claiming the existence of a new particle, the X17, with a series of unprecedented properties. There have been numerous claims that cold fusion has been achieved, which defies the conventional rules of nuclear physics.

There have been claims of reactionless, thrustless engines, which defy the rules of momentum conservation. And there have been extraordinary claims made by real physicists, such as from the Alpha Magnetic Spectrometer or BICEP2, that had mundane, rather than extraordinary, explanations. More recently, there have been claims about room-temperature superconductivity surrounding a substance known as LK-99, now known not to superconduct at all, and the muon g-2 anomaly, which appears to be an experimental triumph but which comes alongside a theoretical calculation whose errors were badly underestimated.

The latest lattice QCD results concerning the theoretical prediction of the muon’s magnetic moment strongly disagree with the older r-ratio method’s predictions, and instead point to a strong agreement with experimental data. It looks like the earlier theoretical method has a flaw in it somewhere.
Credit: A. Boccaletti et al., arXiv:2407.10913, 2024

Whenever you do a real, bona fide experiment, it’s important that you don’t bias yourself toward getting whatever result you anticipate or, worse, hope for. You, as the scientist, need to be the most skeptical of your own setup, the most honest about your errors and uncertainties, and the most forthcoming about your methodologies and their possible flaws. You’ll want to be as responsible as possible, doing everything you can to calibrate your instruments properly and understand all of your sources of error and uncertainty, but in the end, you have to report your results honestly, regardless of what you see. It’s going to be up to the rest of the scientific community to either validate or refute what you’ve found, and if you’ve been unscrupulous at any point along the way, you’re going to get exposed eventually.

There should be no penalty to collaborations for coming up with results that aren’t borne out by later experiments; the OPERA, ATLAS, and CMS collaborations in particular did admirable jobs in releasing their data with all the appropriate caveats. When the first hints of an anomaly arrive, unless there is a particularly glaring flaw with the experiment (or the experimenters), there is no way to know whether it’s an experimental flaw, evidence for an unseen component, or the harbinger of a new set of physical laws. Only with more, better, and independent scientific data can we hope to solve whatever puzzle our investigations reveal about the natural world.

Sign up for the Starts With a Bang newsletter
Travel the universe with Dr. Ethan Siegel as he answers the biggest questions of all

Related

Up Next