Skip to content
Starts With A Bang

Ask Ethan: What Could Solve The Cosmic Controversy Over The Expanding Universe?

Sign up for the Starts With a Bang newsletter
Travel the universe with Dr. Ethan Siegel as he answers the biggest questions of all

Two independent techniques give precise but incompatible answers. Here’s how to resolve it.


If you didn’t know anything about the Universe beyond our own galaxy, there are two different pathways you could take to figure out how it was changing. You could measure the light from well-understood objects at a wide variety of distances, and deduce how the fabric of our Universe changes as the light travels through space before arriving at our eyes. Alternatively, you could identify an ancient signal from the Universe’s earliest stages, and measure its properties to learn about how spacetime changes over time. These two methods are robust, precise, and in conflict with one another. Luc Bourhis wants to know what the resolution might be, asking:

As you pointed out in several of your columns, the cosmic [distance] ladder and the study of CMBR gives incompatible values for the Hubble constant. What are the best explanations cosmologists have come with to reconcile them?

Let’s start by exploring the problem, and then seeing how we might resolve it.

First noted by Vesto Slipher back in 1917, some of the objects we observe show the spectral signatures of absorption or emission of particular atoms, ions, or molecules, but with a systematic shift towards either the red or blue end of the light spectrum. When combined with the distance measurements of Hubble, this data gave rise to the initial idea of the expanding Universe. (VESTO SLIPHER, (1917): PROC. AMER. PHIL. SOC., 56, 403)

The story of the expanding Universe goes back nearly 100 years, to when Edwin Hubble first discovered individual stars of a specific type — Cepheid variable stars — within the spiral nebulae seen throughout the night sky. All at once, this demonstrated that these nebulae were individual galaxies, allowed us to calculate the distance to them, and by adding one additional piece of evidence, revealed that the Universe was expanding.

That additional evidence was discovered a decade prior by Vesto Slipher, who noticed that the spectral lines of these same spiral nebulae were severely redshifted on average. Either they were all moving away from us, or the space between us and them was expanding, just as Einstein’s theory of spacetime predicted. As more and better data came in, the conclusion became overwhelming: the Universe was expanding.

The construction of the cosmic distance ladder involves going from our Solar System to the stars to nearby galaxies to distant ones. Each ‘step’ carries along its own uncertainties. While the inferred expansion rate could be biased towards higher or lower values if we lived in an underdense or overdense region, the amount required to explain this conundrum is ruled out observationally. There are enough independent methods use to construct the cosmic distance ladder that we can no longer reasonably fault one ‘rung’ on the ladder as the cause of our mismatch between different methods. (NASA, ESA, A. FEILD (STSCI), AND A. RIESS (STSCI/JHU))

Once we accepted that the Universe was expanding, it became apparent that the Universe was smaller, hotter, and denser in the past. Light, from wherever it’s emitted, must travel through the expanding Universe in order to arrive at our eyes. When we measure the light we receive from a well-understood object, determining a distance to the objects we observe, we can also measure how much that light has redshifted.

This distance-redshift relation allows us to construct the expansion history of the Universe, as well as measuring its present expansion rate. The distance ladder method was thus born. At present, there are perhaps a dozen different objects we understand well enough to use as distance indicators — or standard candles — to teach us how the Universe has expanded over its history. The different methods all agree, and yield a value of 73 km/s/Mpc, with an uncertainty of just 2–3%.

The pattern of acoustic peaks observed in the CMB from the Planck satellite effectively rules out a Universe that doesn’t contain dark matter, and also tightly constrains many other cosmological parameters. We arrive at a Universe that’s 68% dark energy, 27% dark matter, and just 5% normal matter from this and other lines of evidence, with a best-fit expansion rate of 67 km/s/Mpc. (P.A.R. ADE ET AL. AND THE PLANCK COLLABORATION (2015))

On the other hand, if we go all the way back to the earliest stages of the Big Bang, we know that the Universe contained not only normal matter and radiation, but a substantial amount of dark matter as well. While normal matter and radiation interact with one another through collisions and scattering interactions very frequently, the dark matter behaves differently, as its cross-section is effectively zero.

This leads to a fascinating consequence: the normal matter tries to gravitationally collapse, but the photons push it back out, whereas the dark matter has no ability to be pushed by that radiation pressure. The result is a series of peaks-and-valleys in the large-scale structure that arises on cosmic scales from these oscillations — known as baryon acoustic oscillations (BAO) — but the dark matter is smoothly distributed atop it.

The large-scale structure of the Universe changes over time, as tiny imperfections grow to form the first stars and galaxies, then merge together to form the large, modern galaxies we see today. Looking to great distances reveals a younger Universe, similar to how our local region was in the past. The temperature fluctuations in the CMB, as well as the clustering properties of galaxies throughout time, provide a unique method of measuring the Universe’s expansion history. (CHRIS BLAKE AND SAM MOORFIELD)

These fluctuations show up on a variety of angular scales in the cosmic microwave background (CMB), and also leave an imprint in the clustering of galaxies that occurs later on. These relic signals, originating from the earliest times, allow us to reconstruct how quickly the Universe is expanding, among other properties. From the CMB and BAO both, we get a very different value: 67 km/s/Mpc, with an uncertainty of only 1%.

Because of the fact that there are many parameters we don’t know intrinsically about the Universe — such as the age of the Universe, the normal matter density, the dark matter density, or the dark energy density — we have to allow them all to vary together when constructing our best-fit models of the Universe. When we do, a number of possible pictures arise, but one thing remains unambiguously true: the distance ladder and early relic methods are mutually incompatible.

Modern measurement tensions from the distance ladder (red) with early signal data from the CMB and BAO (blue) shown for contrast. It is plausible that the early signal method is correct and there’s a fundamental flaw with the distance ladder; it’s plausible that there’s a small-scale error biasing the early signal method and the distance ladder is correct, or that both groups are right and some form of new physics (examples shown at top) is the culprit. But right now, we cannot be sure. (ADAM RIESS (PRIVATE COMMUNICATION))

The possibilities for why these discrepancies are occurring are threefold:

  1. The “early relics” group is mistaken. There’s a fundamental error in their approach to this problem, and it’s biasing their results towards unrealistically low values.
  2. The “distance ladder” group is mistaken. There’s some sort of systematic error in their approach, biasing their results towards incorrect, high values.
  3. Both groups are correct, and there is some sort of new physics at play responsible for the two groups obtaining different results.

There are numerous very good reasons indicating that the results of both groups ought to be believed. If that’s the case, there has to be some sort of new physics involved to explain what we’re seeing. Not everything can do it: living in a local cosmic void is disfavored, as is adding in a few percentage points of spatial curvature. Instead, here are the five best explanations cosmologists are considering right now.

Measuring back in time and distance (to the left of “today”) can inform how the Universe will evolve and accelerate/decelerate far into the future. We can learn that acceleration turned on about 7.8 billion years ago with the current data, but also learn that the models of the Universe without dark energy have either Hubble constants that are too low or ages that are too young to match with observations. If dark energy evolves with time, either strengthening or weakening, we will have to revise our present picture. (SAUL PERLMUTTER OF BERKELEY)

1.) Dark energy gets more powerfully negative over time. To the limits of our best observations, dark energy appears to be consistent with a cosmological constant: a form of energy inherent to space itself. As the Universe expands, more space gets created, and since the dark energy density remains constant, the total amount of dark energy contained within our Universe increases along with the Universe’s volume.

But this is not mandatory. Dark energy could either strengthen or weaken over time. If it’s truly a cosmological constant, there’s an absolute relationship between its energy density (ρ) and the negative pressure (p) it exerts on the Universe: p = -ρ. But there is some wiggle room, observationally: the pressure could be anywhere from -0.92ρ to about -1.18ρ. If the pressure gets more negative over time, this could yield a smaller value with the early relics method and a larger value with the distance ladder method. WFIRST should measure this relationship between ρ and p down to about the 1% level, which should constrain, rule out, or discover the truth of this possibility.

The early Universe was full of matter and radiation, and was so hot and dense that it prevented all composite particles from stably forming for the first fraction-of-a-second. As the Universe cools, antimatter annihilates away and composite particles get a chance to form and survive. Neutrinos are generally expected to stop interacting by the time the Universe is ~1 second old, but if there are more interactions than we realize, this could have huge implications for the expansion rate of the Universe. (RHIC COLLABORATION, BROOKHAVEN)

2.) Keeping neutrinos strongly coupled to matter and radiation for longer than expected. Conventionally, neutrinos interact with the other forms of matter and radiation in the Universe only until the Universe cools to a temperature of around 10 billion K. At temperatures cooler than this, their interaction cross-section is too low to be important. This is expected to occur just a second after the Big Bang begins.

But if the neutrinos stay strongly-coupled to the matter and radiation for longer — for thousands of years in the early Universe instead of just ~1 second — this could accommodate a Universe with a faster expansion rate than the early relics teams normally consider. This could arise if there’s an additional self-interaction between neutrinos from what we presently think, which is compelling considering the Standard Model alone cannot explain the full suite of neutrino observations. Further neutrino studies at relatively low and intermediate energies could probe this scenario.

An illustration of clustering patterns due to Baryon Acoustic Oscillations, where the likelihood of finding a galaxy at a certain distance from any other galaxy is governed by the relationship between dark matter and normal matter. As the Universe expands, this characteristic distance expands as well, allowing us to measure the Hubble constant, the dark matter density, and even the scalar spectral index. The results agree with the CMB data, and a Universe made up of 27% dark matter, as opposed to 5% normal matter. Altering the distance of the sound horizon could alter the expansion rate that this data implicates. (ZOSIA ROSTOMIAN)

3.) The size of the cosmic sound horizon is different than what the early relics team has concluded. When we talk about photons, normal matter, and dark matter, there is a characteristic distance scale set by their interactions, the size/age of the Universe, and the rate at which signals can travel through the early Universe. Those acoustic peaks and valleys we see in the CMB and in the BAO data, for example, are manifestations of that sound horizon.

But what if we’ve miscalculated or incorrectly determined the size of that horizon? If you calibrate the sound horizon with distance ladder methods, such as Type Ia supernovae, you obtain a sound horizon that’s significantly larger than the one you get if you calibrate the sound horizon traditionally: with CMB data. If the sound horizon actually evolves from the very early Universe to the present day, this could fully explain the discrepancy. Fortunately, next-generation CMB surveys, like the proposed SPT-3G, should be able to test whether such changes have occurred in our Universe’s past.

If there were no oscillations due to matter interacting with radiation in the Universe, there would be no scale-dependent wiggles seen in galaxy clustering. The wiggles themselves, shown with the non-wiggly part subtracted out (bottom), is dependent on the impact of the cosmic neutrinos theorized to be present by the Big Bang. Standard Big Bang cosmology corresponds to β=1. Note that if there is a dark matter/neutrino interaction present, the perceived expansion rate could be altered. (D. BAUMANN ET AL. (2019), NATURE PHYSICS)

4.) Dark matter and neutrinos could interact with one another. Dark matter, according to every indication we have, only interacts gravitationally: it doesn’t collide with, annihilate with, or experience forces exerted by any other forms of matter or radiation. But in truth, we only have limits on possible interactions; we haven’t ruled them out entirely.

What if dark matter and neutrinos interact and scatter off of one another? If the dark matter is very massive, an interaction between a very heavy thing (like a dark matter particle) and a very light particle (like a neutrino) could cause the light particles to speed up, gaining kinetic energy. This would function as a type of energy injection in the Universe. Depending on when and how it occurs, could cause a discrepancy between early and late measurements of the expansion rate, perhaps even enough to fully account for the differing, technique-dependent measurements.

An illustrated timeline of the Universe’s history. If the value of dark energy is small enough to admit the formation of the first stars, then a Universe containing the right ingredients for life is pretty much inevitable. However, if dark energy comes and goes in waves, with an early amount of dark energy decaying away prior to the emission of the CMB, it could resolve this expanding Universe conundrum. (EUROPEAN SOUTHERN OBSERVATORY (ESO))

5.) Some significant amount of dark energy existed not only at late (modern) times, but also early ones. If dark energy appears in the early Universe (at the level of a few percent) but then decays away prior to the CMB measurements, this could fully explain the tension between the two methods of measuring the expansion rate of the Universe. Again, future improved measurements of both the CMB and of the large-scale structure of the Universe could help provide indications if this scenario describes our Universe.

Of course, this isn’t an exhaustive list; one could always choose any number of classes of new physics, from inflationary add-ons to modifying Einstein’s theory of General Relativity, to potentially explain this controversy. But in the absence of compelling observational evidence for one particular scenario, we have to look at the ideas that could be feasibly tested in the near-term future.

The viewing area of Hubble (top left) as compared to the area that WFIRST will be able to view, at the same depth, in the same amount of time. The wide-field view of WFIRST will allow us to capture a greater number of distant supernovae than ever before, and will enable us to perform deep, wide surveys of galaxies on cosmic scales never probed before. It will bring a revolution in science, regardless of what it finds, and provide the best constraints on how dark energy evolves over cosmic time. If dark energy varies by more than 1% of the value it’s anticipated to have, WFIRST will find it. (NASA / GODDARD / WFIRST)

The immediate problem with most solutions you can concoct to this puzzle is that the data from each of the two main techniques — the distance ladder technique and the early relics technique — already rule out almost all of them. If the five scenarios for new physics you just read seem like an example of desperate theorizing, there’s a good reason for that: unless one of the two techniques has a hitherto-undiscovered fundamental flaw, some type of new physics must be at play.

Based on the improved observations that are coming in, as well as novel scientific instruments that are presently being designed and built, we can fully expect the tension in these two measurements to reach the “gold standard” 5-sigma significance level within a decade. We’ll all keep looking for errors and uncertainties, but it’s time to seriously consider the fantastic: maybe this really is an omen that there’s more to the Universe than we presently realize.


Ethan Siegel is the author of Beyond the Galaxy and Treknology. You can pre-order his third book, currently in development: the Encyclopaedia Cosmologica.
Sign up for the Starts With a Bang newsletter
Travel the universe with Dr. Ethan Siegel as he answers the biggest questions of all

Related

Up Next