Skip to content
Starts With A Bang

How Scientists Use Hydrogen Gas, In Space And On Earth, To Measure The Big Bang

Even 13.8 billion years after the Big Bang, we can reconstruct the first 3 minutes.


About 100 years ago, we began to truly understand the nature of the Universe for the very first time. The grand spirals and ellipticals in the sky were determined to be enormous, distant collections of stars well outside of the Milky Way: galaxies unto themselves. They were receding away from us, with more distant galaxies exhibiting faster recession speeds: evidence that the Universe was expanding. And if space is expanding today, that means the Universe was smaller, denser, and even hotter in the past. Extrapolate back far enough, and you’ll predict that the Universe began a finite amount of time ago in an event known as the hot Big Bang.

If the Universe was hotter and denser in the past, but cooled, that means there was once a time where neutral atoms couldn’t form, because things were too hot, but then did as the Universe cooled. That leads to a prediction of a now-cold, but mostly uniform background of radiation: this was discovered in the 1960s, validating the picture of the hot Big Bang and ruling out many alternatives. But there’s an entirely independent way to validate the hot Big Bang: by the nuclear reactions that must have occurred when the Universe was just minutes old. Those predictions are imprinted in the hydrogen gas throughout our Universe, and help us understand the Big Bang as never before.

A visual history of the expanding Universe includes the hot, dense state known as the Big Bang and the growth and formation of structure subsequently. The full suite of data, including the observations of the light elements and the cosmic microwave background, leaves only the Big Bang as a valid explanation for all we see. As the Universe expands, it also cools, enabling ions, neutral atoms, and eventually molecules, gas clouds, stars, and finally galaxies to form. (NASA / CXC / M. WEISS)

If we were to go back to the very early stages of the hot Big Bang, to when the Universe was just a fraction-of-a-second old, we’d see a very different Universe from the one we recognize today. There were lots of free protons and neutrons, at temperatures and densities greater than we find in the Sun’s core. But there were no heavier nuclei, as the photons that were around at the time were so energetic that they’d immediately blast a heavier nucleus apart. In order to stably form them, we’d have to wait for the Universe to cool. As time went on:

  • electrons and positrons, the lightest charged particles, annihilated away, leaving only enough electrons to balance out the protons (and the electric charge) in the Universe,
  • neutrinos stopped interacting with protons and neutrons, causing them to “free stream,” or travel without colliding with (and potentially transmuting) other particles,
  • a fraction of the remnant free neutrons, with a half-life of around 10 minutes, decayed into protons, electrons, and anti-electron neutrinos,
  • and finally, only after 3–4 minutes, has the Universe cooled enough to successfully take the first step in forming heavy elements: fusing a proton and a neutron into deuterium, the first heavy isotope of hydrogen.

Once the Universe cools enough to pass this “deuterium bottleneck,” nuclear fusion of these light elements can finally proceed unabated.

The abundances of helium, deuterium, helium-3 and lithium-7 are highly dependent on only one parameter, the baryon-to-photon ratio, if the Big Bang theory is correct. The fact that we have 0.0025% deuterium is needed to allow stars to form as massive as they do. (NASA, WMAP SCIENCE TEAM AND GARY STEIGMAN)

But by the time that 3-to-4 minutes have passed since the hot Big Bang, the Universe is a lot cooler and less dense than it once was. Temperatures are still high enough to initiate nuclear fusion, but the density — due to the expansion of the Universe — is only about 0.0000001% of what it is in the Sun’s center. As a result, most of the neutrons that still remain wind up combining with protons to form helium-4, with a small amount of helium-3, deuterium, tritium (which decays to helium-3), and isotopes of lithium and beryllium (which eventually decay to lithium) also remaining.

What’s remarkable about these predictions is how little they depend on. Given the Standard Model of particle physics, and how nuclear processes are known to work, there should be a particular ratio of the light elements that survive today dependent only on the ratio of baryons (protons and neutrons combined) to photons. Even completely independent of the radiation from the Cosmic Microwave Background, measuring the relative abundances of the light elements will tell us what the total amount of “normal matter” present in the Universe must be. In particular, we can see that measuring deuterium’s abundance, particularly if we can measure it precisely, will reveal to us the baryon-to-photon ratio of the Universe.

The absorption spectra of different populations of gas (L) allow us to derive the relative abundances of elements and isotopes (center). In 2011, two distant gas clouds containing no heavy elements and a pristine deuterium-to-hydrogen ratio (R) were discovered for the first time. (MICHELE FUMAGALLI, JOHN M. O’MEARA, AND J. XAVIER PROCHASKA, VIA HTTP://ARXIV.ORG/ABS/1111.2334)

The problem, of course, is that these are predictions for what the Universe was born with, but that’s not the Universe we see today. By the time we get to the stars and galaxies we can observe, the normal matter that exists has gone through processing: stars have formed, lived, burned through their nuclear fuel, transmuted light elements into heavy ones, and have recycled those processed elements back into the interstellar medium. When we look at stars today, they don’t exhibit these predicted ratios, but significantly altered ones. In addition to these light elements, there are also heavy ones showing up ubiquitously, like oxygen, carbon, and iron, among others.

In a Universe without pristine stars, how could you possibly try and reconstruct how much deuterium was present immediately following the Big Bang?

One method you might consider is to measure the ratios of elements in a variety of stellar populations. If you measure, say, the oxygen-to-hydrogen or iron-to-hydrogen ratios, and also measure the deuterium-to-hydrogen ratio, you could graph them together, and use that information to extrapolate backwards: to a zero oxygen or iron abundance. This is a pretty solid method, and gives us an estimate for how much deuterium would be present at a time before heavy elements, like oxygen or iron, had formed.

Distant sources of light — from galaxies, quasars, and even the cosmic microwave background — must pass through clouds of gas. The absorption features we see enable us to measure many features about the intervening gas clouds, including the abundances of the light elements inside. (ED JANSSEN, ESO)

But ideally, you’d want to probe the deuterium abundance directly: in as close to a pristine environment as possible. If you’ve already formed stars, you’ve probably both made and/or destroyed deuterium via nuclear processes, which throw your conclusions into doubt. Ideally, you’d want to find gas that was as close to pristine as possible, without the associated pollution of the stars themselves. You’d want to get high-precision measurements of clouds of gas — ideally very far away, corresponding to very far back in time — with no stars in them at all.

This seems like an impossibility, until you realize that clouds of gas can absorb light, imprinting their unique signature onto it. The brightest, most luminous light sources from the distant universe are quasars: supermassive black holes that are actively feeding in galaxies at great distances. Everywhere there’s an intervening cloud of gas, a portion of that quasar light gets absorbed, as whatever atoms, molecules, or ions that are present will absorb that light at those explicit quantum frequencies particular to whatever particles are present at whatever redshift they’re located at.

Despite the nearly identical physics governing them, the tiny difference in nuclear mass between deuterium and hydrogen leads to a small but measurable shift in the peak of their absorption features. Even with just ~0.002% of the abundance of hydrogen, deuterium in intervening gas clouds can be detected superimposed atop the hydrogen absorption features. (J. GEISS AND G. GLOECKLER (2005))

You might think that deuterium, being an isotope of hydrogen, would be indistinguishable from hydrogen itself. But when it comes to the frequencies that atoms emit or absorb light at, they’re determined by the energy levels of the electrons in that atom, which depend on not just the charge of the atomic nucleus, but on the ratio of the electron mass to the mass of the nucleus itself. With an extra neutron in its nucleus, the deuterium absorption line overlaps with, but its peak is off-center from, the peak of the normal hydrogen.

By looking at the best quasar data we have in the Universe, and finding the closest-to-unpolluted molecular clouds that exist along their lines-of-sight, we can reconstruct the primordial deuterium abundance to extreme precision. The latest results tell us that the amount of deuterium in the Universe, by mass, was 0.00253% of the initial hydrogen abundance, with an uncertainty of only ±0.00004%.

This corresponds to a Universe that’s made up of about 4.9% normal matter: consistent within ~1% of what the Cosmic Microwave Background reveals, but completely independent of that result.

Three different types of measurements, distant stars and galaxies, the large scale structure of the Universe, and the fluctuations in the CMB, tell us the expansion history of the Universe, and rule out alternatives to the Big Bang. (NASA/ESA HUBBLE (TOP L), SDSS (TOP R), ESA AND THE PLANCK COLLABORATION (BOTTOM))

But are we certain that we have the nuclear physics worked out correctly? After all, there’s a big difference between “we understand the laws of physics and how the equations work, and here’s what we predict,” and “we recreated the conditions that were present and demonstrated that the outcomes are in line with our theoretical predictions.” The first allows us to make a prediction — which we can then compare with our observations — but the second would experimentally confirm that our predictions are actually worth their weight in heavy isotopes.

The way we frequently approach problems like this is to identify which step in the process is the most uncertain, particularly if the uncertainty in that step is greater than the uncertainty in either:

  • the observational data we have to compare our results with,
  • or the desired precision of our end conclusion.

For the nuclear processes involved in both creating and burning deuterium, that’s where deuterium fuses with a proton to form helium-3, an uncommon, light, but stable isotope of the element helium.

From beginning with just protons and neutrons, the Universe builds up helium-4 rapidly, with small but calculable amounts of deuterium, helium-3, and lithium-7 left over as well. Until the latest results from the LUNA collaboration, step 2a, where deuterium and a proton fuse into helium-3, had the largest uncertainty. That uncertainty has now dropped to just 1.6%, allowing for incredibly strong conclusions. (E. SIEGEL / BEYOND THE GALAXY)

Last year, at an underground laboratory in Italy, a plasma physics experiment at the Laboratory for Underground Nuclear Astrophysics (LUNA) went and recreated the high temperatures and densities that were present during the hot Big Bang, and went to observe the reactions between deuterium and protons directly. It took three years to measure enough different conditions to high-enough precisions to recreate the necessary temperature ranges, but when all was said and done, they had the best measurement of this particular reaction rate ever: with an uncertainty of just 1.6%.

Most importantly, though, it confirmed our expectations. Although the uncertainties were larger, previously, the central value didn’t shift by very much at all, meaning that our estimates for how the deuterium abundance corresponds to and translates into an overall matter density was actually extremely good. The Universe, as best as we can tell, really is made of about 5% normal matter, and no more than that.

Here, a proton beam is shot at a deuterium target in the LUNA experiment. The rate of nuclear fusion at various temperatures helped reveal the deuterium-proton cross-section, which was the most uncertain term in the equations used to compute and understand the net abundances that would arise at the end of Big Bang Nucleosynthesis. (LUNA COLLABORATION/GRAN SASSO)

This is a conclusion whose importance cannot be overstated. There’s an awful lot we don’t understand about our Universe today, including why we live in a Universe where so much of what exists lies beyond the reach of our observation. There are a lot of reasons to be skeptical of dark matter and dark energy, for instance: they’re tremendously counterintuitive. Just because the Cosmic Microwave Background tells us they must be there, for example, doesn’t mean they necessarily exist. If that one line of evidence is flawed — either from the data or our analysis — we don’t want our conclusions to suddenly be overturned.

Travel the Universe with astrophysicist Ethan Siegel. Subscribers will get the newsletter every Saturday. All aboard!

That’s why we demand multiple, independent lines of evidence for a conclusion before we confidently accept it. The science of Big Bang Nucleosynthesis is one of those incredibly important cross-checks. It’s an independent test not only of the Big Bang model of the early Universe, but of our concordance cosmological model. It tells us, all on its own, what the total amount of normal matter in the Universe is. Since the other lines of evidence, like colliding galaxy clusters or the large-scale structure of the Universe, require far more matter than the early deuterium tells us can exist, we can be much more confident that dark matter is real.

This view of about 0.15 square degrees of space reveals many regions with large numbers of galaxies clustered together in clumps and filaments, with large gaps, or voids, separating them. This region of space is known as the ECDFS, as it images the same portion of the sky imaged previously by the Extended Chandra Deep Field South: a pioneering X-ray view of the same space. (NASA/SPITZER/S-CANDELS; ASHBY ET AL. (2015), ACKNOWLEDGMENT: KAI NOESKE)

When it comes to the Universe, simply starting from the known laws of physics and extrapolating back from our direct observations can get us extremely far. Start with redshifts and distances of galaxies, and General Relativity will give you the expanding Universe. Start with the expanding Universe, and the Cosmic Microwave Background can give you the Big Bang. Start with the Big Bang, and the nuclear physics of the light elements will give you the total amount of normal matter in the Universe. And take the normal matter and our astrophysical observations of how galaxies cluster and merge, and you get a Universe requiring dark matter.

If we confidently want to know what the Universe is made of, we have to ensure we test it in every way plausible. Although it was one of the earliest predictions to arise from the hot Big Bang scenario, the nucleosynthesis of the light elements has often been derided by portions of the community as being too imprecise to draw meaningful conclusions from. With the latest observations and experiments, it’s clear that time has passed. The Universe has only 4.7–5.0% normal matter in it, and the rest, in some form or other, is truly dark.


Starts With A Bang is written by Ethan Siegel, Ph.D., author of Beyond The Galaxy, and Treknology: The Science of Star Trek from Tricorders to Warp Drive.


Related

Up Next