Skip to content
Starts With A Bang

The only two “arrows of time” we have don’t match

Our thermodynamic arrow of time explains why the entropy of any isolated system always increases. But it can’t explain what we perceive.
bounce ball
By examining this strobe image of a bouncing ball, you cannot tell for certain whether the ball is moving toward the right and losing energy with each bounce, or whether it's moving toward the left and getting an energetic kick with each bounce. The laws of physics are symmetric under time-reversal transformations, and yet we only ever perceive time's arrow as running in one particular (forward) direction. The reason why is not yet known.
Credit: MichaelMaggs Edit by Richard Bartz/Wikimedia Commons
Key Takeaways
  • One of the most extraordinary facts about our existence within this Universe is that we experience the flow of time in one direction and one direction only: forward, regardless of what else we do.
  • This “perceptive” arrow of time is one of the great mysteries of existence, as no other physical law or property explains why this is so.
  • The only other “arrow of time” we know of is the thermodynamic arrow of time, stating that entropy always increases. Yet the only two “arrows of time” we know of don’t match up, deepening the mystery.

Most of us, in our day to day lives, experience time as something that’s fixed: always ticking by, in the forward direction, at an easily measurable rate that all observers can agree on. But when two observers compare what they each experience, for themselves, as one second, they don’t always find themselves agreeing with one another. This was only explained in the early 1900s, with the arrival of Einstein’s theory of relativity: the surprise that time itself, long considered fundamental and universal, is actually relative. Different observers, so long as they move through space at different speeds or in different directions, will experience the flow of time differently from one another. Whether two events occur simultaneously or one-before-the-other depends entirely on the observer’s point of view.

And yet, despite how ambiguous time is, there are some facts about it that all observers can agree on. Perhaps the most fundamental of these facts — and yet, perhaps the most puzzling among them as well — is that everyone, in their own inertial reference frame, always sees time moving forward at the same rate: one second-per-second. This fact is known as the arrow of time, or specifically our perceptive arrow of time. There are many ideas as to what causes us to experience it the way we do, and one idea that’s been put forward is the only other “arrow of time” we know about: the thermodynamic arrow of time, as entropy always increases.

These two arrows, unfortunately, cannot be one and the same; the thermodynamic arrow of time is no good as an explanation for the arrow of time we perceive. Although some will argue the contrary, the science is very clear about this. Here’s why.

light clock einstein special relativity time dilation
A “light clock” will appear to run differently for observers moving at different relative speeds, but this is due to the constancy of the speed of light. Einstein’s law of special relativity governs how these time and distance transformations take place between different observers. However, each individual observer will see time pass at the same rate as long as they remain in their own reference frame: one second-per-second.
Credit: John D. Norton/University of Pittsburgh

With every moment that passes, no matter what is going on around us, we find ourselves experiencing the most basic, boring, and mundane form of time travel of all: the slow passage of time that ticks by as we progress into the future. With each moment that elapses, light continues to propagate in the direction it was moving, maintaining its constant speed (the speed of light), as it moves an appropriate distance in every given interval of time, regardless of what else is going on around it. At no point, and under no circumstances, does time ever appear to either stand still or reverse; it can only continue to progress into the future.

In other words, the arrow of time always points in the forward direction for anything that exists within our Universe. But this is a puzzle for fundamental physics, as there’s no explanation for why time behaves in this fashion. The laws of nature, with very, very few exceptions, are completely time-symmetric. From Newton to Einstein to Maxwell to Bohr to Dirac to Feynman, the equations that govern reality don’t have a preference for the direction of the flow of time. The behavior of any system can be described by equations that are just as valid in the forward direction as they are in the backward direction. Yet, we can move “backward” as well as forward in any of our three spatial dimensions. Somehow, time is different.

With all this considered, then, where does our arrow of time come from?

Still from a lecture on entropy by Clarissa Sorensen-Unruh. Entropy, as labeled by the quantity S, plays an enormously important role in physics and in thermodynamics in particular, and also has an arrow coincident with the arrow of time. But does the fact that entropy never decreases mean that entropy is responsible for the perceptive arrow of time? The assertion is dubious.
Credit: C. Sorensen-Unruh/YouTube

According to many, there appears to be a suggestive link between what we perceive as the arrow of time and a quantity called entropy. Commonly known as “a measure of disorder” in a physical system, there are actually two more accurate descriptions when it comes to the physical quantity of entropy.

  1. Entropy can be viewed as the number of possible arrangements of the (quantum) state of your system. If you have more options for how you could arrange your system so that it remains identical, you have higher entropy than if there are fewer options. A room with 20 different regions at 20 different temperatures has a lower entropy than a room where every location has the same temperature; a room with a “hot” and a “cold” side separated by a partition has less entropy than a room that’s well-mixed after the divider has been removed.
  2. It’s also useful to think of entropy as a measure of how much thermal (heat) energy could possibly be turned into useful, mechanical work. When you have lots of energy available to do work (such as a room with a hot source and a cold sink), you have a low-entropy system, as the flow of energy (or heat, or particles) can be used to extract energy and perform work. Contrariwise, if you have very little available energy (a near-equilibrium temperature room), you have a high-entropy system.
perpetual motion norman rockwell
Perpetual motion has long been a holy grail of tinkerers and inventors, but it violates the laws of physics, including Newton’s 3rd law and the laws of thermodynamics. In our Universe, entropy can never spontaneously decrease, which is enough to falsify all perpetual motion ideas.
Credit: Norman Rockwell/public domain

Whenever we discuss entropy, we have to keep in mind that we’re constrained by the science of thermodynamics, and particularly by the laws of thermodynamics. In particular, the second law is of extreme relevance, stating that the entropy of a closed-and-isolated (self-contained) system — a system that doesn’t allow the exchange of matter or energy with the outside environment — can only increase or stay the same over time; it can never go down. Even though the Universe is only approximately closed-and-isolated, that approximation is very, very good for almost all applications. In other words, over time, the entropy of the entire Universe must increase. It’s the only known law of physics that appears to exhibit a preferred direction for time.

Does that mean that it’s possible that we only experience time the way we do because of the second law of thermodynamics?

If so, it would suggest that there’s a fundamentally deep connection between the arrow of time and entropy. While many in the philosophy community (including physicists who tread into philosophy) think there might be such a connection, this is not a matter for philosophy alone. Instead, we can look to the physical evidence of systems where entropy increases, where entropy remains constant, and even where we manipulate the (no longer closed-and-isolated) system, externally, to artificially decrease the entropy inside. If the perceived arrow of time always runs forward, regardless of what happens to the entropy inside the system, this suggested link would then be falsified.

big bang
There is a large suite of scientific evidence that supports the expanding Universe and the Big Bang. At every moment throughout our cosmic history for the first several billion years, the expansion rate and the total energy density balanced precisely, enabling our Universe to persist and form complex structures. Whether the entropy of any part (or the whole) of the Universe increases, decreases, or remains the same, time always runs forward at the same rate.
Credit: NASA / GSFC

As it turns out, reversing the flow of entropy inside of most systems is easier said than done. Sure, you can scramble and cook an egg, which is a very easy process compared to the time-reversed one: uncooking and unscrambling an egg. Although it may be possible, from a practical point of view, it’s a very unlikely prospect: one that never occurs naturally in this Universe, and one that would require careful manipulation (at the molecular level) to induce. The same situation applies when you pour cream into your coffee and stir it; homogenizing your coffee/cream mixture is a lot easier than separating mixed coffee/cream into its individual constituents. Practically, the entropy-reversing process never spontaneously occurs.

Just as you might suspect, thermodynamics and entropy play an outsized role in both of these processes. We can measure a stark difference in entropy between the initial (unscrambled and uncooked, or unstirred and unmixed) and final (scrambled and cooked, or mixed) states, and find that, unsurprisingly, the final states are higher-entropy states than the initial ones. These cases are a specific example of entropy at work, where an initially lower-entropy state (with more available energy capable of performing work) transitions into a final, higher-entropy state (with less available energy to perform work). As you may have noticed, the direction of the thermodynamic arrow of time, where entropy increases, is indeed coincident with the perceived forward passage of time.

As ice melts in a drink, the system approaches an equilibrium configuration, where all of the molecules inside have the same temperature, as opposed to a pre-melting state, where the ice is often significantly colder than the liquid it’s placed in. Drinks never spontaneously heat up and form ice cubes; only the reverse, where warmer drinks and cooler ice cubes move closer to their mutual thermal equilibrium.
Credit: Victor Blacus/Wikimedia Commons

Nature is full of examples such as the mixing of coffee and cream or the scrambling and cooking of an egg: what we conventionally call “irreversible reactions” in physics. Drop an ice cube into a warm drink, and the ice will melt, resulting in a cool drink that’s at a uniform, lower temperature than the drink was before you put the ice cube into it. On the other hand, a cool drink will never spontaneously separate into a warm drink and an ice cube; that’s something that’s forbidden by the second law of thermodynamics. Similarly, if you create a room with a barrier between two halves of it, with one half being kept hot and the other half being kept cold, you can anticipate what will happen if you then open a gate within the barrier: allowing the particles between the two halves to mix.

Over time, the room will equilibrate, and at some late enough time, you’ll find that both halves are now filled with intermediate temperature particles. Never, no matter how (practically) long you wait, will the two halves spontaneously separate into a room that’s half-hot on one side of the gate and half-cold on the opposite side of the gate. This is the price that the laws of thermodynamics extract from the Universe over time: the total entropy of a closed-and-isolated system can never decrease. The interactions we’re describing here are not spontaneously reversible.

Except, if you throw “spontaneous” away along with “closed-and-isolated,” an individual system can be coaxed into experiencing entropy reversal after all.

entropy thermodynamics
A system set up in the initial conditions on the left and allowed to evolve will have less entropy if the door remains closed (left) than if the door is opened (right). If the particles are allowed to mix, there are more ways to arrange twice as many particles at the same equilibrium temperature than there are to arrange half of those particles, each, at two different temperatures, resulting in a much greater entropy for the system at right than the one at left.
Credit: Htkym & Dhollm/Wikimedia Commons

There’s a caveat that most people forget when it comes to the second law of thermodynamics and the seemingly inevitable, accompanying entropy increase: the law only holds when we apply it to a closed-and-isolated system. So long as we have a system where there’s no external energy inputted into it or extracted from it, and no particles added into or taken away from it, and there are no additions to or subtractions from entropy with respect to the outside world, the second law of thermodynamics is mandatory. There are no known exceptions to the second law of thermodynamics for closed-and-isolated systems.

Travel the Universe with astrophysicist Ethan Siegel. Subscribers will get the newsletter every Saturday. All aboard!

But what if we set up our physical system in such a way that those conditions are violated? It turns out, when we input energy into the system, or when we add-or-remove matter from the system, it suddenly becomes possible to violate the second law of thermodynamics. Consider, again, a box with a hot side and a cold side, separated by a divider, versus that same box where the two sides are well-mixed, and both at the same temperature. Is there a way to reverse the “two halves of a box” reaction, where we can start from a well-mixed state and wind up with a “hot side” and a “cold side” to the box?

maxwell demon
A representation of Maxwell’s demon, which can sort particles according to their energy on either side of a box. By opening and closing the divider between the two sides, the flow of particles can be intricately controlled, reducing the entropy of the system inside the box. However, the demon must exert energy to make this happen, and the overall entropy of the box+demon system still increases.
Credit: Htkym/Wikimedia Commons

Indeed, there is a way. It was first thought up by the great physicist James Clerk Maxwell way back in the 1870s. By positing an external entity that’s capable of quickly opening or closing a divider, or gate, between the two sides of the room at the critical, opportune moment, the “cold” molecules can be collected one side and denied passage over to the opposite side, while the “hot” molecules are similarly collected and maintained on the other side. This idea is now known as Maxwell’s demon, and it enables you to decrease the entropy of the system after all, but only at a cost: the cost of expending the energy required to monitor the system and open-and-close the gate separating the two sides.

Following this procedure, however, doesn’t violate the second law of thermodynamics, as the box is no longer a closed-and-isolated system. Instead, you’d have to consider the total entropy of the box plus the entropy of the demon (or the actions of the demon), both added together, in order to find that their combined entropy always increases, just like you’d expect. It’s only if you look solely at a part of the system, like the box alone (while ignoring the demon and its actions), that you perceive a decrease in entropy.

But the situation inside this box is exactly what we need to disprove the hypothetical connection between the thermodynamic arrow of time and the perceptive arrow of time. Even if you lived in the box and the demon was undetectable — similar to what you’d experience if you lived in a pocket of the Universe where the local entropy of your not closed-and-isolated system decreases — time would still run forward for you. This is sufficient to draw the big conclusion: thermodynamic arrow of time does not determine our perceptive arrow of time.

Time is normally something we measure with clocks: devices that record its passage from one moment to the next. While there is an interesting philosophical case to be made that time is an illusion, the fact that we can measure, quantify, and cannot stop its passage all strongly suggests that it truly exists.
Credit: LeArchitecto / Adobe Stock

In order to perform such an experiment responsibly, or to even set up such an experiment in a responsible fashion, you’d have to carefully control the energy and entropy inputs and outputs of your system. If you are allowed to input energy (or transfer particles) into your system, then all of the earlier reactions that we had previously labeled as irreversible can suddenly occur, including:

  • uncooking and unscrambling an egg,
  • unmixing coffee and cream,
  • separating a lukewarm drink into a hot drink and an ice cube,
  • or separating a uniform-temperature room into a hot half and a cold half.

Just as cleaning your room or ordering a mixed deck of cards can decrease its entropy, but only at the expense of the energy you inputted to create that order, these actions have to come at an external cost. However, whether the entropy within your system goes up, down, or remains the same — even if you make these reactions happen in a way that (locally) reverses entropy — all the clocks within your system will still run forward. In natural systems where the entropy remains constant, such as an adiabatically expanding cloud of collisionless matter, time still runs forward. Moreover, time doesn’t just run forward, but always runs forward at exactly the same rate for all observers, regardless of whether or how their entropy changes: at the rate of one second-per-second.

inflation and quantum fluctuations stretched to give rise to the modern universe
From inflation to the hot Big Bang, to the birth and death of stars, galaxies, and black holes, all the way to our ultimate dark energy fate, we know that entropy never decreases with time. But we still don’t understand why time itself flows forward. However, we’re pretty certain that entropy, and the thermodynamic arrow of time, cannot be the answer.
Credit: E. Siegel; ESA/Planck and the DOE/NASA/NSF Interagency Task Force on CMB research

As far as we can tell, the second law of thermodynamics is true: entropy never decreases for any closed-and-isolated system in the Universe, including for the entire observable Universe itself taken collectively. It’s also true that time, as anyone perceives it, always runs in one direction only: forward, for all observers, at the same experiential rate for everyone. What many don’t appreciate, however, is that these two types of arrows — the thermodynamic arrow of entropy and the perceptive arrow of time — are not interchangeable.

During the period of cosmic inflation that set up and preceded the hot Big Bang, where the entropy remains low and constant, time still runs forward. When the last star has burned out and the last black hole has decayed and the eventually-empty Universe is wholly dominated by dark energy, time will still run forward. And everywhere in between, regardless of what’s happening in the Universe or the entropy of any system within that Universe, time will still run forward at exactly that same, universal rate for all observers: one second-per-second. If you want to know why yesterday is in the immutable past, tomorrow will arrive in a day, and the present is what you’re experiencing right now, you’re in good company; nobody knows why time has these properties. What we do know, however, is that thermodynamics, interesting though it may be, doesn’t hold the solution to that puzzle.


Up Next