It’s true that we have a thermodynamic arrow of time, and entropy always increases. But that can’t explain what we perceive.
One of the enormous conceptual ideas that came along with Einstein’s theory of relativity was the surprise that time itself, long considered fundamental and universal, is actually relative. Different observers, so long as they move through space at different speeds or in different directions, will experience the flow of time differently from one another. Whether two events occur simultaneously or one-before-the-other depends entirely on the observer’s point of view.
And yet, despite how ambiguous time is, there are some facts about it that all observers can agree on. Perhaps the most fundamental — and yet perhaps the most puzzling as well — is that everyone, in their own inertial reference frame, always sees time moving forward at the same rate: one second-per-second. This fact is known as the arrow of time, and while there are many ideas as to what causes it, we know it isn’t thermodynamics. Here’s the science behind why.
With every moment that passes, no matter what is going on around us, we find ourselves traveling into the future. Light propagates in the direction it was moving at the speed of light, moving the appropriate distance for any given amount of time, regardless of what else is going on. At no point, and under no circumstances, does time ever appear to either stand still or reverse.
In other words, the arrow of time always points in the forward direction for us. But this is a puzzle for physics, because the laws of nature, with very, very few exceptions, are completely time-symmetric. From Newton to Einstein to Maxwell to Bohr to Dirac to Feynman, the equations that govern reality don’t have a preference for the flow of time. The behavior of any system can be described by equations that are just as valid in the forward direction as they are in the backward direction.
So where, then, does our arrow of time come from?
According to many, there might be a link between what we perceive as the arrow of time and a quantity called entropy. Commonly known as “a measure of disorder” in a physical system, there are actually two better ways to think about it.
- Entropy can be viewed as the number of possible arrangements of the (quantum) state of your system. If you have more options for how you could arrange your system so that it remains identical, you have higher entropy than if there are fewer options. A room with 20 different regions at 20 different temperatures has a lower entropy than a room where every location has the same temperature.
- It’s also useful to think of entropy as a measure of how much thermal (heat) energy could possibly be turned into useful, mechanical work. When you have lots of energy available to do work (such as a room with a hot source and a cold sink), you have a low-entropy system, whereas if you have very little available energy (a near-equilibrium temperature room), you have a high-entropy system.
When we discuss entropy, one of the most important constraints of all comes from the science of thermodynamics. In particular, the second law is of extreme relevance, stating that the entropy of a closed (self-contained) system can only increase or stay the same over time; it can never go down. In other words, over time, the entropy of the entire Universe must increase. It’s the only known law of physics that appears to have a preferred direction for time.
So, does that mean that we only experience time the way we do because of the second law of thermodynamics? That there’s a fundamentally deep connection between the arrow of time and entropy? While many in the philosophy community (including physicists who tread into philosophy) think there might be, the physical evidence strongly indicates otherwise.
Sure, you can scramble and cook an egg, and that’s a very easy process compared to the time-reversed one; uncooking and unscrambling an egg is practically, shall we say, a very unlikely prospect. The same situation applies when you pour cream into your coffee and stir it; homogenizing your coffee/cream mixture is a lot easier than separating mixed coffee/cream into its individual constituents.
Indeed, thermodynamics and entropy play an outsized role in both of these processes, showcasing a stark difference in entropy between the initial (unscrambled and uncooked, or unmixed) and final (scrambled and cooked, or mixed) states. These cases are a specific example of entropy at work, where an initially lower-entropy state (with more available energy capable of performing work) transitions into a final, higher-entropy state (with less available energy to perform work), coincident with the passage of time.
Nature is full of examples such as these: what we conventionally call irreversible reactions in physics. Drop an ice cube into a warm drink, and the ice will melt, resulting in a cool drink; a cool drink will never separate into a warm drink and an ice cube. Create a room with a barrier between two halves of it, one half hot and one half cold, and then open a gate allowing the particles between the two halves to mix.
Over time, the room will equilibrate, and both halves will be filled with intermediate temperature particles. Never, no matter how (practically) long you wait, will the two halves spontaneously separate into a room that’s half-hot and half-cold again. This is the price the Universe extracts over time: the total entropy of a system can never decrease. These interactions are not reversible.
Except, if you rig things just right, perhaps they can be reversed after all.
There’s a caveat that most people forget when it comes to the second law of thermodynamics and the inevitable, accompanying entropy increase: the law only holds when we apply it to a closed system. So long as we have a system where there’s no external energy inputted into it or extracted from it, or there are no additions to or subtractions from entropy with respect to the outside world, the second law of thermodynamics is mandatory.
But if we violate those conditions, we could violate the second law of thermodynamics after all. A way to reverse the “two halves of a box” reaction was first thought up by the great physicist James Clerk Maxwell way back in the 1870s. By positing an external entity that’s capable of quickly opening or closing a divide between the two sides of the room at an opportune moment, the “cold” molecules can be collected one side with the “hot” molecules collected on the other.
This idea is now known as Maxwell’s demon, and it enables you to decrease the entropy of the system after all, at the cost of expending the energy required to monitor the system and open-and-close the gate between the two sides.
Doing this doesn’t violate the second law of thermodynamics, as the total entropy of the box and the entropy of the demon (or the actions of the demon) must be added together, and that combined entropy always increases. Only if you look at a part of the system, like the box alone (and ignored the demon and its actions), would you perceive a decrease in entropy.
But this is exactly what we need to disprove the hypothetical connection between the thermodynamic arrow of time and the perceptive arrow of time. Even if you lived in the box and the demon were undetectable — similar to if you lived in a pocket of the Universe that saw an entropy decrease — time would still run forward for you. The thermodynamic arrow of time does not determine our perceptive arrow of time.
If you carefully control the energy and entropy inputs and outputs of your system, all of these reactions that we’d previously labeled as irreversible can actually occur, including:
- uncooking and unscrambling an egg,
- unmixing coffee and cream,
- separating a lukewarm drink into a hot drink and an ice cube,
- or separating a uniform-temperature room into a hot half and a cold half.
But even if you make those reactions happen in a way that (locally) reverses entropy, your clocks still run forward. In natural systems where the entropy remains constant, such as an adiabatically expanding cloud of collisionless matter, time still runs forward. Moreover, it always does so at exactly the same rate for all observers, regardless of whether or how their entropy changes: at the rate of one second-per-second.
As far as we can tell, the second law of thermodynamics is true: entropy never decreases for any closed system in the Universe, including for the entirety of the observable Universe itself. It’s also true that time always runs in one direction only, forward, for all observers. What many don’t appreciate is that these two types of arrows — the thermodynamic arrow of entropy and the perceptive arrow of time — are not interchangeable.
During inflation, where the entropy remains low and constant, time still runs forward. When the last star has burned out and the last black hole has decayed and the Universe is dominated by dark energy, time will still run forward. And everywhere in between, regardless of what’s happening in the Universe or with its entropy, time still runs forward at exactly that same, universal rate for all observers.
If you want to know why yesterday is in the immutable past, tomorrow will arrive in a day, and the present is what you’re experiencing right now, you’re in good company. But thermodynamics, interesting though it may be, won’t give you the answer. As of 2019, it’s still an unsolved mystery.
Starts With A Bang is now on Forbes, and republished on Medium thanks to our Patreon supporters. Ethan has authored two books, Beyond The Galaxy, and Treknology: The Science of Star Trek from Tricorders to Warp Drive.