Skip to content
13.8

Why information is central to physics and the universe itself

Information may not seem like something physical, yet it has become a central concern for physicists. A wonderful new book explores the importance of the “dataome” for the physical, biological, and human worlds.

Credit: agsandrew via Adobe Stock

Key Takeaways
  • The most important current topic in physics relates to a subject that hardly seems physical at all — information, which is central to thermodynamics and perhaps the universe itself.
  • The “dataome” is the way human beings have been externalizing information about ourselves and the world since we first began making paintings on cave walls.
  • The dataome is vast and growing everyday, sucking up an ever increasing share of the energy humans produce.
Sign up for the Smarter Faster newsletter
A weekly newsletter featuring the biggest ideas from the smartest people

Physics is a field that is supposed to study real stuff. By real, I mean things like matter and energy. Matter is, of course, the kind of stuff you can hold in your hand. Energy may seem a little more abstract, but its reality is pretty apparent, appearing in the form of motion or gravity or electromagnetic fields.

What has become apparent recently, however, is the importance to physics of something that seems somewhat less real: information. From black holes to quantum mechanics to understanding the physics of life, information has risen to become a principal concern of many physicists in many domains. This new centrality of information is why you really need to read astrophysicist Caleb Scharf’s new book The Ascent of Information: Books, Bits, Machines, and Life’s Unending Algorithms.

Scharf is currently the director of the Astrobiology Program at Columbia University. He is also the author of four other books as well as a regular contributor to Scientific American.

(Full disclosure: Scharf and I have been collaborators on a scientific project involving the Fermi Paradox, so I was a big fan before I read this new book. Of course, the reason why I collaborated with him is because I really like the way he thinks, and his creativity in tackling tough problems is on full display in The Ascent of Information.)

What is the dataome?

In his new book, Scharf is seekinga deeper understanding of what he calls the “dataome.” This is the way human beings have been externalizing information about ourselves and the world since we first began making paintings on cave walls. The book opens with a compelling exploration of how Shakespeare’s works, which began as scribbles on a page, have gone on to have lives of their own in the dataome. Through reprintings in different languages, recordings of performances, movie adaptations, comic books, and so on, Shakespeare’s works are now a permanent part of the vast swirling ensemble of information that constitutes the human dataome.

I found gems in these parts of the book that forced me to put the volume down and stare into space for a time to deal with their impact.

But the dataome does not just live in our heads. Scharf takes us on a proper physicist’s journey through the dataome, showing us how information can never be divorced from energy. Your brain needs the chemical energy from food you ate this morning to read, process, and interpret these words. One of the most engaging parts of the book is when Scharf details just how much energy and real physical space our data-hungry world consumes as it adds to the dataome. For example, the Hohhot Data Center in the Inner Mongolia Autonomous Region of China is made of vast “farms” of data processing servers covering 245 acres of real estate. A single application like Bitcoin, Scharf tells us, consumes 7.7 gigawatts per year, equivalent to the output of half a dozen nuclear reactors!

Information is everywhere

But the dataome is not just about energy. Entropy is central to the story as well. Scharf takes the reader through a beautifully crafted discussion of information and the science of thermodynamics. This is where the links between energy, entropy, the limits of useful work, and probability all become profoundly connected to the definition of information.

The second law of thermodynamics tells us that you cannot use all of a given amount of energy to do useful work. Some of that energy must be wasted by getting turned into heat. Entropy is the physicist’s way of measuring that waste (which can also be thought of as disorder). Scharf takes the reader through the basic relations of thermodynamics and then shows how entropy became intimately linked with information. It was Claude Shannon’s brilliant work in the 1940s that showed how information — bits — could be defined for communication and computation as an entropy associated with the redundancy of strings of symbols. That was the link tying the physical world of physics explicitly to the informational and computational world of the dataome.

The best parts of the book are where Scharf unpacks how information makes its appearance in biology. From the data storage and processing that occurs with every strand of DNA, to the tangled pathways that define evolutionary dynamics, Scharf demonstrates how life is what happens to physics and chemistry when information matters. I found gems in these parts of the book that forced me to put the volume down and stare into space for a time to deal with their impact.

The physics of information

    There are a lot of popular physics books out there about black holes and exoplanets and other cool stuff. But right now, I feel like the most important topic in physics relates to a subject that hardly seems physical at all. Information is a relatively new addition to the physics bestiary, making it even more compelling. If you are looking for a good introduction to how that is so, The Ascent of Information is a good place to start.

    In this article
    Sign up for the Smarter Faster newsletter
    A weekly newsletter featuring the biggest ideas from the smartest people

    Related

    Up Next