Remodeling the Universe

David Z Albert the is Frederick E. Woodbridge Professor of Philosophy and Director of the M.A. Program in The Philosophical Foundations of Physics at Columbia University. He is the author of "Time and Chance," "Quantum Mechanics and Experience," among others. He received his B.S. in physics from Columbia College (1976) and his doctorate in theoretical physics from The Rockefeller University. He lives in New York City.

  • Transcript


Question: Can you give a brief overview of quantum mechanics?

David Albert: Quantum mechanics is supposed to be a completely general account of the behavior of the physical world. The way quantum mechanics emerged historically was that around the end of the 19th century, the beginning of the 20th century, there were more and more reasons to be worried that the prevailing classical physics -- that is the physics of Newton and later of Maxwell, who incorporated electromagnetic phenomena into Newton's theory -- that the Newtonian/Maxwellian model of the world that we had was apparently going to be unable to account for atomic structure, was going to be unable to account for things as simple as the stability of matter.

A famous problem that very much worried people just in the period immediately before quantum mechanics emerged was that people had experimentally seen that the atom apparently consisted of a small positively charge core with electrons rotating around it, and it's easy to show from Maxwell's equations that electrons going in a circular orbit around a core like that are going to produce huge amounts of electromagnetic radiation; they're going to lose all of the energy of their orbits to this radiation; they should very quickly crash into the nuclei, and matter should cease to exist. So there was a very acute problem about how matter could be stable at all. The more this and other related problems were looked into at the beginning of the century, the more -- the less it looked as if there was any hope at all of an explanation of these phenomena along the lines of Newtonian and Maxwellian classical physics.

It looked like this was going to call for nothing less than a global revolution in our fundamental theories of physics. And over a period of 10 years or so -- centered on the period between 1920 and 1930, say -- a new fundamental theory of the world was developed, called quantum mechanics. Quantum mechanics, then, aspires to be a complete replacement for Newtonian mechanics. It now proposes to be understood as the fundamental laws of the evolutions of the physical states of every kind of physical system. But, as it surely should have, quantum mechanics was engineered in such a way as to essentially reproduce the predictions of classical mechanics for those systems, human-size systems, macroscopic and larger systems, for which Newtonian mechanics was known to do a good job, okay? And the way it worked out was that the predictions of quantum mechanics differed significantly from those of Newtonian mechanics only for very small kinds of physical systems, subatomic systems, so on and so forth, which is exactly where we needed it to differ.

So it's important to separate how the theory was discovered and what it was originally developed for from what it proposes to be once it's there. It was originally developed to account for new phenomena that we had discovered at the subatomic level, okay? But it would be a serious misunderstanding to interpret it therefore as merely a specialized theory of subatomic objects. It's supposed to be the theory of the entire physical world, but its discovery was prompted by failures of the previously existing theory in the subatomic realm.