Is information the fifth form of matter?
- Researchers have been trying for over 60 years to detect dark matter.
- There are many theories about it, but none are supported by evidence.
- The mass-energy-information equivalence principle combines several theories to offer an alternative to dark matter.
The “discovery” of dark matter
We can tell how much matter is in the universe by the motions of the stars. In the1920s, physicists attempting to do so discovered a discrepancy and concluded that there must be more matter in the universe than is detectable. How can this be?
In 1933, Swiss astronomer Fritz Zwicky, while observing the motion of galaxies in the Coma Cluster, began wondering what kept them together. There wasn't enough mass to keep the galaxies from flying apart. Zwicky proposed that some kind of dark matter provided cohesion. But since he had no evidence, his theory was quickly dismissed.
Then, in 1968, astronomer Vera Rubin made a similar discovery. She was studying the Andromeda Galaxy at Kitt Peak Observatory in the mountains of southern Arizona when she came across something that puzzled her. Rubin was examining Andromeda's rotation curve, or the speed at which the stars around the center rotate, and realized that the stars on the outer edges moved at the exact same rate as those at the interior, violating Newton's laws of motion. This meant there was more matter in the galaxy than was detectable. Her punch card readouts are today considered the first evidence of the existence of dark matter.
Many other galaxies were studied throughout the '70s. In each case, the same phenomenon was observed. Today, dark matter is thought to comprise up to 27% of the universe. "Normal" or baryonic matter makes up just 5%. That's the stuff we can detect. Dark energy, which we can't detect either, makes up 68%.
Dark energy is what accounts for the Hubble Constant, or the rate at which the universe is expanding. Dark matter on the other hand, affects how "normal" matter clumps together. It stabilizes galaxy clusters. It also affects the shape of galaxies, their rotation curves, and how stars move within them. Dark matter even affects how galaxies influence one another.
Leading theories on dark matter
NASA writes: 'This graphic represents a slice of the spider-web-like structure of the universe, called the "cosmic web." These great filaments are made largely of dark matter located in the space between galaxies.'
Credit: NASA, ESA, and E. Hallman (University of Colorado, Boulder)
Since the '70s, astronomers and physicists have been unable to identify any evidence of dark matter. One theory is it's all tied up in space-bound objects called MACHOs (Massive Compact Halo Objects). These include black holes, supermassive black holes, brown dwarfs, and neutron stars.
Another theory is that dark matter is made up of a type of non-baryonic matter, called WIMPS (Weakly Interacting Massive Particles). Baryonic matter is the kind made up of baryons, such as protons and neutrons and everything composed of them, which is anything with an atomic nucleus. Electrons, neutrinos, muons, and tau particles aren't baryons, however, but a class of particles called leptons. Even though the (hypothetical) WIMPS would have ten to a hundred times the mass of a proton, their interactions with normal matter would be weak, making them hard to detect.
Then there are those aforementioned neutrinos. Did you know that giant streams of them pass from the Sun through the Earth each day, without us ever noticing? They're the focus of another theory that says that neutral neutrinos, that only interact with normal matter through gravity, are what dark matter is comprised of. Other candidates include two theoretical particles, the neutral axion and the uncharged photino.
Now, one theoretical physicist posits an even more radical notion. What if dark matter didn't exist at all? Dr. Melvin Vopson of the University of Portsmouth, in the UK, has a hypothesis he calls the mass-energy-information equivalence. It states that information is the fundamental building block of the universe, and it has mass. This accounts for the missing mass within galaxies, thus eliminating the hypothesis of dark matter entirely.
To be clear, the idea that information is an essential building block of the universe isn't new. Classical Information Theory was first posited by Claude Elwood Shannon, the "father of the digital age" in the mid-20th century. The mathematician and engineer, well-known in scientific circles—but not so much outside of them, had a stroke of genius back in 1940. He realized that Boolean algebra coincided perfectly with telephone switching circuits. Soon, he proved that mathematics could be employed to design electrical systems.
Shannon was hired at Bell Labs to figure out how to transfer information over a system of wires. He wrote the bible on using mathematics to set up communication systems, thereby laying the foundation for the digital age. Shannon was also the first to define one unit of information as a bit.
There was perhaps no greater proponent of information theory than another unsung paragon of science, John Archibald Wheeler. Wheeler was part of the Manhattan Project, worked out the "S-Matrix" with Niels Bohr and helped Einstein develop a unified theory of physics. In his later years, he proclaimed, "Everything is information." Then he went about exploring connections between quantum mechanics and information theory.
He also coined the phrase "it from bit" or that every particle in the universe emanates from the information locked inside it. At the Santa Fe Institute in 1989, Wheeler announced that everything, from particles to forces to the fabric of spacetime itself "… derives its function, its meaning, its very existence entirely … from the apparatus-elicited answers to yes-or-no questions, binary choices, bits."
Part Einstein, part Landauer
Vopson takes this notion one step further. He says that not only is information the essential unit of the universe but also that it is energy and has mass. To support this claim, he unifies and coordinates special relativity with the Landauer Principle. The latter is named after Rolf Landauer. In 1961, he predicted that erasing even one bit of information would release a tiny amount of heat, a figure which he calculated. Landauer said this proves information is more than just a mathematical quantity. This connects information to energy. Through experimental testing over the years, the Landauer Principle has held up.
Vopson says, "He [Landauer] first identified the link between thermodynamics and information by postulating that logical irreversibility of a computational process implies physical irreversibility." This indicates that information is physical, Vopson says, and demonstrates the link between information theory and thermodynamics.
In Vopson's theory, information, once created has "finite and quantifiable mass." It so far applies only to digital systems, but could very well apply to analogue and biological ones too, and even quantum or relativistic-moving systems. "Relativity and quantum mechanics are possible future directions of the mass-energy-information equivalence principle," he says.
In the paper published in the journal AIP Advances, Vopson outlines the mathematical basis for his hypothesis. "I am the first to propose the mechanism and the physics by which information acquires mass," he said, "as well as to formulate this powerful principle and to propose a possible experiment to test it."
The fifth state of matter
To measure the mass of digital information, you start with an empty data storage device. Next, you measure its total mass with a highly sensitive measuring apparatus. Then, you fill it and determine its mass. Next, you erase one file and evaluate it again. The trouble is, the "ultra-accurate mass measurement" device the paper describes doesn't exist yet. This would be an interferometer, something similar to LIGO. Or perhaps an ultrasensitive weighing machine akin to a Kibble balance.
"Currently, I am in the process of applying for a small grant, with the main objective of designing such an experiment, followed by calculations to check if detection of these small mass changes is even possible," Vopson says. "Assuming the grant is successful and the estimates are positive, then a larger international consortium could be formed to undertake the construction of the instrument." He added, "This is not a workbench laboratory experiment, and it would most likely be a large and costly facility." If eventually proved correct, Vopson will have discovered the fifth form of matter.
So, what's the connection to dark matter? Vopson says, "M.P. Gough published an article in 2008 in which he worked out … the number of bits of information that the visible universe would contain to make up all the missing dark matter. It appears that my estimates of information bit content of the universe are very close to his estimates."
The I Ching serves as a foundation for many Eastern philosophies and Western mathematics.
- The I Ching is the basis for polymath Gottfried Wilhelm Leibniz's binary code and subsequently basis of our digital technology.
- Psychologist Carl Jung used the Book of Changes to explore notions of synchronicity or "meaningful coincidence."
- Alan Watts considered the I Ching to be a model that mapped the thinking processes of the human mind.
The I Ching or, as many Western audiences know it, the Book of Changes, is a book that is thousands of years old. Throughout the years it has served as an all-encompassing philosophical treatise of the universe, a guide toward ethical living, a guidebook for ruling, and as an oracle for one's personal life and psychic future. Two of the most major branches of Chinese philosophy, Confucianism and Taoism owe their creation to this foundational book.
Here and there it popped up for Western scientists and philosophers to study — the first European commentary was written in the late 15th century. In the 1950s and subsequent '60s counterculture, the I Ching held a special place as a divinatory guidance book for living a better life. The book has spawned countless interpretations, commentaries and dueling schools of thought. It is by far the most consulted book in China and East Asia.
All of this said, the exact origins of the I Ching is shrouded in myth and mystery. According to a mythological version of the creation story, the Chinese hero Fu Xi stared into the skies and the world around him and discovered that everything could be arranged in eight trigrams. That is, three stacked lines either broken or solid, which reflect yin and yang — the cosmic duality of the world and void.
There is historical record that, in 1050 BCE, Emperor Wen of the Zhou dynasty changed the trigrams into hexagrams (six lines) which created 64 different combinations. This is the Book of Changes that we now know today.
The I Ching and Carl Jung
German sinologist, Richard Wilhelm's translation of the I Ching stands as the definitive work to read if you're interested in learning about the ancient work. In a foreword to the book, Carl Jung, psychiatrist and founder of analytical psychology, expressed his intrigue on the divinatory aspect of this mysterious book:
"For more than 30 years I have interested myself in this oracle technique, or method of exploring the unconscious, for it has seemed to me of uncommon significance. I was already fairly familiar with the I Ching when I first met Wilhelm in the early 1920s; he confirmed for me then what I already knew, and taught me many things more."
Jung used the oracle with his patients during therapy sessions. There was a great deal of meaningful and relevant answers to his patients' questions. Coming from a scientific background where demonstrable causality is gospel, Jung was very curious to see why this ancient book was so apt to a seemingly infinite amount of circumstance.
"... A certain curious principle that I have termed synchronicity, a concept that formulates a point of view diametrically opposed to that of causality. Since the latter is a merely statistical truth and not absolute, it is a sort of working hypothesis of how events evolve one out of another, whereas synchronicity takes the coincidence of events in space and time as meaning something more than mere chance."
Even today, such talk of synchronicity gets eye rolls from the materialist and positivist crowd as just a bunch of New Age hogwash. "Western scholars have tended to dispose of it as a collection of 'magic spells,'" wrote Jung in the foreword.
Jung believed that the traditional Chinese mind, as he saw their work laid out in the I Ching, is preoccupied with the chance aspect of natural events. Our notion of coincidence is the main concern of the I Ching. Jung would go on to propose that psyche and matter are one in the same and that through synchronicity, inner psyche and the outside world are intrinsically connected in a way unknown to scientists still tied to their irrefutable axiom truth of causality.
Jung explains that the ancient Chinese school of thought was more modern than we suspected.
"The ancient Chinese mind contemplates the cosmos in a way comparable to that of the modern physicist, who cannot deny that his model of the world is a decidedly psychophysical structure. The microphysical event includes the observer just as much as the reality underlying the I Ching comprises subjective, i.e., psychic conditions in the totality of the momentary situation."
The Book of Changes and Alan Watts
Alan Watts - The I Ching
The essence of all of Alan Watts' philosophies and arguments largely pertain to remedying the apparent separation of dualities and realizing that in its stead is always an interdependence of opposites. Under our limited dualities of language this seems to be a universal truth. This implies that and that implies this. Order and chaos, self and other and so on.
Alan Watts saw that the implicit idea of yin and yang was a fundamental way of viewing reality. It is within this infinite mix of yin and yang in which the multiplicity of reality arises.
In fact all information whatsoever can be translated into terms of yang and yin.
"The Book of Changes is thought to be the oldest of the great Chinese classics and to date from perhaps as early as 1300 BCE. The book may also go back to the earliest phases of human thought because the I Ching is really the ground plan in the way in which not only the Chinese think, it's almost a mapping of all the thinking processes of man."
Watts was aware of the fact that the system of arithmetic which is used by digital computers came from the I Ching. Here he refers to binary code:
"We have a binary system of arithmetic zero and one in varying arrangements. Digital computers use a number system which consists only of the figures zero and one, out of which you can construct any number and this was invented by Leibniz who got it from the Book of Changes."
The I Ching predates binary code by some 5,000 years — if the earliest estimates of the book's creation are true.
In the late 1600s to early 1700s, Gottfried Wilhelm Leibniz was looking for a better arithmetic method over the decimal system. Leibniz invented binary arithmetic by studying the I Ching. This said, the Book of Changes being an influence for binary code is an understatement.
The title of his paper was: "Explanation of the binary arithmetic, which uses only the characters 1 and 0, with some remarks on its usefulness, and on the light it throws on the ancient Chinese figures of Fu X."
Leibniz would surely be shocked at what has succeeded his invention. Everything that we compute and represent digitally and experience is at its core a complex string of binary signals.
Watts found this a profound insight into the validity of the I Ching and its ancient wisdom:
There's a sudden unexpected link between the most sophisticated mathematical machinery and a book originating at least in 1300 BC.
Book of Changes influence on modern society
There is no final authority on the cosmic truths that the I Ching reveals. But it does offer us an ancient and renewed paradigmatic way of viewing the world.
Alan Watts leaves us with a fitting remark on its place in the world:
"This book is somehow always with us, but this then is a way of helping your own multivariable brain arrive at decisions cooperating with your own mind because then again after you've tossed your 64 sided coin, the oracle that you read and explains each particular hexagram in the Book of Changes is a sort of Rorschach blot, it is a very laconic remarks to which everybody reads just exactly what they want to read."
Whether you're utilizing it as a tool to dig into your psyche, consulting for advice like kings of yore or following it as a personal philosophy — the I Ching still has much to tell us.
Research shows that the way math is taught in schools and how its conceptualized as a subject is severely impairing American student's ability to learn and understand the material.
- Americans continually score either in the mid- or bottom-tier when it comes to math and science compared to their international peers.
- Students have a fundamental misunderstanding of what math is and what it can do. By viewing it as a language, students and teachers can begin to conceptualize it in easier and more practical ways.
- A lot of mistakes come from worrying too much about rote memorization and speedy problem-solving and from students missing large gaps in a subject that is reliant on learning concepts sequentially.
It comes as no surprise to most people that Americans perform worse in math and sciences than many of their international peers on the world stage. The numbers don't lie: A recent national survey from the Organization for Economic Cooperation and Development found that 82% of adults couldn't determine the cost of carpeting when given its dimensions and price per square yard.
Unlike the more difficult and comprehensive math tests given to test students' comprehension, this test was for basic numeracy skills. The United States fell behind in 22nd place.
For a country that has boasted, or at least hosted, some of the smartest minds and most competitive research labs, companies, and universities in the world, there is a strange disconnect between our overall mathematical ability and our professional output. There is no doubt that a lot of Americans are bad at math and even suffer crippling math anxiety from a very young age.
But why? It has to do with a few reasons: how math is presented as a subject, how it's taught, and what's expected from American students.
Why the U.S. needs to change its collective view of mathematics
Mathematics has been taught a certain way for decades in U.S. schools. Maybe it's time for that to change.
One of the first things that comes to many people's minds when they think about math is rote memorization, impracticability, and the old slacker adage, "When am I ever going to need to use this?" The quadratic formula, sines, and cosines have gotten a bad rap and taken a verbal beating by an innumerable amount of high schoolers for probably more than a century.
The vast majority of people who haven't had to use an equation since their senior year or cram session in college just don't see the value in math. That's because they fundamentally misunderstand what mathematics is.
Neil deGrasse Tyson put it succinctly when he said, "Math is the language of the universe. So the more equations you know, the more you can converse with the cosmos."
Now, that's part of the equation, but not all of it. Math, in a sense, is a way to speak and manipulate the world in a logic- and reason-based system using a specialized written language.
It is the language of numbers, quantity, and space, and it's used in applications for engineering, physics, and so on. It's doubtful that math is presented this way to children or students at an early age. But that's just one part of the problem with how we approach math.
Why it's easy to fall behind in math
Professor Po-Shen Loh at Carnegie Mellon believes that everyone is a math person; all they lack is proper instruction. In an interview with Big Think, he went on to say that math is a language that builds upon itself, and not understanding the foundations of math is like not understanding the roots and structure of a language.
Essentially if a student doesn't catch on in their first years of instruction, it's going to be very difficult for them to reverse course and excel later on down the line. He believes it is essential to catch this early on and address it before a student's issues with math reach a point where they feel "they're just not good at math."
Professor Loh goes on to say that "Mathematics is the principles of reasoning. There are ways to show you how these basic building blocks of reasoning can be used to deduce surprising and difficult things."
One major reason that mathematics is difficult to understand is because it is a network of prerequisites. Everything, all of the concepts, are chained in sequences of dependencies.
If you miss an important concept earlier on—say, not being able to understand how to chart a simple algebraic equation on a line graph, you'll have no idea how to go on to charting even more complicated equations.
Loh goes on to say that this is much more prevalent in mathematics than history, for example. If you didn't fully understand the War of 1812, it's not going to impact how you learn about the Civil War—aside from the occasional historical patterns you may or may not recognize, of course.
The way to address this is to provide a learning environment for everyone that moves at their own pace, to make sure to fill in the gaps, and to catch those lapses in understanding before they get out of control.
And if you're already in too deep, say, as a college grad or just an adult who wants to learn… well, it's time to start from square one.
A faulty learning and teaching methodology
A few years back, the Programme for International Student Assessment (PISA) dug a little deeper into how math is taught. A 2012 assessment questioned how students approach the subject. Their responses were categorized in three learning styles: some students relied mostly on memorization, others tried to relate new concepts to ones they've already learned, and finally, some used a self-monitoring approach in which they evaluated their understanding and focused on concepts they had yet to learn.
Without much of a surprise, it turned out that the memorizers were the least likely to achieve high scores and understanding. The United States ranked in the top three for this learning method. A more in-depth look showed that memorizers were about a half year behind students who used either relational or self-monitoring strategies.
Research has shown and most likely loads of anecdotal evidence shows that most math classrooms in the United States equate comprehension and skill with speed. Students who are the fastest on their time tables race against the clock to see how fast they can write down their memorized lines. This is not learning, this is not comprehension.
Studies show that stress interferes with the part of our brain we use to manipulate mathematical facts.
Studies have shown that children manipulate math facts with their working memory, an area of the brain that will go offline when they experience stress.
Now put together 45-minute timed tests in a condensed school year or semester combined with math anxiety, faulty instruction and expectations, poor learning methods, potential lapses in the fundamentals, and the problems start to pile up. As a result, the part of the brain responsible for mathematical thinking literally shuts off, and you start to see why Americans are so bad at math.
Leading mathematician Laurent Schwartz wrote in his autobiography that he was a slow thinker in math and even believed that he was stupid. That is until he realized that "What is important is to deeply understand things and their relations to each other. This is where intelligence lies. The fact of being quick or slow isn't really relevant."
The problem has been diagnosed, and a few pieces of the solution have been put together, but something is still missing.
Why new methods of teaching math aren’t working
We've tried different methods of teaching math over the years, but have any of them worked?
Many potentially great minds have probably been turned off by the fast-paced timed tests and wonky teaching methods presented through the years. The language of math needs to be presented in a way that shows how it connects to the world and demonstrates its great capacity for understanding and manipulating reality.
If more people could tap into this infinite matrix of power, they'd be able to engage in the wondrous world of math and unlock unknown potentials. It's not for a lack of trying that we've failed; it comes down to instruction yet again.
Despite being today's newest fad and the subject of ire from many on both sides of the political spectrum, Common Core is our latest panacea for our math woes. Yet we still suffer from what math professor and author John Allen Paulos calls innumeracy—a mathematical illiteracy akin to not being able to read or write.
What is needed is a fundamental shift in how we view mathematics as a subject so we can learn to imagine how it can benefit and help us in different fields. In addition, we need to make sure that it's taught in a way that no student skips past the fundamentals. Instructors and teachers at all levels must make a systemic change if we're to see any progress. Could this change be Common Core or a different teaching philosophy? We'll find out in the years to come.
One prominent mathematician asks: was Einstein such a smartypants after all?
Einstein's theory of relativity revolutionized our view of the universe, positing a space-time continuum undergirding all reality. Equally impactful has been quantum mechanics, which describe the behavior of subatomic particles in ways that differ from observable matter. But both theories have been verified by empirical observation and scientific experiments. String theory, and a select number of other theories that purport to explain the universe in one, all-encompassing equation, remain completely divorced from the physical world. Surely theories about the universe must relate directly to the matter in it?! Did Einstein get it wrong, or has groupthink led us down the wrong path for the last 40 years? Eric Weinstein basically posits that perhaps Einstein's work shouldn't necessarily be as lauded as it is, in part because Einstein himself said that it is a work in progress (or, in his words, "a mansion with a wing made out of marble and a wing made out of cheap wood"). What does this mean for you? Well, to most of the Joe Schmoe's in this world, not much. But if you're deep into theoretical physics and super advanced mathematics as Eric Weinstein is, you'll probably be hooting and hollering at the screen going "OH SNAP!" and "NO HE DI'NT!" like you're watching an NFL game. String theory... kids love it!
Mathematicians are working to combat partisan gerrymandering.
The political strife that defines today's America derives its energy from the feeling among many that their voices are not being heard. By and large, Americans do not trust Congress and often vote to send a message, hoping to get their opinions represented. The reality is that the political parties do all they can to stay in power, with achieving fairness and democracy not their primary goals.
Gerrymandering is the practice of drawing the borders of voting districts to favor specific candidates or political parties. It can make a difference in the number of representatives of each party that a state sends to Washington. In essence, using these strategies can allow one party to keep winning the majority of districts (and representatives) without having the most votes.
Jonathan Mattingly, a mathematician from Duke University in North Carolina, has been working for the past several years to figure out mathematical solutions to the problem. He would like to take the job of drawing voting district lines away from self-serving politicians.
As part of that goal, Mattingly created an algorithm that produces random iterations of the state’s election maps to show the impact of gerrymandering. This is not just a hypothetical exercise. The mathematician says that partisan gerrymandering is having a serious effect on our democracy.
“Even if gerrymandering affected just 5 seats out of 435, that’s often enough to sway crucial votes,” he said in an interview with the journal Nature, referring to the number of representatives in Congress.
Two of the most used methods in gerrymandering are packing and cracking. When they employ packing, legislators try to draw the map in such a way that the opposing voters would be packed into the fewest districts possible. Cracking means dividing the other party’s voters into several districts, making it harder for them to elect a representative. This tactic helps the party in power to stay in power.
Here’s a useful graphic from Washington Post on how gerrymandering works:
Mattingly’s state of North Carolina has been ground zero in this fight. While both parties used to receive a generally equal number of representatives (either six or seven), Republican redistricting several years ago packed most of the Democrats into three districts. The 2015-2016 North Carolina cohort to Washington included just 3 Democrats and 10 Republicans, while the statewide vote is split close to 50-50 between the two parties.
Recently, the Supreme Court weighed in that two districts in North Carolina were drawn along racial lines and were, as such, unconstitutional.
While the Supreme Court intervened in that case, the highest court in the land doesn't generally address gerrymandering as long as districts abide by four criteria - the districts need to be compact, continuous, have more or less the same number of people and give minority groups a chance to elect their own rep. The difficulty of objectively proving whether and how the district is gerrymandered has been one of the difficulties in stopping this practice.
Mattingly set out to create mathematical tools that would prove to the courts time and time again if a district borders have been drawn by politics and not fairness. What Mattingly and his student Christy Graves realized is that gerrymandering produces certain statistical signs. The opposition party usually gets a landslide in the packed districts and loses narrowly in the cracked ones. Using data analysis, Mattingly and his team were able to create an index that shows the extent of gerrymandering in a district.
It is important to note that Mattingly is not alone in this quest. Other mathematicians have also been working to create better methods for evaluating gerrymandering. The political statistician Wendy Tam Cho from the University of Illinois Urbana-Champaign has also designed district map-drawing algorithms that satisfy state law requirements without relying on partisan voting information.
Nicholas Stephanopoulos, a political scientist from the University of Chicago, created an "efficiency gap" to show how each state's wasted votes can reveal signs of gerrymandering. If a party has landslide victories or losses, with numbers much more extreme than the proportion it actually needed to win, that could be a sign of political shenanigans.
Despite the various science and math-based ideas to combat gerrymandering, they have not been embraced by the politicians. Perhaps, unsurprisingly, as they do not want to lose this weapon from their arsenal. But there are signs that the courts are admitting more mathematical analysis when gerrymandering is being alleged. Whitford v. Gill, a Wisconsin case, which may end up before the Supreme Court, used Stephanopoulos's efficiency gap analysis to inform their decision.
The upcoming 2020 census is the next big event in this fight. The new numbers are likely to create much redistricting around the country. While Republicans have been shown to use gerrymandering to their advantage, the Democrats also engage in the practice. Mattingly's analysis showed they used the tactic in Maryland, where they control the legislature. For the sake of American democracy, devising objective mathematical approaches that ensure all voices are being heard equally seems like a no-brainer.