Information itself may be what ends the human race
"We are literally changing the planet bit by bit, and it is an invisible crisis."
- IBM estimates that humans produce 2.5 quintillion digital data bytes daily.
- We'll one day reach a point where the number of bits we store outnumber the entirety of atoms on Earth.
- In the most severe scenario, it takes just 130 years for all the power generated on Earth to be sucked up by digital data creation and storage.
Although many places in the world are currently preoccupied by the COVID‐19 pandemic, most infectious disease experts believe that a pandemic is unlikely to end the human race. The Black Death of the 14th century was probably the worst case in history. It eliminated one‐third of Europe's population, which ironically, lead to higher wages—due to a lack of workers, a distrust of authorities who could not protect the people against the disease, and a reinvestment in humanity, all of which lead to the Renaissance.
If anything is to end the human race, it'll be climate change. The iconic Doomsday Clock was moved ahead from two minutes to 100 seconds to midnight, in January of this year. The clock has been moved ahead each year for the last four years. What's more, this is the closest it's ever been to midnight since its inception in 1947.
We've got about a decade to turn things around before the damage becomes irreversible. The situation is so disheartening, that at least one group of scientists speculates the reason we don't see a universe replete with alien civilizations is that it's hard for species to survive the climate change advances in technology inevitably cause.
Will data do us in?
If we do get lucky enough to survive and steer clear of any other likely, apocalyptic scenario, say a thermonuclear war, the eruption of a supervolcano or an enormous asteroid slamming into the Earth, we'll have about five billion years until the sun runs out of fuel. But between now and the death of our sun, there's another issue scientists weren't even aware of, until now. Information itself could thwart humankind. It isn't data per se but storing it. As societies increasingly rely on digital information and there's more and more of it, we'll one day reach a point where the number of bits being stored will outnumber the atoms that make up our planet. That's according to theoretical physicist and Senior Lecturer Melvin Vopson at the University of Portsmouth in the UK. A peer‐reviewed paper on his theory, called "The Information Catastrophe," was recently published in the journal AIP Advances.
"Currently, we produce ∼1021 digital bits of information annually on Earth," Vopson begins. This is based on an IBM estimate that humans produce 2.5 quintillion digital data bytes daily. With an assumed 20 percent growth rate, the number of bits we produce will outnumber the entirety of atoms on the planet in around 350 years. In a press release, Vopson said, "We are literally changing the planet bit by bit, and it is an invisible crisis."
There are a lot of variables to consider. For instance, the number of bits produced each year, data storage capacity, energy production and the size of the bit compared to the atom (mass distribution). There are human‐centered factors too, such as population growth and the rate of access to information technology in developing countries. "If we assume a more realistic growth rates of 5%, 20%, and 50%," the paper states, "the total number of bits created will equal the total number of atoms on Earth after ∼1,200 years, ∼340 years, and ∼150 years, respectively."
It could be worse than predicted
In the most severe case, the 150‐year scenario, it would take approximately 130 years until all the power generated on Earth is sucked up by digital data creation and storage. In this version, by 2245, digital information's mass would equal half that of the Earth's. IBM states that 90 percent of the digital information we have today was only produced in the last ten years. "The growth of digital information seems truly unstoppable," Vopson said.
What's more, he believes his rates are conservative. He told me via email: "If we look only at the magnetic data storage density, it doubled every year for over 50 years." Not only might the generation of data increase at a faster clip, the estimate uses the thermodynamic energy limit for bit creation. This is the ideal case, the maximum possible efficiency, which we are miles away from, meaning the issue may arrive far sooner.
Dr. Vopson did offer one solution, using "non‐material media" to store information. He does not hold out hope for this, however. "I am more optimistic about the energy aspects as we will most likely master better ways of extracting energy from fusion (and) solar PVs to close to 100% efficiency." Quantum computing wouldn't be the answer, as quantum bits or q‐bits (bits in quantum superposition states) don't store data. Instead, storage happens using digital bits and classical computing.
Besides this theory, Vopson is the progenitor of the mass‐energy‐information equivalence, which states that information is an essential building block of the universe and it has mass. In this theory mass, energy, and information are all interconnected. Dark matter doesn't exist. Instead, the "missing" matter in the universe is the mass information itself contains.
The homogeneity of the news media can now be quantified
New research reveals the extent to which groupthink bias is increasingly being built into the content we consume.
- When ownership of news sources is concentrated into the hands of just a handful of corporations, the kind of reporting that audiences get to see is limited and all the more likely to be slanted by corporate interests.
- Newsroom employment has declined dramatically over the past decade, and this has only been exacerbated by the COVID-19 pandemic.
- The findings of a new University of Illinois study suggest that Washington journalists operate in insular microbubbles that are vulnerable to consensus seeking. If the reporters on the Hill are feeding America copycat news information, we are all at risk of succumbing to groupthink.
Distrust of the media is a growing phenomenon in the U.S., with many Americans feeling that mainstream news media presents biased views and an alarming number saying they're reluctant to believe what is being reported.
It's easy to make the argument that media consolidation is to blame for this trend.
When ownership of news sources is concentrated into the hands of just a handful of corporations, the kind of reporting that audiences get to see is limited and all the more likely to be slanted by corporate interests.
Combined with the contraction of traditional local print news in the face of infotainment TV and clickbait-influenced web publishing, journalists are becoming increasingly homogeneous in their views and more susceptible to groupthink.
Deregulation and the rise of new media
Up until the 1980s, the federal government worked to prevent media consolidation in partnership with the FCC. But under Reagan, many of the existing regulations were shelved, giving corporations greater leeway in acquiring local news outlets.
The deregulatory trend persisted, arguably culminating with Clinton's 1996 Telecommunications Act. A watershed moment for news media homogeneity, the law essentially permitted corporations to amass large numbers of local newspapers and news stations, granting hegemons access to almost every household in America.
Traditional news outlets have been suffering for years with the rise of cable networks and the advent of web publishing. With free content constantly available online, many outlets have given up the ghost and shut down print and broadcast. Newsroom employment has declined dramatically over the past decade, and this has only been exacerbated by the COVID-19 pandemic.The outlets that are left standing are now under corporate ownership and are reliant on social media platforms for distribution. Trading hard-hitting news for click baiting has a direct impact on both the format of the news and the kind of information that is disseminated.
One effect of the contraction of the news industry is that journalists are networking with fewer peers and sources. In one recently published study, "Sharing Knowledge and 'Microbubbles': Epistemic Communities and Insularity in US Political Journalism," researchers from the University of Illinois explore the extent to which groupthink bias is increasingly being built into the content we consume.
The study even goes so far as to measure news media homogeneity using data from journalists' interactions on social media. Analyzing 680,021 tweets posted by 2,292 accounts belonging to credentialed journalists, the researchers concluded that "beltway insularity," as they call the phenomenon, can easily be grouped into nine distinct clusters.
The findings of the study suggest that Washington journalists operate in insular microbubbles that are vulnerable to consensus seeking. If the reporters on the Hill are feeding America copycat news information, we are all at risk of succumbing to groupthink.
Critics have two main concerns with this consensus-reinforcing phenomenon: a less diverse output of storylines and takes on those storylines, and the ease with which unsubstantiated or shoddily substantiated reporting can get picked up by mainstream news outlets.
Record distrust in the media industry
There's never been a time in American history when the sources of information were so doubted. Even after Watergate, trust in the media stood at 74 percent. At last count, Gallup found that just 20 percent of American have confidence in print and broadcast journalism, two more percentage points than TV news received in the same poll.
There is a growing concern that news media is biased, that reporters don't just report but curate and editorialize, and that the money behind the news has an impact on what is reported and how. This suspicion is fodder for conspiracy theorists who vilify the mainstream media and offer alternative facts to what is available. Playing on people's fears, alternative outlets online are picking up steam and spreading misinformation (and deliberate disinformation).
For example, although many leading news outlets – including The Washington Post, The Independent, The New York Times and even Fox News – independently debunked the "Pizzagate" conspiracy as soon as it began to spread in 2016, media coverage of the story has steadily risen throughout the past year.
Recent examples of conspiracy theories about COVID-19 achieving widespread popularity attest to this trend. Other examples include climate change denial, the QAnon "deep state" conspiracy, and others. Where public trust in news media is lacking, fake news fills the void.
Fewer journalists means fewer voices
One factor in Americans' diminishing trust in the news is that there are fewer journalists, especially local journalists, that viewers can turn to as distinct voices. Lack of local coverage and the rise of homogeneous, sensationalist journalism are perpetuating distrust and driving many Americans to look for news elsewhere – and leaving them susceptible to manipulation.
As mentioned earlier, print media has been hit hard, and broadcast journalism is also feeling the pain. With lots of newsroom layoffs and closures, having fewer journalists means exposure to fewer perspectives. This has created a situation where there is less original reporting, with more repurposing of others' stories and less fact checking, thereby contributing to the spread of misinformation.
Lack of local news has far reaching effects on democracy. One study from King's College London found that communities without local community news outlets have less public engagement and greater distrust of public institutions.
"We can all have our own social media account, but when local papers are depleted or in some cases simply don't exist, people lose a communal voice," Martin Moore, the author of the study, remarked. "They feel angry, not listened to and more likely to believe malicious rumour."
Mainstream media and fake news
Ironically, while the erosion of mainstream media is contributing to the rise of misinformation and alternative news, when outlets attempt to expose fake news, it often backfires, propelling its dissemination. Plenty of news consumers first encounter conspiracies and disinformation on the news, but rather than building trust, 72 percent of Americans believe that traditional outlets are the ones with the agenda.
And who can blame them? The parroting of identical headlines across consolidated newsrooms doesn't help instill confidence. Take for example this compilation of "local news" talking heads repeating the same script:
All of these reporters are part of the Sinclair Broadcast Group. It's hard to deny the dangers of corporate consolidation of news media when confronted with damning clips like this, and Sinclair is out for even more control. An attempted acquisition in 2017 would have put Sinclair stations in 72 percent of households with a television, but the deal was struck down by Tribune.
This is a huge amount of influence for one company, or person, to have. In an election year, this is even more pertinent.
Social media algorithms and information bubbles
Just as more Americans distrust mainstream news, the majority get their facts on social media. This wouldn't be a problem per se, but the way online news is delivered to consumers perpetuates echo chambers and information bubbles.
Social media deliberately surfaces content to individuals that confirm their views and echo previously viewed or shared content. The algorithms amplify biases and screen out dissenting opinions. Before you know it, other voices are blocked from your feed, leaving you in an echo chamber. This doesn't just apply to news, but also to targeted ads and campaigns designed for microcommunities with shared attributes.
It has never been easier to convince so many people to believe stories that aren't necessarily true – lack of trust, consolidation of news outlets, the contraction of journalism, and the pervasiveness of web news is creating isolated information bubbles that many of us now find ourselves stuck in. People naturally want to read news that confirms their beliefs.
When infotainment is commoditized and served up for quick and easy consumption, critical thinking takes a back seat.
Finding the facts on your own
With quantified evidence of journalistic groupthink and information bubbles among those who consume political information, is there hope for open dialogue and a variety of perspectives?
Ultimately, yes. However, this won't likely be coming from the news media. Choosing not to be misled and seeking out a variety of opinions and perspectives is something that each individual will likely have to do on their own, even if it means questioning one's fundamental beliefs. This entails verifying the information you read, actively engaging with people outside of your comfortable echo chamber, and even changing your mind when confronted with hard evidence.
Finding the facts on your own can be tough, but if we can't rely on the news to give us the news, there's no other choice.
There is no dark matter. Instead, information has mass, physicist says
Is information the fifth form of matter?
- Researchers have been trying for over 60 years to detect dark matter.
- There are many theories about it, but none are supported by evidence.
- The mass-energy-information equivalence principle combines several theories to offer an alternative to dark matter.
The “discovery” of dark matter
We can tell how much matter is in the universe by the motions of the stars. In the1920s, physicists attempting to do so discovered a discrepancy and concluded that there must be more matter in the universe than is detectable. How can this be?
In 1933, Swiss astronomer Fritz Zwicky, while observing the motion of galaxies in the Coma Cluster, began wondering what kept them together. There wasn't enough mass to keep the galaxies from flying apart. Zwicky proposed that some kind of dark matter provided cohesion. But since he had no evidence, his theory was quickly dismissed.
Then, in 1968, astronomer Vera Rubin made a similar discovery. She was studying the Andromeda Galaxy at Kitt Peak Observatory in the mountains of southern Arizona when she came across something that puzzled her. Rubin was examining Andromeda's rotation curve, or the speed at which the stars around the center rotate, and realized that the stars on the outer edges moved at the exact same rate as those at the interior, violating Newton's laws of motion. This meant there was more matter in the galaxy than was detectable. Her punch card readouts are today considered the first evidence of the existence of dark matter.
Many other galaxies were studied throughout the '70s. In each case, the same phenomenon was observed. Today, dark matter is thought to comprise up to 27% of the universe. "Normal" or baryonic matter makes up just 5%. That's the stuff we can detect. Dark energy, which we can't detect either, makes up 68%.
Dark energy is what accounts for the Hubble Constant, or the rate at which the universe is expanding. Dark matter on the other hand, affects how "normal" matter clumps together. It stabilizes galaxy clusters. It also affects the shape of galaxies, their rotation curves, and how stars move within them. Dark matter even affects how galaxies influence one another.
Leading theories on dark matter
NASA writes: 'This graphic represents a slice of the spider-web-like structure of the universe, called the "cosmic web." These great filaments are made largely of dark matter located in the space between galaxies.'
Credit: NASA, ESA, and E. Hallman (University of Colorado, Boulder)
Since the '70s, astronomers and physicists have been unable to identify any evidence of dark matter. One theory is it's all tied up in space-bound objects called MACHOs (Massive Compact Halo Objects). These include black holes, supermassive black holes, brown dwarfs, and neutron stars.
Another theory is that dark matter is made up of a type of non-baryonic matter, called WIMPS (Weakly Interacting Massive Particles). Baryonic matter is the kind made up of baryons, such as protons and neutrons and everything composed of them, which is anything with an atomic nucleus. Electrons, neutrinos, muons, and tau particles aren't baryons, however, but a class of particles called leptons. Even though the (hypothetical) WIMPS would have ten to a hundred times the mass of a proton, their interactions with normal matter would be weak, making them hard to detect.
Then there are those aforementioned neutrinos. Did you know that giant streams of them pass from the Sun through the Earth each day, without us ever noticing? They're the focus of another theory that says that neutral neutrinos, that only interact with normal matter through gravity, are what dark matter is comprised of. Other candidates include two theoretical particles, the neutral axion and the uncharged photino.
Now, one theoretical physicist posits an even more radical notion. What if dark matter didn't exist at all? Dr. Melvin Vopson of the University of Portsmouth, in the UK, has a hypothesis he calls the mass-energy-information equivalence. It states that information is the fundamental building block of the universe, and it has mass. This accounts for the missing mass within galaxies, thus eliminating the hypothesis of dark matter entirely.
Information theory
To be clear, the idea that information is an essential building block of the universe isn't new. Classical Information Theory was first posited by Claude Elwood Shannon, the "father of the digital age" in the mid-20th century. The mathematician and engineer, well-known in scientific circles—but not so much outside of them, had a stroke of genius back in 1940. He realized that Boolean algebra coincided perfectly with telephone switching circuits. Soon, he proved that mathematics could be employed to design electrical systems.
Shannon was hired at Bell Labs to figure out how to transfer information over a system of wires. He wrote the bible on using mathematics to set up communication systems, thereby laying the foundation for the digital age. Shannon was also the first to define one unit of information as a bit.
There was perhaps no greater proponent of information theory than another unsung paragon of science, John Archibald Wheeler. Wheeler was part of the Manhattan Project, worked out the "S-Matrix" with Niels Bohr and helped Einstein develop a unified theory of physics. In his later years, he proclaimed, "Everything is information." Then he went about exploring connections between quantum mechanics and information theory.
He also coined the phrase "it from bit" or that every particle in the universe emanates from the information locked inside it. At the Santa Fe Institute in 1989, Wheeler announced that everything, from particles to forces to the fabric of spacetime itself "… derives its function, its meaning, its very existence entirely … from the apparatus-elicited answers to yes-or-no questions, binary choices, bits."
Part Einstein, part Landauer
Vopson takes this notion one step further. He says that not only is information the essential unit of the universe but also that it is energy and has mass. To support this claim, he unifies and coordinates special relativity with the Landauer Principle. The latter is named after Rolf Landauer. In 1961, he predicted that erasing even one bit of information would release a tiny amount of heat, a figure which he calculated. Landauer said this proves information is more than just a mathematical quantity. This connects information to energy. Through experimental testing over the years, the Landauer Principle has held up.
Vopson says, "He [Landauer] first identified the link between thermodynamics and information by postulating that logical irreversibility of a computational process implies physical irreversibility." This indicates that information is physical, Vopson says, and demonstrates the link between information theory and thermodynamics.
In Vopson's theory, information, once created has "finite and quantifiable mass." It so far applies only to digital systems, but could very well apply to analogue and biological ones too, and even quantum or relativistic-moving systems. "Relativity and quantum mechanics are possible future directions of the mass-energy-information equivalence principle," he says.
In the paper published in the journal AIP Advances, Vopson outlines the mathematical basis for his hypothesis. "I am the first to propose the mechanism and the physics by which information acquires mass," he said, "as well as to formulate this powerful principle and to propose a possible experiment to test it."
The fifth state of matter
To measure the mass of digital information, you start with an empty data storage device. Next, you measure its total mass with a highly sensitive measuring apparatus. Then, you fill it and determine its mass. Next, you erase one file and evaluate it again. The trouble is, the "ultra-accurate mass measurement" device the paper describes doesn't exist yet. This would be an interferometer, something similar to LIGO. Or perhaps an ultrasensitive weighing machine akin to a Kibble balance.
"Currently, I am in the process of applying for a small grant, with the main objective of designing such an experiment, followed by calculations to check if detection of these small mass changes is even possible," Vopson says. "Assuming the grant is successful and the estimates are positive, then a larger international consortium could be formed to undertake the construction of the instrument." He added, "This is not a workbench laboratory experiment, and it would most likely be a large and costly facility." If eventually proved correct, Vopson will have discovered the fifth form of matter.
So, what's the connection to dark matter? Vopson says, "M.P. Gough published an article in 2008 in which he worked out … the number of bits of information that the visible universe would contain to make up all the missing dark matter. It appears that my estimates of information bit content of the universe are very close to his estimates."
Google 2.0: Why MIT scientists are building a new search engine
The truth is a messy business, but an information revolution is coming. Danny Hillis and Peter Hopkins discuss knowledge, fake news and disruption at NeueHouse in Manhattan.
- In 2005, Danny Hillis co-founded Freebase, an open-source knowledge database that was acquired by Google in 2010. Freebase formed the foundation of Google's famous Knowledge Graph, which enhances its search engine results and powers Google Assistant and Google Home.
- Hillis is now building The Underlay, a new knowledge database and future search engine app that is meant to serve the common good rather than private enterprise. He calls it his "penance for having sold the other one to Google."
- Powerful collections of machine-readable knowledge are becoming exceedingly important, but most are privatized and serve commercial goals.
- Decentralizing knowledge and making information provenance transparent will be a revolution in the so-called "post-truth age". The Underlay is being developed at MIT by Danny Hillis, SJ Klein, Travis Rich.
Fact vs. Fiction: How Facts Are Made, and Who Decides What's True
What information can we trust? Truth isn't black and white, so here are three requirements every fact should meet.
The chances are good that you've used Wikipedia to define or discover something in the last week, if not 24 hours. It's currently the 5th most-visited website in the world. The English-language Wikipedia averages 800 new articles per day — but 1,000 articles are deleted per day, the site's own statistics page reports. That fluctuation is probably partly the result of mischievous users, but it is also an important demonstration of Wikipedia's quest for knowledge in motion. "As the world's consensus changes about what is reliable, verifiable information, the information for us will change too," says Katherine Maher, executive director of the Wikimedia Foundation. Maher is careful to delineate between truth and knowledge. Wikipedia isn't a jury for truth, it's a repository for information that must be three things: neutral, verifiable, and determined with consensus. So how do we know what information to trust, in an age that is flooded with access, data, and breaking news? Through explaining how Wikipedia editors work and the painstaking detail and debate that goes into building an article, Maher offers a guide to separating fiction from fact, which can be applied more broadly to help us assess the quality of information in other forums.