Once a week.
Subscribe to our weekly newsletter.
The supervolcano that can wipe out the U.S. and kill billions may be overdue for an eruption
An extinction events expert sounds a dire warning.
- The supervolcano in Yellowstone National Park could cause an "ultra-catastrophe," warns an extinction events writer.
- The full eruption of the volcano last happened 640,000 years ago.
- The blast could kill billions and make United States uninhabitable.
If there weren't enough cataclysms to worry about, a recently published book brought another terrible possibility back into the public spotlight. Among other calamities, Bryan Walsh's "End Times" highlights the eruption of the Yellowstone supervolcano in the United States as an event that could wipe out much of human life on this planet.
Thankfully, the Yellowstone volcano in Wyoming doesn't erupt very often. The last time it did so was 640,000 years ago. Certainly, no one knows when it will blow again, but by some measures it could happen at any point.
When it does happen, the eruption would propel something like 240 cubic miles of rock, dust and ash into the sky, like it did during the last major explosion. It would also create a flow of magma that would bury the area within a 40-mile radius. Chunks of Colorado, Wyoming and Utah would end up under up to three feet of ash. Parts of the Midwest would be covered in at least a few inches of such ash, becoming dark. And if you think that already doesn't sound good, the emission of dust and toxic gases like sulphur monoxide into the atmosphere would create an acidic veil around the planet reflecting sunlight. This would lead to a dramatic cooling of the global climate that would last at least several years.
This will also result in devastated crops and the destruction of the power grid. Without sunlight, warmth and food, a major loss of life would be incurred. How bad? As Bryan Walsh writes in his op-ed in the New York Times, a 2015 report for the European Science Foundation on extreme geohazards called what might happen "the greatest catastrophe since the dawn of civilization."
Dr. Jerzy Żaba, a geologist from the University of Silesia in Katowice, Poland, was even more specific, saying in an interview that in his estimate, around 5 billion people would die from hunger "due to the effects of climate change." If Yellowstone blows, he suggests that to survive, the only thing to do would be to flee North America. Imagine, American refugees trying to get into South America or Europe.
Bryan Walsh calls this scenario, without much exaggeration, an "existential risk" not just to the U.S., but the world at large. Walsh writes that "existential risk experts largely agree that supervolcanoes — of which there are 20 scattered around the planet — are the natural threat that poses the highest probability of human extinction."
Yellowstone National Park, Wyoming, United States. May 2016.
Credit: Russell Pearson/Getty Images
The big question, of course, is when will the Yellowstone supervolcano erupt? The seismically active area in the Yellowstone National Park gets about 3,000 small earthquakes a year, but the catastrophic blasts are infrequent. Besides the so-called "Lava Creek eruption" 640K years ago, the volcano fully exploded approximately 1.3 million and 2.1 million years ago, which means there are about 660,000 to 800,000 years in between.
Of course, no one knows when it will happen next for sure, which is why Walsh believes more resources need to be dedicated to the study of volcanos. Currently, the FAA spends about $7 billion every year on aviation safety while volcano programs, which can potentially cause much more massive casualties, get a paltry $22 million in study expenditures.
- Physicist discovers the explosions that will end our Universe - Big Think ›
- One billion Americans: Is it a good idea? - Big Think ›
A team of archaeologists has discovered 3,200-year-old cheese after analyzing artifacts found in an ancient Egyptian tomb. It could be the oldest known cheese sample in the world.
A team of archaeologists has discovered 3,200-year-old cheese after analyzing artifacts found in an ancient Egyptian tomb. It could be the oldest known cheese sample in the world.
The tomb that held the cheese lies in the desert sands south of Cairo. It was first discovered in the 19th century by treasure hunters, who eventually lost the knowledge of its location, leaving the Saharan sands to once again conceal the tomb.
“Since 1885 the tomb has been covered in sand and no-one knew about it,” Professor Ola el-Aguizy of Cairo University told the BBC. “It is important because this tomb was the lost tomb.”
In 2010, a team of archaeologists rediscovered the tomb, which belonged to Ptahmes, a mayor and military chief of staff of the Egyptian city of Memphis in the 13th century B.C. In the tomb, the team found a jar containing a “solidified whitish mass,” among other artifacts.
“The archaeologists suspected [the mass] was food, according to the conservation method and the position of the finding inside the tomb, but we discovered it was cheese after the first tests,” Enrico Greco, the lead author of the paper and a research assistant at Peking University in Beijing, told the The New York Times.
To find out what the substance was, the team had to develop a novel way to analyze the proteins and identify the peptide markers in the samples. They first dissolved parts of the substance and then used mass spectrometry and chromatography to analyze its proteins.
Despite more than 3,000 years spent in the desert, the researchers were able to identify hundreds of peptides (chains of amino acids) in the sample. They found some that were associated with milk from goat, sheep and, interestingly, the African buffalo, a species not usually kept as a domestic animal in modern Africa, as Gizmodo reports.
Those results suggested that the substance was cheese, specifically one that was probably similar in consistency to chevre but with a “really, really acidy” taste, as Dr. Paul Kindstedt, a professor at the University of Vermont who studies the chemistry and history of cheese, told the The New York Times.
“It would be high in moisture; it would be spreadable,” he said. “It would not last long; it would spoil very quickly.”
The researchers also found traces of the bacterium Brucella melitensis, which causes brucellosis, a debilitating disease that can cause endocarditis, arthritis, chronic fatigue, malaise, muscle pain and other conditions. It’s a disease usually contracted by consuming raw dairy products.
“The most common way to be infected [with Brucella melitensis] is by eating or drinking unpasteurized/raw dairy products. When sheep, goats, cows, or camels are infected, their milk becomes contaminated with the bacteria,” the U.S. Centers for Disease Control wrote on its website. “If the milk from infected animals is not pasteurized, the infection will be transmitted to people who consume the milk and/or cheese products.”
Dr. Kindstedt said one reason the study is significant is for its novel use of proteomic analysis, which is the systematic identification and quantification of the complete complement of proteins (the proteome) of a biological system.
“As I say to my students every year when I get to Egypt, someone has to go ahead and analyze these residues with modern capabilities,” he told the The New York Times. “This is a logical next step and I think you’re going to see a lot more of this.”
'The Great Pyramid of Chee-za'. An artist's interpretation of a very ripe, slightly deadly Egyptian tomb cheese. (Credit: Creative commons/Big Think)
However, Dr. Kindstedt did offer a bit of caution on the conclusions the researchers drew from the findings.
“The authors of this new study did some nice work,” he told Gizmodo in a statement. “But in my view, on multiple grounds (I suspect in their zeal to be “the first”), they inferred considerably beyond what their data is capable of supporting within reasonable certainty, and almost certainly they are not the first to have found solid cheese residues in Egyptian tombs, just the first to apply proteomic analyses (which is worthy achievement on its own).”
The research also raises an intriguing question: Can we get around the Heisenberg uncertainty principle?
- New experiments with vibrating drums push the boundaries of quantum mechanics.
- Two teams of physicists create quantum entanglement in larger systems.
- Critics question whether the study gets around the famous Heisenberg uncertainty principle.
Recently published research pushes the boundaries of key concepts in quantum mechanics. Studies from two different teams used tiny drums to show that quantum entanglement, an effect generally linked to subatomic particles, can also be applied to much larger macroscopic systems. One of the teams also claims to have found a way to evade the Heisenberg uncertainty principle.
One question that the scientists were hoping to answer pertained to whether larger systems can exhibit quantum entanglement in the same way as microscopic ones. Quantum mechanics proposes that two objects can become "entangled," whereby the properties of one object, such as position or velocity, can become connected to those of the other.
An experiment performed at the U.S. National Institute of Standards and Technology in Boulder, Colorado, led by physicist Shlomi Kotler and his colleagues, showed that a pair of vibrating aluminum membranes, each about 10 micrometers long, can be made to vibrate in sync, in such a way that they can be described to be quantum entangled. Kotler's team amplified the signal from their devices to "see" the entanglement much more clearly. Measuring their position and velocities returned the same numbers, indicating that they were indeed entangled.
Tiny aluminium membranes used by Kotler's team.Credit: Florent Lecoq and Shlomi Kotler/NIST
Evading the Heisenberg uncertainty principle?
Another experiment with quantum drums — each one-fifth the width of a human hair — by a team led by Prof. Mika Sillanpää at Aalto University in Finland, attempted to find what happens in the area between quantum and non-quantum behavior. Like the other researchers, they also achieved quantum entanglement for larger objects, but they also made a fascinating inquiry into getting around the Heisenberg uncertainty principle.
The team's theoretical model was developed by Dr. Matt Woolley of the University of New South Wales. Photons in the microwave frequency were employed to create a synchronized vibrating pattern as well as to gauge the positions of the drums. The scientists managed to make the drums vibrate in opposite phases to each other, achieving "collective quantum motion."
The study's lead author, Dr. Laure Mercier de Lepinay, said: "In this situation, the quantum uncertainty of the drums' motion is canceled if the two drums are treated as one quantum-mechanical entity."
This effect allowed the team to measure both the positions and the momentum of the virtual drumheads at the same time. "One of the drums responds to all the forces of the other drum in the opposing way, kind of with a negative mass," Sillanpää explained.
Theoretically, this should not be possible under the Heisenberg uncertainty principle, one of the most well-known tenets of quantum mechanics. Proposed in the 1920s by Werner Heisenberg, the principle generally says that when dealing with the quantum world, where particles also act like waves, there's an inherent uncertainty in measuring both the position and the momentum of a particle at the same time. The more precisely you measure one variable, the more uncertainty in the measurement of the other. In other words, it is not possible to simultaneously pinpoint the exact values of the particle's position and momentum.
Big Think contributor astrophysicist Adam Frank, known for the 13.8 podcast, called this "a really fascinating paper as it shows that it's possible to make larger entangled systems which behave like a single quantum object. But because we're looking at a single quantum object, the measurement doesn't really seem to me to be 'getting around' the uncertainty principle, as we know that in entangled systems an observation of one part constrains the behavior of other parts."
Ethan Siegel, also an astrophysicist, commented, "The main achievement of this latest work is that they have created a macroscopic system where two components are successfully quantum mechanically entangled across large length scales and with large masses. But there is no fundamental evasion of the Heisenberg uncertainty principle here; each individual component is exactly as uncertain as the rules of quantum physics predicts. While it's important to explore the relationship between quantum entanglement and the different components of the systems, including what happens when you treat both components together as a single system, nothing that's been demonstrated in this research negates Heisenberg's most important contribution to physics."The papers, published in the journal Science, could help create new generations of ultra-sensitive measuring devices and quantum computers.
As bad as this sounds, a new essay suggests that we live in a surprisingly egalitarian age.
- A new essay depicts 700 years of economic inequality in Europe.
- The only stretch of time more egalitarian than today was the period between 1350 to approximately the year 1700.
- Data suggest that, without intervention, inequality does not decrease on its own.
Economic inequality is a constant topic. No matter the cycle — boom or bust — somebody is making a lot of money, and the question of fairness is never far behind.
A recently published essay in the Journal of Economic Literature by Professor Guido Alfani adds an intriguing perspective to the discussion by showing the evolution of income inequality in Europe over the last several hundred years. As it turns out, we currently live in a comparatively egalitarian epoch.
Seven centuries of economic history
Figure 8 from Guido Alfani, Journal of Economic Literature, 2021.
This graph shows the amount of wealth controlled by the top ten percent in certain parts of Europe over the last seven hundred years. Archival documentation similar to — and often of a similar quality as — modern economic data allows researchers to get a glimpse of what economic conditions were like centuries ago. Sources like property tax records and documents listing the rental value of homes can be used to determine how much a person's estate was worth. (While these methods leave out those without property, the data is not particularly distorted.)
The first part of the line, shown in black, represents work by Prof. Alfani and represents the average inequality level of the Sabaudian State in Northern Italy, The Florentine State, The Kingdom of Naples, and the Republic of Venice. The latter part, in gray, is based on the work of French economist Thomas Piketty and represents an average of inequality in France, the United Kingdom, and Sweden during that time period.
Despite the shift in location, the level of inequality and rate of increase are very similar between the two data sets.
Apocalyptic events cause decreases in inequality
Note that there are two substantial declines in inequality. Both are tied to truly apocalyptic events. The first is the Black Death, the common name for the bubonic plague pandemic in the 14th century, which killed off anywhere between 30 and 50 percent of Europe. The second, at the dawn of the 20th century, was the result of World War I and the many major events in its aftermath.
The 20th century as a whole was a time of tremendous economic change, and the periods not featuring major wars are notable for having large experiments in distributive economic policies, particularly in the countries Piketty considers.
The slight stall in the rise of inequality during the 17th century is the result of the Thirty Years' War, a terrible religious conflict that ravaged Europe and left eight million people dead, and of major plagues that affected South Europe. However, the recurrent outbreaks of the plague after the Black Death no longer had much effect on inequality. This was due to a number of factors, not the least of which was the adaptation of European institutions to handle pandemics without causing such a shift in wealth.
In 2010, the last year covered by the essay, inequality levels were similar to those of 1340, with 66 percent of the wealth of society being held by the top ten percent. Also, inequality levels were continuing to rise, and the trends have not ended since. As Prof. Alfani explained in an email to BigThink:
"During the decade preceding the Covid pandemic, economic inequality has shown a slow tendency towards further inequality growth. The Great Recession that began in 2008 possibly contributed to slow down inequality growth, especially in Europe, but it did not stop it. However, the expectation is that Covid-19 will tend to increase inequality and poverty. This, because it tends to create a relatively greater economic damage to those having unstable occupations, or who need physical strength to work (think of the effects of the so-called "long-Covid," which can prove physically invalidating for a long time). Additionally, and thankfully, Covid is not lethal enough to force major leveling dynamics upon society."
Can only disasters change inequality?
That is the subject of some debate. While inequality can occur in any economy, even one that doesn't grow all that much, some things appear to make it more likely to rise or fall.
Thomas Piketty suggested that the cause of changes in inequality levels is the difference in the rate of return on capital and the overall growth rate of the economy. Since the return on capital is typically higher than the overall growth rate, this means that those who have capital to invest tend to get richer faster than everybody else.
While this does explain a great deal of the graph after 1800, his model fails to explain why inequality fell after the Black Death. Indeed, since the plague destroyed human capital and left material goods alone, we would expect the ratio of wealth over income to increase and for inequality to rise. His model can provide explanations for the decline in inequality in the decades after the pandemic, however- it is possible that the abundance of capital could have lowered returns over a longer time span.
The catastrophe theory put forth by Walter Scheidel suggests that the only force strong enough to wrest economic power from those who have it is a world-shattering event like the Black Death, the fall of the Roman Empire, or World War I. While each event changed the world in a different way, they all had a tremendous leveling effect on society.
But not even this explains everything in the above graph. Pandemics subsequent to the Black Death had little effect on inequality, and inequality continued to fall for decades after World War II ended. Prof. Alfani suggests that we remember the importance of human agency through institutional change. He attributes much of the post-WWII decline in inequality to "the redistributive policies and the development of the welfare states from the 1950s to the early 1970s."
What does this mean for us now?
As Professor Alfani put it in his email:
"[H]istory does not necessarily teach us whether we should consider the current trend toward growth in economic inequality as an undesirable outcome or a problem per se (although I personally believe that there is some ground to argue for that). Nor does it teach us that high inequality is destiny. What it does teach us, is that if we do not act, we have no reason whatsoever to expect that inequality will, one day, decline on its own. History also offers abundant evidence that past trends in inequality have been deeply influenced by our collective decisions, as they shaped the institutional framework across time. So, it is really up to us to decide whether we want to live in a more, or a less unequal society."