Once a week.
Subscribe to our weekly newsletter.
The Historic Roots, and Impacts, of Our Nuclear Fear
I remember going to bed one night when I was 11, seriously afraid I would not be alive in the morning. It was October, 1962, and the frightening cold war between the U.S. and Soviet Union, constantly in the news but mostly abstract to me as a kid, had becoming terrifyingly real. I had watched a stern President Kennedy on TV revealing that Soviet missiles were being installed in Cuba and ordering a blockade of Soviet ships. There were pictures of the missile sites, and video of confrontations at sea. The world really was on the brink of nuclear war. I was viscerally afraid, and I wasn’t alone.
But that was long ago, and those fears were well-tucked away…until I started reading Spencer Weart’s fascinating new book, The Rise of Nuclear Fear. It was remarkable how quickly, and powerfully, the feeling of those scary days came back to life. Even more, it was stunning to discover how broadly and profoundly our nuclear fears have shaped our modern world. If you want to understand our acute modern fear of cancer, the roots of environmentalism, what really put a man on the moon, even what turned our 1950s and 60s faith in science and technology into the doubt and skepticism of today, you must read this wonderful insightful entertaining book.
The Rise of Nuclear Fear looks back at our emotional relationship not just with atomic weapons but with nuclear radiation generally, from its discovery by the Curies through Fukushima, a history of how radiation went from “Gee Whiz!” to “OH NO!”, entertainingly woven around how radiation was reflected in popular culture. At the turn of the 20th century, as the scientific and industrial revolutions were bringing so much change and opportunity, Weart tells us that radiation glowed with promise. You could buy radium toothpaste and mouthwash and skin cream. News reports promised that a small amount of uranium could propel a steamship across the ocean, or light cities. Radiation was even seen as the Philosopher’s Stone, the legendary material the alchemists had sought that would give man control over matter. A newspaper headline about radiation declared “Science on the road to revolutionize all existence. No limit to man’s power over nature.”
Radiation’s mysterious powers made it a mainstay of science fiction. Flash Gordon sabotaged the ‘atom furnaces’ that powered the gravity-defying rays of Ming the Merciless’s Sky City, after Ming bragged that “Radioactivity will make me Emperor of the Universe” (add evil laughter here.) Superman’s home planet was destroyed by an atomic explosion. Gene Autry was killed by evil Queen Tikla of the underground civilization of Maurania, but was resurrected in the ‘radium reviving room.”
Radiation and weapons were readily paired in all sorts of popular fiction. Weart recounts one fascinating example of how this would effect world affairs decades later. In the 1940 film Murder in the Air, a U.S. agent guarded the secret of an atomic ray canon that could shoot enemy planes out of the sky. The actor who played the agent was Ronald Reagan, who 40 years later as President would invest billions in just such folly with his Strategic Defense Initiative, a project that most scientists called a sci-fi pipedream without ever realizing where that dream apparently began.
Then, in August of 1945, The Bombs destroyed Hiroshima and Nagasaki, and in a terrible flash our relationship with nuclear radiation exploded into the profound angst that has shaped so many aspects of world history and modern culture. These were far more than just bigger bombs, and Weart writes that they evoked a special sort of fear, of “cosmic power…hell fire…Doomsday itself.” “For all we know,” one broadcaster said, “we have created a Frankenstein.” The suffering of the survivors from the acute effects of exposure to high doses of radiation was quickly labeled “atomic bomb disease” and “mysterious, horrible…atomic plague”. One widely read commentary said “The fear of irrational death…has burst out of the subconscious and into the conscious, filling the mind with primordial apprehensions.”
Those apprehensions only grew more ominous when in 1954 the radioactive fallout from an atmospheric nuclear weapon test fell far beyond the predicted exclusion zone, contaminating a Japanese fishing vessel, the Daigo Fukuryu Maru, or The Lucky Dragon. Back in port, crewmen got sick. One of them died. Images of the men appeared in newspapers and magazines around the world. The Lucky Dragon incident put the word “fallout” into the popular lexicon, and nuclear weapons now meant not only apocalyptic warfare, but the insidious global spread of carcinogens in our air and drinking water.
This had a huge and unexpected impact, in ways that resonate deeply in society today. The fear of cancer exploded in the United States in the 1950’s. Fear of nuclear weapons and fallout played a huge role in that explosion, carving the dread of cancer deeper into our hearts and dramatically shaping our health choices, and health care policy, ever since.
To try and put the genie of growing nuclear fear back in the bottle, Weart writes that President Eisenhower created the Atoms for Peace program in the mid 1950’s, not so much to develop non-military uses of nuclear technology but as propaganda against our fear of nuclear weapons and fallout, dramatically accelerating the creation of a civilian nuclear energy program. A promotional program by the Atomic Energy Agency promised “the Era of Atomic Power is on the Way.” The White House commissioned Walt Disney to make the widely viewed film Our Friend the Atom.
Despite the fear of nuclear war and radioactive fallout, the propaganda worked, in a post World War II society with a strong faith in the power and promise of science. Three quarters of the people in a U.S. national survey in 1956 supported nuclear power. But at the same time, the world was learning of the birth defects suffered by the children of the atomic bomb survivors, who had been exposed in utero . Now, in addition to cancer, nuclear fallout brought an additional terrible risk, of genetic damage.
Then, in the fall of 1957 the Soviet Union launched Sputnik, the satellite that amazed us, but also frightened us with the reality that missiles could deliver nuclear holocaust literally in minutes. Heightened nuclear fear produced ‘the space race’, one outcome of which was man landing on the moon. Another result of Sputnik was to turn what had been a tiny group of liberal pacifists into the first truly global protest movement, championed by Bertrand Russell and Albert Einstein, to “Ban the Bomb”. Tens of thousands participated in huge anti-nuclear/anti-war rallies in England and elsewhere. Weart cites a number of studies that found that the liberal pacifist ‘Ban the Bomb’ movement of the 50s laid the social, ideological, and cultural foundations for the protests against the war in Vietnam more than a decade later.
Fear of nuclear weapons and fallout also led directly to the creation of the modern environmental movement. Weart reports that Barry Commoner, an early environmental leader, said “I learned about the environment from the Atomic Energy Commission in 1953.” Commoner’s influential publication “Environment Magazine” actually began as “Nuclear Information”. Rachel Carson wrote that she had clung to faith that Nature was “beyond the tampering reach of man.” until radioactive fallout killed that faith, and led to her classic cri de couer Silent Spring, in which she emphasized the dangers of industrial chemicals by likening them to radiation. In the chapter “One in Four” devoted to cancer, she writes about a Swedish farmer who she claims was killed by pesticides, likening him to Aikichi Kuboyama, the crewman from the Lucky Dragon killed by radioactive fallout. “For each man,” Carson wrote, “a poison drifting out of the sky carried a death sentence. For one, it was radiation-poisoned ash; for the other, chemical dust.”
Silent Spring was published in September of 1962. The Cuban missile crisis took place less than a month later, terrifying us, but, in the end, easing our fears of nuclear holocaust. The defense strategy of MAD – Mutual Assured Destruction – actually worked. Neither President Kennedy nor Soviet Premier Khrushchev was mad enough to start a nuclear war. But by this point the fear of anything nuclear was so deep that as the apocalyptic threat of nuclear war receded, and as the 1963 atmospheric test ban eliminated the risk of fallout, the fear was transferred to a new nuclear bogeyman. Weart writes that the fear of nuclear weapons and fallout led directly to the opposition to nuclear power.
He cites several studies which found that from the very beginning, this opposition was strongest among the more liberal environmentalist and pacifist parts of society. “People with a more egalitarian ideology who thought that wealth and power should be widely distributed, were more anxious about environmental risks in general and nuclear power above all than people who believed in a more hierarchical social order.” From that opposition arose yet another of the profound, unpredictable effects of nuclear fear; a coal-based energy policy which has killed hundreds of thousands of people from air pollution and now contributes significantly to the threat to the very climate on which life on earth depends.
Weart’s book, a more concise and entertaining update of the one he published in 1998, moves quickly through Chernobyl and Fukushima. He devotes practically no attention to one key part of the nuclear fear story, the findings from studies of the atomic bomb survivors that have shown that the actual biological risk from nuclear radiation actually is stunningly lower than most people realize. The cancer death rate among those survivors went up less than one percent, and no biological effects at all have been detected among those who received lower doses (below 110 milliseiverts). No multi-generational genetic damage has been detected either. The fear of radiation, understandably so deep because it was born in the face of terrifying existential danger, far exceeds the actual risk. This omission is interesting, because Weart does not hesitate to argue that excessive fear of nuclear radiation is irrational and impedes development of nuclear power as one way to deal with climate change.
But it demonstrates how Weart has not written a pro-nuclear polemic. The Rise of Nuclear Fear is a fascinating, entertaining, insightful history which offers an important lesson that reaches far beyond the nuclear issue itself. By illuminating the roots of our nuclear fears, and describing the vast impacts those fears have had, Weart offers a dramatic illustration of the affective/emotional/instinctive nature of risk perception in general, and a sobering lesson about the powerful and unpredictable ways that fear shapes the course of events.
A team of archaeologists has discovered 3,200-year-old cheese after analyzing artifacts found in an ancient Egyptian tomb. It could be the oldest known cheese sample in the world.
A team of archaeologists has discovered 3,200-year-old cheese after analyzing artifacts found in an ancient Egyptian tomb. It could be the oldest known cheese sample in the world.
The tomb that held the cheese lies in the desert sands south of Cairo. It was first discovered in the 19th century by treasure hunters, who eventually lost the knowledge of its location, leaving the Saharan sands to once again conceal the tomb.
“Since 1885 the tomb has been covered in sand and no-one knew about it,” Professor Ola el-Aguizy of Cairo University told the BBC. “It is important because this tomb was the lost tomb.”
In 2010, a team of archaeologists rediscovered the tomb, which belonged to Ptahmes, a mayor and military chief of staff of the Egyptian city of Memphis in the 13th century B.C. In the tomb, the team found a jar containing a “solidified whitish mass,” among other artifacts.
“The archaeologists suspected [the mass] was food, according to the conservation method and the position of the finding inside the tomb, but we discovered it was cheese after the first tests,” Enrico Greco, the lead author of the paper and a research assistant at Peking University in Beijing, told the The New York Times.
To find out what the substance was, the team had to develop a novel way to analyze the proteins and identify the peptide markers in the samples. They first dissolved parts of the substance and then used mass spectrometry and chromatography to analyze its proteins.
Despite more than 3,000 years spent in the desert, the researchers were able to identify hundreds of peptides (chains of amino acids) in the sample. They found some that were associated with milk from goat, sheep and, interestingly, the African buffalo, a species not usually kept as a domestic animal in modern Africa, as Gizmodo reports.
Those results suggested that the substance was cheese, specifically one that was probably similar in consistency to chevre but with a “really, really acidy” taste, as Dr. Paul Kindstedt, a professor at the University of Vermont who studies the chemistry and history of cheese, told the The New York Times.
“It would be high in moisture; it would be spreadable,” he said. “It would not last long; it would spoil very quickly.”
The researchers also found traces of the bacterium Brucella melitensis, which causes brucellosis, a debilitating disease that can cause endocarditis, arthritis, chronic fatigue, malaise, muscle pain and other conditions. It’s a disease usually contracted by consuming raw dairy products.
“The most common way to be infected [with Brucella melitensis] is by eating or drinking unpasteurized/raw dairy products. When sheep, goats, cows, or camels are infected, their milk becomes contaminated with the bacteria,” the U.S. Centers for Disease Control wrote on its website. “If the milk from infected animals is not pasteurized, the infection will be transmitted to people who consume the milk and/or cheese products.”
Dr. Kindstedt said one reason the study is significant is for its novel use of proteomic analysis, which is the systematic identification and quantification of the complete complement of proteins (the proteome) of a biological system.
“As I say to my students every year when I get to Egypt, someone has to go ahead and analyze these residues with modern capabilities,” he told the The New York Times. “This is a logical next step and I think you’re going to see a lot more of this.”
'The Great Pyramid of Chee-za'. An artist's interpretation of a very ripe, slightly deadly Egyptian tomb cheese. (Credit: Creative commons/Big Think)
However, Dr. Kindstedt did offer a bit of caution on the conclusions the researchers drew from the findings.
“The authors of this new study did some nice work,” he told Gizmodo in a statement. “But in my view, on multiple grounds (I suspect in their zeal to be “the first”), they inferred considerably beyond what their data is capable of supporting within reasonable certainty, and almost certainly they are not the first to have found solid cheese residues in Egyptian tombs, just the first to apply proteomic analyses (which is worthy achievement on its own).”
As bad as this sounds, a new essay suggests that we live in a surprisingly egalitarian age.
- A new essay depicts 700 years of economic inequality in Europe.
- The only stretch of time more egalitarian than today was the period between 1350 to approximately the year 1700.
- Data suggest that, without intervention, inequality does not decrease on its own.
Economic inequality is a constant topic. No matter the cycle — boom or bust — somebody is making a lot of money, and the question of fairness is never far behind.
A recently published essay in the Journal of Economic Literature by Professor Guido Alfani adds an intriguing perspective to the discussion by showing the evolution of income inequality in Europe over the last several hundred years. As it turns out, we currently live in a comparatively egalitarian epoch.
Seven centuries of economic history
Figure 8 from Guido Alfani, Journal of Economic Literature, 2021.
This graph shows the amount of wealth controlled by the top ten percent in certain parts of Europe over the last seven hundred years. Archival documentation similar to — and often of a similar quality as — modern economic data allows researchers to get a glimpse of what economic conditions were like centuries ago. Sources like property tax records and documents listing the rental value of homes can be used to determine how much a person's estate was worth. (While these methods leave out those without property, the data is not particularly distorted.)
The first part of the line, shown in black, represents work by Prof. Alfani and represents the average inequality level of the Sabaudian State in Northern Italy, The Florentine State, The Kingdom of Naples, and the Republic of Venice. The latter part, in gray, is based on the work of French economist Thomas Piketty and represents an average of inequality in France, the United Kingdom, and Sweden during that time period.
Despite the shift in location, the level of inequality and rate of increase are very similar between the two data sets.
Apocalyptic events cause decreases in inequality
Note that there are two substantial declines in inequality. Both are tied to truly apocalyptic events. The first is the Black Death, the common name for the bubonic plague pandemic in the 14th century, which killed off anywhere between 30 and 50 percent of Europe. The second, at the dawn of the 20th century, was the result of World War I and the many major events in its aftermath.
The 20th century as a whole was a time of tremendous economic change, and the periods not featuring major wars are notable for having large experiments in distributive economic policies, particularly in the countries Piketty considers.
The slight stall in the rise of inequality during the 17th century is the result of the Thirty Years' War, a terrible religious conflict that ravaged Europe and left eight million people dead, and of major plagues that affected South Europe. However, the recurrent outbreaks of the plague after the Black Death no longer had much effect on inequality. This was due to a number of factors, not the least of which was the adaptation of European institutions to handle pandemics without causing such a shift in wealth.
In 2010, the last year covered by the essay, inequality levels were similar to those of 1340, with 66 percent of the wealth of society being held by the top ten percent. Also, inequality levels were continuing to rise, and the trends have not ended since. As Prof. Alfani explained in an email to BigThink:
"During the decade preceding the Covid pandemic, economic inequality has shown a slow tendency towards further inequality growth. The Great Recession that began in 2008 possibly contributed to slow down inequality growth, especially in Europe, but it did not stop it. However, the expectation is that Covid-19 will tend to increase inequality and poverty. This, because it tends to create a relatively greater economic damage to those having unstable occupations, or who need physical strength to work (think of the effects of the so-called "long-Covid," which can prove physically invalidating for a long time). Additionally, and thankfully, Covid is not lethal enough to force major leveling dynamics upon society."
Can only disasters change inequality?
That is the subject of some debate. While inequality can occur in any economy, even one that doesn't grow all that much, some things appear to make it more likely to rise or fall.
Thomas Piketty suggested that the cause of changes in inequality levels is the difference in the rate of return on capital and the overall growth rate of the economy. Since the return on capital is typically higher than the overall growth rate, this means that those who have capital to invest tend to get richer faster than everybody else.
While this does explain a great deal of the graph after 1800, his model fails to explain why inequality fell after the Black Death. Indeed, since the plague destroyed human capital and left material goods alone, we would expect the ratio of wealth over income to increase and for inequality to rise. His model can provide explanations for the decline in inequality in the decades after the pandemic, however- it is possible that the abundance of capital could have lowered returns over a longer time span.
The catastrophe theory put forth by Walter Scheidel suggests that the only force strong enough to wrest economic power from those who have it is a world-shattering event like the Black Death, the fall of the Roman Empire, or World War I. While each event changed the world in a different way, they all had a tremendous leveling effect on society.
But not even this explains everything in the above graph. Pandemics subsequent to the Black Death had little effect on inequality, and inequality continued to fall for decades after World War II ended. Prof. Alfani suggests that we remember the importance of human agency through institutional change. He attributes much of the post-WWII decline in inequality to "the redistributive policies and the development of the welfare states from the 1950s to the early 1970s."
What does this mean for us now?
As Professor Alfani put it in his email:
"[H]istory does not necessarily teach us whether we should consider the current trend toward growth in economic inequality as an undesirable outcome or a problem per se (although I personally believe that there is some ground to argue for that). Nor does it teach us that high inequality is destiny. What it does teach us, is that if we do not act, we have no reason whatsoever to expect that inequality will, one day, decline on its own. History also offers abundant evidence that past trends in inequality have been deeply influenced by our collective decisions, as they shaped the institutional framework across time. So, it is really up to us to decide whether we want to live in a more, or a less unequal society."
Our love-hate relationship with browser tabs drives all of us crazy. There is a solution.
- A new study suggests that tabs can cause people to be flustered as they try to keep track of every website.
- The reason is that tabs are unable to properly organize information.
- The researchers are plugging a browser extension that aims to fix the problem.
A lot of ideas that people had about the internet in the 1990s have fallen by the wayside as technology and our usage patterns evolved. Long gone are things like GeoCities, BowieNet, and the belief that letting anybody post whatever they are thinking whenever they want is a fundamentally good idea with no societal repercussions.
While these ideas have been abandoned and the tools that made them possible often replaced by new and improved ones, not every outdated part of our internet experience is gone. A new study by a team at Carnegie Mellon makes the case that the use of tabs in a web browser is one of these outdated concepts that we would do well to get rid of.
How many tabs do you have open right now?
We didn't always have tabs. Introduced in the early 2000s, tabs are now included on all major web browsers, and most users have had access to them for a little over a decade. They've been pretty much the same since they came out, despite the ever changing nature of the internet. So, in this new study, researchers interviewed and surveyed 113 people on their use of — and feelings toward — the ubiquitous tabs.
Most people use tabs for the short-term storage of information, particularly if it's information that is needed again soon. Some keep tabs that they know they'll never get around to reading. Others used them as a sort of external memory bank. One participant described this action to the researchers:
"It's like a manifestation of everything that's on my mind right now. Or the things that should be on my mind right now... So right now, in this browser window, I have a web project that I'm working on. I don't have time to work on it right now, but I know I need to work on it. So it's sitting there reminding me that I need to work on it."
You suffer from tab overload
Unfortunately, trying to use tabs this way can cause a number of problems. A quarter of the interview subjects reported having caused a computer or browser to crash because they had too many tabs open. Others reported feeling flustered by having so many tabs open — a situation called "tab overload" — or feeling ashamed that they appeared disorganized by having so many tabs up at once. More than half of participants reported having problems like this at least two or three times a week.
However, people can become emotionally invested in the tabs. One participant explained, "[E]ven when I'm not using those tabs, I don't want to close them. Maybe it's because it took efforts [sic] to open those tabs and organize them in that way."
So, we have a tool that inefficiently saves web pages that we might visit again while simultaneously reducing our productivity, increasing our anxiety, and crashing our machines. And yet we feel oddly attached to them.
Either the system is crazy or we are.
Skeema: The anti-tab revolution
The researchers concluded that at least part of the problem is caused by tabs not being an ideal way of organizing the work we now do online. They propose a new model that better compartmentalizes tabs by task and subtask, reflects users' mental models, and helps manage the users' attention on what is important right now rather than what might be important later.
To that end, the team also created Skeema, an extension for Google Chrome, that treats tabs as tasks and offers a variety of ways to organize them. Users of an early version reported having fewer tabs and windows open at one time and were better able to manage the information they contained.
Tabs were an improvement over having multiple windows open at the same time, but they may have outlived their usefulness. While it might take a paradigm shift to fully replace the concept, the study suggests that taking a different approach to tabs might be worth trying.
And now, excuse me, while I close some of the 87 tabs I currently have open.