Once a week.
Subscribe to our weekly newsletter.
Just how cold was the Ice Age? New study finds the temperature
Researchers figure out the average temperatures of the last ice age on Earth.
- A new study analyzes fossil data to find the average temperatures during the last Ice Age.
- This period of time, about 20,000 years ago, had the average temperature of about 46 degrees Fahrenheit (7.8 C).
- The study has implications for understanding climate change.
How cold was the Ice Age? While one can imagine layers of ice covering everything around the world, that's not exactly what happened. In fact, researchers identified the temperature of the Last Glacial Maximum, from about 20,000 ago, to be about 46 degrees Fahrenheit (7.8 C).
This, of course, was the average global temperature – not the extent of how cold it really got in some places. The Last Glacial Maximum (LGM) was a very chilly period, when glaciers covered about half of North and South Americas, as well as Europe and parts of Asia. Overall, the new paper found that the world's temperatures were about 11 degrees Fahrenheit or 6 degrees Celsius less warm than today. If you're comparing, the average global temperature was 14 C (57 F) in the 20th century.
The study's lead author, Jessica Tierney, associate professor at the University of Arizona Department of Geosciences, addressed that this may not sound like a big deal to some but was, in fact, monumental.
"In your own personal experience that might not sound like a big difference, but, in fact, it's a huge change," explained Tierney. "In North America and Europe, the most northern parts were covered in ice and were extremely cold. Even here in Arizona, there was big cooling. But the biggest cooling was in high latitudes, such as the Arctic, where it was about 14 C (25 F) colder than today."
This corresponds to climate change models, which show that high latitudes get warmer at a faster rate than low latitudes. This means, according to projections, that this process of "polar amplification" will make it warmer and warmer over areas like the Arctic which are more sensitive to climate change.
Surface air temperatures during the last ice age.
Credit: Jessica Tierney, University of Arizona
Tierney's team calculated that every time the amount of atmospheric carbon will double, global temperatures should go up by 3.4 C (6.1 F). Carbon levels during the Ice Age were about 180 parts per million, then rose to about 280 parts per million during the Industrial Revolution, and have by now reached 415 parts per million.
How did the scientists reach their conclusions? The team used models that connected data from ocean plankton fossils to sea-surface temperatures. A technique called data assimilation, used in weather forecasting, was then employed to link the fossil data with climate model simulations of the LGM.
"What happens in a weather office is they measure the temperature, pressure, humidity and use these measurements to update a forecasting model and predict the weather," Tierney shared. "Here, we use the Boulder, Colorado-based National Center for Atmospheric Research climate model to produce a hindcast of the LGM, and then we update this hindcast with the actual data to predict what the climate was like."
The findings will help climate scientists evaluate how today's rising atmospheric levels of carbon dioxide influence the average temperatures around the world.
Co-authors of the new study also include professor Christopher Poulsen from the University of Michigan and postdoctoral researcher Jiang Zhu, now with the National Center for Atmospheric Research.
"Six degrees of global average cooling is enormous. The world would have looked much different during the last glacial maximum," said Poulsen, adding "The northern portions of North America, including here in Ann Arbor, Michigan, were covered by kilometers of ice."
You can read their paper published in Nature.
The COVID-19 pandemic is making health disparities in the United States crystal clear. It is a clarion call for health care systems to double their efforts in vulnerable communities.
- The COVID-19 pandemic has exacerbated America's health disparities, widening the divide between the haves and have nots.
- Studies show disparities in wealth, race, and online access have disproportionately harmed underserved U.S. communities during the pandemic.
- To begin curing this social aliment, health systems like Northwell Health are establishing relationships of trust in these communities so that the post-COVID world looks different than the pre-COVID one.
COVID-19 deepens U.S. health disparities<p>Communities on the pernicious side of America's health disparities have their unique histories, environments, and social structures. They are spread across the United States, but they all have one thing in common.</p><p>"There is one common divide in American communities, and that is poverty," said <a href="https://www.northwell.edu/about/leadership/debbie-salas-lopez" target="_blank">Debbie Salas-Lopez, MD, MPH</a>, senior vice president of community and population health at Northwell Health. "That is the undercurrent that manifests poor health, poor health outcomes, or poor health prognoses for future wellbeing."</p><p>Social determinants have far-reaching effects on health, and poor communities have unfavorable social determinants. To pick one of many examples, <a href="https://www.npr.org/2020/09/27/913612554/a-crisis-within-a-crisis-food-insecurity-and-covid-19" target="_blank" rel="noopener noreferrer">food insecurity</a> reduces access to quality food, leading to poor health and communal endemics of chronic medical conditions. The U.S. Centers for Disease Control and Prevention has identified some of these conditions, such as obesity and Type 2 diabetes, as increasing the risk of developing a severe case of coronavirus.</p><p>The pandemic didn't create poverty or food insecurity, but it exacerbated both, and the results have been catastrophic. A study published this summer in the <em><a href="https://link.springer.com/article/10.1007/s11606-020-05971-3" target="_blank">Journal of General Internal Medicine</a></em> suggested that "social factors such as income inequality may explain why some parts of the USA are hit harder by the COVID-19 pandemic than others."</p><p>That's not to say better-off families in the U.S. weren't harmed. A <a href="https://voxeu.org/article/poverty-inequality-and-covid-19-us" target="_blank" rel="noopener noreferrer">paper from the Centre for Economic Policy Research</a> noted that families in counties with a higher median income experienced adjustment costs associated with the pandemic—for example, lowering income-earning interactions to align with social distancing policies. However, the paper found that the costs of social distancing were much greater for poorer families, who cannot easily alter their living circumstances, which often include more individuals living in one home and a reliance on mass transit to reach work and grocery stores. They are also disproportionately represented in essential jobs, such as retail, transportation, and health care, where maintaining physical distance can be all but impossible.</p><p>The paper also cited a positive correlation between higher income inequality and higher rates of coronavirus infection. "Our interpretation is that poorer people are less able to protect themselves, which leads them to different choices—they face a steeper trade-off between their health and their economic welfare in the context of the threats posed by COVID-19," the authors wrote.</p><p>"There are so many pandemics that this pandemic has exacerbated," Dr. Salas-Lopez noted.</p><p>One example is the health-wealth gap. The mental stressors of maintaining a low socioeconomic status, especially in the face of extreme affluence, can have a physically degrading impact on health. <a href="https://www.scientificamerican.com/index.cfm/_api/render/file/?method=inline&fileID=123ECD96-EF81-46F6-983D2AE9A45FA354" target="_blank" rel="noopener noreferrer">Writing on this gap</a>, Robert Sapolsky, professor of biology and neurology at Stanford University, notes that socioeconomic stressors can increase blood pressure, reduce insulin response, increase chronic inflammation, and impair the prefrontal cortex and other brain functions through anxiety, depression, and cognitive load. </p><p>"Thus, from the macro level of entire body systems to the micro level of individual chromosomes, poverty finds a way to produce wear and tear," Sapolsky writes. "It is outrageous that if children are born into the wrong family, they will be predisposed toward poor health by the time they start to learn the alphabet."</p>Research on the economic and mental health fallout of COVID-19 is showing two things: That unemployment is hitting <a href="https://www.pewsocialtrends.org/2020/09/24/economic-fallout-from-covid-19-continues-to-hit-lower-income-americans-the-hardest/" target="_blank" rel="noopener noreferrer">low-income and young Americans</a> most during the pandemic, potentially widening the health-wealth gap further; and that the pandemic not only exacerbates mental health stressors, but is doing so at clinically relevant levels. As <a href="https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7413844/" target="_blank" rel="noopener noreferrer">the authors of one review</a> wrote, the pandemic's effects on mental health is itself an international public health priority.
Working to close the health gap<img type="lazy-image" data-runner-src="https://assets.rebelmouse.io/eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpbWFnZSI6Imh0dHBzOi8vYXNzZXRzLnJibC5tcy8yNDc5MDk1MS9vcmlnaW4uanBnIiwiZXhwaXJlc19hdCI6MTYxNTYyMzQzMn0.KSFpXH7yHYrfVPtfgcxZqAHHYzCnC2bFxwSrJqBbH4I/img.jpg?width=980" id="b40e2" class="rm-shortcode" data-rm-shortcode-id="1b9035370ab7b02a0dc00758e494412b" data-rm-shortcode-name="rebelmouse-image" />
Northwell Health coronavirus testing center at Greater Springfield Community Church.
Credit: Northwell Health<p>Novel coronavirus may spread and infect indiscriminately, but pre-existing conditions, environmental stressors, and a lack of access to care and resources increase the risk of infection. These social determinants make the pandemic more dangerous, and erode communities' and families' abilities to heal from health crises that pre-date the pandemic.</p><p>How do we eliminate these divides? Dr. Salas-Lopez says the first step is recognition. "We have to open our eyes to see the suffering around us," she said. "Northwell has not shied away from that."</p><p>"We are steadfast in improving health outcomes for our vulnerable and underrepresented communities that have suffered because of the prevalence of chronic disease, a problem that led to the disproportionately higher death rate among African-Americans and Latinos during the COVID-19 pandemic," said Michael Dowling, Northwell's president and CEO. "We are committed to using every tool at our disposal—as a provider of health care, employer, purchaser and investor—to combat disparities and ensure the <a href="https://www.northwell.edu/education-and-resources/community-engagement/center-for-equity-of-care" target="_blank" rel="noopener noreferrer">equity of care</a> that everyone deserves." </p><p>With the need recognized, Dr. Salas-Lopez calls for health care systems to travel upstream and be proactive in those hard-hit communities. This requires health care systems to play a strong role, but not a unilateral one. They must build <a href="https://www.northwell.edu/news/insights/faith-based-leaders-are-the-key-to-improving-community-health" target="_blank" rel="noopener noreferrer">partnerships with leaders in those communities</a> and utilize those to ensure relationships last beyond the current crisis. </p><p>"We must meet with community leaders and talk to them to get their perspective on what they believe the community needs are and should be for the future. Together, we can co-create a plan to measurably improve [community] health and also to be ready for whatever comes next," she said.</p><p>Northwell has built relationships with local faith-based and community organizations in underserved communities of color. Those partnerships enabled Northwell to test more than 65,000 people across the metro New York region. The health system also offered education on coronavirus and precautions to curb its spread.</p><p>These initiatives began the process of building trust—trust that Northwell has counted on to return to these communities to administer flu vaccines to prepare for what experts fear may be a difficult flu season.</p><p>While Northwell has begun building bridges across the divides of the New York area, much will still need to be done to cure U.S. health care overall. There is hope that the COVID pandemic will awaken us to the deep disparities in the US.</p><p>"COVID has changed our world. We have to seize this opportunity, this pandemic, this crisis to do better," Dr. Salas-Lopez said. "Provide better care. Provide better health. Be better partners. Be better community citizens. And treat each other with respect and dignity.</p><p>"We need to find ways to unify this country because we're all human beings. We're all created equal, and we believe that health is one of those important rights."</p>
With just a few strategical tweaks, the Nazis could have won one of World War II's most decisive battles.
- The Battle of Britain is widely recognized as one of the most significant battles that occurred during World War II. It marked the first major victory of the Allied forces and shifted the tide of the war.
- Historians, however, have long debated the deciding factor in the British victory and German defeat.
- A new mathematical model took into account numerous alternative tactics that the German's could have made and found that just two tweaks stood between them and victory over Britain.
Two strategic blunders<p>Now, historians and mathematicians from York St. John University have collaborated to produce <a href="http://www-users.york.ac.uk/~nm15/bootstrapBoB%20AAMS.docx" target="_blank">a statistical model (docx download)</a> capable of calculating what the likely outcomes of the Battle of Britain would have been had the circumstances been different. </p><p>Would the German war effort have fared better had they not bombed Britain at all? What if Hitler had begun his bombing campaign earlier, even by just a few weeks? What if they had focused their targets on RAF airfields for the entire course of the battle? Using a statistical technique called weighted bootstrapping, the researchers studied these and other alternatives.</p><p>"The weighted bootstrap technique allowed us to model alternative campaigns in which the Luftwaffe prolongs or contracts the different phases of the battle and varies its targets," said co-author Dr. Jaime Wood in a <a href="https://www.york.ac.uk/news-and-events/news/2020/research/mathematicians-battle-britain-what-if-scenarios/" target="_blank">statement</a>. Based on the different strategic decisions that the German forces could have made, the researchers' model enabled them to predict the likelihood that the events of a given day of fighting would or would not occur.</p><p>"The Luftwaffe would only have been able to make the necessary bases in France available to launch an air attack on Britain in June at the earliest, so our alternative campaign brings forward the air campaign by three weeks," continued Wood. "We tested the impact of this and the other counterfactuals by varying the probabilities with which we choose individual days."</p><p>Ultimately, two strategic tweaks shifted the odds significantly towards the Germans' favor. Had the German forces started their campaign earlier in the year and had they consistently targeted RAF airfields, an Allied victory would have been extremely unlikely.</p><p>Say the odds of a British victory in the real-world Battle of Britain stood at 50-50 (there's no real way of knowing what the actual odds are, so we'll just have to select an arbitrary figure). If this were the case, changing the start date of the campaign and focusing only on airfields would have reduced British chances at victory to just 10 percent. Even if a British victory stood at 98 percent, these changes would have cut them down to just 34 percent.</p>
A tool for understanding history<p>This technique, said co-author Niall Mackay, "demonstrates just how finely-balanced the outcomes of some of the biggest moments of history were. Even when we use the actual days' events of the battle, make a small change of timing or emphasis to the arrangement of those days and things might have turned out very differently."</p><p>The researchers also claimed that their technique could be applied to other uncertain historical events. "Weighted bootstrapping can provide a natural and intuitive tool for historians to investigate unrealized possibilities, informing historical controversies and debates," said Mackay.</p><p>Using this technique, researchers can evaluate other what-ifs and gain insight into how differently influential events could have turned out if only the slightest things had changed. For now, at least, we can all be thankful that Hitler underestimated Britain's grit.</p>
The next era in American history can look entirely different. It's up to us to choose.
- The timeline of America post-WWII can be divided into two eras, according to author and law professor Ganesh Sitaraman: the liberal era which ran through the 1970s, and the current neoliberal era which began in the early 1980s. The latter promised a "more free society," but what we got instead was more inequality, less opportunity, and greater market consolidation.
- "We've lived through a neoliberal era for the last 40 years, and that era is coming to an end," Sitaraman says, adding that the ideas and policies that defined the period are being challenged on various levels.
- What comes next depends on if we take a proactive and democratic approach to shaping the economy, or if we simply react to and "deal with" market outcomes.
A new MIT report proposes how humans should prepare for the age of automation and artificial intelligence.
- A new report by MIT experts proposes what humans should do to prepare for the age of automation.
- The rise of intelligent machines is coming but it's important to resolve human issues first.
- Improving economic inequality, skills training, and investment in innovation are necessary steps.