Nuclear power is not the answer in a time of climate change
It makes no "makes no economic or energy sense."
In November 2018, the Woolsey Fire scorched nearly 100,000 acres of Los Angeles and Ventura counties, destroying forests, fields and more than 1,500 structures, and forcing the evacuation of nearly 300,000 people over 14 days.
It burned so viciously that it seared a scar into the land that's visible from space. Investigators determined that the Woolsey Fire began at the Santa Susana Field Laboratory, a nuclear research property contaminated by a partial meltdown in 1959 of its failed Sodium Reactor Experiment, as well as rocket tests and regular releases of radiation.
The State of California's Department of Toxic Substances Control (DTSC) reports that its air, ash and soil tests conducted on the property after the fire show no release of radiation beyond baseline for the contaminated site. But the DTSC report lacks sufficient information, according to the Bulletin of Atomic Scientists. It includes 'few actual measurements' of the smoke from the fire, and the data raises alarms. Research on Chernobyl in Ukraine following wildfires in 2015 shows clear release of radiation from the old nuclear power plant, calling into question the quality of DTSC's tests. What's more, scientists such as Nikolaos Evangeliou, who studies radiation releases from wildfires at the Norwegian Institute for Air Research, point out that the same hot, dry and windy conditions exacerbating the Woolsey Fire (all related to human-caused global warming) are a precursor to future climate-related radioactive releases.
With our climate-impacted world now highly prone to fires, extreme storms and sea-level rise, nuclear energy is touted as a possible replacement for the burning of fossil fuels for energy – the leading cause of climate change. Nuclear power can demonstrably reduce carbon dioxide emissions. Yet scientific evidence and recent catastrophes call into question whether nuclear power could function safely in our warming world. Wild weather, fires, rising sea levels, earthquakes and warming water temperatures all increase the risk of nuclear accidents, while the lack of safe, long-term storage for radioactive waste remains a persistent danger.
The Santa Susana Field Laboratory property has had a long history of contaminated soil and groundwater. Indeed, a 2006 advisory panel compiled a report suggesting that workers at the lab, as well as residents living nearby, had unusually high exposure to radiation and industrial chemicals that are linked to an increased incidence of some cancers. Discovery of the pollution prompted California's DTSC in 2010 to order a cleanup of the site by its current owner – Boeing – with assistance from the US Department of Energy and NASA. But the required cleanup has been hampered by Boeing's legal fight to perform a less rigorous cleaning.
Like the Santa Susana Field Lab, Chernobyl remains largely unremediated since its meltdown in 1986. With each passing year, dead plant material accumulates and temperatures rise, making it especially prone to fires in the era of climate change. Radiation releases from contaminated soils and forests can be carried thousands of kilometres away to human population centres, according to Evangeliou.
Kate Brown, a historian at the Massachusetts Institute of Technology and the author of Manual for Survival: A Chernobyl Guide to the Future (2019), and Tim Mousseau, an evolutionary biologist at the University of South Carolina, also have grave concerns about forest fires. 'Records show that there have been fires in the Chernobyl zone that raised the radiation levels by seven to 10 times since 1990,' Brown says. Further north, melting glaciers contain 'radioactive fallout from global nuclear testing and nuclear accidents at levels 10 times higher than elsewhere'. As ice melts, radioactive runoff flows into the ocean, is absorbed into the atmosphere, and falls as acid rain. 'With fires and melting ice, we are basically paying back a debt of radioactive debris incurred during the frenzied production of nuclear byproducts during the 20th century,' Brown concludes.
Flooding is another symptom of our warming world that could lead to nuclear disaster. Many nuclear plants are built on coastlines where seawater is easily used as a coolant. Sea-level rise, shoreline erosion, coastal storms and heat waves – all potentially catastrophic phenomena associated with climate change – are expected to get more frequent as the Earth continues to warm, threatening greater damage to coastal nuclear power plants. 'Mere absence of greenhouse gas emissions is not sufficient to assess nuclear power as a mitigation for climate change,' conclude Natalie Kopytko and John Perkins in their paper 'Climate Change, Nuclear Power, and the Adaptation-Mitigation Dilemma' (2011) in Energy Policy.
Proponents of nuclear power say that the reactors' relative reliability and capacity make this a much clearer choice than other non-fossil-fuel sources of energy, such as wind and solar, which are sometimes brought offline by fluctuations in natural resource availability. Yet no one denies that older nuclear plants, with an aged infrastructure often surpassing expected lifetimes, are extremely inefficient and run a higher risk of disaster.
'The primary source of nuclear power going forward will be the current nuclear fleet of old plants,' said Joseph Lassiter, an energy expert and nuclear proponent who is retired from Harvard University. But 'even where public support exists for [building new] nuclear plants, it remains to be seen if these new-build nuclear plants will make a significant contribution to fossil-emissions reductions given the cost and schedule overruns that have plagued the industry.'
Lassiter and several other energy experts advocate for the new, Generation IV nuclear power plants that are supposedly designed to deliver high levels of nuclear power at the lowest cost and with the lowest safety risks. But other experts say that the benefits even here remain unclear. The biggest critique of the Generation IV nuclear reactors is that they are in the design phase, and we don't have time to wait for their implementation. Climate abatement action is needed immediately.
'New nuclear power seemingly represents an opportunity for solving global warming, air pollution, and energy security,' says Mark Jacobson, director of Stanford University's Atmosphere and Energy Programme. But it makes no economic or energy sense. 'Every dollar spent on nuclear results in one-fifth the energy one would gain with wind or solar [at the same cost], and nuclear energy takes five to 17 years longer before it becomes available. As such, it is impossible for nuclear to help with climate goals of reducing 80 per cent of emissions by 2030. Also, while we're waiting around for nuclear, coal, gas and oil are being burned and polluting the air. In addition, nuclear has energy security risks other technologies don't have: weapons proliferation, meltdown, waste and uranium-worker lung-cancer risks.'
Around the world, 31 countries have nuclear power plants that are currently online, according to the International Atomic Energy Agency. By contrast, four countries have made moves to phase out nuclear power following the 2011 Fukushima disaster, and 15 countries have remained opposed and have no functional power plants.
With almost all countries' carbon dioxide emissions increasing – and China, India and the US leading the pack – the small Scandinavian country of Denmark is an outlier. Its carbon dioxide emissions are decreasing despite it not producing any nuclear power. Denmark does import some nuclear power produced by its neighbours Sweden and Germany, but in February, the country's most Left-leaning political party, Enhedslisten, published a new climate plan that outlines a path for the country to start relying on its own 100 per cent renewable, non-nuclear energy for power and heat production by 2030. The plan would require investments in renewables such as solar and wind, a smart grid and electric vehicles that double as mobile batteries and can recharge the grid during peak hours.
Gregory Jaczko, former chairman of the US Nuclear Regulatory Commission and the author of Confessions of a Rogue Nuclear Regulator (2019), believes the technology is no longer a viable method for dealing with climate change: 'It is dangerous, costly and unreliable, and abandoning it will not bring on a climate crisis.'
This article was originally published at Aeon and has been republished under Creative Commons.
- Why the Future of Nuclear Energy May Depend on Media Coverage ... ›
- Nuclear Fear, Science, and Ideology - Big Think ›
- Why the World Needs Nuclear Power - Big Think ›
- Jon Stewart: Congress is abandoning veterans exposed to toxic 'burn pits' - Big Think ›
Dominique Crenn, the only female chef in America with three Michelin stars, joins Big Think Live this Thursday at 1pm ET.
Scientists discover the inner workings of an effect that will lead to a new generation of devices.
- Researchers discover a method of extracting previously unavailable information from superconductors.
- The study builds on a 19th-century discovery by physicist Edward Hall.
- The research promises to lead to a new generation of semiconductor materials and devices.
Credit: Gunawan/Nature magazine
The number of people with dementia is expected to triple by 2060.
The images and our best computer models don't agree.
A trio of intriguing galaxy clusters<img type="lazy-image" data-runner-src="https://assets.rebelmouse.io/eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpbWFnZSI6Imh0dHBzOi8vYXNzZXRzLnJibC5tcy8yNDQzNDA0OS9vcmlnaW4uanBnIiwiZXhwaXJlc19hdCI6MTYxNTkzNzUyOH0.0IRzkzvKsmPEHV-v1dqM1JIPhgE2W-UHx0COuB0qQnA/img.jpg?width=980" id="d69be" class="rm-shortcode" data-rm-shortcode-id="2d2664d9174369e0a06540cb3a3a9079" data-rm-shortcode-name="rebelmouse-image" />
The three galaxy clusters imaged for the study
Mapping dark matter<span style="display:block;position:relative;padding-top:56.25%;" class="rm-shortcode" data-rm-shortcode-id="d904b585c806752f261e1215014691a6"><iframe type="lazy-iframe" data-runner-src="https://www.youtube.com/embed/fO0jO_a9uLA?rel=0" width="100%" height="auto" frameborder="0" scrolling="no" style="position:absolute;top:0;left:0;width:100%;height:100%;"></iframe></span><p>The assumption has been that the greater the lensing effect, the higher the concentration of dark matter.</p><p>As scientists analyzed the clusters' large-scale lensing — the massive arc and elongation visual effects produced by dark matter — they noticed areas of smaller-scale lensing within that larger distortion. The scientists interpret these as concentrations of dark matter within individual galaxies inside the clusters.</p><p>The researchers used spectrographic data from the VLT to determine the mass of these smaller lenses. <a href="https://www.oas.inaf.it/en/user/pietro.bergamini/" target="_blank" rel="noopener noreferrer">Pietro Bergamini</a> of the INAF-Observatory of Astrophysics and Space Science in Bologna, Italy explains, "The speed of the stars gave us an estimate of each individual galaxy's mass, including the amount of dark matter." The leader of the spectrographic aspect of the study was <a href="http://docente.unife.it/docenti-en/piero.rosati1/curriculum?set_language=en" target="_blank">Piero Rosati</a> of the Università degli Studi di Ferrara, Italy who recalls, "the data from Hubble and the VLT provided excellent synergy. We were able to associate the galaxies with each cluster and estimate their distances." </p><p>This work allowed the team to develop a thoroughly calibrated, high-resolution map of dark matter concentrations throughout the three clusters.</p>
But the models say...<p>However, when the researchers compared their map to the concentrations of dark matter computer models predicted for galaxies bearing the same general characteristics, something was <em>way</em> off. Some small-scale areas of the map had 10 times the amount of lensing — and presumably 10 times the amount of dark matter — than the model predicted.</p><p>"The results of these analyses further demonstrate how observations and numerical simulations go hand in hand," notes one team member, <a href="https://nena12276.wixsite.com/elenarasia" target="_blank">Elena Rasia</a> of the INAF-Astronomical Observatory of Trieste, Italy. Another, <a href="http://adlibitum.oats.inaf.it/borgani/" target="_blank" rel="noopener noreferrer">Stefano Borgani</a> of the Università degli Studi di Trieste, Italy, adds that "with advanced cosmological simulations, we can match the quality of observations analyzed in our paper, permitting detailed comparisons like never before."</p><p>"We have done a lot of testing of the data in this study," Meneghetti says, "and we are sure that this mismatch indicates that some physical ingredient is missing either from the simulations or from our understanding of the nature of dark matter." <a href="https://physics.yale.edu/people/priyamvada-natarajan" target="_blank">Priyamvada Natarajan</a> of Yale University in Connecticut agrees: "There's a feature of the real Universe that we are simply not capturing in our current theoretical models."</p><p>Given that any theory in science lasts only until a better one comes along, Natarajan views the discrepancy as an opportunity, saying, "this could signal a gap in our current understanding of the nature of dark matter and its properties, as these exquisite data have permitted us to probe the detailed distribution of dark matter on the smallest scales."</p><p>At this point, it's unclear exactly what the conflict signifies. Do these smaller areas have unexpectedly high concentrations of dark matter? Or can dark matter, under certain currently unknown conditions, produce a tenfold increase in lensing beyond what we've been expecting, breaking the assumption that more lensing means more dark matter?</p><p>Obviously, the scientific community has barely begun to understand this mystery.</p>
Scientists have found evidence of hot springs near sites where ancient hominids settled, long before the control of fire.