I usually write optimistic posts. This is going to be a scary one. I apologize in advance.
While I was in Columbus last month, I mentioned the furnace-like heat. Well, it wasn't just me. In what's becoming an annual tradition, heat records are being broken all across the country. And it's not just hot, it's dry: more than 80% of the continental U.S. is experiencing abnormally dry or drought conditions, with no relief in sight. Corn and other crops are being decimated by the drought, the worst in over 50 years, which has led Agriculture Secretary Tom Vilsack to announce that he's praying for rain.
Nor is it just crops that are suffering. Colorado is burning in the worst wildfire season in history, with hot, dry weather fanning massive blazes that are incinerating rural communities and prompting mass evacuations. According to the article, fire experts are saying that this could be the future for Colorado: wildfires raging unchecked all summer, every summer, until they're finally quenched by the autumn rains. And one more: It's so hot that nuclear power plants in the Midwest are faced with shutdowns because the water they usually draw from lakes and rivers to cool their cores is too warm.
The usual caution of climate scientists is that climate is a large-scale trend, whereas weather is a local phenomenon subject to all the random ups and downs of chance. One hot summer doesn't prove the pattern, just as one cold winter doesn't disprove it. Nevertheless, there comes a point where the anomalies begin to pile up; where the accumulating weight of evidence forces the objective observer toward a particular conclusion. The Earth is changing, it's changing because of us, and we're starting to feel the brunt of it.
I don't think there's a good term in popular use to describe what's happening to our planet. "Climate change" is too sterile, too antiseptic. "Global warming" is misleading: it's technically accurate but gives the impression that we can expect one uniform blanket of change, when the reality is that different parts of the world will be affected in different ways.
I have a suggestion. What we're bringing back, by our heedless burning of hydrocarbons, is the kind of climate the planet last saw millions of years ago, when the dinosaurs reigned; when the carbon that's now buried in coal banks and oil fields was aboveground, cloaking the Earth in sweltering jungle. How about we call it "dinosaur weather"?
We environmentalists have always prided ourselves on being clear-eyed realists, so we need to face facts: it's too late to avert this. We're beginning the shift to renewable, carbon-neutral power sources, but too much carbon is already in the air, and fossil fuels still have far too much momentum. This planet is going to change, in ways we can't fully foresee. The only question is how much damage we're going to cause - how much worse it's going to get before we stop digging up and burning the buried forests of the dinosaurs.
To get a glimpse of one possible answer, consider an engrossing site called the World Dream Bank, whose author imagines whole planets and then illustrates them in meticulous, plausible detail. Some are totally alien; some are strange variations on our own planet, hypothetical Earths from parallel universes. (There's also lots of weird pseudoscience on other sections of the site, which I obviously don't endorse.)
One of these parallel worlds is called Dubia (and you'll have to click through and read to see the explanation of why it's called that). The concept is simple: it's what Earth would look like with carbon-dioxide levels of 700 parts per million, about double what we currently have. This leads to massive climatic changes, which lead in turn to geographical changes: with no more polar ice caps, the sea level rises by over 100 meters, and the world is inundated.
Scary as that sounds, the overall impression you get if you read all the way through is that Dubia isn't a desert wasteland or a barren inferno. It's a lush, perfectly livable, even hospitable world. The problem is that its livable regions aren't where our planet's livable regions are. And to get there from here would entail disruptions on an unimaginable scale: the migrations of billions of people, cities and nations drowned or abandoned to desert, agriculture completely uprooted and restarted from scratch in the new fertile areas. There would be mass extinctions, famines, wars, and probably much worse. The human species would survive; whether civilization would make it through, and in what form, is a different question.
Here's what we'd have to give up, to get from Earth to Dubia. Here's what would be underwater in Europe:
Scandinavia, Spain, Brittany and Normandy are now islands, and the northern plain is gone, from Belgium to Murmansk. So are Athens, Venice, London, Brussels, Amsterdam, Hamburg, Copenhagen, Helsinki, St. Petersburg.
And in southern Asia:
India's heartland, the Ganges Plain, is gone. The sea's crept up the Brahmaputra valley, too, nearly to the Chinese border. The new breadbaskets are the green Deccan and what's left of the Indus Valley, where the rains have increased. But even the Thar Desert and Pakistan's dry mountains are grasslands, while the Punjab, straddling the Indo-Paki border, is downright lush. The Rann of Kutch is now a great sound; the coastal hills are an island-chain stretching all the way to Bombay, now a modest island town. Calcutta has, of course, been obliterated.
The nation of Bangladesh is gone too.
So is half Burma. And Thailand. And southern Cambodia and Vietnam... Singapore is long abandoned--just another rusty reef.
In America, Florida is of course gone. Louisiana is gone. Alabama and Mississippi are partially swallowed by a new inland sea. And in the Northeast:
New England's now an island, cut off by the St. Lawrence and the narrow Hudson Straits. I won't dwell on the view from the Hudson Palisades, looking out at the great rust-red towers rising from the sea--it's such a cliché, repeatable all the way from Toronto to Boston to Washington. Instead let's admire Niagara Falls pouring into the sea. No, no, I exaggerate--it's still a good five miles from the beaches of Ontario Sound.
Is this humanity's future? In all likelihood, we'll never know personally. We've caught the first rumblings of it, but we're not going to live to see this new world. It will be our distant descendants who'll have to live with the full measure of what we've done to our planet. But just because we'll escape the worst consequences doesn't make our selfishness any less deplorable.
We may yet avoid the worst of this. Perhaps our economy will reach a tipping point and decarbonize faster than anyone expected; perhaps we'll invent some kind of geoengineering technology to suck all the carbon out of the atmosphere and restore our climate to its former state. But it would be foolishness to count on something like this happening, and I'm increasingly pessimistic. I still believe that great things lie ahead for humanity; that our future will be more peaceful, more free and more enlightened than our present. But this will be another giant hurdle we'll have to surmount to reach that future state, and like many of the others, it's one we created for ourselves.
Daylight Atheism: The Book is now available! Click here for reviews and ordering information.
Upstreamism advocate Rishi Manchanda calls us to understand health not as a "personal responsibility" but a "common good."
- Upstreamism tasks health care professionals to combat unhealthy social and cultural influences that exist outside — or upstream — of medical facilities.
- Patients from low-income neighborhoods are most at risk of negative health impacts.
- Thankfully, health care professionals are not alone. Upstreamism is increasingly part of our cultural consciousness.
- A huge segment of America's population — the Baby Boom generation — is aging and will live longer than any American generation in history.
- The story we read about in the news? Their drain on social services like Social Security and Medicare.
- But increased longevity is a cause for celebration, says Ashton Applewhite, not doom and gloom.
Some evidence attributes a certain neurological phenomenon to a near death experience.
Time of death is considered when a person has gone into cardiac arrest. This is the cessation of the electrical impulse that drive the heartbeat. As a result, the heart locks up. The moment the heart stops is considered time of death. But does death overtake our mind immediately afterward or does it slowly creep in?
Researchers hope the technology will further our understanding of the brain, but lawmakers may not be ready for the ethical challenges.
- Researchers at the Yale School of Medicine successfully restored some functions to pig brains that had been dead for hours.
- They hope the technology will advance our understanding of the brain, potentially developing new treatments for debilitating diseases and disorders.
- The research raises many ethical questions and puts to the test our current understanding of death.
The image of an undead brain coming back to live again is the stuff of science fiction. Not just any science fiction, specifically B-grade sci fi. What instantly springs to mind is the black-and-white horrors of films like Fiend Without a Face. Bad acting. Plastic monstrosities. Visible strings. And a spinal cord that, for some reason, is also a tentacle?
But like any good science fiction, it's only a matter of time before some manner of it seeps into our reality. This week's Nature published the findings of researchers who managed to restore function to pigs' brains that were clinically dead. At least, what we once thought of as dead.
What's dead may never die, it seems
The researchers did not hail from House Greyjoy — "What is dead may never die" — but came largely from the Yale School of Medicine. They connected 32 pig brains to a system called BrainEx. BrainEx is an artificial perfusion system — that is, a system that takes over the functions normally regulated by the organ. Think a dialysis machine for the mind. The pigs had been killed four hours earlier at a U.S. Department of Agriculture slaughterhouse; their brains completely removed from the skulls.
BrainEx pumped an experiment solution into the brain that essentially mimic blood flow. It brought oxygen and nutrients to the tissues, giving brain cells the resources to begin many normal functions. The cells began consuming and metabolizing sugars. The brains' immune systems kicked in. Neuron samples could carry an electrical signal. Some brain cells even responded to drugs.
The researchers have managed to keep some brains alive for up to 36 hours, and currently do not know if BrainEx can have sustained the brains longer. "It is conceivable we are just preventing the inevitable, and the brain won't be able to recover," said Nenad Sestan, Yale neuroscientist and the lead researcher.
As a control, other brains received either a fake solution or no solution at all. None revived brain activity and deteriorated as normal.
The researchers hope the technology can enhance our ability to study the brain and its cellular functions. One of the main avenues of such studies would be brain disorders and diseases. This could point the way to developing new of treatments for the likes of brain injuries, Alzheimer's, Huntington's, and neurodegenerative conditions.
"This is an extraordinary and very promising breakthrough for neuroscience. It immediately offers a much better model for studying the human brain, which is extraordinarily important, given the vast amount of human suffering from diseases of the mind [and] brain," Nita Farahany, the bioethicists at the Duke University School of Law who wrote the study's commentary, told National Geographic.
An ethical gray matter
Before anyone gets an Island of Dr. Moreau vibe, it's worth noting that the brains did not approach neural activity anywhere near consciousness.
The BrainEx solution contained chemicals that prevented neurons from firing. To be extra cautious, the researchers also monitored the brains for any such activity and were prepared to administer an anesthetic should they have seen signs of consciousness.
Even so, the research signals a massive debate to come regarding medical ethics and our definition of death.
Most countries define death, clinically speaking, as the irreversible loss of brain or circulatory function. This definition was already at odds with some folk- and value-centric understandings, but where do we go if it becomes possible to reverse clinical death with artificial perfusion?
"This is wild," Jonathan Moreno, a bioethicist at the University of Pennsylvania, told the New York Times. "If ever there was an issue that merited big public deliberation on the ethics of science and medicine, this is one."
One possible consequence involves organ donations. Some European countries require emergency responders to use a process that preserves organs when they cannot resuscitate a person. They continue to pump blood throughout the body, but use a "thoracic aortic occlusion balloon" to prevent that blood from reaching the brain.
The system is already controversial because it raises concerns about what caused the patient's death. But what happens when brain death becomes readily reversible? Stuart Younger, a bioethicist at Case Western Reserve University, told Nature that if BrainEx were to become widely available, it could shrink the pool of eligible donors.
"There's a potential conflict here between the interests of potential donors — who might not even be donors — and people who are waiting for organs," he said.
It will be a while before such experiments go anywhere near human subjects. A more immediate ethical question relates to how such experiments harm animal subjects.
Ethical review boards evaluate research protocols and can reject any that causes undue pain, suffering, or distress. Since dead animals feel no pain, suffer no trauma, they are typically approved as subjects. But how do such boards make a judgement regarding the suffering of a "cellularly active" brain? The distress of a partially alive brain?
The dilemma is unprecedented.
Setting new boundaries
Another science fiction story that comes to mind when discussing this story is, of course, Frankenstein. As Farahany told National Geographic: "It is definitely has [sic] a good science-fiction element to it, and it is restoring cellular function where we previously thought impossible. But to have Frankenstein, you need some degree of consciousness, some 'there' there. [The researchers] did not recover any form of consciousness in this study, and it is still unclear if we ever could. But we are one step closer to that possibility."
She's right. The researchers undertook their research for the betterment of humanity, and we may one day reap some unimaginable medical benefits from it. The ethical questions, however, remain as unsettling as the stories they remind us of.
SMARTER FASTER trademarks owned by The Big Think, Inc. All rights reserved.