Will America’s disregard for science be the end of its reign?
Confirmation bias is baked into the DNA of America, but it may soon be the nation's undoing.
MICHAEL SHERMER: Because of the internet, especially, this whole idea of what we now call fake news, alternative facts has gotten bigger and bigger.
KURT ANDERSEN: You look at this history and it's like, "Oh, we should've seen this coming."
We were softened up as a people to believe what we want to believe.
NEIL DEGRASSE TYSON: This is irresponsible. Plus, it means you don't know how science works.
MARGARET ATWOOD: People do not want to give up their cherished beliefs, especially cherished beliefs that they find comforting.
ANDERSEN: We have this new infrastructure, that I think is new, that I think is a new condition. In 1860, southerners didn't say, "Oh, no, there are no slaves. No, no, no, there's no slavery."
BILL NYE: The United States used to be the world leader in technology, but when you have this group of leaders, elected officials, who are anti-science, you're setting the US back and then ultimately setting the world back.
KURT ANDERSEN: Americans have always been magical thinkers and passionate believers in the untrue. We were started by the Puritans in New England who wanted to create, and did create, a Christian utopia and theocracy as they waited for the imminent second coming of Christ and the end of days. And in the South by a bunch of people who were convinced, absolutely convinced, that this place they'd never been was full of gold just to be plucked from the dirt in Virginia. And they stayed there looking and hoping for gold for 20 years before they finally, finally faced the facts and the evidence and decided that they weren't going to get rich overnight there.
So that was the beginning. And then we've had centuries of 'buyer beware' charlatanism to an extreme degree and medical quackery to an extreme degree, and increasingly exotic, extravagant, implausible religions over and over again from Mormonism, to Christian science, to Scientology in the last century. And we've had this anti-establishment, "I'm not going to trust the experts. I'm not going to trust the elite," in our character from the beginning. Now, all those things came together and were supercharged in the 1960s when you were entitled to your own truth and your own reality. Then, a generation later when the internet came along, giving each of those realities, no matter how false or magical or nutty they are, their own kind of media infrastructure.
We had entertainment, again, for our whole last couple 100 years, but especially in the last 50 years, permeating all the rest of life, including presidential politics, from John F. Kennedy through Ronald Reagan to Bill Clinton. So, the thing was set up for Donald Trump to exploit all these various American threads and astonishingly become president. But then you look at this history and it's like, "Oh, we should've seen this coming."
TYSON: The power of journalism: A mistake becomes truth. The print journalism is taking what I said and turning it into an article, so it has to pass through the journalist, get processed, and then it becomes some written content on a page. One hundred percent of those experiences, the journalist got something fundamentally wrong with the subject matter. And just as an interesting point about the power of journalists, I had people read the article and say, "Neil, you must know better than that. That's not how this works." They assumed the journalist was correct about reporting what I said, not that I was correct and that the journalists was wrong. This is an interesting power that journalists have over whether you think what they're writing is true or not. That was decades ago. In recent years, what I think has happened is that they're more journalists who are science fluent that are writing about science than was the case 20 years ago. So now I don't have to worry about the journalist missing something fundamental about what I'm trying to describe. And reporting has been much more accurate in recent years, I'm happy to report. However, there's something that has not been fixed in journalism yet. It's their urge to get the story first, the science story. The breaking news about a discovery. The urge to get it first means they're reporting on something that's not yet verified by other scientific experiments. If it's not yet verified, it's not there yet. And you're more likely to write about a story that is most extraordinary. And the more extraordinary the single scientific result is, the less likely it is that it's going to be true. So you need some restraint there or some way to buffer the account. I don't want you to not talk about it, but say, "This is not yet verified. It's not yet this, it's not yet that. And it's been criticized by these other people anyway." So be more open about how wrong the thing is you're reporting on could be, because otherwise you're doing a disservice to the public. And that disservice is that people out there say, "Scientists don't know anything." But what gives you that idea? "Well, one week cholesterol is good for you and the next week it's bad for you. They don't know what they're doing!" That's on the frontier. On the frontier, science is flip-flopping all the time. Yes, if you're going to report from the frontier, it looks like scientists are clueless about everything. You take a few steps behind the line, where experiments have verified and re verified results, that's the stuff for the textbooks. That's the stuff that is objectively true. That's the stuff you should be paying attention to. That's the stuff where you should be thinking about laws and legislation related to that. If you speak to journalists, they say, "We need a fair and balanced article. So if you say this, we will go to someone else with the opposite view and that way it's fair and balanced." Where do you draw the line? You realize Earth goes around the Sun, right? "Oh yeah. Of course." If someone says the Sun goes around the earth, are you going to give them equal time? "Well, of course not because that's just ridiculous." Fine. Now, how about how much column space you're giving to climate change. "Well, there's scientists who say it's real, there's scientists are not. So we're giving them equal time, equal space." Are they equal in the literature? No. Are they equal in impact? No. Are they equal in any way? No. Except in your journalistic philosophy, you want to give more column space to something that is shown to be false by the consensus of observation and experiment that's out there. And you think you're honoring your journalistic credo, but you're not—not on that level. It's like saying the Sun goes around the earth, as far as I'm concerned. That's patently absurd to you. So you got to know where you draw that line because with matters of science, it's not simply, 'What's the other opposite opinion I can get on it?' Look to see how much scientific agreement has descended upon that statement. And if there's not much agreement, then fine, talk about the whole frontier. There's plenty of that. Just go to any scientific conference. You want to get multiple views on something? That's where you'd get it. But the moment something enters the canon of objective knowledge and objective truths, that's the kind of emergent truth that we have with climate change. Humans warming the planet. That's the kind of agreement we have in scientific research. Oh, you think it's some other way. You want it to be... That's odd. If you went to your doctor and you have some ailment and the doctor says, "You can take this pill, which three percent of all research says will cure you, or you can take this pill, which 97% of all research says will cure you." Which one are you going to walk away from the doctor's office with? The 97% pill, of course. Yet, you walk out of there and say, "Oh, I believe the three percent who say we're not warming the planet." This is irresponsible. Plus it means you don't know how science works.
SHERMER: Because of the internet, especially, this whole idea of what we now call fake news, alternative facts, has gotten bigger and bigger and it just gets unfolded in real time, online, within minutes and hours. And we have to jump on it fast. What the skeptical movement has developed is a set of tools with particular claims that are on the margins of science, like creationism, intelligent design theory, the anti-vaccinations, the Holocaust revisionists. All these conspiracy theories and so on, all these alternative medicines. And there are hundreds and hundreds of these claims that are all connected to different sciences but the scientists in those particular fields are too busy working in their research to bother with what these claims are, because the claims really aren't about those fields. They're just hooked to them. They're about something else, because back in the '80s when I first saw some professional scientists debate Duane Gish, the young-Earth creationist, they did not fare well. And I saw some Holocaust historians debating or confronting Holocaust so-called revisionists or deniers. They did not fare well because they didn't know the special arguments that are being made by these fringe people that have nothing to do with the science, really. They have an agenda and they're using these little tweak questions to get at the mainstream and try to debunk it for their own ideological reasons. So, for example, like Holocaust revisionists, they make this big deal about why the door on the gas chamber at Mauthausen doesn't lock. I mean, if it doesn't lock, how are you gassing people, if you can't lock the door? So they must not have gassed people in there. So if they didn't gas people at Mauthausen they probably didn't gas people at any of the death camps, and if they didn't gas people at any of the death camps, then there must not have been a Holocaust. What? Wait a minute, what? All from this door that doesn't lock? Well, I eventually went and found out that that wasn't the original door, that took me a couple of years. But that's the kind of specialty thing that skeptics do that mainstream scientists, scholars, historians don't have time to do.
ANDERSEN: The idea of America from the beginning was that you could come here, reinvent yourself, be anybody you want, live any way you wanted, believe anything you wanted. For the first few hundred years, like everywhere else in the world, celebrity and fame were a result of some kind of accomplishment or achievement, sometimes not a great accomplishment or achievement, but you did something in the world to earn renown. America really was the key place that invented the modern celebrity culture which was, beginning a century ago, more and more, not necessarily about having won a war or lead people or written a great book or painted a great painting, but about being famous. Fame for its own sake. We created that. We created Hollywood, we created the whole culture industry and that then became what I call the fantasy industrial complex where, certainly in the last few decades, more than ever, more than anybody thought possible before, fame for its own sake, fame itself—however you got it—was a primary goal for people. And again, as so many of the things I talk about in 'Fantasyland', not uniquely to America, but more here than anywhere. And then you get reality television, which was this unholy hybrid of the fictional and the real for the last, now, generation, where that blur between 'What's real and what's not?' is pumped into our media stream, willy nilly. There are now more reality shows on television than there were shows on television 20 years ago.
ATWOOD: If you look at the history of what happened to Darwin when he published. What would you call that? Yes, he was hugely attacked at the time. And it's often a case of people do not want to give up their cherished beliefs, especially cherished beliefs that they find comforting. So, it's no good for Richard Dawkins to say, 'Let us stand on the bold, bare promontory of truth and acknowledge the basically nothingness of ourselves.' People don't find that cozy. So they will go around the block not to do that. And that's very understandable and human. And religious thinking, the idea that there's somebody bigger than you out there who might be helpful to you if certain rules are observed, that goes back so far, we probably have an epigene or something, or a cluster of epigenes, for that. And you see it a lot in small children. There is a monster under the bed and you can't tell them there isn't—they don't find that reassuring. What you can tell them is, "Yes, there is a monster under that bed. But as long as I put this cabbage right in this spot, it can't come out."
ANDERSEN: Like all humans, Americans suffer from what's called confirmation bias, which is, "Oh, I believe this. I will look for facts or pseudo-facts or fictions that confirm my preexisting beliefs." Americans long before psychologists invented that phrase, confirmation bias, had that tendency. Again, at the very beginning: 'I've never been to the new world. Nobody I know has been in the new world. I've never really read any firsthand accounts of the new world, but I'm going to give up my life and go there because it's going to be awesome and perfect. And I'm going to get rich overnight and/or create a Christian utopia.' So we began that way and that has kept up. I just want to believe what I want to believe. And don't let your lying eyes tell you anything different.
ATWOOD: When science is telling you something that you really find very inconvenient, and that is the history of global warming and the changes that we are certainly already seeing around us. First of all, it was denial. 'It cannot be happening.' Now, there's grudging admission as things flood and droughts kick in and food supplies drop, and the sea level rises, and the glaciers melt, big time. I have seen that, been there. You can't deny that it's happening, but you then have to pretend that it's nothing to do with us. So we don't have to change our behavior. That's the thinking around that.
WADE CROWFOOT: If we ignore that science and put our head in the sand and think it's all about vegetation management, we're not going to succeed together protecting Californians.
DONALD TRUMP: Okay, it'll start getting cooler. You just watch.
CROWFOOT: I wish science agreed with you.
TRUMP: Well, I don't think science knows, actually.
ATWOOD: And that can get very entrenched until people see that by trying to solve the problem, jobs can be created and money can be made. And that will be the real tipping point in public consciousness in this country. Other countries are already there.
ANDERSEN: Believing whatever nutty thing you want to believe, or pretending you are whatever you are, or having even kooky conspiracy theories or speaking in tongues, whatever it is, fine—if it's private. The problem is when that, as it has in the last couple of decades especially, leaches into the public sphere and the policy sphere and like, "Nah, there's no global warming. We don't have to worry about the seas rising," or "Nah, scientists say that vaccines are safe but I think they cause autism, so I'm not going to vaccinate my children." And so on and so on and so on. That's when the rubber hits the road—will hit the road—and people will start saying, "Wait a minute." Not until then, not until there's a consequence and not until there's a price to pay.
NYE: By having a population of people who don't really understand germs and how serious they are, the germ gets spread really readily. There is a faction of our leaders, elected officials, who continually cuts the budget for the Centers for Disease Control, which to me reflects an ignorance of how serious germs can be. In my opinion, we should be supporting that research full bore—but at the same time, don't curtail research in other germs, which is going on at the Centers for Disease Control, for example, all the time. That's not where you save your money, Congress. But if you don't believe in the seriousness of it, and you have a mistrust of scientists, if you have a mistrust of engineers, you're not going to help us out with that, are you? So it's a very serious concern of mine. I mean, the United States used to be the world leader in technology, but when you have this group of leaders, elected officials, who are anti-science, you're setting the US back and then ultimately setting the world back.
SHERMER: Let's address the college campus issue these days. I really think this goes back to the 1980s. I noticed it first when I was in graduate school the second time when I got a PhD in the history of science. My first round was in the '70s in experimental psychology, graduate school, and I didn't notice any of this campus stuff. In the late '80s, when I was in my doctoral program, because history deals a lot with literature, the kind of post-modernist deconstruction of what texts mean was really taking off. And so I initially thought, "What is this? But, okay I'll give it a shot. I'll keep an open mind here and just try to follow the reasoning." And I can kind of see where they were going. So what is the true meaning of Jane Austen's novel here, or Shakespeare's play there, or this novelist or that author? And I can see that there may not be one meaning; maybe the author meant it as provoking you to think about certain deep issues and you have to find your own meaning in the text. Okay, I can understand that, but then it kind of started to spill over into history and I was studying the history of science. And I like to think of science as progressing towards some better understanding of reality that I believe is really there. And it's not that science is perfect and we're going to get to a perfect understanding of reality—I know that's not going to happen. But it's not the same as literature. It's not the same as art and music. It's different than that. If Darwin hadn't discovered evolution, somebody else would have—in fact, somebody did: Alfred Russel Wallace discovered natural selection as the mechanism of evolution. And if Newton hadn't discovered the calculus somebody else would've. Well, they did: Leibniz. And so on. These are things that are out there to be discovered, and I see that differently than art and music and literature, which is constructing ideas out of your mind.
So, I don't think that the post-modern deconstruction of the text applies completely to history. And you can see immediately why it fails, because this is what led to, in the '90s, the whole Holocaust denial movement, so-called revisionists. They call themselves revisionists and the argument was: 'All history is text. It's just written by the winners and the winners write themselves as the good guys, and the losers are the bad guys. And this is all unfair. And, look, maybe the winners here have unfairly critiqued Hitler and the Nazis.' and so on. Yeah. But what about that Holocaust thing? It looks pretty bad. 'Yeah. Yeah. Well, maybe it didn't happen the way we have been led to believe it happened because, again, the history of the Holocaust is written by the winners.' You can see immediately why this kind of textual analysis can cascade into complete moral relativism and insane ideas like Holocaust denial. That's when I thought, okay, this is wrong. This has gone too far. And in the mid-'90s, after we founded the skeptics and Skeptic Magazine in '92, this is one of the earliest things we started going after because it was around '95 or so that the so-called 'science wars' took off. And that science is just another way of knowing the world, no different and no better than any other way of knowing the world. Wait, wait, wait, time out. What was that part about, we're just like everybody else? Science has its flaws, but it's not just like art or music. It's different.
So then, by the 2000s, I think this really trickled down into all of the social sciences: anthropology, biology, evolutionary biology, and just attack, attack, attack to the point where any particular viewpoint that an oppressed minority finds offensive—or anybody finds offensive—can be considered a kind of hate speech or a kind of violence. And you can sort of see the reasoning from back in the 1980s all the way through to today. You can see how they get there, but we should have drawn that line and stopped—well, a bunch of us tried to stop it back in the '90s. And well, it had a momentum of its own.
ANDERSEN: What has been enabled in the last 30 years, first through deregulated talk radio where you didn't have to be fair and balanced anymore, then national cable television—FOX News comes to mind—and then, of course, the internet as well, where these more and more not just politically different points of view, but these alternate factual realities could be portrayed and depicted. We've been in that state now for 20 years or more. Again, we were softened up as a people to believe what we want to believe, but we have this new infrastructure that I think is new, that I think is a new condition. So, there's a history of, "Oh, I believe this," or "I believe this." Or "Slavery is good." "No, slavery is bad." Those are disagreements, but in 1860 southerners didn't say, "Oh, no, there are no slaves. No, no, no, there's no slavery." That's the condition we have now, that is the Kellyanne-Conway-Donald-Trump situation—and Republican Party situation before Donald Trump ever came along—where we say, "No, no, there's no climate change." Or, "Oh, this factual truth is not true." That's the new thing. And this new media infrastructure is a new condition. Now it may not be the end of things as a result, but we don't know yet. We're only 20 years into it. And maybe we'll learn new protocols of what to believe and whatnot, and we'll grow up and be able to accommodate ourselves to this new media situation. But I'm worried that we won't, and I'm worried that a significant fraction of us—for now, mostly on the right, but there's no reason it should be limited to the right—will be in their bubble and their silo and with their own reality and not be able to be retrieved into the reality-based world.
- From America's inception, there has always been a rebellious, anti-establishment mentality. That way of thinking has become more reckless now that the entire world is interconnected and there are added layers of verification (or repudiation) of facts.
- As the great minds in this video can attest, there are systems and mechanisms in place to discern between opinion and truth. By making conscious efforts to undermine and ignore those systems at every turn (climate change, conspiracy theories, coronavirus, politics, etc.), America has compromised its position of power and effectively stunted its own growth.
- A part of the problem, according to writer and radio host Kurt Andersen, is a new media infrastructure that allows for false opinions to persist and spread to others. Is it the beginning of the end of the American empire?
- The information arms race can't be won, but we have to keep fighting ... ›
- Does our society incentivize disinformation? - Big Think ›
- Neil deGrasse Tyson: Science literacy can fight disinformation - Big ... ›
- Why science denial and science negation are different - Big Think ›
Once a week.
Subscribe to our weekly newsletter.
Brain cells snap strands of DNA in many more places and cell types than researchers previously thought.
The urgency to remember a dangerous experience requires the brain to make a series of potentially dangerous moves: Neurons and other brain cells snap open their DNA in numerous locations — more than previously realized, according to a new study — to provide quick access to genetic instructions for the mechanisms of memory storage.
The extent of these DNA double-strand breaks (DSBs) in multiple key brain regions is surprising and concerning, says study senior author Li-Huei Tsai, Picower Professor of Neuroscience at MIT and director of The Picower Institute for Learning and Memory, because while the breaks are routinely repaired, that process may become more flawed and fragile with age. Tsai's lab has shown that lingering DSBs are associated with neurodegeneration and cognitive decline and that repair mechanisms can falter.
"We wanted to understand exactly how widespread and extensive this natural activity is in the brain upon memory formation because that can give us insight into how genomic instability could undermine brain health down the road," says Tsai, who is also a professor in the Department of Brain and Cognitive Sciences and a leader of MIT's Aging Brain Initiative. "Clearly, memory formation is an urgent priority for healthy brain function, but these new results showing that several types of brain cells break their DNA in so many places to quickly express genes is still striking."
In 2015, Tsai's lab provided the first demonstration that neuronal activity caused DSBs and that they induced rapid gene expression. But those findings, mostly made in lab preparations of neurons, did not capture the full extent of the activity in the context of memory formation in a behaving animal, and did not investigate what happened in cells other than neurons.
In the new study published July 1 in PLOS ONE, lead author and former graduate student Ryan Stott and co-author and former research technician Oleg Kritsky sought to investigate the full landscape of DSB activity in learning and memory. To do so, they gave mice little electrical zaps to the feet when they entered a box, to condition a fear memory of that context. They then used several methods to assess DSBs and gene expression in the brains of the mice over the next half-hour, particularly among a variety of cell types in the prefrontal cortex and hippocampus, two regions essential for the formation and storage of conditioned fear memories. They also made measurements in the brains of mice that did not experience the foot shock to establish a baseline of activity for comparison.
The creation of a fear memory doubled the number of DSBs among neurons in the hippocampus and the prefrontal cortex, affecting more than 300 genes in each region. Among 206 affected genes common to both regions, the researchers then looked at what those genes do. Many were associated with the function of the connections neurons make with each other, called synapses. This makes sense because learning arises when neurons change their connections (a phenomenon called "synaptic plasticity") and memories are formed when groups of neurons connect together into ensembles called engrams.
"Many genes essential for neuronal function and memory formation, and significantly more of them than expected based on previous observations in cultured neurons … are potentially hotspots of DSB formation," the authors wrote in the study.
In another analysis, the researchers confirmed through measurements of RNA that the increase in DSBs indeed correlated closely with increased transcription and expression of affected genes, including ones affecting synapse function, as quickly as 10-30 minutes after the foot shock exposure.
"Overall, we find transcriptional changes are more strongly associated with [DSBs] in the brain than anticipated," they wrote. "Previously we observed 20 gene-associated [DSB] loci following stimulation of cultured neurons, while in the hippocampus and prefrontal cortex we see more than 100-150 gene associated [DSB] loci that are transcriptionally induced."
Snapping with stress
In the analysis of gene expression, the neuroscientists looked at not only neurons but also non-neuronal brain cells, or glia, and found that they also showed changes in expression of hundreds of genes after fear conditioning. Glia called astrocytes are known to be involved in fear learning, for instance, and they showed significant DSB and gene expression changes after fear conditioning.
Among the most important functions of genes associated with fear conditioning-related DSBs in glia was the response to hormones. The researchers therefore looked to see which hormones might be particularly involved and discovered that it was glutocortocoids, which are secreted in response to stress. Sure enough, the study data showed that in glia, many of the DSBs that occurred following fear conditioning occurred at genomic sites related to glutocortocoid receptors. Further tests revealed that directly stimulating those hormone receptors could trigger the same DSBs that fear conditioning did and that blocking the receptors could prevent transcription of key genes after fear conditioning.
Tsai says the finding that glia are so deeply involved in establishing memories from fear conditioning is an important surprise of the new study.
"The ability of glia to mount a robust transcriptional response to glutocorticoids suggest that glia may have a much larger role to play in the response to stress and its impact on the brain during learning than previously appreciated," she and her co-authors wrote.
Damage and danger?
More research will have to be done to prove that the DSBs required for forming and storing fear memories are a threat to later brain health, but the new study only adds to evidence that it may be the case, the authors say.
"Overall we have identified sites of DSBs at genes important for neuronal and glial functions, suggesting that impaired DNA repair of these recurrent DNA breaks which are generated as part of brain activity could result in genomic instability that contribute to aging and disease in the brain," they wrote.
The National Institutes of Health, The Glenn Foundation for Medical Research, and the JPB Foundation provided funding for the research.
Research shows that those who spend more time speaking tend to emerge as the leaders of groups, regardless of their intelligence.
- A new study proposes the "babble hypothesis" of becoming a group leader.
- Researchers show that intelligence is not the most important factor in leadership.
- Those who talk the most tend to emerge as group leaders.
If you want to become a leader, start yammering. It doesn't even necessarily matter what you say. New research shows that groups without a leader can find one if somebody starts talking a lot.
This phenomenon, described by the "babble hypothesis" of leadership, depends neither on group member intelligence nor personality. Leaders emerge based on the quantity of speaking, not quality.
Researcher Neil G. MacLaren, lead author of the study published in The Leadership Quarterly, believes his team's work may improve how groups are organized and how individuals within them are trained and evaluated.
"It turns out that early attempts to assess leadership quality were found to be highly confounded with a simple quantity: the amount of time that group members spoke during a discussion," shared MacLaren, who is a research fellow at Binghamton University.
While we tend to think of leaders as people who share important ideas, leadership may boil down to whoever "babbles" the most. Understanding the connection between how much people speak and how they become perceived as leaders is key to growing our knowledge of group dynamics.
The power of babble
The research involved 256 college students, divided into 33 groups of four to ten people each. They were asked to collaborate on either a military computer simulation game (BCT Commander) or a business-oriented game (CleanStart). The players had ten minutes to plan how they would carry out a task and 60 minutes to accomplish it as a group. One person in the group was randomly designated as the "operator," whose job was to control the user interface of the game.
To determine who became the leader of each group, the researchers asked the participants both before and after the game to nominate one to five people for this distinction. The scientists found that those who talked more were also more likely to be nominated. This remained true after controlling for a number of variables, such as previous knowledge of the game, various personality traits, or intelligence.
How leaders influence people to believe | Michael Dowling | Big Think www.youtube.com
In an interview with PsyPost, MacLaren shared that "the evidence does seem consistent that people who speak more are more likely to be viewed as leaders."
Another find was that gender bias seemed to have a strong effect on who was considered a leader. "In our data, men receive on average an extra vote just for being a man," explained MacLaren. "The effect is more extreme for the individual with the most votes."
Geologists discover a rhythm to major geologic events.
- It appears that Earth has a geologic "pulse," with clusters of major events occurring every 27.5 million years.
- Working with the most accurate dating methods available, the authors of the study constructed a new history of the last 260 million years.
- Exactly why these cycles occur remains unknown, but there are some interesting theories.
Our hearts beat at a resting rate of 60 to 100 beats per minute. Lots of other things pulse, too. The colors we see and the pitches we hear, for example, are due to the different wave frequencies ("pulses") of light and sound waves.
Now, a study in the journal Geoscience Frontiers finds that Earth itself has a pulse, with one "beat" every 27.5 million years. That's the rate at which major geological events have been occurring as far back as geologists can tell.
A planetary calendar has 10 dates in red
Credit: Jagoush / Adobe Stock
According to lead author and geologist Michael Rampino of New York University's Department of Biology, "Many geologists believe that geological events are random over time. But our study provides statistical evidence for a common cycle, suggesting that these geologic events are correlated and not random."
The new study is not the first time that there's been a suggestion of a planetary geologic cycle, but it's only with recent refinements in radioisotopic dating techniques that there's evidence supporting the theory. The authors of the study collected the latest, best dating for 89 known geologic events over the last 260 million years:
- 29 sea level fluctuations
- 12 marine extinctions
- 9 land-based extinctions
- 10 periods of low ocean oxygenation
- 13 gigantic flood basalt volcanic eruptions
- 8 changes in the rate of seafloor spread
- 8 times there were global pulsations in interplate magmatism
The dates provided the scientists a new timetable of Earth's geologic history.
Tick, tick, boom
Credit: New York University
Putting all the events together, the scientists performed a series of statistical analyses that revealed that events tend to cluster around 10 different dates, with peak activity occurring every 27.5 million years. Between the ten busy periods, the number of events dropped sharply, approaching zero.
Perhaps the most fascinating question that remains unanswered for now is exactly why this is happening. The authors of the study suggest two possibilities:
"The correlations and cyclicity seen in the geologic episodes may be entirely a function of global internal Earth dynamics affecting global tectonics and climate, but similar cycles in the Earth's orbit in the Solar System and in the Galaxy might be pacing these events. Whatever the origins of these cyclical episodes, their occurrences support the case for a largely periodic, coordinated, and intermittently catastrophic geologic record, which is quite different from the views held by most geologists."
Assuming the researchers' calculations are at least roughly correct — the authors note that different statistical formulas may result in further refinement of their conclusions — there's no need to worry that we're about to be thumped by another planetary heartbeat. The last occurred some seven million years ago, meaning the next won't happen for about another 20 million years.
The great theoretical physicist Steven Weinberg passed away on July 23. This is our tribute.
- The recent passing of the great theoretical physicist Steven Weinberg brought back memories of how his book got me into the study of cosmology.
- Going back in time, toward the cosmic infancy, is a spectacular effort that combines experimental and theoretical ingenuity. Modern cosmology is an experimental science.
- The cosmic story is, ultimately, our own. Our roots reach down to the earliest moments after creation.
When I was a junior in college, my electromagnetism professor had an awesome idea. Apart from the usual homework and exams, we were to give a seminar to the class on a topic of our choosing. The idea was to gauge which area of physics we would be interested in following professionally.
Professor Gilson Carneiro knew I was interested in cosmology and suggested a book by Nobel Prize Laureate Steven Weinberg: The First Three Minutes: A Modern View of the Origin of the Universe. I still have my original copy in Portuguese, from 1979, that emanates a musty tropical smell, sitting on my bookshelf side-by-side with the American version, a Bantam edition from 1979.
Inspired by Steven Weinberg
Books can change lives. They can illuminate the path ahead. In my case, there is no question that Weinberg's book blew my teenage mind. I decided, then and there, that I would become a cosmologist working on the physics of the early universe. The first three minutes of cosmic existence — what could be more exciting for a young physicist than trying to uncover the mystery of creation itself and the origin of the universe, matter, and stars? Weinberg quickly became my modern physics hero, the one I wanted to emulate professionally. Sadly, he passed away July 23rd, leaving a huge void for a generation of physicists.
What excited my young imagination was that science could actually make sense of the very early universe, meaning that theories could be validated and ideas could be tested against real data. Cosmology, as a science, only really took off after Einstein published his paper on the shape of the universe in 1917, two years after his groundbreaking paper on the theory of general relativity, the one explaining how we can interpret gravity as the curvature of spacetime. Matter doesn't "bend" time, but it affects how quickly it flows. (See last week's essay on what happens when you fall into a black hole).
The Big Bang Theory
For most of the 20th century, cosmology lived in the realm of theoretical speculation. One model proposed that the universe started from a small, hot, dense plasma billions of years ago and has been expanding ever since — the Big Bang model; another suggested that the cosmos stands still and that the changes astronomers see are mostly local — the steady state model.
Competing models are essential to science but so is data to help us discriminate among them. In the mid 1960s, a decisive discovery changed the game forever. Arno Penzias and Robert Wilson accidentally discovered the cosmic microwave background radiation (CMB), a fossil from the early universe predicted to exist by George Gamow, Ralph Alpher, and Robert Herman in their Big Bang model. (Alpher and Herman published a lovely account of the history here.) The CMB is a bath of microwave photons that permeates the whole of space, a remnant from the epoch when the first hydrogen atoms were forged, some 400,000 years after the bang.
The existence of the CMB was the smoking gun confirming the Big Bang model. From that moment on, a series of spectacular observatories and detectors, both on land and in space, have extracted huge amounts of information from the properties of the CMB, a bit like paleontologists that excavate the remains of dinosaurs and dig for more bones to get details of a past long gone.
How far back can we go?
Confirming the general outline of the Big Bang model changed our cosmic view. The universe, like you and me, has a history, a past waiting to be explored. How far back in time could we dig? Was there some ultimate wall we cannot pass?
Because matter gets hot as it gets squeezed, going back in time meant looking at matter and radiation at higher and higher temperatures. There is a simple relation that connects the age of the universe and its temperature, measured in terms of the temperature of photons (the particles of visible light and other forms of invisible radiation). The fun thing is that matter breaks down as the temperature increases. So, going back in time means looking at matter at more and more primitive states of organization. After the CMB formed 400,000 years after the bang, there were hydrogen atoms. Before, there weren't. The universe was filled with a primordial soup of particles: protons, neutrons, electrons, photons, and neutrinos, the ghostly particles that cross planets and people unscathed. Also, there were very light atomic nuclei, such as deuterium and tritium (both heavier cousins of hydrogen), helium, and lithium.
So, to study the universe after 400,000 years, we need to use atomic physics, at least until large clumps of matter aggregate due to gravity and start to collapse to form the first stars, a few millions of years after. What about earlier on? The cosmic history is broken down into chunks of time, each the realm of different kinds of physics. Before atoms form, all the way to about a second after the Big Bang, it's nuclear physics time. That's why Weinberg brilliantly titled his book The First Three Minutes. It is during the interval between one-hundredth of a second and three minutes that the light atomic nuclei (made of protons and neutrons) formed, a process called, with poetic flair, primordial nucleosynthesis. Protons collided with neutrons and, sometimes, stuck together due to the attractive strong nuclear force. Why did only a few light nuclei form then? Because the expansion of the universe made it hard for the particles to find each other.
What about the nuclei of heavier elements, like carbon, oxygen, calcium, gold? The answer is beautiful: all the elements of the periodic table after lithium were made and continue to be made in stars, the true cosmic alchemists. Hydrogen eventually becomes people if you wait long enough. At least in this universe.
In this article, we got all the way up to nucleosynthesis, the forging of the first atomic nuclei when the universe was a minute old. What about earlier on? How close to the beginning, to t = 0, can science get? Stay tuned, and we will continue next week.
To Steven Weinberg, with gratitude, for all that you taught us about the universe.