Will America’s disregard for science be the end of its reign?
Confirmation bias is baked into the DNA of America, but it may soon be the nation's undoing.
MICHAEL SHERMER: Because of the internet, especially, this whole idea of what we now call fake news, alternative facts has gotten bigger and bigger.
KURT ANDERSEN: You look at this history and it's like, "Oh, we should've seen this coming."
We were softened up as a people to believe what we want to believe.
NEIL DEGRASSE TYSON: This is irresponsible. Plus, it means you don't know how science works.
MARGARET ATWOOD: People do not want to give up their cherished beliefs, especially cherished beliefs that they find comforting.
ANDERSEN: We have this new infrastructure, that I think is new, that I think is a new condition. In 1860, southerners didn't say, "Oh, no, there are no slaves. No, no, no, there's no slavery."
BILL NYE: The United States used to be the world leader in technology, but when you have this group of leaders, elected officials, who are anti-science, you're setting the US back and then ultimately setting the world back.
KURT ANDERSEN: Americans have always been magical thinkers and passionate believers in the untrue. We were started by the Puritans in New England who wanted to create, and did create, a Christian utopia and theocracy as they waited for the imminent second coming of Christ and the end of days. And in the South by a bunch of people who were convinced, absolutely convinced, that this place they'd never been was full of gold just to be plucked from the dirt in Virginia. And they stayed there looking and hoping for gold for 20 years before they finally, finally faced the facts and the evidence and decided that they weren't going to get rich overnight there.
So that was the beginning. And then we've had centuries of 'buyer beware' charlatanism to an extreme degree and medical quackery to an extreme degree, and increasingly exotic, extravagant, implausible religions over and over again from Mormonism, to Christian science, to Scientology in the last century. And we've had this anti-establishment, "I'm not going to trust the experts. I'm not going to trust the elite," in our character from the beginning. Now, all those things came together and were supercharged in the 1960s when you were entitled to your own truth and your own reality. Then, a generation later when the internet came along, giving each of those realities, no matter how false or magical or nutty they are, their own kind of media infrastructure.
We had entertainment, again, for our whole last couple 100 years, but especially in the last 50 years, permeating all the rest of life, including presidential politics, from John F. Kennedy through Ronald Reagan to Bill Clinton. So, the thing was set up for Donald Trump to exploit all these various American threads and astonishingly become president. But then you look at this history and it's like, "Oh, we should've seen this coming."
TYSON: The power of journalism: A mistake becomes truth. The print journalism is taking what I said and turning it into an article, so it has to pass through the journalist, get processed, and then it becomes some written content on a page. One hundred percent of those experiences, the journalist got something fundamentally wrong with the subject matter. And just as an interesting point about the power of journalists, I had people read the article and say, "Neil, you must know better than that. That's not how this works." They assumed the journalist was correct about reporting what I said, not that I was correct and that the journalists was wrong. This is an interesting power that journalists have over whether you think what they're writing is true or not. That was decades ago. In recent years, what I think has happened is that they're more journalists who are science fluent that are writing about science than was the case 20 years ago. So now I don't have to worry about the journalist missing something fundamental about what I'm trying to describe. And reporting has been much more accurate in recent years, I'm happy to report. However, there's something that has not been fixed in journalism yet. It's their urge to get the story first, the science story. The breaking news about a discovery. The urge to get it first means they're reporting on something that's not yet verified by other scientific experiments. If it's not yet verified, it's not there yet. And you're more likely to write about a story that is most extraordinary. And the more extraordinary the single scientific result is, the less likely it is that it's going to be true. So you need some restraint there or some way to buffer the account. I don't want you to not talk about it, but say, "This is not yet verified. It's not yet this, it's not yet that. And it's been criticized by these other people anyway." So be more open about how wrong the thing is you're reporting on could be, because otherwise you're doing a disservice to the public. And that disservice is that people out there say, "Scientists don't know anything." But what gives you that idea? "Well, one week cholesterol is good for you and the next week it's bad for you. They don't know what they're doing!" That's on the frontier. On the frontier, science is flip-flopping all the time. Yes, if you're going to report from the frontier, it looks like scientists are clueless about everything. You take a few steps behind the line, where experiments have verified and re verified results, that's the stuff for the textbooks. That's the stuff that is objectively true. That's the stuff you should be paying attention to. That's the stuff where you should be thinking about laws and legislation related to that. If you speak to journalists, they say, "We need a fair and balanced article. So if you say this, we will go to someone else with the opposite view and that way it's fair and balanced." Where do you draw the line? You realize Earth goes around the Sun, right? "Oh yeah. Of course." If someone says the Sun goes around the earth, are you going to give them equal time? "Well, of course not because that's just ridiculous." Fine. Now, how about how much column space you're giving to climate change. "Well, there's scientists who say it's real, there's scientists are not. So we're giving them equal time, equal space." Are they equal in the literature? No. Are they equal in impact? No. Are they equal in any way? No. Except in your journalistic philosophy, you want to give more column space to something that is shown to be false by the consensus of observation and experiment that's out there. And you think you're honoring your journalistic credo, but you're not—not on that level. It's like saying the Sun goes around the earth, as far as I'm concerned. That's patently absurd to you. So you got to know where you draw that line because with matters of science, it's not simply, 'What's the other opposite opinion I can get on it?' Look to see how much scientific agreement has descended upon that statement. And if there's not much agreement, then fine, talk about the whole frontier. There's plenty of that. Just go to any scientific conference. You want to get multiple views on something? That's where you'd get it. But the moment something enters the canon of objective knowledge and objective truths, that's the kind of emergent truth that we have with climate change. Humans warming the planet. That's the kind of agreement we have in scientific research. Oh, you think it's some other way. You want it to be... That's odd. If you went to your doctor and you have some ailment and the doctor says, "You can take this pill, which three percent of all research says will cure you, or you can take this pill, which 97% of all research says will cure you." Which one are you going to walk away from the doctor's office with? The 97% pill, of course. Yet, you walk out of there and say, "Oh, I believe the three percent who say we're not warming the planet." This is irresponsible. Plus it means you don't know how science works.
SHERMER: Because of the internet, especially, this whole idea of what we now call fake news, alternative facts, has gotten bigger and bigger and it just gets unfolded in real time, online, within minutes and hours. And we have to jump on it fast. What the skeptical movement has developed is a set of tools with particular claims that are on the margins of science, like creationism, intelligent design theory, the anti-vaccinations, the Holocaust revisionists. All these conspiracy theories and so on, all these alternative medicines. And there are hundreds and hundreds of these claims that are all connected to different sciences but the scientists in those particular fields are too busy working in their research to bother with what these claims are, because the claims really aren't about those fields. They're just hooked to them. They're about something else, because back in the '80s when I first saw some professional scientists debate Duane Gish, the young-Earth creationist, they did not fare well. And I saw some Holocaust historians debating or confronting Holocaust so-called revisionists or deniers. They did not fare well because they didn't know the special arguments that are being made by these fringe people that have nothing to do with the science, really. They have an agenda and they're using these little tweak questions to get at the mainstream and try to debunk it for their own ideological reasons. So, for example, like Holocaust revisionists, they make this big deal about why the door on the gas chamber at Mauthausen doesn't lock. I mean, if it doesn't lock, how are you gassing people, if you can't lock the door? So they must not have gassed people in there. So if they didn't gas people at Mauthausen they probably didn't gas people at any of the death camps, and if they didn't gas people at any of the death camps, then there must not have been a Holocaust. What? Wait a minute, what? All from this door that doesn't lock? Well, I eventually went and found out that that wasn't the original door, that took me a couple of years. But that's the kind of specialty thing that skeptics do that mainstream scientists, scholars, historians don't have time to do.
ANDERSEN: The idea of America from the beginning was that you could come here, reinvent yourself, be anybody you want, live any way you wanted, believe anything you wanted. For the first few hundred years, like everywhere else in the world, celebrity and fame were a result of some kind of accomplishment or achievement, sometimes not a great accomplishment or achievement, but you did something in the world to earn renown. America really was the key place that invented the modern celebrity culture which was, beginning a century ago, more and more, not necessarily about having won a war or lead people or written a great book or painted a great painting, but about being famous. Fame for its own sake. We created that. We created Hollywood, we created the whole culture industry and that then became what I call the fantasy industrial complex where, certainly in the last few decades, more than ever, more than anybody thought possible before, fame for its own sake, fame itself—however you got it—was a primary goal for people. And again, as so many of the things I talk about in 'Fantasyland', not uniquely to America, but more here than anywhere. And then you get reality television, which was this unholy hybrid of the fictional and the real for the last, now, generation, where that blur between 'What's real and what's not?' is pumped into our media stream, willy nilly. There are now more reality shows on television than there were shows on television 20 years ago.
ATWOOD: If you look at the history of what happened to Darwin when he published. What would you call that? Yes, he was hugely attacked at the time. And it's often a case of people do not want to give up their cherished beliefs, especially cherished beliefs that they find comforting. So, it's no good for Richard Dawkins to say, 'Let us stand on the bold, bare promontory of truth and acknowledge the basically nothingness of ourselves.' People don't find that cozy. So they will go around the block not to do that. And that's very understandable and human. And religious thinking, the idea that there's somebody bigger than you out there who might be helpful to you if certain rules are observed, that goes back so far, we probably have an epigene or something, or a cluster of epigenes, for that. And you see it a lot in small children. There is a monster under the bed and you can't tell them there isn't—they don't find that reassuring. What you can tell them is, "Yes, there is a monster under that bed. But as long as I put this cabbage right in this spot, it can't come out."
ANDERSEN: Like all humans, Americans suffer from what's called confirmation bias, which is, "Oh, I believe this. I will look for facts or pseudo-facts or fictions that confirm my preexisting beliefs." Americans long before psychologists invented that phrase, confirmation bias, had that tendency. Again, at the very beginning: 'I've never been to the new world. Nobody I know has been in the new world. I've never really read any firsthand accounts of the new world, but I'm going to give up my life and go there because it's going to be awesome and perfect. And I'm going to get rich overnight and/or create a Christian utopia.' So we began that way and that has kept up. I just want to believe what I want to believe. And don't let your lying eyes tell you anything different.
ATWOOD: When science is telling you something that you really find very inconvenient, and that is the history of global warming and the changes that we are certainly already seeing around us. First of all, it was denial. 'It cannot be happening.' Now, there's grudging admission as things flood and droughts kick in and food supplies drop, and the sea level rises, and the glaciers melt, big time. I have seen that, been there. You can't deny that it's happening, but you then have to pretend that it's nothing to do with us. So we don't have to change our behavior. That's the thinking around that.
WADE CROWFOOT: If we ignore that science and put our head in the sand and think it's all about vegetation management, we're not going to succeed together protecting Californians.
DONALD TRUMP: Okay, it'll start getting cooler. You just watch.
CROWFOOT: I wish science agreed with you.
TRUMP: Well, I don't think science knows, actually.
ATWOOD: And that can get very entrenched until people see that by trying to solve the problem, jobs can be created and money can be made. And that will be the real tipping point in public consciousness in this country. Other countries are already there.
ANDERSEN: Believing whatever nutty thing you want to believe, or pretending you are whatever you are, or having even kooky conspiracy theories or speaking in tongues, whatever it is, fine—if it's private. The problem is when that, as it has in the last couple of decades especially, leaches into the public sphere and the policy sphere and like, "Nah, there's no global warming. We don't have to worry about the seas rising," or "Nah, scientists say that vaccines are safe but I think they cause autism, so I'm not going to vaccinate my children." And so on and so on and so on. That's when the rubber hits the road—will hit the road—and people will start saying, "Wait a minute." Not until then, not until there's a consequence and not until there's a price to pay.
NYE: By having a population of people who don't really understand germs and how serious they are, the germ gets spread really readily. There is a faction of our leaders, elected officials, who continually cuts the budget for the Centers for Disease Control, which to me reflects an ignorance of how serious germs can be. In my opinion, we should be supporting that research full bore—but at the same time, don't curtail research in other germs, which is going on at the Centers for Disease Control, for example, all the time. That's not where you save your money, Congress. But if you don't believe in the seriousness of it, and you have a mistrust of scientists, if you have a mistrust of engineers, you're not going to help us out with that, are you? So it's a very serious concern of mine. I mean, the United States used to be the world leader in technology, but when you have this group of leaders, elected officials, who are anti-science, you're setting the US back and then ultimately setting the world back.
SHERMER: Let's address the college campus issue these days. I really think this goes back to the 1980s. I noticed it first when I was in graduate school the second time when I got a PhD in the history of science. My first round was in the '70s in experimental psychology, graduate school, and I didn't notice any of this campus stuff. In the late '80s, when I was in my doctoral program, because history deals a lot with literature, the kind of post-modernist deconstruction of what texts mean was really taking off. And so I initially thought, "What is this? But, okay I'll give it a shot. I'll keep an open mind here and just try to follow the reasoning." And I can kind of see where they were going. So what is the true meaning of Jane Austen's novel here, or Shakespeare's play there, or this novelist or that author? And I can see that there may not be one meaning; maybe the author meant it as provoking you to think about certain deep issues and you have to find your own meaning in the text. Okay, I can understand that, but then it kind of started to spill over into history and I was studying the history of science. And I like to think of science as progressing towards some better understanding of reality that I believe is really there. And it's not that science is perfect and we're going to get to a perfect understanding of reality—I know that's not going to happen. But it's not the same as literature. It's not the same as art and music. It's different than that. If Darwin hadn't discovered evolution, somebody else would have—in fact, somebody did: Alfred Russel Wallace discovered natural selection as the mechanism of evolution. And if Newton hadn't discovered the calculus somebody else would've. Well, they did: Leibniz. And so on. These are things that are out there to be discovered, and I see that differently than art and music and literature, which is constructing ideas out of your mind.
So, I don't think that the post-modern deconstruction of the text applies completely to history. And you can see immediately why it fails, because this is what led to, in the '90s, the whole Holocaust denial movement, so-called revisionists. They call themselves revisionists and the argument was: 'All history is text. It's just written by the winners and the winners write themselves as the good guys, and the losers are the bad guys. And this is all unfair. And, look, maybe the winners here have unfairly critiqued Hitler and the Nazis.' and so on. Yeah. But what about that Holocaust thing? It looks pretty bad. 'Yeah. Yeah. Well, maybe it didn't happen the way we have been led to believe it happened because, again, the history of the Holocaust is written by the winners.' You can see immediately why this kind of textual analysis can cascade into complete moral relativism and insane ideas like Holocaust denial. That's when I thought, okay, this is wrong. This has gone too far. And in the mid-'90s, after we founded the skeptics and Skeptic Magazine in '92, this is one of the earliest things we started going after because it was around '95 or so that the so-called 'science wars' took off. And that science is just another way of knowing the world, no different and no better than any other way of knowing the world. Wait, wait, wait, time out. What was that part about, we're just like everybody else? Science has its flaws, but it's not just like art or music. It's different.
So then, by the 2000s, I think this really trickled down into all of the social sciences: anthropology, biology, evolutionary biology, and just attack, attack, attack to the point where any particular viewpoint that an oppressed minority finds offensive—or anybody finds offensive—can be considered a kind of hate speech or a kind of violence. And you can sort of see the reasoning from back in the 1980s all the way through to today. You can see how they get there, but we should have drawn that line and stopped—well, a bunch of us tried to stop it back in the '90s. And well, it had a momentum of its own.
ANDERSEN: What has been enabled in the last 30 years, first through deregulated talk radio where you didn't have to be fair and balanced anymore, then national cable television—FOX News comes to mind—and then, of course, the internet as well, where these more and more not just politically different points of view, but these alternate factual realities could be portrayed and depicted. We've been in that state now for 20 years or more. Again, we were softened up as a people to believe what we want to believe, but we have this new infrastructure that I think is new, that I think is a new condition. So, there's a history of, "Oh, I believe this," or "I believe this." Or "Slavery is good." "No, slavery is bad." Those are disagreements, but in 1860 southerners didn't say, "Oh, no, there are no slaves. No, no, no, there's no slavery." That's the condition we have now, that is the Kellyanne-Conway-Donald-Trump situation—and Republican Party situation before Donald Trump ever came along—where we say, "No, no, there's no climate change." Or, "Oh, this factual truth is not true." That's the new thing. And this new media infrastructure is a new condition. Now it may not be the end of things as a result, but we don't know yet. We're only 20 years into it. And maybe we'll learn new protocols of what to believe and whatnot, and we'll grow up and be able to accommodate ourselves to this new media situation. But I'm worried that we won't, and I'm worried that a significant fraction of us—for now, mostly on the right, but there's no reason it should be limited to the right—will be in their bubble and their silo and with their own reality and not be able to be retrieved into the reality-based world.
- From America's inception, there has always been a rebellious, anti-establishment mentality. That way of thinking has become more reckless now that the entire world is interconnected and there are added layers of verification (or repudiation) of facts.
- As the great minds in this video can attest, there are systems and mechanisms in place to discern between opinion and truth. By making conscious efforts to undermine and ignore those systems at every turn (climate change, conspiracy theories, coronavirus, politics, etc.), America has compromised its position of power and effectively stunted its own growth.
- A part of the problem, according to writer and radio host Kurt Andersen, is a new media infrastructure that allows for false opinions to persist and spread to others. Is it the beginning of the end of the American empire?
- The information arms race can't be won, but we have to keep fighting ... ›
- Does our society incentivize disinformation? - Big Think ›
- Neil deGrasse Tyson: Science literacy can fight disinformation - Big ... ›
- Why science denial and science negation are different - Big Think ›
Big ideas.
Once a week.
Subscribe to our weekly newsletter.
Could muons point to new physics?
New data have set the particle physics community abuzz.
- The first question ever asked in Western philosophy, "What's the world made of?" continues to inspire high energy physicists.
- New experimental results probing the magnetic properties of the muon, a heavier cousin of the electron, seem to indicate that new particles of nature may exist, potentially shedding light on the mystery of dark matter.
- The results are a celebration of the human spirit and our insatiable curiosity to understand the world and our place in it.
If brute force doesn't work, then look into the peculiarities of nothingness. This may sound like a Zen koan, but it's actually the strategy that particle physicists are using to find physics beyond the Standard Model, the current registry of all known particles and their interactions. Instead of the usual colliding experiments that smash particles against one another, exciting new results indicate that new vistas into exotic kinds of matter may be glimpsed by carefully measuring the properties of the quantum vacuum. There's a lot to unpack here, so let's go piecemeal.
It is fitting that the first question asked in Western philosophy concerned the material composition of the world. Writing around 350 BCE, Aristotle credited Thales of Miletus (circa 600 BCE) with the honor of being the first Western philosopher when he asked the question, "What is the world made of?" What modern high energy physicists do, albeit with very different methodology and equipment, is to follow along the same philosophical tradition of trying to answer this question, assuming that there are indivisible bricks of matter called elementary particles.
Deficits in the Standard Model
Jumping thousands of years of spectacular discoveries, we now have a very neat understanding of the material composition of the world at the subatomic level: a total of 12 particles and the Higgs boson. The 12 particles of matter are divided into two groups, six leptons and six quarks. The six quarks comprise all particles that interact via the strong nuclear force, like protons and neutrons. The leptons include the familiar electron and its two heavier cousins, the muon and the tau. The muon is the star of the new experiments.
The Standard ModelCredit: Cush via Wikimedia Commons licensed under CC0 1.0
For all its glory, the Standard Model described above is incomplete. The goal of fundamental physics is to answer the most questions with the least number of assumptions. As it stands, the values of the masses of all particles are parameters that we measure in the laboratory, related to how strongly they interact with the Higgs. We don't know why some interact much stronger than others (and, as a consequence, have larger masses), why there is a prevalence of matter over antimatter, or why the universe seems to be dominated by dark matter — a kind of matter we know nothing about, apart from the fact that it's not part of the recipe included in the Standard Model. We know dark matter has mass since its gravitational effects are felt in familiar matter, the matter that makes up galaxies and stars. But we don't know what it is.
Whatever happens, new science will be learned.
Physicists had hoped that the powerful Large Hadron Collider in Switzerland would shed light on the nature of dark matter, but nothing has come up there or in many direct searches, where detectors were mounted to collect dark matter that presumably would rain down from the skies and hit particles of ordinary matter.
Could muons fill in the gaps?
Enter the muons. The hope that these particles can help solve the shortcomings of the Standard Model has two parts to it. The first is that every particle, like a muon, that has an electric charge can be pictured simplistically as a spinning sphere. Spinning spheres and disks of charge create a magnetic field perpendicular to the direction of the spin. Picture the muon as a tiny spinning top. If it's rotating counterclockwise, its magnetic field would point vertically up. (Grab a glass of water with your right hand and turn it counterclockwise. Your thumb will be pointing up, the direction of the magnetic field.) The spinning muons will be placed into a doughnut-shaped tunnel and forced to go around and around. The tunnel will have its own magnetic field that will interact with the tiny magnetic field of the muons. As the muons circle the doughnut, they will wobble about, just like spinning-tops wobble on the ground due to their interaction with Earth's gravity. The amount of wobbling depends on the magnetic properties of the muon which, in turn, depend on what's going on with the muon in space.
Credit: Fabrice Coffrini / Getty Images
This is where the second idea comes in, the quantum vacuum. In physics, there is no empty space. The so-called vacuum is actually a bubbling soup of particles that appear and disappear in fractions of a second. Everything fluctuates, as encapsulated in Heisenberg's Uncertainty Principle. Energy fluctuates too, what we call zero-point energy. Since energy and mass are interconvertible (E=mc2, remember?), these tiny fluctuations of energy can be momentarily converted into particles that pop out and back into the busy nothingness of the quantum vacuum. Every particle of matter is cloaked with these particles emerging from vacuum fluctuations. Thus, a muon is not only a muon, but a muon dressed with these extra fleeting bits of stuff. That being the case, these extra particles affect a muon's magnetic field, and thus, its wobbling properties.
About 20 years ago, physicists at the Brookhaven National Laboratory detected anomalies in the muon's magnetic properties, larger than what theory predicted. This would mean that the quantum vacuum produces particles not accounted for by the Standard Model: new physics! Fast forward to 2017, and the experiment, at four times higher sensitivity, was repeated at the Fermi National Laboratory, where yours truly was a postdoctoral fellow a while back. The first results of the Muon g-2 experiment were unveiled on 7-April-2021 and not only confirmed the existence of a magnetic moment anomaly but greatly amplified it.
To most people, the official results, published recently, don't seem so exciting: a "tension between theory and experiment of 4.2 standard deviations." The gold standard for a new discovery in particle physics is a 5-sigma variation, or one part in 3.5 million. (That is, running the experiment 3.5 million times and only observing the anomaly once.) However, that's enough for plenty of excitement in the particle physics community, given the remarkable precision of the experimental measurements.
A time for excitement?
Now, results must be reanalyzed very carefully to make sure that (1) there are no hidden experimental errors; and (2) the theoretical calculations are not off. There will be a frenzy of calculations and papers in the coming months, all trying to make sense of the results, both on the experimental and theoretical fronts. And this is exactly how it should be. Science is a community-based effort, and the work of many compete with and complete each other.
Whatever happens, new science will be learned, even if less exciting than new particles. Or maybe, new particles have been there all along, blipping in and out of existence from the quantum vacuum, waiting to be pulled out of this busy nothingness by our tenacious efforts to find out what the world is made of.
- Benjamin Franklin wrote essays on a whole range of subjects, but one of his finest was on how to be a nice, likable person.
- Franklin lists a whole series of common errors people make while in the company of others, like over-talking or storytelling.
- His simple recipe for being good company is to be genuinely interested in others and to accept them for who they are.
Think of the nicest person you know. The person who would fit into any group configuration, who no one can dislike, or who makes a room warmer and happier just by being there.
What makes them this way? Why are they so amiable, likeable, or good-natured? What is it, you think, that makes a person good company?
There are really only two things that make someone likable.
This is the kind of advice that comes from one of history's most famously good-natured thinkers: Benjamin Franklin. His essay "On Conversation" is full of practical, surprisingly modern tips about how to be a nice person.
Franklin begins by arguing that there are really only two things that make someone likable. First, they have to be genuinely interested in what others say. Second, they have to be willing "to overlook or excuse Foibles." In other words, being good company means listening to people and ignoring their faults. Being witty, well-read, intelligent, or incredibly handsome can all make a good impression, but they're nothing without these two simple rules.
The sort of person nobody likes
From here, Franklin goes on to give a list of the common errors people tend to make while in company. These are the things people do that makes us dislike them. We might even find, with a sinking feeling in our stomach, that we do some of these ourselves.
1) Talking too much and becoming a "chaos of noise and nonsense." These people invariably talk about themselves, but even if "they speak beautifully," it's still ultimately more a soliloquy than a real conversation. Franklin mentions how funny it can be to see these kinds of people come together. They "neither hear nor care what the other says; but both talk on at any rate, and never fail to part highly disgusted with each other."
2) Asking too many questions. Interrogators are those people who have an "impertinent Inquisitiveness… of ten thousand questions," and it can feel like you're caught between a psychoanalyst and a lawyer. In itself, this might not be a bad thing, but Franklin notes it's usually just from a sense of nosiness and gossip. The questions are only designed to "discover secrets…and expose the mistakes of others."
3) Storytelling. You know those people who always have a scripted story they tell at every single gathering? Utterly painful. They'll either be entirely oblivious to how little others care for their story, or they'll be aware and carry on regardless. Franklin notes, "Old Folks are most subject to this Error," which we might think is perhaps harsh, or comically honest, depending on our age.
4) Debating. Some people are always itching for a fight or debate. The "Wrangling and Disputing" types inevitably make everyone else feel like they need to watch what they say. If you give even the lightest or most modest opinion on something, "you throw them into Rage and Passion." For them, the conversation is a boxing fight, and words are punches to be thrown.
5) Misjudging. Ribbing or mocking someone should be a careful business. We must never mock "Misfortunes, Defects, or Deformities of any kind", and should always be 100% sure we won't upset anyone. If there's any doubt about how a "joke" will be taken, don't say it. Offense is easily taken and hard to forget.
Not following Benjamin Franklin's advice.Credit: Ronald Martinez via Getty Images
On practical philosophy
Franklin's essay is a trove of great advice, and this article only touches on the major themes. It really is worth your time to read it in its entirety. As you do, it's hard not to smile along or to think, "Yes! I've been in that situation." Though the world has changed dramatically in the 300 years since Franklin's essay, much is exactly the same. Basic etiquette doesn't change.
If there's only one thing to take away from Franklin's essay, it comes at the end, where he revises his simple recipe for being nice:
"Be ever ready to hear what others say… and do not censure others, nor expose their Failings, but kindly excuse or hide them"
So, all it takes to be good company is to listen and accept someone for who they are.
Philosophy doesn't always have to be about huge questions of truth, beauty, morality, art, or meaning. Sometimes it can teach us simply how to not be a jerk.
Weird science shows unseemly way beetles escape after being eaten
Certain water beetles can escape from frogs after being consumed.
- A Japanese scientist shows that some beetles can wiggle out of frog's butts after being eaten whole.
- The research suggests the beetle can get out in as little as 7 minutes.
- Most of the beetles swallowed in the experiment survived with no complications after being excreted.
In what is perhaps one of the weirdest experiments ever that comes from the category of "why did anyone need to know this?" scientists have proven that the Regimbartia attenuata beetle can climb out of a frog's butt after being eaten.
The research was carried out by Kobe University ecologist Shinji Sugiura. His team found that the majority of beetles swallowed by black-spotted pond frogs (Pelophylax nigromaculatus) used in their experiment managed to escape about 6 hours after and were perfectly fine.
"Here, I report active escape of the aquatic beetle R. attenuata from the vents of five frog species via the digestive tract," writes Sugiura in a new paper, adding "although adult beetles were easily eaten by frogs, 90 percent of swallowed beetles were excreted within six hours after being eaten and, surprisingly, were still alive."
One bug even got out in as little as 7 minutes.
Sugiura also tried putting wax on the legs of some of the beetles, preventing them from moving. These ones were not able to make it out alive, taking from 38 to 150 hours to be digested.
Naturally, as anyone would upon encountering such a story, you're wondering where's the video. Thankfully, the scientists recorded the proceedings:
The Regimbartia attenuata beetle can be found in the tropics, especially as pests in fish hatcheries. It's not the only kind of creature that can survive being swallowed. A recent study showed that snake eels are able to burrow out of the stomachs of fish using their sharp tails, only to become stuck, die, and be mummified in the gut cavity. Scientists are calling the beetle's ability the first documented "active prey escape." Usually, such travelers through the digestive tract have particular adaptations that make it possible for them to withstand extreme pH and lack of oxygen. The researchers think the beetle's trick is in inducing the frog to open a so-called "vent" controlled by the sphincter muscle.
"Individuals were always excreted head first from the frog vent, suggesting that R. attenuata stimulates the hind gut, urging the frog to defecate," explains Sugiura.
For more information, check out the study published in Current Biology.
Our ancestors first developed humanlike brains 1.7 million years ago
A recent study analyzed the skulls of early Homo species to learn more about the evolution of primate brains.
For nearly two centuries, scientists have known that humans descended from the great apes. But it's proven difficult to precisely map out the branches of that evolutionary tree, especially in terms of determining when and where early Homo species first developed brains similar to modern humans.
There are clear differences between ape and human brains. Compared to apes, the Homo sapiens brain is larger, and its frontal lobe is organized such that we can engage in toolmaking, planning, and language. Other Homo species also enjoyed some of these cognitive innovations, from the Neanderthals to Homo floresiensis, the hobbit-like people who once inhabited Indonesia.
One reason it's been difficult to discern the details of this cognitive evolution from apes to Homo species is that brains don't fossilize, so scientists can't directly study early primate brains. But primate skulls offer clues.
Brains of yore
In a new study published in Science, an international team of researchers analyzed impressions left on the skulls of Homo species to better understand the evolution of primate brains. Using computer tomography on fossil skulls, the team generated images of what the brain structures of early Homo species probably looked like, and then compared those structures to the brains of great apes and modern humans.
The results suggest that Homo species first developed humanlike brains approximately 1.7 to 1.5 million years ago in Africa. This cognitive evolution occurred at roughly the same time Homo species' technology and culture were becoming more complex, with these species developing more sophisticated stone tools and animal food resources.
The team hypothesized that "this pattern reflects interdependent processes of brain-culture coevolution, where cultural innovation triggered changes in cortical interconnectivity and ultimately in external frontal lobe topography."
The team also found that these structural changes occurred after Homo species migrated out of Africa for regions like modern-day Georgia and Southeast Asia, which is where the fossils in the study were discovered. In other words, Homo species still had ape-like brains when some groups first left Africa.
While the study sheds new light on the evolution of primate brains, the team said there's still much to learn about the history of early Homo species, particularly in terms of explaining the morphological diversity of Homo fossils discovered in Africa.
"Deciphering evolutionary process in early Homo remains a challenge that will be met only through the recovery of expanded fossil samples from well-controlled chronological contexts," the researchers wrote.
