When does an idea die? Plato and string theory clash with data
How long should one wait until an idea like string theory, seductive as it may be, is deemed unrealistic?
Marcelo Gleiser is a professor of natural philosophy, physics, and astronomy at Dartmouth College. He is a Fellow of the American Physical Society, a recipient of the Presidential Faculty Fellows Award from the White House and NSF, and was awarded the 2019 Templeton Prize. Gleiser has authored five books and is the co-founder of 13.8, where he writes about science and culture with physicist Adam Frank.
- How far should we defend an idea in the face of contrarian evidence?
- Who decides when it's time to abandon an idea and deem it wrong?
- Science carries within it its seeds from ancient Greece, including certain prejudices of how reality should or shouldn't be.
From the perspective of the west, it all started in ancient Greece, around 600 BCE. This is during the Axial Age, a somewhat controversial term coined by German philosopher Karl Jaspers to designate the remarkable intellectual and spiritual awakening that happened in different places across the globe roughly within the span of a century. Apart from the Greek explosion of thought, this is the time of Siddhartha Gautama (aka the Buddha) in India, of Confucius and Lao Tzu in China, of Zoroaster (or Zarathustra) in ancient Persia—religious leaders and thinkers who would reframe the meaning of faith and morality. In Greece, Thales of Miletus and Pythagoras of Samos pioneered pre-Socratic philosophy, (sort of) moving the focus of inquiry and explanation from the divine to the natural.
To be sure, the divine never quite left early Greek thinking, but with the onset of philosophy, trying to understand the workings of nature through logical reasoning—as opposed to supernatural reasoning—would become an option that didn't exist before. The history of science, from its early days to the present, could be told as an increasingly successful split between belief in a supernatural component to reality and a strictly materialistic cosmos. The Enlightenment of the 17th and 18th centuries, the Age of Reason, means quite literally 'to see the light,' the light here clearly being the superiority of human logic above any kind of supernatural or nonscientific methodology to get to the "truth" of things.
Einstein, for one, was a believer, preaching the fundamental reasonableness of nature; no weird unexplainable stuff, like a god that plays dice—his tongue-in-cheek critique of the belief that the unpredictability of the quantum world was truly fundamental to nature and not just a shortcoming of our current understanding.
To what extent we can understand the workings of nature through logic alone is not something science can answer. It is here that the complication begins. Can the human mind, through the diligent application of scientific methodology and the use of ever-more-powerful instruments, reach a complete understanding of the natural world? Is there an "end to science"? This is the sensitive issue. If the split that started in pre-Socratic Greece were to be completed, nature in its entirety would be amenable to a logical description, the complete collection of behaviors that science studies identified, classified, and described by means of perpetual natural laws. All that would be left for scientists and engineers to do would be practical applications of this knowledge, inventions, and technologies that would serve our needs in different ways.
This sort of vision—or hope, really—goes all the way back to at least Plato who, in turn, owes much of this expectation to Pythagoras and Parmenides, the philosopher of Being. The dispute between the primacy of that which is timeless or unchangeable (Being), and that which is changeable and fluid (Becoming), is at least that old. Plato proposed that truth was in the unchangeable, rational world of Perfect Forms that preceded the tricky and deceptive reality of the senses. For example, the abstract form Chair embodies all chairs, objects that can take many shapes in our sensorial reality while serving their functionality (an object to sit on) and basic design (with a sittable surface and some legs below it). According to Plato, the Forms hold the key to the essence of all things.
Plato used the allegory of the cave to explain that what humans see and experience is not the true reality.
Credit: Gothika via Wikimedia Commons CC 4.0
When scientists and mathematicians use the term Platonic worldview, that's what they mean in general: The unbound capacity of reason to unlock the secrets of creation, one by one. Einstein, for one, was a believer, preaching the fundamental reasonableness of nature; no weird unexplainable stuff, like a god that plays dice—his tongue-in-cheek critique of the belief that the unpredictability of the quantum world was truly fundamental to nature and not just a shortcoming of our current understanding. Despite his strong belief in such underlying order, Einstein recognized the imperfection of human knowledge: "What I see of Nature is a magnificent structure that we can comprehend only very imperfectly, and that must fill a thinking person with a feeling of humility." (Quoted by Dukas and Hoffmann in Albert Einstein, The Human Side: Glimpses from His Archives (1979), 39.)
Einstein embodies the tension between these two clashing worldviews, a tension that is still very much with us today: On the one hand, the Platonic ideology that the fundamental stuff of reality is logical and understandable to the human mind, and, on the other, the acknowledgment that our reasoning has limitations, that our tools have limitations and thus that to reach some sort of final or complete understanding of the material world is nothing but an impossible, semi-religious dream.
This kind of tension is palpable today when we see groups of scientists passionately arguing for or against the existence of the multiverse, an idea that states that our universe is one in a huge number of other universes; or for or against the final unification of the laws of physics.
Nature, of course, is always the final arbiter of any scientific dispute. Data decides, one way or another. That's the beauty and power at the core of science. The challenge, though, is to know when to let go of an idea. How long should one wait until an idea, seductive as it may be, is deemed unrealistic? This is where the debate gets interesting. Data to support more "out there" ideas such as the multiverse or extra symmetries of nature needed for unification models has refused to show up for decades, despite extensive searches with different instruments and techniques. On the other hand, we only find if we look. So, should we keep on defending these ideas? Who decides? Is it a community decision or should each person pursue their own way of thinking?
In 2019, I participated in an interesting live debate at the World Science Festival with physicists Michael Dine and Andrew Strominger and hosted by physicist Brian Greene. The theme was string theory, our best candidate for a final theory of how particles of matter interact. When I completed my PhD in 1986, string theory was the way. The only way. But, by 2019, things had changed, and quite dramatically, due to the lack of supporting data. To my surprise, both Mike and Andy were quite open to the fact that that certainty of the past was no more. String theory has taught physicists many things and that was perhaps its use. The Platonic outlook was in peril.
The dispute remains alive, although with each experiment that fails to show supporting evidence for string theory the dream grows harder to justify. Will it be a generational thing, as celebrated physicist Max Planck once quipped, "Ideas don't die, physicists do"? (I paraphrase.) I hope not. But it is a conversation that should be held more in the open, as was the case with the World Science Festival. Dreams die hard. But they may die a little easier when we accept the fact that our grasp of reality is limited, and doesn't always fit our expectations of what should or shouldn't be real.
- Benjamin Franklin wrote essays on a whole range of subjects, but one of his finest was on how to be a nice, likable person.
- Franklin lists a whole series of common errors people make while in the company of others, like over-talking or storytelling.
- His simple recipe for being good company is to be genuinely interested in others and to accept them for who they are.
Think of the nicest person you know. The person who would fit into any group configuration, who no one can dislike, or who makes a room warmer and happier just by being there.
What makes them this way? Why are they so amiable, likeable, or good-natured? What is it, you think, that makes a person good company?
There are really only two things that make someone likable.
This is the kind of advice that comes from one of history's most famously good-natured thinkers: Benjamin Franklin. His essay "On Conversation" is full of practical, surprisingly modern tips about how to be a nice person.
Franklin begins by arguing that there are really only two things that make someone likable. First, they have to be genuinely interested in what others say. Second, they have to be willing "to overlook or excuse Foibles." In other words, being good company means listening to people and ignoring their faults. Being witty, well-read, intelligent, or incredibly handsome can all make a good impression, but they're nothing without these two simple rules.
The sort of person nobody likes
From here, Franklin goes on to give a list of the common errors people tend to make while in company. These are the things people do that makes us dislike them. We might even find, with a sinking feeling in our stomach, that we do some of these ourselves.
1) Talking too much and becoming a "chaos of noise and nonsense." These people invariably talk about themselves, but even if "they speak beautifully," it's still ultimately more a soliloquy than a real conversation. Franklin mentions how funny it can be to see these kinds of people come together. They "neither hear nor care what the other says; but both talk on at any rate, and never fail to part highly disgusted with each other."
2) Asking too many questions. Interrogators are those people who have an "impertinent Inquisitiveness… of ten thousand questions," and it can feel like you're caught between a psychoanalyst and a lawyer. In itself, this might not be a bad thing, but Franklin notes it's usually just from a sense of nosiness and gossip. The questions are only designed to "discover secrets…and expose the mistakes of others."
3) Storytelling. You know those people who always have a scripted story they tell at every single gathering? Utterly painful. They'll either be entirely oblivious to how little others care for their story, or they'll be aware and carry on regardless. Franklin notes, "Old Folks are most subject to this Error," which we might think is perhaps harsh, or comically honest, depending on our age.
4) Debating. Some people are always itching for a fight or debate. The "Wrangling and Disputing" types inevitably make everyone else feel like they need to watch what they say. If you give even the lightest or most modest opinion on something, "you throw them into Rage and Passion." For them, the conversation is a boxing fight, and words are punches to be thrown.
5) Misjudging. Ribbing or mocking someone should be a careful business. We must never mock "Misfortunes, Defects, or Deformities of any kind", and should always be 100% sure we won't upset anyone. If there's any doubt about how a "joke" will be taken, don't say it. Offense is easily taken and hard to forget.
Not following Benjamin Franklin's advice.Credit: Ronald Martinez via Getty Images
On practical philosophy
Franklin's essay is a trove of great advice, and this article only touches on the major themes. It really is worth your time to read it in its entirety. As you do, it's hard not to smile along or to think, "Yes! I've been in that situation." Though the world has changed dramatically in the 300 years since Franklin's essay, much is exactly the same. Basic etiquette doesn't change.
If there's only one thing to take away from Franklin's essay, it comes at the end, where he revises his simple recipe for being nice:
"Be ever ready to hear what others say… and do not censure others, nor expose their Failings, but kindly excuse or hide them"
So, all it takes to be good company is to listen and accept someone for who they are.
Philosophy doesn't always have to be about huge questions of truth, beauty, morality, art, or meaning. Sometimes it can teach us simply how to not be a jerk.
Jonny Thomson teaches philosophy in Oxford. He runs a popular Instagram account called Mini Philosophy (@philosophyminis). His first book is Mini Philosophy: A Small Book of Big Ideas.
Our ancestors first developed humanlike brains 1.7 million years ago
A recent study analyzed the skulls of early Homo species to learn more about the evolution of primate brains.
For nearly two centuries, scientists have known that humans descended from the great apes. But it's proven difficult to precisely map out the branches of that evolutionary tree, especially in terms of determining when and where early Homo species first developed brains similar to modern humans.
There are clear differences between ape and human brains. Compared to apes, the Homo sapiens brain is larger, and its frontal lobe is organized such that we can engage in toolmaking, planning, and language. Other Homo species also enjoyed some of these cognitive innovations, from the Neanderthals to Homo floresiensis, the hobbit-like people who once inhabited Indonesia.
One reason it's been difficult to discern the details of this cognitive evolution from apes to Homo species is that brains don't fossilize, so scientists can't directly study early primate brains. But primate skulls offer clues.
Brains of yore
In a new study published in Science, an international team of researchers analyzed impressions left on the skulls of Homo species to better understand the evolution of primate brains. Using computer tomography on fossil skulls, the team generated images of what the brain structures of early Homo species probably looked like, and then compared those structures to the brains of great apes and modern humans.
The results suggest that Homo species first developed humanlike brains approximately 1.7 to 1.5 million years ago in Africa. This cognitive evolution occurred at roughly the same time Homo species' technology and culture were becoming more complex, with these species developing more sophisticated stone tools and animal food resources.
The team hypothesized that "this pattern reflects interdependent processes of brain-culture coevolution, where cultural innovation triggered changes in cortical interconnectivity and ultimately in external frontal lobe topography."
The team also found that these structural changes occurred after Homo species migrated out of Africa for regions like modern-day Georgia and Southeast Asia, which is where the fossils in the study were discovered. In other words, Homo species still had ape-like brains when some groups first left Africa.
While the study sheds new light on the evolution of primate brains, the team said there's still much to learn about the history of early Homo species, particularly in terms of explaining the morphological diversity of Homo fossils discovered in Africa.
"Deciphering evolutionary process in early Homo remains a challenge that will be met only through the recovery of expanded fossil samples from well-controlled chronological contexts," the researchers wrote.
Say goodbye to wrinkles with the ultimate anti-aging device
This device from Lumina NRG uses advanced LED and microcurrent facial toning technology to remove signs of aging.
- When it comes to your daily skincare routine, the right tools can make all the difference.
- Beyond moisturizers, serums, and toners, it's important to invest in a device that can enhance the capabilities of your beauty products.
- The Microcurrent 3-in-1 Facial Toning Device is on sale for 63% off and helps serums and moisturizers penetrate deeper into your skin.
Have you been trying endless anti-aging beauty products and gaining poor results? This means it's definitely time to re-evaluate your skincare routine. Beyond finding the proper products, you should pair it with a high-quality device like the Microcurrent 3-in-1 Facial Toning Device.
There are plenty of benefits to leaning on this machine. It uses advanced LED and microcurrent facial toning technology to remove any signs of aging and leave you with fresher, younger-looking skin. The red light within the device stimulates collagen and gets rid of fine lines, along with large pores, while the blue light minimizes redness and any acne-causing bacteria.
Furthermore, the heat therapy opens up your pores, allowing your serums, moisturizers, and more to penetrate deeper into your skin. All in all, it's a great alternative to the standard facelift. Not only is it safer, it provides instant results and doesn't have the same harmful effects as surgery.
Check it out in action:
The road to glowing skin can be yours with the Microcurrent 3-in-1 Facial Toning Device. Usually selling for $299, you have the chance to order it for a whopping 63% off, dropping the final price all the way down to an affordable $109.99. Considering all of the impressive features, this device will pay for itself soon enough. Wave goodbye to wrinkles and say hello to the new you.
Prices subject to change.
When you buy something through a link in this article or from our shop, Big Think earns a small commission. Thank you for supporting our team's work.



SMARTER FASTER trademarks owned by Freethink Media, Inc. All rights reserved.
