Once a week.
Subscribe to our weekly newsletter.
Are Humans Getting Smarter or Less Intelligent?
We may pay a price for abstract thinking.
Observe the behavior of shoppers in a long supermarket line or drivers snarled in traffic, and you can quickly become disillusioned about humanity and its collective IQ. Reality TV and websites like People of Walmart inflame this consideration. Lots of songs, both popular and underground, even utter the phrase “only stupid people are breeding.” Apparently, many of us can relate.
And yet, we’re better at technology today than in times past. Never before have we been more productive, better educated, or more technologically savvy. I had a teacher in high school who said that at the time Einstein was considering relativity, few people in the entire world were intelligent enough to understand it. But just a generation later, everyone had the theory in high school and understood it well, or at least well enough to pass the test.
So at different times and in different ways, we get competing impressions as to whether humanity collectively is getting smarter or less intelligent than before. Of course, the problem with personal experience is that it’s myopic or shortsighted. So what do studies tell us? What’s really going on here? Well, things get more complex and thornier moving forward, as they often do.
Howard Gardner (right) of Harvard, the father of the multiple intelligences theory. Getty Images.
First, there’s an argument even in terms of what intelligence is. Harvard psychologist Howard Gardner for instance, proposes multiple intelligences, which has been a staple in educational spheres for some years (. Types include: verbal, logical-mathematical, visual-spacial, bodily-kinesthetic, musical, interpersonal (between people), intrapersonal (understanding your own feelings, thoughts, and beliefs), naturalist (understanding the outdoors), and existential intelligence (understanding the deeper questions of life).
Traditionally, vocabulary was used as a metric for intelligence. Research has shown that it’s highly correlated with IQ. Yet, according to a 2006 study, American’s vocabulary has been in swift decline since its peak, in the 1940’s. There is some controversy however, as vocabulary tests have been shown to hold an inherent cultural bias.
If you look at IQ as the most vital metric, note that it’s been rising globally over time. But that doesn’t tell the whole story. In fact, there’s an interesting trend. IQ has been rising in developing countries, while it may be slowing in developed ones. In a 2015, King’s College London study, published in the journal Intelligence, psychologists wanted to know what shape the world IQ was in. Researchers spent over six decades conducting the study. In total, they collected the IQ scores of 200,000 people from 48 different countries. They found that global IQ has risen 20 points since 1950.
More abstract thinking is a sign of greater intelligence. Getty Images.
India and China saw the most gains. But developing countries in general have seen a rise, due to improved education and healthcare systems. What follows is known as the Flynn effect, named after political scientist James Flynn. In 1982, he predicted that rising living conditions would improve a people’s collective IQ. A number of studies support the Flynn effect. In the King’s College London study, IQ grew at a more rapidly in the developing world, while the pace slowed in US and other developed countries. Many developing nations may someday close the gap.
Another reason, the human brain continues to evolve toward ever more abstract thinking. Flynn quotes a study looking at rural Russian peasants. The researchers told them, "All bears are white where there is always snow; in Novaya Zemlya there is always snow; what color are the bears there?" Most had answered that since they had never been there, they would not know, or that they had only seen black bears.
Another example is that if you asked someone in the 19th century what a rabbit and a dog had in common, they wouldn’t be likely to point out that they’re both mammals or that they’re warm blooded. Instead, they may say something like, both are furry, or both are used by humans. Here, people rely more on their experiences in the real world, rather than abstract, logical or “scientific” reasoning. Flynn said that this change in our faculties illustrated, "Nothing less than a liberation of the human mind.”
Abstract reasoning helps use build impressive technology and understand how to use it. Getty Images.
Flynn wrote, “The scientific world-view, with its vocabulary, taxonomies, and detachment of logic and the hypothetical from concrete referents, has begun to permeate the minds of post-industrial people. This has paved the way for mass education on the university level and the emergence of an intellectual cadre without whom our present civilization would be inconceivable."
Will we ever reach a maximum in what humans can comprehend? Will environmental changes alter our mental landscape? What about those monumental changes about to be brought on by the second industrial revolution, the coming tidal wave of robots and AI? The answer to all of these is, no one knows.
One thought, older folks usually complain that young people lack “common sense.” When something is gained in nature or in life, something else is often lost as a result. Perhaps, as our thinking grows more abstract, we tend to lose the practical aspects of our faculties. Despite this, as each generation becomes more dissimilar than those past, their newly updated faculties help them to change the world in ever more dizzying, sophisticated, and delightful ways.
Why did humans become so intelligent in the first place? To find out, click here:
Geologists discover a rhythm to major geologic events.
- It appears that Earth has a geologic "pulse," with clusters of major events occurring every 27.5 million years.
- Working with the most accurate dating methods available, the authors of the study constructed a new history of the last 260 million years.
- Exactly why these cycles occur remains unknown, but there are some interesting theories.
Our hearts beat at a resting rate of 60 to 100 beats per minute. Lots of other things pulse, too. The colors we see and the pitches we hear, for example, are due to the different wave frequencies ("pulses") of light and sound waves.
Now, a study in the journal Geoscience Frontiers finds that Earth itself has a pulse, with one "beat" every 27.5 million years. That's the rate at which major geological events have been occurring as far back as geologists can tell.
A planetary calendar has 10 dates in red
Credit: Jagoush / Adobe Stock
According to lead author and geologist Michael Rampino of New York University's Department of Biology, "Many geologists believe that geological events are random over time. But our study provides statistical evidence for a common cycle, suggesting that these geologic events are correlated and not random."
The new study is not the first time that there's been a suggestion of a planetary geologic cycle, but it's only with recent refinements in radioisotopic dating techniques that there's evidence supporting the theory. The authors of the study collected the latest, best dating for 89 known geologic events over the last 260 million years:
- 29 sea level fluctuations
- 12 marine extinctions
- 9 land-based extinctions
- 10 periods of low ocean oxygenation
- 13 gigantic flood basalt volcanic eruptions
- 8 changes in the rate of seafloor spread
- 8 times there were global pulsations in interplate magmatism
The dates provided the scientists a new timetable of Earth's geologic history.
Tick, tick, boom
Credit: New York University
Putting all the events together, the scientists performed a series of statistical analyses that revealed that events tend to cluster around 10 different dates, with peak activity occurring every 27.5 million years. Between the ten busy periods, the number of events dropped sharply, approaching zero.
Perhaps the most fascinating question that remains unanswered for now is exactly why this is happening. The authors of the study suggest two possibilities:
"The correlations and cyclicity seen in the geologic episodes may be entirely a function of global internal Earth dynamics affecting global tectonics and climate, but similar cycles in the Earth's orbit in the Solar System and in the Galaxy might be pacing these events. Whatever the origins of these cyclical episodes, their occurrences support the case for a largely periodic, coordinated, and intermittently catastrophic geologic record, which is quite different from the views held by most geologists."
Assuming the researchers' calculations are at least roughly correct — the authors note that different statistical formulas may result in further refinement of their conclusions — there's no need to worry that we're about to be thumped by another planetary heartbeat. The last occurred some seven million years ago, meaning the next won't happen for about another 20 million years.
Information may not seem like something physical, yet it has become a central concern for physicists. A wonderful new book explores the importance of the "dataome" for the physical, biological, and human worlds.
- The most important current topic in physics relates to a subject that hardly seems physical at all — information, which is central to thermodynamics and perhaps the universe itself.
- The "dataome" is the way human beings have been externalizing information about ourselves and the world since we first began making paintings on cave walls.
- The dataome is vast and growing everyday, sucking up an ever increasing share of the energy humans produce.
Physics is a field that is supposed to study real stuff. By real, I mean things like matter and energy. Matter is, of course, the kind of stuff you can hold in your hand. Energy may seem a little more abstract, but its reality is pretty apparent, appearing in the form of motion or gravity or electromagnetic fields.
What has become apparent recently, however, is the importance to physics of something that seems somewhat less real: information. From black holes to quantum mechanics to understanding the physics of life, information has risen to become a principal concern of many physicists in many domains. This new centrality of information is why you really need to read astrophysicist Caleb Scharf's new book The Ascent of Information: Books, Bits, Machines, and Life's Unending Algorithms.
Scharf is currently the director of the Astrobiology Program at Columbia University. He is also the author of four other books as well as a regular contributor to Scientific American.
(Full disclosure: Scharf and I have been collaborators on a scientific project involving the Fermi Paradox, so I was a big fan before I read this new book. Of course, the reason why I collaborated with him is because I really like the way he thinks, and his creativity in tackling tough problems is on full display in The Ascent of Information.)
What is the dataome?
In his new book, Scharf is seeking a deeper understanding of what he calls the "dataome." This is the way human beings have been externalizing information about ourselves and the world since we first began making paintings on cave walls. The book opens with a compelling exploration of how Shakespeare's works, which began as scribbles on a page, have gone on to have lives of their own in the dataome. Through reprintings in different languages, recordings of performances, movie adaptations, comic books, and so on, Shakespeare's works are now a permanent part of the vast swirling ensemble of information that constitutes the human dataome.
I found gems in these parts of the book that forced me to put the volume down and stare into space for a time to deal with their impact.
But the dataome does not just live in our heads. Scharf takes us on a proper physicist's journey through the dataome, showing us how information can never be divorced from energy. Your brain needs the chemical energy from food you ate this morning to read, process, and interpret these words. One of the most engaging parts of the book is when Scharf details just how much energy and real physical space our data-hungry world consumes as it adds to the dataome. For example, the Hohhot Data Center in the Inner Mongolia Autonomous Region of China is made of vast "farms" of data processing servers covering 245 acres of real estate. A single application like Bitcoin, Scharf tells us, consumes 7.7 gigawatts per year, equivalent to the output of half a dozen nuclear reactors!
Information is everywhere
But the dataome is not just about energy. Entropy is central to the story as well. Scharf takes the reader through a beautifully crafted discussion of information and the science of thermodynamics. This is where the links between energy, entropy, the limits of useful work, and probability all become profoundly connected to the definition of information.
The second law of thermodynamics tells us that you cannot use all of a given amount of energy to do useful work. Some of that energy must be wasted by getting turned into heat. Entropy is the physicist's way of measuring that waste (which can also be thought of as disorder). Scharf takes the reader through the basic relations of thermodynamics and then shows how entropy became intimately linked with information. It was Claude Shannon's brilliant work in the 1940s that showed how information — bits — could be defined for communication and computation as an entropy associated with the redundancy of strings of symbols. That was the link tying the physical world of physics explicitly to the informational and computational world of the dataome.
The best parts of the book are where Scharf unpacks how information makes its appearance in biology. From the data storage and processing that occurs with every strand of DNA, to the tangled pathways that define evolutionary dynamics, Scharf demonstrates how life is what happens to physics and chemistry when information matters. I found gems in these parts of the book that forced me to put the volume down and stare into space for a time to deal with their impact.
The physics of information
There are a lot of popular physics books out there about black holes and exoplanets and other cool stuff. But right now, I feel like the most important topic in physics relates to a subject that hardly seems physical at all. Information is a relatively new addition to the physics bestiary, making it even more compelling. If you are looking for a good introduction to how that is so, The Ascent of Information is a good place to start.
A new study tested to what extent dogs can sense human deception.
Is humanity's best friend catching on to our shenanigans? Researchers at the University of Vienna discovered that dogs can in certain cases know when people are lying.
The scientists carried out a study with hundreds of dogs to determine to what extent dogs could spot deception. The team's new paper, published in Proceedings of the Royal Society B, outlined experiments that tested whether dogs, like humans, have some inner sense of how to assess truthfulness.
As the researchers wrote in their paper, "Among non-primates, dogs (Canis familiaris) constitute a particularly interesting case, as their social environment has been shared with humans for at least 14,000 years. For this reason, dogs have been considered as a model species for the comparative investigation of socio-cognitive abilities." The investigation focused specifically on understanding if dogs were "sensitive to some mental or psychological states of humans."
The experiments involved 260 dogs, which were made to listen to advice from a human "communicator" whom they did not know. The human told them which one of two bowls had a treat hidden inside by touching it and saying, "Look, this is very good!" If the dogs took the person's advice, they would get the treat.
Once they established the trust of the dogs, the researchers then complicated the experience by letting dogs watch another human that they did not know transfer the treat from one bowl to another. In some cases, the original communicator would also be present to watch but not always.
The findings revealed that half of the dogs did not follow the advice of the communicator if that person was not present when the food was switched to a different bowl. The dogs had a sense that this human could not have known the true location of the treat. Furthermore, two-thirds of the dogs ignored the human's suggestion if she did see the food switch but pointed to the wrong bowl. The dogs figured out the human was lying to them.
Photos of experiments showing the dog, human communicator, and person hiding the treat. Credit: Lucrezia Lonardo et al / Proceedings of the Royal Society B.
"We thought dogs would behave like children under age five and apes, but now we speculate that perhaps dogs can understand when someone is being deceitful," co-author Ludwig Huber from the University of Vienna told New Scientist. "Maybe they think, 'This person has the same knowledge as me, and is nevertheless giving me the wrong [information].' It's possible they could see that as intentionally misleading, which is lying."
This is not the first time such experiments have been carried out. Previously, children under age five, macaques, and chimps were tested in a similar way. It turned out that children and other animals were more likely than dogs to listen to the advice of the liars. Notably, among the dogs, terriers were found to be more like children and apes, more eagerly following false suggestions.