Big Think Interview With Nicholas Carr
Nicholas Carr writes on the social, economic, and business implications of technology. He is the author of the 2008 Wall Street Journal bestseller "The Big Switch: Rewiring the World, from Edison to Google," which is "widely considered to be the most influential book so far on the cloud computing movement," according the Christian Science Monitor. His earlier book, "Does IT Matter?," published in 2004, "lays out the simple truths of the economics of information technology in a lucid way, with cogent examples and clear analysis," said The New York Times. His new book is "The Shallows: What the Internet Is Doing to Our Brains."
Carr has also written for many periodicals, including The Atlantic Monthly, The New York Times Magazine, Wired, The Financial Times, Die Zeit, The Futurist, and Advertising Age, and has been a columnist for The Guardian and The Industry Standard. His much-discussed essay "Is Google Making Us Stupid?," which appeared as the cover story of the Atlantic Monthly's Ideas issue in the summer of 2008, has been collected in three popular anthologies. Carr has written a personal blog, Rough Type, since 2005. He is a member of the Encyclopaedia Britannica's editorial board of advisors and is on the steering board of the World Economic Forum's cloud computing project.
Carr holds a B.A. from Dartmouth College and an M.A., in English and American literature and language, from Harvard University.
Question: What are some technologies, prior to the Internet, that have radically reshaped the way our brains work?
Nicholas Carr: I think that if you look across the entire world of tools and technologies, what you see is that there are different categories. One category is what I call intellectual technologies. And these are the tools we use to think with, to find information, gather information, exchange information and so forth. And I think if you look back through the intellectual history of human beings you can trace the way that these intellectual technologies influence the way we think. And that’s true all the way back to, for instance, the arrival of the map, which actually predates history. We don’t know who invented the map, but somebody at some point had to invent it.
And before the map came along people understood where they were and where they were going purely through their sensory perceptions, through what they saw, also what they hear and so forth. As soon as the map came along, we suddenly had a very different way to think about where we were in space. The pure visual and auditory and sensory perception was supplemented by an abstract picture, which is a radically different way to think about space. And of course, there were all sorts of practical uses of maps—and still are—for charting routes and establishing boundaries, but what happened at a deeper level is that the map kind of trained us to think abstractly, more abstractly in general. So it gave us, or helped give us – give human beings, a more abstract mind. More attuned to the hidden patterns that lay behind what we saw and what we heard and what we felt, and so forth.
And I think you see a similar thing when the mechanical clock comes around. Now this is much later, in the 1300’s or 1400’s or so. Before the mechanical clock came along, people experienced time as a natural flow; to the extent that they measured it by watching the stars or the moon or the sun or so forth, things that emphasized the natural flow of time. As soon as you introduce the mechanical clock, you get a radically different view of time. Suddenly, it’s not a flow; it’s a series of discreet, precisely measurable units, seconds, minutes, hours, and so forth. And again, there’s all sorts of practical uses of the tool for scheduling a person’s time, for coordinating work and other activities among a large number of people. But what we see again, is that this new tool, this new intellectual technology gave us, in general, a different way of thinking. A much more scientific way of thinking that very much focused on measurement and on kind of precise cause and effect across long chains.
So here again, we see an intellectual technology, that beyond its practical uses really changed in a kind of fundamental way, I think, the way people think. And it’s no coincidence, I think, that after the arrival of the mechanical clock we see an explosion in scientific thinking and scientific discovery.
At about the same time, a little after the arrival of the mechanical clock, we saw the introduction of the printing press and hence printed books, which replaced handwritten books. And I think that the book in some ways is the most interesting from our own present standpoint, particularly when we want to think about the way the internet is changing us. It’s interesting to think about how the book changed us.
I think what the book did in addition to its practical uses, is it gave us a more attentive way of thinking. What the book does as a technology is shield us from distraction. The only thinggoing on is the, you know, the progression of words and sentences across page after page and so suddenly we see this immersive kind of very attentive thinking, whether you are paying attention to a story or to an argument, or whatever. And what we know about the brain is the brain adapts to these types of tools.
And so the ways of thinking that we learned from the tools we can then apply in other areas of our lives. So we become, after the arrival of the printing press in general, more attentive more attuned to contemplative ways of thinking. And that’s a very unnatural way of using our mind. You know, paying attention, filtering out distractions. So the book, I think, like the map before it, like the clock, created or help create a revolution in the human mind in the way our habits of mind and ultimately the way we use our brains.
Question: Neurologically, how does our brain adapt itself to new technologies?
Nicholas Carr: A couple of types of adaptations take place in your brain. One is a strengthening of the synaptical connections between the neurons involved in using that instrument, in using that tool. And basically these are chemical – neural chemical changes. So you know, cells in our brain communicate by transmitting electrical signals between them and those electrical signals are actually activated by the exchange of chemicals, neurotransmitters in our synapses.
And so when you begin to use a tool, for instance, you have much stronger electrochemical signals being processed in those – through those synaptical connections.
And then the second, and even more interesting adaptation is in actual physical changes,anatomical changes. Your neurons, you may grow new neurons that are then recruited into these circuits or your existing neurons may grow new synaptical terminals. And again, that also serves to strengthen the activity in those, in those particular pathways that are being used – new pathways.
On the other hand, you know, the brain likes to be efficient and so even as its strengthening the pathways you’re exercising, it’s pulling – it’s weakening the connections in other ways between the cells that supported old ways of thinking or working or behaving, or whatever that you’re not exercising so much.
So that adaptation – I mean there are a whole lot of reasons to be very happy that our brains are able to adapt and adapt so readily because we do strengthen and become more efficient at things we do a lot of in that changed ways of thinking that we might need. On the other hand, there is a cost. We lose – we begin to lose the facilities that we don’t exercise. So adaptation has both a very, very positive side, but also a potentially negative side because ultimately our brain is qualitatively neutral. It doesn’t pare what it’s strengthening or what it’s weakening, it just responds to the way we’re exercising our mind.
Question: What skills are we losing because of the Internet?
Nicholas Carr: The Internet, like all intellectual technologies has a trade off. As we train our brains to use it, as we adapt to the environment of the internet, which is an environment of kind of constant immersion and information and constant distractions, interruptions, juggling lots of messages, lots of bits of information. As we adapt to that information environment, so to speak, we gain certain skills, but we lose other ones. And if you look at the scientific evidence, it’s pretty clear particularly from studies of like video games, that use of online media enhances our – some of our visual cognitive ability. So our ability to spot patterns in arrays of visual information to keep track of lots of things going on at once on a screen but along with that, what we lose is the ability to pay deep attention to one thing for a sustained period of time, to filter out distractions.
And the ability to pay attention not only underpins kind of ways of thinking that are pretty obvious, contemplativeness, reflection, introspection, all of those kind of solitary ways of thinking, but what we know from brain studies is that the ability to pay attention also is very important for our ability to build memories, to transfer information from our short-term memory to our long-term memory. And only when we do that do we weave new information into everything else we have stored in our brains. All the other facts we’ve learned, all the other experiences we’ve had, emotions we’ve felt. And that’s how you build, I think, a rich intellect and a rich intellectual life.
Question: Is losing the capacity for solitary thought necessarily a bad thing?
Nicholas Carr: I think there is a reason that, you know, 100 years ago when Rodin sculpted his great figure of The Thinker, The Thinker was, you know, in a contemplative pose and was concentrating deeply, and wasn’t you know, multi-tasking. And because that is something that, until recently anyway, people always thought was the deepest and most distinctly human way of thinking. And that doesn’t meant that I believe that all of us should sit in you know, darkened rooms and think big thoughts without any stimuli coming at us all day. I think it’s important to have a balance of those ways of thinking.
But what the web seems to be doing, and a lot of the proponents of the web seem to be completely comfortable with, its pushing us all in the direction of skimming and scanning and multi-tasking and it’s not encouraging us or even giving us an opportunity to engage in more attentive ways of thinking. And so, to me, I think losing those abilities may – we may end up finding that those are actually the most valuable ways of thinking that are available to us as human beings.
Question: How can we resist the Internet’s effect on our brains?
Nicholas Carr: I think the solution, so to speak, to this problem is pretty simple to state. I mean, if you want to change your brain, you change your habits. You change your habits of thinking. And that means, you know, setting aside time to engage in more contemplative, more reflective ways of thinking and that means, you know, setting aside time to engage in more contemplative, more reflective ways of thinking, to be – to screen out distractions. And that means retreating from digital media and from the web and from Smart Phones and texting and Facebook and Tweeting and everything else.
And so that’s a pretty obvious solution. What’s hard is actually doing it. Because it’s no longer just a matter of personal choice, of personal discipline, though obviously those things are always important, but what we’re seeing and we see this over and over again in the history of technology, is that the technology – the technology of the web, the technology of digital media, gets entwined very, very deeply into social processes, into expectations. So more and more, for instance in our work lives. You know, if our boss and all our colleagues are constantly exchanging messages, constantly checking email on their Blackberry or iPhone or their Droid or whatever, then it becomes very difficult to say, I’m not going to be as connected because you feel like you’re career is going to take a hit. And that same expectation is now moving over into our social lives, particularly for young people.
If all your friends are planning their social lives through texts and Facebook and Twitter and so forth, then to back away from that means to feel socially isolated. And of course for all people, particularly for young people, there’s kind of nothing worse than feeling socially isolated, that your friends are you know, having these conversations and you’re not involved. So it’s easy to say the solution, which is to, you know, becomes a little bit more disconnected. What’s hard it actually doing that. And I think that all of us, including myself who try, find that it’s really a struggle because were so kind of – we’re so used to craving constant streams of new information that it’s kind of bewildering to be alone with our thoughts these days.
Question: How has the technology of reading evolved from papyrus to the iPad?
Nicholas Carr: One of the most important things to realize about reading is that it is a fairly new invention in human history—a couple of millennia old, only after the invention of the alphabet. And for a long time, reading was really only just a kind of adjunct to oral communication because you know, most of human history you just conversed and exchanged information through speech.
And so one of the fascinating things about early writing on slates, on papyrus, even on early handwritten books, is for instance, there were no space between the words. People just wrote in continuous script. And that’s because that’s the way we hear speech. You now, when somebody’s talking to us, they’re not putting pauses – carefully putting pauses between words. It all flows together. The problem with that though, it’s very hard to read. A lot of your mental energy goes to figuring out where does one word end and the next begin. And as a result, all reading was done in the early years out loud, there was no such thing as silent reading because you had to read out loud in order to figure out you know, where was a word ending and where is the word beginning.
And it was only in around the year 800 or 900 that we saw the introduction of word spaces. And suddenly reading became, in a sense, easier and suddenly you had to arrival of silent reading, which changed the act of reading from just transcription of speech to something that every individual did on their own. And suddenly you had this whole deal of the silent solitary reader who was improving their mind, expanding their horizons, and so forth. And when Guttenberg invented the printing press around 1450, what that served to do was take this new very attentive, very deep form of reading, which had been limited to just, you know, monasteries and universities, and by making books much cheaper and much more available, spread that way of reading out to a much larger mass of audience. And so we saw, for the last 500 years or so, one of the central facts of culture was deep solitary reading. The immersion of ourselves in books, in long articles, and so forth.
With the arrival – with the transfer now of text more and more onto screens, we see, I think, a new and in some ways more primitive way of reading. In order to take in information off a screen, when you are also being bombarded with all sort of other information and when there links in the text where you have to think even for just a fraction of a second, you know, do I click on this link or not. Suddenly reading again becomes a more cognitively intensive act, the way it was back when there were no spaces between words. And as a result, I think we begin to lose the ability to read in the deepest, most interpretive ways because were not kind of calming our mind and just focusing on the argument or the story.
Recorded November 10, 2010
Interviewed by Max Miller
A conversation with the technology writer.
Once a week.
Subscribe to our weekly newsletter.
New data have set the particle physics community abuzz.
- The first question ever asked in Western philosophy, "What's the world made of?" continues to inspire high energy physicists.
- New experimental results probing the magnetic properties of the muon, a heavier cousin of the electron, seem to indicate that new particles of nature may exist, potentially shedding light on the mystery of dark matter.
- The results are a celebration of the human spirit and our insatiable curiosity to understand the world and our place in it.
If brute force doesn't work, then look into the peculiarities of nothingness. This may sound like a Zen koan, but it's actually the strategy that particle physicists are using to find physics beyond the Standard Model, the current registry of all known particles and their interactions. Instead of the usual colliding experiments that smash particles against one another, exciting new results indicate that new vistas into exotic kinds of matter may be glimpsed by carefully measuring the properties of the quantum vacuum. There's a lot to unpack here, so let's go piecemeal.
It is fitting that the first question asked in Western philosophy concerned the material composition of the world. Writing around 350 BCE, Aristotle credited Thales of Miletus (circa 600 BCE) with the honor of being the first Western philosopher when he asked the question, "What is the world made of?" What modern high energy physicists do, albeit with very different methodology and equipment, is to follow along the same philosophical tradition of trying to answer this question, assuming that there are indivisible bricks of matter called elementary particles.
Deficits in the Standard Model
Jumping thousands of years of spectacular discoveries, we now have a very neat understanding of the material composition of the world at the subatomic level: a total of 12 particles and the Higgs boson. The 12 particles of matter are divided into two groups, six leptons and six quarks. The six quarks comprise all particles that interact via the strong nuclear force, like protons and neutrons. The leptons include the familiar electron and its two heavier cousins, the muon and the tau. The muon is the star of the new experiments.
For all its glory, the Standard Model described above is incomplete. The goal of fundamental physics is to answer the most questions with the least number of assumptions. As it stands, the values of the masses of all particles are parameters that we measure in the laboratory, related to how strongly they interact with the Higgs. We don't know why some interact much stronger than others (and, as a consequence, have larger masses), why there is a prevalence of matter over antimatter, or why the universe seems to be dominated by dark matter — a kind of matter we know nothing about, apart from the fact that it's not part of the recipe included in the Standard Model. We know dark matter has mass since its gravitational effects are felt in familiar matter, the matter that makes up galaxies and stars. But we don't know what it is.
Whatever happens, new science will be learned.
Physicists had hoped that the powerful Large Hadron Collider in Switzerland would shed light on the nature of dark matter, but nothing has come up there or in many direct searches, where detectors were mounted to collect dark matter that presumably would rain down from the skies and hit particles of ordinary matter.
Could muons fill in the gaps?
Enter the muons. The hope that these particles can help solve the shortcomings of the Standard Model has two parts to it. The first is that every particle, like a muon, that has an electric charge can be pictured simplistically as a spinning sphere. Spinning spheres and disks of charge create a magnetic field perpendicular to the direction of the spin. Picture the muon as a tiny spinning top. If it's rotating counterclockwise, its magnetic field would point vertically up. (Grab a glass of water with your right hand and turn it counterclockwise. Your thumb will be pointing up, the direction of the magnetic field.) The spinning muons will be placed into a doughnut-shaped tunnel and forced to go around and around. The tunnel will have its own magnetic field that will interact with the tiny magnetic field of the muons. As the muons circle the doughnut, they will wobble about, just like spinning-tops wobble on the ground due to their interaction with Earth's gravity. The amount of wobbling depends on the magnetic properties of the muon which, in turn, depend on what's going on with the muon in space.
Credit: Fabrice Coffrini / Getty Images
This is where the second idea comes in, the quantum vacuum. In physics, there is no empty space. The so-called vacuum is actually a bubbling soup of particles that appear and disappear in fractions of a second. Everything fluctuates, as encapsulated in Heisenberg's Uncertainty Principle. Energy fluctuates too, what we call zero-point energy. Since energy and mass are interconvertible (E=mc2, remember?), these tiny fluctuations of energy can be momentarily converted into particles that pop out and back into the busy nothingness of the quantum vacuum. Every particle of matter is cloaked with these particles emerging from vacuum fluctuations. Thus, a muon is not only a muon, but a muon dressed with these extra fleeting bits of stuff. That being the case, these extra particles affect a muon's magnetic field, and thus, its wobbling properties.
About 20 years ago, physicists at the Brookhaven National Laboratory detected anomalies in the muon's magnetic properties, larger than what theory predicted. This would mean that the quantum vacuum produces particles not accounted for by the Standard Model: new physics! Fast forward to 2017, and the experiment, at four times higher sensitivity, was repeated at the Fermi National Laboratory, where yours truly was a postdoctoral fellow a while back. The first results of the Muon g-2 experiment were unveiled on 7-April-2021 and not only confirmed the existence of a magnetic moment anomaly but greatly amplified it.
To most people, the official results, published recently, don't seem so exciting: a "tension between theory and experiment of 4.2 standard deviations." The gold standard for a new discovery in particle physics is a 5-sigma variation, or one part in 3.5 million. (That is, running the experiment 3.5 million times and only observing the anomaly once.) However, that's enough for plenty of excitement in the particle physics community, given the remarkable precision of the experimental measurements.
A time for excitement?
Now, results must be reanalyzed very carefully to make sure that (1) there are no hidden experimental errors; and (2) the theoretical calculations are not off. There will be a frenzy of calculations and papers in the coming months, all trying to make sense of the results, both on the experimental and theoretical fronts. And this is exactly how it should be. Science is a community-based effort, and the work of many compete with and complete each other.
Whatever happens, new science will be learned, even if less exciting than new particles. Or maybe, new particles have been there all along, blipping in and out of existence from the quantum vacuum, waiting to be pulled out of this busy nothingness by our tenacious efforts to find out what the world is made of.
- Benjamin Franklin wrote essays on a whole range of subjects, but one of his finest was on how to be a nice, likable person.
- Franklin lists a whole series of common errors people make while in the company of others, like over-talking or storytelling.
- His simple recipe for being good company is to be genuinely interested in others and to accept them for who they are.
Think of the nicest person you know. The person who would fit into any group configuration, who no one can dislike, or who makes a room warmer and happier just by being there.
What makes them this way? Why are they so amiable, likeable, or good-natured? What is it, you think, that makes a person good company?
There are really only two things that make someone likable.
This is the kind of advice that comes from one of history's most famously good-natured thinkers: Benjamin Franklin. His essay "On Conversation" is full of practical, surprisingly modern tips about how to be a nice person.
Franklin begins by arguing that there are really only two things that make someone likable. First, they have to be genuinely interested in what others say. Second, they have to be willing "to overlook or excuse Foibles." In other words, being good company means listening to people and ignoring their faults. Being witty, well-read, intelligent, or incredibly handsome can all make a good impression, but they're nothing without these two simple rules.
The sort of person nobody likes
From here, Franklin goes on to give a list of the common errors people tend to make while in company. These are the things people do that makes us dislike them. We might even find, with a sinking feeling in our stomach, that we do some of these ourselves.
1) Talking too much and becoming a "chaos of noise and nonsense." These people invariably talk about themselves, but even if "they speak beautifully," it's still ultimately more a soliloquy than a real conversation. Franklin mentions how funny it can be to see these kinds of people come together. They "neither hear nor care what the other says; but both talk on at any rate, and never fail to part highly disgusted with each other."
2) Asking too many questions. Interrogators are those people who have an "impertinent Inquisitiveness… of ten thousand questions," and it can feel like you're caught between a psychoanalyst and a lawyer. In itself, this might not be a bad thing, but Franklin notes it's usually just from a sense of nosiness and gossip. The questions are only designed to "discover secrets…and expose the mistakes of others."
3) Storytelling. You know those people who always have a scripted story they tell at every single gathering? Utterly painful. They'll either be entirely oblivious to how little others care for their story, or they'll be aware and carry on regardless. Franklin notes, "Old Folks are most subject to this Error," which we might think is perhaps harsh, or comically honest, depending on our age.
4) Debating. Some people are always itching for a fight or debate. The "Wrangling and Disputing" types inevitably make everyone else feel like they need to watch what they say. If you give even the lightest or most modest opinion on something, "you throw them into Rage and Passion." For them, the conversation is a boxing fight, and words are punches to be thrown.
5) Misjudging. Ribbing or mocking someone should be a careful business. We must never mock "Misfortunes, Defects, or Deformities of any kind", and should always be 100% sure we won't upset anyone. If there's any doubt about how a "joke" will be taken, don't say it. Offense is easily taken and hard to forget.
On practical philosophy
Franklin's essay is a trove of great advice, and this article only touches on the major themes. It really is worth your time to read it in its entirety. As you do, it's hard not to smile along or to think, "Yes! I've been in that situation." Though the world has changed dramatically in the 300 years since Franklin's essay, much is exactly the same. Basic etiquette doesn't change.
If there's only one thing to take away from Franklin's essay, it comes at the end, where he revises his simple recipe for being nice:
"Be ever ready to hear what others say… and do not censure others, nor expose their Failings, but kindly excuse or hide them"
So, all it takes to be good company is to listen and accept someone for who they are.
Philosophy doesn't always have to be about huge questions of truth, beauty, morality, art, or meaning. Sometimes it can teach us simply how to not be a jerk.
Certain water beetles can escape from frogs after being consumed.
- A Japanese scientist shows that some beetles can wiggle out of frog's butts after being eaten whole.
- The research suggests the beetle can get out in as little as 7 minutes.
- Most of the beetles swallowed in the experiment survived with no complications after being excreted.
In what is perhaps one of the weirdest experiments ever that comes from the category of "why did anyone need to know this?" scientists have proven that the Regimbartia attenuata beetle can climb out of a frog's butt after being eaten.
The research was carried out by Kobe University ecologist Shinji Sugiura. His team found that the majority of beetles swallowed by black-spotted pond frogs (Pelophylax nigromaculatus) used in their experiment managed to escape about 6 hours after and were perfectly fine.
"Here, I report active escape of the aquatic beetle R. attenuata from the vents of five frog species via the digestive tract," writes Sugiura in a new paper, adding "although adult beetles were easily eaten by frogs, 90 percent of swallowed beetles were excreted within six hours after being eaten and, surprisingly, were still alive."
One bug even got out in as little as 7 minutes.
Sugiura also tried putting wax on the legs of some of the beetles, preventing them from moving. These ones were not able to make it out alive, taking from 38 to 150 hours to be digested.
Naturally, as anyone would upon encountering such a story, you're wondering where's the video. Thankfully, the scientists recorded the proceedings:
The Regimbartia attenuata beetle can be found in the tropics, especially as pests in fish hatcheries. It's not the only kind of creature that can survive being swallowed. A recent study showed that snake eels are able to burrow out of the stomachs of fish using their sharp tails, only to become stuck, die, and be mummified in the gut cavity. Scientists are calling the beetle's ability the first documented "active prey escape." Usually, such travelers through the digestive tract have particular adaptations that make it possible for them to withstand extreme pH and lack of oxygen. The researchers think the beetle's trick is in inducing the frog to open a so-called "vent" controlled by the sphincter muscle.
"Individuals were always excreted head first from the frog vent, suggesting that R. attenuata stimulates the hind gut, urging the frog to defecate," explains Sugiura.
For more information, check out the study published in Current Biology.
A recent study analyzed the skulls of early Homo species to learn more about the evolution of primate brains.
- Using computed tomography, a team of researchers generated images of what the brains of early Homo species likely looked like.
- The team then compared these images to the brains of great apes and modern humans.
- The results suggest that Homo species developed humanlike brains about 1.7 million years ago and that this cognitive evolution occurred at the same time early Homo culture and technology were becoming more complex.
For nearly two centuries, scientists have known that humans descended from the great apes. But it's proven difficult to precisely map out the branches of that evolutionary tree, especially in terms of determining when and where early Homo species first developed brains similar to modern humans.
There are clear differences between ape and human brains. Compared to apes, the Homo sapiens brain is larger, and its frontal lobe is organized such that we can engage in toolmaking, planning, and language. Other Homo species also enjoyed some of these cognitive innovations, from the Neanderthals to Homo floresiensis, the hobbit-like people who once inhabited Indonesia.
One reason it's been difficult to discern the details of this cognitive evolution from apes to Homo species is that brains don't fossilize, so scientists can't directly study early primate brains. But primate skulls offer clues.
Brains of yore
In a new study published in Science, an international team of researchers analyzed impressions left on the skulls of Homo species to better understand the evolution of primate brains. Using computer tomography on fossil skulls, the team generated images of what the brain structures of early Homo species probably looked like, and then compared those structures to the brains of great apes and modern humans.
The results suggest that Homo species first developed humanlike brains approximately 1.7 to 1.5 million years ago in Africa. This cognitive evolution occurred at roughly the same time Homo species' technology and culture were becoming more complex, with these species developing more sophisticated stone tools and animal food resources.
The team hypothesized that "this pattern reflects interdependent processes of brain-culture coevolution, where cultural innovation triggered changes in cortical interconnectivity and ultimately in external frontal lobe topography."
The team also found that these structural changes occurred after Homo species migrated out of Africa for regions like modern-day Georgia and Southeast Asia, which is where the fossils in the study were discovered. In other words, Homo species still had ape-like brains when some groups first left Africa.
While the study sheds new light on the evolution of primate brains, the team said there's still much to learn about the history of early Homo species, particularly in terms of explaining the morphological diversity of Homo fossils discovered in Africa.
"Deciphering evolutionary process in early Homo remains a challenge that will be met only through the recovery of expanded fossil samples from well-controlled chronological contexts," the researchers wrote.