Once a week.
Subscribe to our weekly newsletter.
Human brain cells don’t continue to grow into adulthood, according to a new study
Research with other species lends weight to these findings.
We used to think neurogenesis—the growth of new brain cells, occurred throughout one’s lifetime. A shocking new study out of UC-San Francisco finds that instead, no more memory cells grow in the hippocampus after childhood. That's staggering, as this area of the brain associated with really important things such as learning, memory, and emotion. One of the largest studies of its kind to date, the scientists examined 59 human brain specimens, all of varying ages, and found no new neuron formation past age 13.
Neuroscientists have been debating since the late 1920s whether or not neurogenesis occurs into adulthood. From the ‘80s on, the prevailing view was that neurogenesis takes place throughout our lives. That’s because lots of studies confirmed the phenomenon occurring past the juvenile stage in the brains of other species, including birds, mice, rats, and nonhuman primates. In rats, for instance, new neurons are constantly formed around the olfactory bulb, which is associated with the sense of smell.
Arturo Alvarez-Buylla was the senior author of the study. He told Medical News Today, "We find that if neurogenesis occurs in the adult hippocampus in humans, it is an extremely rare phenomenon, raising questions about its contribution to brain repair or normal brain function." Lead researcher Shawn Sorrells, said they couldn’t find any new neurons in the adult samples they examined. The study’s findings were published in the journal Nature.
Model of the brain. By CNX OpenStax, Wikimedia Commons.
According to this research, our brain creates loads and loads of new neurons during the prenatal and neonatal stages. In early childhood, dramatic bursts of neurogenesis occur in the frontal lobe—the very front part of the brain, responsible for executive functions such as decision-making, learning, and planning. After that, neurogenesis drops off and becomes exceptionally rare, according to this study.
To conduct it, researchers collected hippocampus samples ranging from the fetal stage to age 77. These came from the US, China, and Spain. Some samples were from cadavers, while others were excess tissue removed during brain surgery, to help relieve severe epilepsy.
With each sample, the researchers examined a particular part of the hippocampus called the dentate gyrus, an area crucial to memory formation. They sliced up this region and applied antibodies which would congregate at new neurons or any newly formed cell. While this occurred often in samples from fetuses and young children, after age 1 it was rare, and after age 13, researchers found no new neuron formation.
The idea is bound to be controversial as certain diets, exercises programs, and even antidepressants, rest on the idea that they instigate the growth of fresh neurons. Among neuroscientists, there’s a variety of beliefs, ranging from the notion that the human brain generates many neurons each day, to the conjecture that neurogenesis is actually quite rare.
Neuronal formation isn’t a simple process. It begins with progenitor cells, which turn into stem cells before becoming neurons. "It requires the birth of the cell,” Alvarez-Buylla told CNN, “the migration or the movement of the cell to the right place, which is not an easy task in the very dense structure of the brain -- and then that cell has to make space to grow and connect to other cells and then contribute in a functional way to that circuit." The brains of infants and young children are primed for this, while with adult brains, researchers aren’t sure.
Neurons in an adult mouse's hippocampus. Credit: Wikipedia.
Another study that supports this one states that whales and dolphins don’t experience neurogenesis as adults. Yale neuroscientist Pasko Ravic has worked with monkeys and finds that they go through little neurogenesis during adulthood, as opposed to rodents. The reason might be, neural networks within adult primate brains are complex, and the growth of such cells could disrupt normal operations.
This study, although provocative, gives us a snapshot of only one region of the brain. More research will have to be done to confirm these results and see whether the same is true in other parts. Many neuroscientists say, even if adults do experience neurogenesis, there’s still no proof that it rejuvenates the brain. Instead, making new connections might be what’s really worthwhile. In another sense, this study could aid us in the search for a cure to Alzheimer’s, as neurogenesis was one avenue researchers were exploring.
Want to learn more about neurogenesis in the adult brain? Click here.
A Harvard professor's study discovers the worst year to be alive.
- Harvard professor Michael McCormick argues the worst year to be alive was 536 AD.
- The year was terrible due to cataclysmic eruptions that blocked out the sun and the spread of the plague.
- 536 ushered in the coldest decade in thousands of years and started a century of economic devastation.
The past year has been nothing but the worst in the lives of many people around the globe. A rampaging pandemic, dangerous political instability, weather catastrophes, and a profound change in lifestyle that most have never experienced or imagined.
But was it the worst year ever?
Nope. Not even close. In the eyes of the historian and archaeologist Michael McCormick, the absolute "worst year to be alive" was 536.
Why was 536 so bad? You could certainly argue that 1918, the last year of World War I when the Spanish Flu killed up to 100 million people around the world, was a terrible year by all accounts. 1349 could also be considered on this morbid list as the year when the Black Death wiped out half of Europe, with up to 20 million dead from the plague. Most of the years of World War II could probably lay claim to the "worst year" title as well. But 536 was in a category of its own, argues the historian.
It all began with an eruption...
According to McCormick, Professor of Medieval History at Harvard University, 536 was the precursor year to one of the worst periods of human history. It featured a volcanic eruption early in the year that took place in Iceland, as established by a study of a Swiss glacier carried out by McCormick and the glaciologist Paul Mayewski from the Climate Change Institute of The University of Maine (UM) in Orono.
The ash spewed out by the volcano likely led to a fog that brought an 18-month-long stretch of daytime darkness across Europe, the Middle East, and portions of Asia. As wrote the Byzantine historian Procopius, "For the sun gave forth its light without brightness, like the moon, during the whole year." He also recounted that it looked like the sun was always in eclipse.
Cassiodorus, a Roman politician of that time, wrote that the sun had a "bluish" color, the moon had no luster, and "seasons seem to be all jumbled up together." What's even creepier, he described, "We marvel to see no shadows of our bodies at noon."
...that led to famine...
The dark days also brought a period of coldness, with summer temperatures falling by 1.5° C. to 2.5° C. This started the coldest decade in the past 2300 years, reports Science, leading to the devastation of crops and worldwide hunger.
...and the fall of an empire
In 541, the bubonic plague added considerably to the world's misery. Spreading from the Roman port of Pelusium in Egypt, the so-called Plague of Justinian caused the deaths of up to one half of the population of the eastern Roman Empire. This, in turn, sped up its eventual collapse, writes McCormick.
Between the environmental cataclysms, with massive volcanic eruptions also in 540 and 547, and the devastation brought on by the plague, Europe was in for an economic downturn for nearly all of the next century, until 640 when silver mining gave it a boost.
Was that the worst time in history?
Of course, the absolute worst time in history depends on who you were and where you lived.
Native Americans can easily point to 1520, when smallpox, brought over by the Spanish, killed millions of indigenous people. By 1600, up to 90 percent of the population of the Americas (about 55 million people) was wiped out by various European pathogens.
Like all things, the grisly title of "worst year ever" comes down to historical perspective.
A machine learning system lets visitors at a Kandinsky exhibition hear the artwork.
Have you ever heard colors?
As part of a new exhibition, the worlds of culture and technology collide, bringing sound to the colors of abstract art pioneer Wassily Kandinsky.
Kandinsky had synesthesia, where looking at colors and shapes causes some with the condition to hear associated sounds. With the help of machine learning, virtual visitors to the Sounds Like Kandinsky exhibition, a partnership project by Centre Pompidou in Paris and Google Arts & Culture, can have an aural experience of his art.
An eye for music
Kandinsky's synesthesia is thought to have heavily influenced his painting. Seeing yellow summoned up trumpets, evoking emotions like cheekiness; reds produced violins portraying restlessness; while organs representing heavenliness he associated with blues, according to the exhibition notes.
Virtual visitors are invited to take part in an experiment called Play a Kandinsky, which allows them to see and hear the world through the artist's eyes.
Kandinsky's synesthesia is thought to have heavily influenced his 1925 painting Yellow, Red, Blue.Image: Guillaume Piolle/Wikimedia Commons
In 1925, the artist's masterpiece, "Yellow, Red, Blue", broke new ground in the world of abstract art, guiding the viewer from left to right with shifting shapes and shades. Almost a century after it was painted, Google's interactive tool lets visitors click different parts of the artwork to journey through the artist's description of the colors, associated sounds and moods that inspired the work.
But Google's new toy is not the only tool developed to enhance the artistic experience.
Artist Neil Harbisson has developed an artificial way to emulate Kandinsky by turning colors into sounds. He has a rare form of color blindness and sees the world in greyscale. But a smart antenna attached to his head translates dominant colors into musical notes, creating a real-world soundtrack of what's in front of him. The invention could open up a new world for people who are color blind.
A new study suggests that private prisons hold prisoners for a longer period of time, wasting the cost savings that private prisons are supposed to provide over public ones.
- Private prisons in Mississippi tend to hold prisoners 90 days longer than public ones.
- The extra days eat up half of the expected cost savings of a private prison.
- The study leaves several open questions, such as what affect these extra days have on recidivism rates.
The United States of America, land of the free, is home to 5 percent of the world's population but 25 percent of its prisoners. The cost of having so many people in the penal system adds up to $80 billion per year, more than three times the budget for NASA. This massive system exploded in size relatively recently, with the prison population increasing by six-fold in the last four decades.
Ten percent of these prisoners are kept in private prisons, which are owned and operated for the sake of profit by contractors. In theory, these operations cost less than public prisons and jails, and states can save money by contracting them to incarcerate people. They have a long history in the United States and are used in many other countries as well.
However, despite the pervasiveness of private contractors in the American prison system, there is not much research into how well they live up to their promise to provide similar services at a lower cost to the state. The little research that is available often encounters difficulties in trying to compare the costs and benefits of facilities with vastly different operations and occasionally produces results suggesting there are few benefits to privatization.
A new study by Dr. Anita Mukherjee and published in the American Economic Journal: Economic Policy joins the debate with a robust consideration of the costs and benefits of private prisons. Its findings suggest that some private prisons keep people incarcerated longer and save less money than advertised.
The study focuses on prisons in Mississippi. Despite its comparatively high rate of incarceration, Mississippi's prison system is very similar to that of other states that also use private prisons. Demographically, its system is representative of the rest of the U.S. prison system, and its inmates are sentenced for similar amounts of time.
The state attempts to get the most out of its privatization efforts, as a 1994 law requires all contracts for private prisons in Mississippi to provide at least a 10 percent cost savings over public prisons while providing similar services. As a result, the state seeks to maximize its savings by sending prisoners to private institutions first if space if available.
While public and private prisons in Mississippi are quite similar, there are a few differences that allow for the possibility of cost savings by private operators — not the least of which is that the guards are paid 30 percent less and have fewer benefits than their publicly employed counterparts.
The results of privatization
The graph depicts the likelihood of release for public (dotted line) vs. private (solid line) prison inmates. At every level of time served, public prisoners were more likely to be released than private prisoners.Dr. Anita Mukherjee
The study relied on administrative records of the Mississippi prison system between 1996 and 2013. The data included information on prisoner demographics, the crimes committed, sentence lengths, time served, infractions while incarcerated, and prisoner relocation while in the system, including between public and private jails. For this study, the sample examined was limited to those serving between one and six years and those who served at least a quarter of their sentence. This created a primary sample of 26,563 bookings.
Analysis revealed that prisoners in private prisons were behind bars for four to seven percent longer than those in public prisons, which translates to roughly 85 to 90 extra days per prisoner. This is, in part, because those in private prison serve a greater portion of their sentences (73 percent) than those in public institutions (70 percent).
This in turn might be due to the much higher infraction rate in private prisons compared to public ones. While only 18 percent of prisoners in a public prison commit an infraction, such as disobeying a guard or possessing contraband, the number jumps to 46 percent in a private prison. Infractions can reduce the probability of early release or cause time to be added to a sentence.
It's unclear why there are so many more infractions in private prisons. Dr. Mukherjee suggests it could be the result of "harsher prison conditions in private prisons," better monitoring techniques, incentives to report more of them to the state before contract renewals, or even a lackadaisical attitude on the part of public prison employees.
What does all this cost Mississippi?
The extra time served eats 48 percent of the cost savings of keeping prisoners in a private facility. For example, it costs about $135,000 to house a prisoner in a private prison for three years and $150,000 in the public system. But longer stays in private prisons reduce the savings from $15,000 to only $7,800.
As Dr. Mukherjee remarks, this cost is also just the finance. Some things are a little harder to measure:
"There are, of course, other costs that are difficult to quantify — e.g., the cost of injustice to society (if private prison inmates systematically serve more time), the inmate's individual value of freedom, and impacts of the additional incarceration on future employment. Abrams and Rohlfs (2011) estimates a prisoner's value of freedom for 90 days at about $1,100 using experimental variation in bail setting. Mueller-Smith (2017) estimates that 90 days of marginal incarceration costs about $15,000 in reduced wages and increased reliance on welfare. If these social costs were to exceed $7,800 in the example stated, private prisons would no longer offer a bargain in terms of welfare-adjusted cost savings."
It is possible that the extra time in jail provides benefits that counter these costs, such as a reduced recidivism rate, but this proved difficult to determine. Though it was not statistically significant, there was some evidence that the added time actually increased the rate of recidivism. If that's true, then private prisons could be counterproductive.