Once a week.
Subscribe to our weekly newsletter.
Adult-made neurons mature longer, have unique functions
Unraveling the mysteries of adult neurogenesis may have clinical applications.
- Neuroscientists don't know the degree to which adult human brains generate new neurons.
- A new study found that adult-born neurons in lab rats continued to grow and mature long after infant-born ones stopped.
- Understanding the process of neuron birth and death can help scientists understand the causes of neurological disorders.
Learning about the brain is a challenge. Neuroscientists must measure the operations of a byzantine tool with the very tool they're attempting to measure. That's a journey that's not so much winding as it is Möbius, so it's little wonder that history's greatest scientists and philosophers have yet to crack, say, the hard problem of consciousness.
Other problems are limited more by our inability to poke around in real-time. Take the question of adult neurogenesis. Neurogenesis is the brain's ability to generate new neurons. This process is intensely productive during embryonic development, and it continues after birth at a rate any parent with a toddler can appreciate daily.
For much of the 20th century, scientists believed neurogenesis didn't take place in the structured, sedate brains of human adults. They thought that after development we possessed all the neurons we'd ever have, and this lead to a perception that aged minds had less plasticity.
Then studies began to accumulate evidence that the adult brain may not be as placid as thought. One such study, published in 2018 in Cell Stem Cell, autopsied the hippocampi of 28 adults and found human brains still churn out neural stem cells by the thousands well into our golden years.
"We found that older people have similar ability to make thousands of hippocampal new neurons from progenitor cells as younger people do," Maura Boldrini, the study's lead author, said in a release. "We also found equivalent volumes of the hippocampus (a brain structure used for emotion and cognition) across ages."
Other studies have clouded the consensus. A study published in Nature, one with a remarkably similar methodology to Boldrini's, found little evidence for young neurons in the dentate gyrus, a part of the hippocampus. Its authors concluded that neurogenesis likely ceased, or was extremely rare, in adults.
But a new study published in the Journal of Neuroscience may have discovered how adult brains can continue to mature and retain plasticity without producing bubbly, baby neurons at the same clip as their younger counterparts.
Getting better with age
Reconstructions of adult-born neurons from rats undergoing maturation. Left to right: 2-weeks old, 4-weeks, 6-weeks, and 24-weeks.
One challenge to understanding adult neurogenesis is that most studies examine new neurons within their typical six-week development window. During that time a neuron is born, travels to the region of the brain where it will work, and differentiates depending on that location. After that, the neuron is considered mature.
According to Jason Snyder, a researcher at the Djavad Mowafaghian Centre for Brain Health and one of the study's authors, the researchers wanted to look beyond this window. They wanted to know if adult-born neurons could mature, grow later in life, and become unique to those produced by newborns' brains.
To test their hypothesis, the researchers injected a viral vector into lab rats' dentate gyri. The retrovirus was tagged with fluorescent reporters. After it inserted a copy of its genome into the dividing cells' DNA, subsequent generations would glow and allow the researchers to follow them.
They watched the rats' adult-born neurons for the typical six weeks, but then kept observing into the seventh. Amazingly, the seven-week-old neurons continued to exhibit growth markers, such as larger nuclei and thicker dendrites. The researchers continued their watch for 24 weeks and found the aged neurons were bigger and sported more connections than infant-born ones.
Based on the results, they think that adult-born neurons may continue to contribute to plasticity and regeneration throughout life, even if cell production winds down with age.
"Our study is exciting because it gives us a new framework for studying these cells," Snyder said. "Even if neurogenesis stops as we age, our study shows that it's still relevant because cells take so long to mature and keep growing for so long. This is really just a different way of looking at them.
Optimize Your Brain: The Science of Smarter Eating | Dr. Drew Ramsey | Big Think
The challenge of measuring adult neurogenesis is difficult, but it's not impossible. A big part of the solution is knowing what to measure and where. While this new study was performed on rats—and therefore may be a poor predictor of what we'll see in humans—it can direct future research by showing neuroscientists where to look and what to look for.
And unlike the hard problem of consciousness, unraveling the mysteries of adult neurogenesis may have clinical applications. Better the lifecycle of neurons may reveal how neurological disorders such as Parkinson's and Alzheimer's disease emerge. There's even research linking disorders such as depression and anxiety to neurogenesis activity.
This knowledge may lead to new treatments, but if not, it could also reveal a better understanding of how our lifestyles and environments support brain health and regeneration throughout human life.
- 7 new things we've learned about the brain - Big Think ›
- Human brain cells don't continue to grow into adulthood, according ... ›
- Scientists Prove New Neurons Grow in Adult Brains - Big Think ›
"Deepfakes" and "cheap fakes" are becoming strikingly convincing — even ones generated on freely available apps.
- A writer named Magdalene Visaggio recently used FaceApp and Airbrush to generate convincing portraits of early U.S. presidents.
- "Deepfake" technology has improved drastically in recent years, and some countries are already experiencing how it can weaponized for political purposes.
- It's currently unknown whether it'll be possible to develop technology that can quickly and accurately determine whether a given video is real or fake.
After former U.S. President William Henry Harrison delivered his inaugural speech on March 4, 1841, he posed for a daguerreotype, the first widely available photographic technology. It became the first photo taken of a sitting American president.
As for the eight presidents before Harrison, history can see them only through artistic renderings. (The exception is a handful of surviving daguerreotypes of John Quincy Adams, taken after he left office. In his diary, Adams described them as "hideous" and "too true to the original.")
But a recent project offers a glimpse of what early presidents might've looked like if photographed through modern cameras. Using FaceApp and Airbrush, Magdalene Visaggio, author of books such as "Eternity Girl" and "Kim & Kim," generated a collection of convincing portraits of the nation's first presidents, from George Washington to Ulysses S. Grant.
Modern Presidents George Washington https://t.co/CURJQB0kap— Magdalene Visaggio (@Magdalene Visaggio)1611952243.0
What might be surprising is that Visaggio was able to generate the images without a background in graphic design, using freely available tools. She wrote on Twitter:
"A lot of people think I'm a digital artist or whatever, so let me clarify how I work. Everything you see here is done in Faceapp+Airbrush on my phone. On the outside, each takes between 15-30 mins. Washington was a pretty simple one-and-done replacement."
Ulysses S Grant https://t.co/L1IGXLI3Vl— Magdalene Visaggio (@Magdalene Visaggio)1611959480.0
"Other than that? I am not a visual artist in any sense, just a hobbyist using AI tools see what she can make. I'm actually a professional comics writer."
Did another pass at Lincoln. https://t.co/PdT4QVpMbn— Magdalene Visaggio (@Magdalene Visaggio)1611973947.0
Of course, Visaggio isn't the first person to create deepfakes (or "cheap fakes") of politicians.
In 2017, many people got their first glimpse of the technology through a video depicting former President Barack Obama warning: "We're entering an era in which our enemies can make it look like anyone is saying anything at any point in time." The video quickly reveals itself to be fake, with comedian Jordan Peele speaking for the computer-generated Obama.
While deepfakes haven't yet caused significant chaos in the U.S., incidents in other nations may offer clues of what's to come.
The future of deepfakes
In 2018, Gabon's president Ali Bongo had been out of the country for months receiving medical treatment. After Bongo hadn't been seen in public for months, rumors began swirling about his condition. Some suggested Bongo might even be dead. In response, Bongo's administration released a video that seemed to show the president addressing the nation.
But the video is strange, appearing choppy and blurry in parts. After political opponents declared the video to be a deepfake, Gabon's military attempted an unsuccessful coup. What's striking about the story is that, to this day, experts in the field of deepfakes can't conclusively verify whether the video was real.
The uncertainty and confusion generated by deepfakes poses a "global problem," according to a 2020 report from The Brookings Institution. In 2018, the U.S. Department of Defense released some of the first tools able to successfully detect deepfake videos. The problem, however, is that deepfake technology keeps improving, meaning forensic approaches may forever be one step behind the most sophisticated forms of deepfakes.
As the 2020 report noted, even if the private sector or governments create technology to identify deepfakes, they will:
"...operate more slowly than the generation of these fakes, allowing false representations to dominate the media landscape for days or even weeks. "A lie can go halfway around the world before the truth can get its shoes on," warns David Doermann, the director of the Artificial Intelligence Institute at the University of Buffalo. And if defensive methods yield results short of certainty, as many will, technology companies will be hesitant to label the likely misrepresentations as fakes."
The author of 'How We Read' Now explains.
During the pandemic, many college professors abandoned assignments from printed textbooks and turned instead to digital texts or multimedia coursework.
As a professor of linguistics, I have been studying how electronic communication compares to traditional print when it comes to learning. Is comprehension the same whether a person reads a text onscreen or on paper? And are listening and viewing content as effective as reading the written word when covering the same material?
The answers to both questions are often “no," as I discuss in my book “How We Read Now," released in March 2021. The reasons relate to a variety of factors, including diminished concentration, an entertainment mindset and a tendency to multitask while consuming digital content.
Print versus digital reading
The benefits of print particularly shine through when experimenters move from posing simple tasks – like identifying the main idea in a reading passage – to ones that require mental abstraction – such as drawing inferences from a text. Print reading also improves the likelihood of recalling details – like “What was the color of the actor's hair?" – and remembering where in a story events occurred – “Did the accident happen before or after the political coup?"
Studies show that both grade school students and college students assume they'll get higher scores on a comprehension test if they have done the reading digitally. And yet, they actually score higher when they have read the material in print before being tested.
Educators need to be aware that the method used for standardized testing can affect results. Studies of Norwegian tenth graders and U.S. third through eighth graders report higher scores when standardized tests were administered using paper. In the U.S. study, the negative effects of digital testing were strongest among students with low reading achievement scores, English language learners and special education students.
My own research and that of colleagues approached the question differently. Rather than having students read and take a test, we asked how they perceived their overall learning when they used print or digital reading materials. Both high school and college students overwhelmingly judged reading on paper as better for concentration, learning and remembering than reading digitally.
The discrepancies between print and digital results are partly related to paper's physical properties. With paper, there is a literal laying on of hands, along with the visual geography of distinct pages. People often link their memory of what they've read to how far into the book it was or where it was on the page.
But equally important is mental perspective, and what reading researchers call a “shallowing hypothesis." According to this theory, people approach digital texts with a mindset suited to casual social media, and devote less mental effort than when they are reading print.
Podcasts and online video
Given increased use of flipped classrooms – where students listen to or view lecture content before coming to class – along with more publicly available podcasts and online video content, many school assignments that previously entailed reading have been replaced with listening or viewing. These substitutions have accelerated during the pandemic and move to virtual learning.
Surveying U.S. and Norwegian university faculty in 2019, University of Stavanger Professor Anne Mangen and I found that 32% of U.S. faculty were now replacing texts with video materials, and 15% reported doing so with audio. The numbers were somewhat lower in Norway. But in both countries, 40% of respondents who had changed their course requirements over the past five to 10 years reported assigning less reading today.
A primary reason for the shift to audio and video is students refusing to do assigned reading. While the problem is hardly new, a 2015 study of more than 18,000 college seniors found only 21% usually completed all their assigned course reading.
Maximizing mental focus
Researchers found similar results with university students reading an article versus listening to a podcast of the text. A related study confirms that students do more mind-wandering when listening to audio than when reading.
Results with younger students are similar, but with a twist. A study in Cyprus concluded that the relationship between listening and reading skills flips as children become more fluent readers. While second graders had better comprehension with listening, eighth graders showed better comprehension when reading.
Research on learning from video versus text echoes what we see with audio. For example, researchers in Spain found that fourth through sixth graders who read texts showed far more mental integration of the material than those watching videos. The authors suspect that students “read" the videos more superficially because they associate video with entertainment, not learning.
The collective research shows that digital media have common features and user practices that can constrain learning. These include diminished concentration, an entertainment mindset, a propensity to multitask, lack of a fixed physical reference point, reduced use of annotation and less frequent reviewing of what has been read, heard or viewed.
Digital texts, audio and video all have educational roles, especially when providing resources not available in print. However, for maximizing learning where mental focus and reflection are called for, educators – and parents – shouldn't assume all media are the same, even when they contain identical words.
Humans may have evolved to be tribalistic. Is that a bad thing?
- From politics to every day life, humans have a tendency to form social groups that are defined in part by how they differ from other groups.
- Neuroendocrinologist Robert Sapolsky, author Dan Shapiro, and others explore the ways that tribalism functions in society, and discuss how—as social creatures—humans have evolved for bias.
- But bias is not inherently bad. The key to seeing things differently, according to Beau Lotto, is to "embody the fact" that everything is grounded in assumptions, to identify those assumptions, and then to question them.
Ancient corridors below the French capital have served as its ossuary, playground, brewery, and perhaps soon, air conditioning.