Once a week.
Subscribe to our weekly newsletter.
Why is 18 the age of adulthood if the brain can take 30 years to mature?
Neuroscience research suggests it might be time to rethink our ideas about when exactly a child becomes an adult.
- Research suggests that most human brains take about 25 years to develop, though these rates can vary among men and women, and among individuals.
- Although the human brain matures in size during adolescence, important developments within the prefrontal cortex and other regions still take place well into one's 20s.
- The findings raise complex ethical questions about the way our criminal justice systems punishes criminals in their late teens and early 20s.
At what age does someone become an adult? Many might say that the 18th birthday marks the transition from childhood to adulthood. After all, that's the age at which people can typically join the military and become fully independent in the eyes of the law.
But in light of research showing our brains develop gradually over the course of several decades, and at different paces among individuals, should we start rethinking how we categorize children and adults?
"There isn't a childhood and then an adulthood," Peter Jones, who works as part of the epiCentre group at Cambridge University, told the BBC. "People are on a pathway, they're on a trajectory."
The prefrontal cortex, cerebellum and reward systems
One key part of that trajectory is the development of the prefrontal cortex, a significant part of the brain, in terms of social interactions, that affects how we regulate emotions, control impulsive behavior, assess risk and make long-term plans. Also important are the brain's reward systems, which are especially excitable during adolescence. But these parts of the brain don't stop growing at age 18. In fact, research shows that it can take more than 25 years for them to reach maturity.
The cerebellum also affects our cognitive maturity. But unlike the prefrontal cortex, the development of the cerebellum appears to depend largely on environment, as Dr. Jay Giedd, chair of child psychiatry at Rady Children's Hospital-San Diego, told PBS:
"Identical twins' cerebellum are no more alike than non-identical twins. So we think this part of the brain is very susceptible to the environment. And interestingly, it's a part of the brain that changes most during the teen years. This part of the brain has not finished growing well into the early 20s, even. The cerebellum used to be thought to be involved in the coordination of our muscles. So if your cerebellum is working well, you were graceful, a good dancer, a good athlete.
But we now know it's also involved in coordination of our cognitive processes, our thinking processes. Just like one can be physically clumsy, one can be kind of mentally clumsy. And this ability to smooth out all the different intellectual processes to navigate the complicated social life of the teen and to get through these things smoothly and gracefully instead of lurching. . . seems to be a function of the cerebellum."
The effects environment can bring upon the cerebellum even further complicate the question when does a child become an adult, considering the answer might depend on the kind of childhood an individual experienced.
Adulthood and the criminal justice system
These factors of cognitive develop raise many philosophical questions, but perhaps none are as important as those related to how we punish criminal, especially among young men, whose brains develop an average of two years later than women.
"The preponderance of young men engaging in these deadly, evil, and stupid acts of violence may be a result of brains that have yet to fully developed," Howard Forman, an assistant professor of psychiatry at Albert Einstein College of Medicine, told Business Insider.
So, does that mean young criminals — say, 19- to 25-year-olds — should be receive the same punishment as a 35-year-old who commits the same crime? Both criminals would still be guilty, but each might not necessarily deserve the same punishment, as Laurence Steinberg, a professor of psychology at Temple University, told Newsweek.
"It's not about guilt or innocence... The question is, 'How culpable are they, and how do we punish them?'"
After all, most countries have separate juvenile justice systems to deal with children who commit crimes. These separate systems are predicated on the idea that there ought to be a spectrum of culpability that accounts for a criminal's age. So, if we assume that the importance of age in the eyes of the justice system is based largely on cognitive differences between children and adults, then why shouldn't that culpability spectrum be modified to better match the science, which clearly shows that 18 is not the age at which the brain is fully matured?
Whatever the answer, society clearly needs some definition of adulthood in order to be able to differentiate between children and adults in order to function smoothly, as Jones suggested to the BBC.
"I guess systems like the education system, the health system and the legal system make it convenient for themselves by having definitions."
But that doesn't mean these definitions make sense outside of a legal context.
"What we're really saying is that to have a definition of when you move from childhood to adulthood looks increasingly absurd," he said. "It's a much more nuanced transition that takes place over three decades."
- How to Rewire Your Brain For Success - Big Think ›
- Researchers Find the 1st Possible Negative Side Effect of Marijuana ... ›
- Human brain cells don't continue to grow into adulthood, according ... ›
- Scientists Prove New Neurons Grow in Adult Brains - Big Think ›
- The true size of the cerebellum is hidden in its folds - Big Think ›
- 3 ways your environment affects your intelligence - Big Think ›
With just a few strategical tweaks, the Nazis could have won one of World War II's most decisive battles.
- The Battle of Britain is widely recognized as one of the most significant battles that occurred during World War II. It marked the first major victory of the Allied forces and shifted the tide of the war.
- Historians, however, have long debated the deciding factor in the British victory and German defeat.
- A new mathematical model took into account numerous alternative tactics that the German's could have made and found that just two tweaks stood between them and victory over Britain.
Two strategic blunders<p>Now, historians and mathematicians from York St. John University have collaborated to produce <a href="http://www-users.york.ac.uk/~nm15/bootstrapBoB%20AAMS.docx" target="_blank">a statistical model (docx download)</a> capable of calculating what the likely outcomes of the Battle of Britain would have been had the circumstances been different. </p><p>Would the German war effort have fared better had they not bombed Britain at all? What if Hitler had begun his bombing campaign earlier, even by just a few weeks? What if they had focused their targets on RAF airfields for the entire course of the battle? Using a statistical technique called weighted bootstrapping, the researchers studied these and other alternatives.</p><p>"The weighted bootstrap technique allowed us to model alternative campaigns in which the Luftwaffe prolongs or contracts the different phases of the battle and varies its targets," said co-author Dr. Jaime Wood in a <a href="https://www.york.ac.uk/news-and-events/news/2020/research/mathematicians-battle-britain-what-if-scenarios/" target="_blank">statement</a>. Based on the different strategic decisions that the German forces could have made, the researchers' model enabled them to predict the likelihood that the events of a given day of fighting would or would not occur.</p><p>"The Luftwaffe would only have been able to make the necessary bases in France available to launch an air attack on Britain in June at the earliest, so our alternative campaign brings forward the air campaign by three weeks," continued Wood. "We tested the impact of this and the other counterfactuals by varying the probabilities with which we choose individual days."</p><p>Ultimately, two strategic tweaks shifted the odds significantly towards the Germans' favor. Had the German forces started their campaign earlier in the year and had they consistently targeted RAF airfields, an Allied victory would have been extremely unlikely.</p><p>Say the odds of a British victory in the real-world Battle of Britain stood at 50-50 (there's no real way of knowing what the actual odds are, so we'll just have to select an arbitrary figure). If this were the case, changing the start date of the campaign and focusing only on airfields would have reduced British chances at victory to just 10 percent. Even if a British victory stood at 98 percent, these changes would have cut them down to just 34 percent.</p>
A tool for understanding history<p>This technique, said co-author Niall Mackay, "demonstrates just how finely-balanced the outcomes of some of the biggest moments of history were. Even when we use the actual days' events of the battle, make a small change of timing or emphasis to the arrangement of those days and things might have turned out very differently."</p><p>The researchers also claimed that their technique could be applied to other uncertain historical events. "Weighted bootstrapping can provide a natural and intuitive tool for historians to investigate unrealized possibilities, informing historical controversies and debates," said Mackay.</p><p>Using this technique, researchers can evaluate other what-ifs and gain insight into how differently influential events could have turned out if only the slightest things had changed. For now, at least, we can all be thankful that Hitler underestimated Britain's grit.</p>
Apple sold its first iPod in 2001, and six years later it introduced the iPhone, which ushered in a new era of personal technology.
A biologist-reporter investigates his fungal namesake.
The unmatched biologist-reporter Tomasz Sitarz interviews his fungal namesake, maślak sitarz – known in English as the Jersey cow mushroom.