Like autism, ADHD lies on a spectrum, and some children should not be treated.
- ADHD is an extremely contentious disorder in terms of diagnosis and treatment.
- A research team examined 334 studies on ADHD published between 1979 and 2020.
- The team concluded that ADHD is being overdiagnosed and overtreated in children with milder symptoms.
Attention deficit hyperactivity disorder (ADHD) has long been a controversial topic. While the term "mental restlessness" dates back to 1798, English pediatrician George Still described the disorder in front of the Royal College of Physicians of London in 1902. The condition is attributed to both nature and nurture, with a recent study suggesting the disorder is 75 percent genetic.
According to DSM-IV criteria, ADHD affects five to seven percent of children; but according to ICD-10, only between one and two percent are afflicted. Global estimates state that nearly 85 million people suffer from ADHD, which, like autism, exists on a spectrum.
Treatment is perhaps the most contentious issue. While a holistic approach includes counseling, lifestyle changes, and medication, due to insurance requirements and other factors, many children only receive the latter. And now a new systematic scoping review published in the journal JAMA Network Open that investigated 334 studies conducted between 1979 and 2020 found that ADHD is being both overdiagnosed and overtreated in children and adolescents.
ADHD: An epidemic of overdiagnosis
Researchers from the University of Sydney and the Institute for Evidence-Based Healthcare in Australia initially retrieved 12,267 relevant studies before using a set of criteria that whittled the list down to 334. Only five studies critically investigated the costs and benefits of treating milder cases of ADHD, prompting the team to focus on knowledge gaps in side effects.
The team writes that public scrutiny has increased along with the increase in diagnoses. The numbers are startling: between 1997 and 2016, the number of children reported to be suffering from ADHD doubled. While the symptoms of ADHD include fidgeting, inattention, and impulsivity, Dr. Stephen Hinshaw compared this disorder to depression, as neither condition has "unequivocal biological markers." He continues, "It's probably not a true epidemic of ADHD. It might be an epidemic of diagnosing it."
The Australian researchers write that ambiguous or mild symptoms might contribute to diagnostic inflation and the subsequent rise in the prevalence of ADHD. They compare this to cancer, a field that has established protocols for overdiagnosis. ADHD is still understudied in this regard.
Photo: fizkes / Adobe Stock
Overdiagnosis is harmful
This has contributed to an increase in potential harm, not just to children's health (such as the long-term pharmacological impact on developing brains) but to parents' finances. As of 2018, ADHD is a $16.4 billion global industry, with continued revenue growth predicted — ensured by future ADHD diagnoses.
The costs and benefits of ADHD treatment are mixed. The authors write:
"We found evidence of benefits for academic outcomes, injuries, hospital admissions, criminal behavior, and quality of life. In addition, harmful outcomes were evident for heart rate and cardiovascular events, growth and weight, risk for psychosis and tics, and stimulant misuse or poisoning."
For most of these studies, the benefits outweighed the risks in children suffering from more severe ADHD. But this is not true for children with milder symptoms.
Across the studies, the team noticed that four themes emerged. The first two were positive, and the second two were negative:
- For some people, an ADHD diagnosis was shown to create a sense of empowerment because a biological explanation provided a sense of legitimacy.
- Feelings of empowerment enabled help-seeking behavior.
- For others, a biomedical explanation led to disempowerment because it served as an excuse and provided a way to shirk responsibility.
- An ADHD diagnosis was linked to stigmatization and social isolation.
The unfortunate reality is that ADHD is a real condition that should be treated in some children. But for many, the harm of treatment outweighs the benefits.
Stay in touch with Derek on Twitter and Facebook. His most recent book is "Hero's Dose: The Case For Psychedelics in Ritual and Therapy."
- Lawrence Kohlberg's experiments gave children a series of moral dilemmas to test how they differed in their responses across various ages.
- He identified three separate stages of moral development from the egoist to the principled person.
- Some people do not progress through all the stages of moral development, which means they will remain "morally undeveloped."
Has your sense of right and wrong changed over the years? Are there things that you see as acceptable today that you'd never dream of doing when you were younger? If you spend time around children, do you notice how starkly different their sense of morality is? How black and white, or egocentric, or oddly rational it can be?
These were questions that Lawrence Kohlberg asked, and his "stages of moral development" dominates a lot of moral psychology today.
The Heinz Dilemma
Kohlberg was curious to see how and why children differed in their ethical judgements, and so he gave roughly 60 children, across a variety of ages, a series of moral dilemmas. They were all given open-ended questions to explain their answers in order to minimize the risk of leading them to a certain response.
For instance, one of the better-known dilemmas involved an old man called Heinz who needed an expensive drug for his dying wife. Heinz only managed to raise half the required money, which the pharmacists wouldn't accept. Unable to afford it, he has only three options. What should he do?
(a) Not steal it because it's breaking the law.
(b) Steal it, and go to jail for breaking the law.
(c) Steal it, but be let off a prison sentence.
What option would you choose?
Stages of Moral Development
From the answers he got, Kohlberg identified three definite levels or stages of our moral development.
Pre-conventional stage. This is characterized by an ego-centric attitude that seeks pleasure and to prevent pain. The primary motivation is to avoid punishment or claim a reward. In this stage of moral development, "good" is defined as whatever is beneficial to oneself. "Bad" is the opposite. For instance, a young child might share their food with a younger sibling not from kindness or some altruistic impulse but because they know that they'll be praised by their parents (or, perhaps, have their food taken away from them).
In the pre-conventional stage, there is no inherent sense of right and wrong, per se, but rather "good" is associated with reward and "bad" is associated with punishment. At this stage, children are sort of like puppies.
If you spend time around children, do you notice how starkly different their sense of morality is? How black and white, or egocentric, or oddly rational it can be?
Conventional stage. This stage reflects a growing sense of social belonging and hence a higher regard for others. Approval and praise are seen as rewards, and behavior is calibrated to please others, obey the law, and promote the good of the family/tribe/nation. In the conventional stage, a person comes to see themselves as part of a community and that their actions have consequences.
Consequently, this stage is much more rule-focused and comes along with a desire to be seen as good. Image, reputation, and prestige matter the most in motivating good behavior — we want to fit into our community.
Post-conventional stage. In this final stage, there is much more self-reflection and moral reasoning, which gives people the capacity to challenge authority. Committing to principles is considered more important than blindly obeying fixed laws. Importantly, a person comes to understand the difference between what is "legal" and what is "right." Ideas such as justice and fairness start to mature. Laws or rules are no longer equated to morality but might be seen as imperfect manifestations of larger principles.
A lot of moral philosophy is only possible in the post-conventional stage. Theories like utilitarianism or Immanuel Kant's duty-focused ethics ask us to consider what's right or wrong in itself, not just because we get a reward or look good to others. Aristotle perhaps sums it up best when he wrote, "I have gained this from philosophy: that I do without being commanded what others do only from fear of the law."
How morally developed are you?
Kohlberg identified these stages as a developmental progression from early infancy all the way to adulthood, and they map almost perfectly onto Jean Piaget's psychology of child development. For instance, the pre-conventional stage usually lasts from birth to roughly nine years old, the conventional occurs mainly during adolescence, and the post-conventional goes into adulthood.
What's important to note, though, is that this is not a fatalistic timetable to which all humans adhere. Kohlberg thought, for instance, that some people never progress or mature. It's quite possible, maybe, for someone to have no actual moral compass at all (which is sometimes associated with psychopathy).
More commonly, though, we all know people who are resolutely bound to the conventional stage, where they care only for their image or others' judgment. Those who do not develop beyond this stage are usually stubbornly, even aggressively, strict in following the rules or the law. Prepubescent children can be positively authoritarian when it comes to obeying the rules of a board game, for instance.
So, what's your answer to the Heinz dilemma? Where do you fall on Kohlberg's moral development scale? Is he right to view it is a progressive, hierarchical maturing, where we have "better" and "worse" stages? Or could it be that as we grow older, we grow more immoral?
Did the 20th century bring a breakthrough in how children are treated?
It took several thousand years for our culture to realize that a child is not an object. Learning how to treat children as humans continues to this day.
Double standards in people's approach to children were not unusual in the past. In ancient Greece, no one condemned parents for leaving a baby by the road or in the garbage. Usually, it was torn apart by animals. Less often, a passer-by would take them – not necessarily guided by mercy. After raising the orphan, the 'Good Samaritan' could sell the child at a slave market, recovering the money invested in their maintenance with interest. This kind of practice did not shock, because in the world of ancient Greece a child had the status of private property, and therefore the public and authorities were indifferent to their fate.
The exception was Sparta, but this did not mean anything good for minors. While in other poleis infanticide was left to parents, in Sparta it was managed by the council of Fyli. In Life of Lycurgus, Plutarch wrote about how the child was inspected by the Fyli elders forming the council: "If they found it stout and well made, they gave order for its rearing, and allotted to it one of the nine thousand shares of land above mentioned for its maintenance, but, if they found it puny and ill-shaped, ordered it to be taken to what was called the Apothetae, a sort of chasm under Taygetus; as thinking it neither for the good of the child itself, nor for the public interest, that it should be brought up, if it did not, from the very outset, appear made to be healthy and vigorous." The boys who passed the selection faced a rather short childhood – when they were seven, they were taken to the barracks, where they were trained to be excellent soldiers until they came of age.
Greek standards for dealing with children were modified slightly by the Romans. Until the second century BCE, citizens of the Eternal City followed the custom to put each new born baby on the ground right after delivery. If the father picked the baby up, the mother could care for it. If not, the newborn landed in the trash – someone could take them away or wild dogs would consume them. It was not until the end of the republic that this custom was considered barbaric and gradually began to fade. However, the tradition requiring that the young man or woman should remain under the absolute authority of their father was still obliged. The head of the family could even kill the offspring with impunity, although he had to consult the decision with the rest of the family beforehand.
When the Greeks and Romans did decide to look after their offspring, they showed them love and attention. In wealthier homes, special emphasis was placed on education and upbringing, so that the descendant "would desire to become an exemplary citizen, who would able to govern as well as obey orders in accordance with the laws of justice," as Plato explained in The Laws. According to the philosopher, children should be carefully looked after, and parents have the duty to care for their physical and mental development. Plato considered outdoor games combined with reading fairy tales, poetry and listening to music as the best way to achieve this goal. Interestingly, Plato did not approve of corporeal punishment as an educational measure.
The great Greek historian and philosopher Plutarch was of a similar opinion. He praised the Roman senator Cato the Elder for helping his wife to bathe their son, and not avoiding changing the baby. When the offspring grew up, the senator spent a lot of time with the boy, studied literary works with him, and taught him history, as well as horse riding and the use of weapons. Cato also condemned the beating of children, considering it to be unworthy of a Roman citizen. As prosperity grew, the revolutionary idea became increasingly popular in the republic. Educator Marcus Fabius Quintilianus (Quintilian) in his Institutes of Orator described corporeal punishment as "humiliating".
Another consequence of the liberalization of customs in the first century CE was taking care of girls' education and gradually equalizing their rights with those of boys. However, only Christians condemned the practice of abandonment of newborns. The new religion, garnering new followers in the Roman Empire from the third century onwards, ordered followers to care unconditionally for every being bestowed with an immortal soul.
This new trend turned out so strong that it survived even the fall of the Empire and the conquest of its lands by the Germanic peoples. Unwanted children began to end up in shelters, eagerly opened by monasteries. Moral pressure and the opportunity to give a child to the monks led to infanticide becoming a marginal phenomenon. Legal provisions prohibiting parents from killing, mutilating and selling children began to emerge. In Poland, this was banned in 1347 by Casimir the Great in his Wiślica Statutes.
However, as Philippe Ariès notes in Centuries of Childhood: A Social History of Family Life: "Childhood was a period of transition which passed quickly, and which was just as quickly forgotten." As few children survived into adulthood, parents usually did not develop deeper emotional ties with their offspring. During the Middle Ages, most European languages did not even know the word 'child'.
Departure from violence
During the Middle Ages, a child became a young man at the age of eight or nine. According to canon law of the Catholic Church, the bride had to be at least 12 years old, and the groom, 14. This fact greatly hindered the lives of the most powerful families. Immediately after the child's birth, the father, wanting to increase the resources and prestige of the family, began looking for a daughter-in-law or son-in-law. While the families decided their fate, the children subject to the transaction had nothing to say. When the King of Poland and Hungary, Louis the Hungarian, matched his daughter Jadwiga with Wilhelm Habsburg, she was only four years old. The husband chosen for her was four years older. To avoid conflicts with the church, the contract between the families was called an 'engagement for the future' (in Latin: sponsalia de futuro). The advantage of these arrangements was such that if political priorities changed, they were easier to break than sacramental union. This was the case with the engagement of Hedwig, who, for the benefit of the Polish raison d'etat, at the age of 13 married Władysław II Jagiełło, instead of Habsburg.
Interest in children as independent beings was revived in Europe when antiquity was discovered. Thanks to the writings of ancient philosophers, the fashion to care for education and educating children returned. Initially, corporeal punishment was the main tool in the education process. Regular beating of the pupils was considered to be so necessary that in the monastery schools a custom of a spring trip to the birch grove arose. There, the students themselves collected a supply of sticks for their teacher for the entire year.
A change in this way of thinking came with Ignatius of Loyola's Society of Jesus, founded in 1540. The Jesuits used violence only in extraordinary situations, and corporeal punishment could only be imposed by a servant, never a teacher. The pan-European network of free schools for young people built by the order enjoyed an excellent reputation. "They were the best teachers of all," the English philosopher Francis Bacon admitted reluctantly. The successes of the order made empiricists aware of the importance of non-violent education. One of the greatest philosophers of the 17th century, John Locke, urged parents to try to stimulate children to learn and behave well, using praise above all other measures.
The aforementioned Rousseau went even further, and criticized all then patterns of treating children. According to the then fashion, noble and rich people did not deal with them, because so did the plebs. The newborn was fed by a wet-nurse, and then was passed on to grandparents or poor relatives who were paid a salary. The child would return home when they were at least five years old. The toddler suddenly lost their loved ones. Later, their upbringing and education was supervised by their strict biological mother. They saw the father sporadically. Instead of love, they received daily lessons in showing respect and obedience. Rousseau condemned all of this. "His accusations and demands shook public opinion, women read them with tears in their eyes. And just like it was once fashionable, among the upper classes, to pass on the baby to the wet-nurse, after Emil it became fashionable for the mother to breastfeed her child," wrote Stanisław Kot in Historia wychowania [The History of Education]. Still, a fashion detached from the law and exposing society to the fate of children could not change the reality.
Shelter and factory
"In many villages and towns, newborn babies were kept for twelve to fifteen days, until there were enough of them. Then they were transported, often in a state of extreme exhaustion, to the shelter," writes Marian Surdacki in Dzieci porzucone w społeczeństwach dawnej Europy i Polski [Children Abandoned in the Societies of Old Europe and Poland]. While the Old Continent elites discovered the humanity of children, less affluent residents began reproducing entirely different ancient patterns on a massive scale. In the 18th century, abandoning unwanted children again became the norm. They usually went to care facilities maintained by local communes. In London, shelters took in around 15,000 children each year. Few managed to survive into adulthood. Across Europe, the number of abandoned children in the 18th century is estimated at around 10 million. Moral condemnation by the Catholic and Protestant churches did not do much.
Paradoxically, the industrial revolution turned out to be more effective, although initially it seemed to have the opposite effect. In Great Britain, peasants migrating to the cities routinely rid themselves of bothersome progeny. London shelters were under siege, and around 120,000 homeless, abandoned children wandered the streets of the metropolis. Although most did not survive a year, those who did required food and clothes. The financing of shelters placed a heavy burden on municipal budgets. "To the parish authorities, encumbered with great masses of unwanted children, the new cotton mills in Lancashire, Derby, and Notts were a godsend," write Barbara and John Lawrence Hammond in The Town Labourer.
At the beginning of the 19th century, English shelters became a source of cheap labour for the emerging factories. Orphans had to earn a living to receive shelter and food. Soon, their peers from poor families met the same fate. "In the manufacturing districts it is common for parents to send their children of both sexes at seven or eight years of age, in winter as well as summer, at six o'clock in the morning, sometimes of course in the dark, and occasionally amidst frost and snow, to enter the manufactories, which are often heated to a high temperature, and contain an atmosphere far from being the most favourable to human life," wrote Robert Owen in 1813. This extraordinary manager of the New Lanark spinning mill built a workers' estate complete with a kindergarten. It offered care, but also taught the children of workers how to read and write.
However, Owen remained a notable exception. Following his appeal, in 1816 the British parliament set up a special commission, which soon established that as many as 20% of workers in the textile industry were under 13 years old. There were also spinning mills where children constituted 70% of the labour force. As a standard, they worked 12 hours a day, and their only day of rest was Sunday. Their supervisors maintained discipline with truncheons. Such daily existence, combined with the tuberculosis epidemic, did not give the young workers a chance to live for too long. Owen and his supporters' protests, however, hardly changed anything for many years. "Industry as such is seeking new, less skilled but cheaper, workers. Small children are most welcome," noted the French socialist Eugène Buret two decades later.
Among the documents available in the British National Archives is the report of a government factory inspector from August 1859. He briefly described the case of a 13-year-old worker, Martha Appleton, from a Wigan spinning mill. Due to unhealthy, inhumane conditions the girl fainted on the job. Her hand became caught in an unguarded machine and all her fingers on that hand were severed. Since her job required both hands to be fast and efficient, Martha was fired, noted the inspector. As he suspected, the girl fainted due to fatigue. The next day, the factory owner decided that such a defective child would be useless. So, he dismissed her.
Where a single man once worked, one now finds several children or women doing similar jobs for poor salaries, warned Eugène Buret. This state of affairs began to alarm an increasing number of people. The activities of the German educator Friedrich Fröbel had a significant impact on this: he visited many cities and gave lectures on returning children to their childhoods, encouraging adults to provide children with care and free education. Fröbel's ideas contrasted dramatically with press reports about the terrible conditions endured by children in factories.
The Prussian government reacted first, and as early as 1839 banned the employment of minors. In France, a similar ban came into force two years later. In Britain, however, Prime Minister Robert Peel had to fight the parliament before peers agreed to adopt the Factory Act in 1844. The new legislation banned children below 13 from working in factories for more than six hours per day. Simultaneously, employers were required to provide child workers with education in factory schools. Soon, European states discovered that their strength was determined by citizens able to work efficiently and fight effectively on the battlefields. Children mutilated at work were completely unfit for military service. At the end of the 19th century, underage workers finally disappeared from European factories.
In defence of the child
"Mamma has been in the habit of whipping and beating me almost every day. She used to whip me with a twisted whip – a rawhide. The whip always left a black and blue mark on my body," 10-year-old Mary Ellen Wilson told a New York court in April 1874. Social activist Etty Wheeler stood in defence of the girl battered by her guardians (her biological parents were dead). When her requests for intervention were repeatedly refused by the police, the courts, and even the mayor of New York, the woman turned to the American Society for the Prevention of Cruelty to Animals (ASPCA) for help. Its president Henry Bergh first agreed with Miss Wheeler that the child was not her guardians' property. Using his experience fighting for animal rights, he began a press and legal battle for little Wilson. The girl's testimony published in the press shocked the public. The court took the child from her guardians, and sentenced her sadistic stepmother to a year of hard labour. Mary Ellen Wilson came under the care of Etty Wheeler. In 1877, her story inspired animal rights activists to establish American Humane, an NGO fighting for the protection of every harmed creature, including children.
In Europe, this idea found more and more supporters. Even more so than among the aristocrats, the bourgeois hardly used corporeal punishment, as it was met with more and more condemnation, note Philippe Ariès and Georges Duby in A History of Private Life: From the Fires of Revolution to the Great War. At the same time, the custom of entrusting the care of offspring to strangers fell into oblivion. Towards the end of the 19th century, 'good mothers' began to look after their own babies.
In 1900, Ellen Key's bestselling book The Century of the Child was published. A teacher from Sweden urged parents to provide their offspring with love and a sense of security, and limit themselves to patiently observe how nature takes its course. However, her idealism collided with another pioneering work by Karl Marx and Friedrich Engels. The authors postulated that we ought to "replace home education by social". The indoctrination of children was to be dealt with by school and youth organizations, whose aim was to prepare young people to fight the conservative generation of parents for a new world.
Did the 20th century bring a breakthrough in how children are treated? In 1924, the League of Nations adopted a Declaration of the Rights of the Child. The opening preamble stated that "mankind owes to the child the best that it has to give." This is an important postulate, but sadly it is still not implemented in many places around the world.
Translated from the Polish by Joanna Figiel
Healthy people need healthy microbiomes from an early age.
- 30 million children worldwide suffer from moderate acute malnutrition.
- Lifelong problems from undernourishment include increased risks of diabetes and heart problems.
- New research shows that targeting the microbiome could help malnourished children grow up healthy.
According to the United Nations Food and Agriculture Organization, an estimated 815 million people — nearly 11 percent of the global population — suffer from chronic undernourishment. While the predominant number of them live in poor countries, some 11 million live in more developed nations. As the pandemic rages on and the effects of climate change continue, those numbers will continue to increase if there are no interventions.
A new study, published in The New England Journal of Medicine by an international team of researchers, investigates one potential solution that could address the 30 million children suffering from moderate acute malnutrition: target the microbiome.
Childhood undernourishment results in a variety of crippling lifelong effects: wasting and stunting (impaired growth and development), immune and metabolic dysfunction, and central nervous system problems top the list. With the pandemic predicted to increase childhood deaths from wasting by 20 percent, the team expresses urgency for this chronic problem.
Feeding the microbiota
For this randomized, controlled study, researchers recruited 118 children between the ages of 12 to 18 months. They split the recruits into two groups: 59 children were given an experimental diet (which they called microbiota-directed complementary food prototype, or MDCF-2), and the other 59 were given a control diet (which was ready-to-use supplementary food, or RUSF). All children lived in Mirpur, an impoverished region of Dhaka, Bangladesh.
Supplementation was given for three months followed by one month of monitoring. The team measured a total of 4,977 proteins and 209 bacterial taxa in fecal samples over the course of the project. Because they had previously observed that malnourished children have less advanced microbiome profiles than healthy children, the goal was to feed and encourage the growth of the bacterial community associated with normal childhood development.
During the first month, mothers brought their children to a regional healthcare center to feed them two daily servings of either MDCF-2 or RUSF. During the second month, one of those two feedings happened at home. By the third month, the children were fed at home. After three months, the children returned to their normal feeding routines and were tested one month later.
The group given MDCF-2 saw improvements in two of four key measurements: weight-for-length and weight-for-age. They also found an important improvement in terms of bodily inflammation. The authors wrote:
"By the end of MDCF-2 supplementation, children in the upper quartile had the largest increases in mediators of bone growth and CNS development and the largest decreases in effectors of inflammation. Together, these results provide evidence that mediators of bone growth, neurodevelopment, and inflammation distinguished the effects of the MDCF-2 nutritional intervention from that of RUSF."
Undernourishment often results in metabolic reprogramming that predisposes children to develop cardiovascular issues, diabetes, and hypertension later in life. This is, in part, why they're seeking early interventions focused on creating healthy microbial communities before such metabolic changes occur.
Stay in touch with Derek on Twitter and Facebook. His most recent book is "Hero's Dose: The Case For Psychedelics in Ritual and Therapy."
The answer seems to be a series of evolutionary trade-offs that help protect organs in women, according to a recent study.
- Human childbirth is a relatively painful and dangerous process, due largely to the "obstetrical dilemma."
- The obstetrical dilemma describes how human infants have big heads, but their mothers have relatively small birth canals and pelvic floors.
- The new study found that having a smaller pelvic floor helps maintain the integrity of women's organs, even though it makes childbirth difficult.
Human childbirth is a relatively painful and complicated process in the animal kingdom.
Unlike other primates that are able to give birth unassisted, human mothers usually need help from their family or community to deliver a baby. Even with help, mothers and infants face a small chance of death during childbirth, especially in regions with limited access to healthcare, like Sub-Saharan Africa.
One reason human childbirth is dangerous is the "obstetrical dilemma." This dilemma describes what seems to be an anatomical contradiction: Human infants have large heads, but their mothers have small birth canals.
To pass through the birth canal, human infants have to perform a series of twists and turns, a process called rotational birth. (Human infants also have fontanels, commonly referred to as soft spots, which help them better squeeze through the birth canal.)
It begs the question: Why hasn't evolution made childbirth easier on humans?
A study recently published in Proceedings of the National Academy of Sciences proposes that human childbirth is difficult because of evolutionary trade-offs that ultimately help protect organs in the body.
The main trade-off for women centers on the pelvic floor, which is a group of muscles that stretches from the pubic bone to the tailbone. These muscles help stabilize the spine, support the womb, and control bladder and bowel functions. The pelvic floor also stretches during childbirth, allowing the baby to pass more easily through the birth canal.A medical illustration depicting the female pelvic muscles.
Some researchers have proposed that a larger pelvic floor would make childbirth easier for women. But others have countered that a larger pelvic floor would actually be more vulnerable to deformation and could lead to disorders, including incontinence and organs dropping from their normal positions (known as prolapse).
Known as the "pelvic floor hypothesis," this idea has been difficult to test. In the current study, a team of researchers from the University of Texas and University of Vienna used computer models to test how increasing the size and thickness of the pelvic floor might affect women, both in general and during childbirth.
To test the models, the team used finite element analysis. This method, more commonly used in engineering projects, uses mathematics to test how structures would likely react to real-world forces, such as vibration, heat, fluid flow, and pressure. After testing a wide range of pelvic floor sizes and thicknesses, the results suggested that the pelvic floor hypothesis is correct.
"We found that thicker pelvic floors would require quite a bit higher intra-abdominal pressures than humans are capable of generating to stretch during childbirth," Nicole Grunstra, an affiliated researcher at the University of Vienna's Unit for Theoretical Biology in the Department of Evolutionary Biology, told UT News.
"Being unable to push the baby through a resistant pelvic floor would equally complicate childbirth, despite the extra space available in the birth canal, and so pelvic floor thickness appears to be another evolutionary 'compromise,' in addition to the size of the birth canal."
The results highlight how evolution has helped us achieve remarkable anatomical balance.
"Although this dimension has made childbirth more difficult, we have evolved to a point where the pelvic floor and canal can balance supporting internal organs while also facilitating childbirth and making it as easy as possible," lead study author Krishna Kumar told UT News.