Once a week.
Subscribe to our weekly newsletter.
The birth of childhood: A brief history of the European child
Did the 20th century bring a breakthrough in how children are treated?
It took several thousand years for our culture to realize that a child is not an object. Learning how to treat children as humans continues to this day.
"Nature wants children to be children before they are men," wrote Jean-Jacques Rousseau in the book Emile, or On Education (1762). While Rousseau did not see children as humans, he appealed to parents to look after their offspring. "If we consider childhood itself, is there anything so weak and wretched as a child, anything so utterly at the mercy of those about it, so dependent on their pity, their care, and their affection?" he asked. At a time when children were regularly entrusted to others during adolescence or left in shelters, Rousseau's demands seemed revolutionary. They paved the way for the breakthrough discovery that indeed, a child is also a human being, capable of feelings, having their own needs and, above all, suffering. But the philosopher himself did not take these ideas to heart. Whenever his lover and later wife, Teresa Levasseur, gave birth to a child, Rousseau immediately gave the baby to an orphanage, where just one in a hundred newborns had a chance to live to adulthood.
Double standards in people's approach to children were not unusual in the past. In ancient Greece, no one condemned parents for leaving a baby by the road or in the garbage. Usually, it was torn apart by animals. Less often, a passer-by would take them – not necessarily guided by mercy. After raising the orphan, the 'Good Samaritan' could sell the child at a slave market, recovering the money invested in their maintenance with interest. This kind of practice did not shock, because in the world of ancient Greece a child had the status of private property, and therefore the public and authorities were indifferent to their fate.
The exception was Sparta, but this did not mean anything good for minors. While in other poleis infanticide was left to parents, in Sparta it was managed by the council of Fyli. In Life of Lycurgus, Plutarch wrote about how the child was inspected by the Fyli elders forming the council: "If they found it stout and well made, they gave order for its rearing, and allotted to it one of the nine thousand shares of land above mentioned for its maintenance, but, if they found it puny and ill-shaped, ordered it to be taken to what was called the Apothetae, a sort of chasm under Taygetus; as thinking it neither for the good of the child itself, nor for the public interest, that it should be brought up, if it did not, from the very outset, appear made to be healthy and vigorous." The boys who passed the selection faced a rather short childhood – when they were seven, they were taken to the barracks, where they were trained to be excellent soldiers until they came of age.
Greek standards for dealing with children were modified slightly by the Romans. Until the second century BCE, citizens of the Eternal City followed the custom to put each new born baby on the ground right after delivery. If the father picked the baby up, the mother could care for it. If not, the newborn landed in the trash – someone could take them away or wild dogs would consume them. It was not until the end of the republic that this custom was considered barbaric and gradually began to fade. However, the tradition requiring that the young man or woman should remain under the absolute authority of their father was still obliged. The head of the family could even kill the offspring with impunity, although he had to consult the decision with the rest of the family beforehand.
When the Greeks and Romans did decide to look after their offspring, they showed them love and attention. In wealthier homes, special emphasis was placed on education and upbringing, so that the descendant "would desire to become an exemplary citizen, who would able to govern as well as obey orders in accordance with the laws of justice," as Plato explained in The Laws. According to the philosopher, children should be carefully looked after, and parents have the duty to care for their physical and mental development. Plato considered outdoor games combined with reading fairy tales, poetry and listening to music as the best way to achieve this goal. Interestingly, Plato did not approve of corporeal punishment as an educational measure.
The great Greek historian and philosopher Plutarch was of a similar opinion. He praised the Roman senator Cato the Elder for helping his wife to bathe their son, and not avoiding changing the baby. When the offspring grew up, the senator spent a lot of time with the boy, studied literary works with him, and taught him history, as well as horse riding and the use of weapons. Cato also condemned the beating of children, considering it to be unworthy of a Roman citizen. As prosperity grew, the revolutionary idea became increasingly popular in the republic. Educator Marcus Fabius Quintilianus (Quintilian) in his Institutes of Orator described corporeal punishment as "humiliating".
Another consequence of the liberalization of customs in the first century CE was taking care of girls' education and gradually equalizing their rights with those of boys. However, only Christians condemned the practice of abandonment of newborns. The new religion, garnering new followers in the Roman Empire from the third century onwards, ordered followers to care unconditionally for every being bestowed with an immortal soul.
This new trend turned out so strong that it survived even the fall of the Empire and the conquest of its lands by the Germanic peoples. Unwanted children began to end up in shelters, eagerly opened by monasteries. Moral pressure and the opportunity to give a child to the monks led to infanticide becoming a marginal phenomenon. Legal provisions prohibiting parents from killing, mutilating and selling children began to emerge. In Poland, this was banned in 1347 by Casimir the Great in his Wiślica Statutes.
However, as Philippe Ariès notes in Centuries of Childhood: A Social History of Family Life: "Childhood was a period of transition which passed quickly, and which was just as quickly forgotten." As few children survived into adulthood, parents usually did not develop deeper emotional ties with their offspring. During the Middle Ages, most European languages did not even know the word 'child'.
Departure from violence
During the Middle Ages, a child became a young man at the age of eight or nine. According to canon law of the Catholic Church, the bride had to be at least 12 years old, and the groom, 14. This fact greatly hindered the lives of the most powerful families. Immediately after the child's birth, the father, wanting to increase the resources and prestige of the family, began looking for a daughter-in-law or son-in-law. While the families decided their fate, the children subject to the transaction had nothing to say. When the King of Poland and Hungary, Louis the Hungarian, matched his daughter Jadwiga with Wilhelm Habsburg, she was only four years old. The husband chosen for her was four years older. To avoid conflicts with the church, the contract between the families was called an 'engagement for the future' (in Latin: sponsalia de futuro). The advantage of these arrangements was such that if political priorities changed, they were easier to break than sacramental union. This was the case with the engagement of Hedwig, who, for the benefit of the Polish raison d'etat, at the age of 13 married Władysław II Jagiełło, instead of Habsburg.
Interest in children as independent beings was revived in Europe when antiquity was discovered. Thanks to the writings of ancient philosophers, the fashion to care for education and educating children returned. Initially, corporeal punishment was the main tool in the education process. Regular beating of the pupils was considered to be so necessary that in the monastery schools a custom of a spring trip to the birch grove arose. There, the students themselves collected a supply of sticks for their teacher for the entire year.
A change in this way of thinking came with Ignatius of Loyola's Society of Jesus, founded in 1540. The Jesuits used violence only in extraordinary situations, and corporeal punishment could only be imposed by a servant, never a teacher. The pan-European network of free schools for young people built by the order enjoyed an excellent reputation. "They were the best teachers of all," the English philosopher Francis Bacon admitted reluctantly. The successes of the order made empiricists aware of the importance of non-violent education. One of the greatest philosophers of the 17th century, John Locke, urged parents to try to stimulate children to learn and behave well, using praise above all other measures.
The aforementioned Rousseau went even further, and criticized all then patterns of treating children. According to the then fashion, noble and rich people did not deal with them, because so did the plebs. The newborn was fed by a wet-nurse, and then was passed on to grandparents or poor relatives who were paid a salary. The child would return home when they were at least five years old. The toddler suddenly lost their loved ones. Later, their upbringing and education was supervised by their strict biological mother. They saw the father sporadically. Instead of love, they received daily lessons in showing respect and obedience. Rousseau condemned all of this. "His accusations and demands shook public opinion, women read them with tears in their eyes. And just like it was once fashionable, among the upper classes, to pass on the baby to the wet-nurse, after Emil it became fashionable for the mother to breastfeed her child," wrote Stanisław Kot in Historia wychowania [The History of Education]. Still, a fashion detached from the law and exposing society to the fate of children could not change the reality.
Shelter and factory
"In many villages and towns, newborn babies were kept for twelve to fifteen days, until there were enough of them. Then they were transported, often in a state of extreme exhaustion, to the shelter," writes Marian Surdacki in Dzieci porzucone w społeczeństwach dawnej Europy i Polski [Children Abandoned in the Societies of Old Europe and Poland]. While the Old Continent elites discovered the humanity of children, less affluent residents began reproducing entirely different ancient patterns on a massive scale. In the 18th century, abandoning unwanted children again became the norm. They usually went to care facilities maintained by local communes. In London, shelters took in around 15,000 children each year. Few managed to survive into adulthood. Across Europe, the number of abandoned children in the 18th century is estimated at around 10 million. Moral condemnation by the Catholic and Protestant churches did not do much.
Paradoxically, the industrial revolution turned out to be more effective, although initially it seemed to have the opposite effect. In Great Britain, peasants migrating to the cities routinely rid themselves of bothersome progeny. London shelters were under siege, and around 120,000 homeless, abandoned children wandered the streets of the metropolis. Although most did not survive a year, those who did required food and clothes. The financing of shelters placed a heavy burden on municipal budgets. "To the parish authorities, encumbered with great masses of unwanted children, the new cotton mills in Lancashire, Derby, and Notts were a godsend," write Barbara and John Lawrence Hammond in The Town Labourer.
At the beginning of the 19th century, English shelters became a source of cheap labour for the emerging factories. Orphans had to earn a living to receive shelter and food. Soon, their peers from poor families met the same fate. "In the manufacturing districts it is common for parents to send their children of both sexes at seven or eight years of age, in winter as well as summer, at six o'clock in the morning, sometimes of course in the dark, and occasionally amidst frost and snow, to enter the manufactories, which are often heated to a high temperature, and contain an atmosphere far from being the most favourable to human life," wrote Robert Owen in 1813. This extraordinary manager of the New Lanark spinning mill built a workers' estate complete with a kindergarten. It offered care, but also taught the children of workers how to read and write.
However, Owen remained a notable exception. Following his appeal, in 1816 the British parliament set up a special commission, which soon established that as many as 20% of workers in the textile industry were under 13 years old. There were also spinning mills where children constituted 70% of the labour force. As a standard, they worked 12 hours a day, and their only day of rest was Sunday. Their supervisors maintained discipline with truncheons. Such daily existence, combined with the tuberculosis epidemic, did not give the young workers a chance to live for too long. Owen and his supporters' protests, however, hardly changed anything for many years. "Industry as such is seeking new, less skilled but cheaper, workers. Small children are most welcome," noted the French socialist Eugène Buret two decades later.
Among the documents available in the British National Archives is the report of a government factory inspector from August 1859. He briefly described the case of a 13-year-old worker, Martha Appleton, from a Wigan spinning mill. Due to unhealthy, inhumane conditions the girl fainted on the job. Her hand became caught in an unguarded machine and all her fingers on that hand were severed. Since her job required both hands to be fast and efficient, Martha was fired, noted the inspector. As he suspected, the girl fainted due to fatigue. The next day, the factory owner decided that such a defective child would be useless. So, he dismissed her.
Where a single man once worked, one now finds several children or women doing similar jobs for poor salaries, warned Eugène Buret. This state of affairs began to alarm an increasing number of people. The activities of the German educator Friedrich Fröbel had a significant impact on this: he visited many cities and gave lectures on returning children to their childhoods, encouraging adults to provide children with care and free education. Fröbel's ideas contrasted dramatically with press reports about the terrible conditions endured by children in factories.
The Prussian government reacted first, and as early as 1839 banned the employment of minors. In France, a similar ban came into force two years later. In Britain, however, Prime Minister Robert Peel had to fight the parliament before peers agreed to adopt the Factory Act in 1844. The new legislation banned children below 13 from working in factories for more than six hours per day. Simultaneously, employers were required to provide child workers with education in factory schools. Soon, European states discovered that their strength was determined by citizens able to work efficiently and fight effectively on the battlefields. Children mutilated at work were completely unfit for military service. At the end of the 19th century, underage workers finally disappeared from European factories.
In defence of the child
"Mamma has been in the habit of whipping and beating me almost every day. She used to whip me with a twisted whip – a rawhide. The whip always left a black and blue mark on my body," 10-year-old Mary Ellen Wilson told a New York court in April 1874. Social activist Etty Wheeler stood in defence of the girl battered by her guardians (her biological parents were dead). When her requests for intervention were repeatedly refused by the police, the courts, and even the mayor of New York, the woman turned to the American Society for the Prevention of Cruelty to Animals (ASPCA) for help. Its president Henry Bergh first agreed with Miss Wheeler that the child was not her guardians' property. Using his experience fighting for animal rights, he began a press and legal battle for little Wilson. The girl's testimony published in the press shocked the public. The court took the child from her guardians, and sentenced her sadistic stepmother to a year of hard labour. Mary Ellen Wilson came under the care of Etty Wheeler. In 1877, her story inspired animal rights activists to establish American Humane, an NGO fighting for the protection of every harmed creature, including children.
In Europe, this idea found more and more supporters. Even more so than among the aristocrats, the bourgeois hardly used corporeal punishment, as it was met with more and more condemnation, note Philippe Ariès and Georges Duby in A History of Private Life: From the Fires of Revolution to the Great War. At the same time, the custom of entrusting the care of offspring to strangers fell into oblivion. Towards the end of the 19th century, 'good mothers' began to look after their own babies.
In 1900, Ellen Key's bestselling book The Century of the Child was published. A teacher from Sweden urged parents to provide their offspring with love and a sense of security, and limit themselves to patiently observe how nature takes its course. However, her idealism collided with another pioneering work by Karl Marx and Friedrich Engels. The authors postulated that we ought to "replace home education by social". The indoctrination of children was to be dealt with by school and youth organizations, whose aim was to prepare young people to fight the conservative generation of parents for a new world.
Did the 20th century bring a breakthrough in how children are treated? In 1924, the League of Nations adopted a Declaration of the Rights of the Child. The opening preamble stated that "mankind owes to the child the best that it has to give." This is an important postulate, but sadly it is still not implemented in many places around the world.
Translated from the Polish by Joanna Figiel
- How Finland's amazing education system works - Big Think ›
- The Eastern European Way: Childhood Independence and Putting ... ›
We explore the history of blood types and how they are classified to find out what makes the Rh-null type important to science and dangerous for those who live with it.
- Fewer than 50 people worldwide have 'golden blood' — or Rh-null.
- Blood is considered Rh-null if it lacks all of the 61 possible antigens in the Rh system.
- It's also very dangerous to live with this blood type, as so few people have it.
Golden blood sounds like the latest in medical quackery. As in, get a golden blood transfusion to balance your tantric midichlorians and receive a free charcoal ice cream cleanse. Don't let the New-Agey moniker throw you. Golden blood is actually the nickname for Rh-null, the world's rarest blood type.
As Mosaic reports, the type is so rare that only about 43 people have been reported to have it worldwide, and until 1961, when it was first identified in an Aboriginal Australian woman, doctors assumed embryos with Rh-null blood would simply die in utero.
But what makes Rh-null so rare, and why is it so dangerous to live with? To answer that, we'll first have to explore why hematologists classify blood types the way they do.
A (brief) bloody history
Our ancestors understood little about blood. Even the most basic of blood knowledge — blood inside the body is good, blood outside is not ideal, too much blood outside is cause for concern — escaped humanity's grasp for an embarrassing number of centuries.
Absence this knowledge, our ancestors devised less-than-scientific theories as to what blood was, theories that varied wildly across time and culture. To pick just one, the physicians of Shakespeare's day believed blood to be one of four bodily fluids or "humors" (the others being black bile, yellow bile, and phlegm).
Handed down from ancient Greek physicians, humorism stated that these bodily fluids determined someone's personality. Blood was considered hot and moist, resulting in a sanguine temperament. The more blood people had in their systems, the more passionate, charismatic, and impulsive they would be. Teenagers were considered to have a natural abundance of blood, and men had more than women.
Humorism lead to all sorts of poor medical advice. Most famously, Galen of Pergamum used it as the basis for his prescription of bloodletting. Sporting a "when in doubt, let it out" mentality, Galen declared blood the dominant humor, and bloodletting an excellent way to balance the body. Blood's relation to heat also made it a go-to for fever reduction.
While bloodletting remained common until well into the 19th century, William Harvey's discovery of the circulation of blood in 1628 would put medicine on its path to modern hematology.
Soon after Harvey's discovery, the earliest blood transfusions were attempted, but it wasn't until 1665 that first successful transfusion was performed by British physician Richard Lower. Lower's operation was between dogs, and his success prompted physicians like Jean-Baptiste Denis to try to transfuse blood from animals to humans, a process called xenotransfusion. The death of human patients ultimately led to the practice being outlawed.4
The first successful human-to-human transfusion wouldn't be performed until 1818, when British obstetrician James Blundell managed it to treat postpartum hemorrhage. But even with a proven technique in place, in the following decades many blood-transfusion patients continued to die mysteriously.
Enter Austrian physician Karl Landsteiner. In 1901 he began his work to classify blood groups. Exploring the work of Leonard Landois — the physiologist who showed that when the red blood cells of one animal are introduced to a different animal's, they clump together — Landsteiner thought a similar reaction may occur in intra-human transfusions, which would explain why transfusion success was so spotty. In 1909, he classified the A, B, AB, and O blood groups, and for his work he received the 1930 Nobel Prize for Physiology or Medicine.
What causes blood types?
It took us a while to grasp the intricacies of blood, but today, we know that this life-sustaining substance consists of:
- Red blood cells — cells that carry oxygen and remove carbon dioxide throughout the body;
- White blood cells — immune cells that protect the body against infection and foreign agents;
- Platelets — cells that help blood clot; and
- Plasma — a liquid that carries salts and enzymes.6,7
Each component has a part to play in blood's function, but the red blood cells are responsible for our differing blood types. These cells have proteins* covering their surface called antigens, and the presence or absence of particular antigens determines blood type — type A blood has only A antigens, type B only B, type AB both, and type O neither. Red blood cells sport another antigen called the RhD protein. When it is present, a blood type is said to be positive; when it is absent, it is said to be negative. The typical combinations of A, B, and RhD antigens give us the eight common blood types (A+, A-, B+, B-, AB+, AB-, O+, and O-).
Blood antigen proteins play a variety of cellular roles, but recognizing foreign cells in the blood is the most important for this discussion.
Think of antigens as backstage passes to the bloodstream, while our immune system is the doorman. If the immune system recognizes an antigen, it lets the cell pass. If it does not recognize an antigen, it initiates the body's defense systems and destroys the invader. So, a very aggressive doorman.
While our immune systems are thorough, they are not too bright. If a person with type A blood receives a transfusion of type B blood, the immune system won't recognize the new substance as a life-saving necessity. Instead, it will consider the red blood cells invaders and attack. This is why so many people either grew ill or died during transfusions before Landsteiner's brilliant discovery.
This is also why people with O negative blood are considered "universal donors." Since their red blood cells lack A, B, and RhD antigens, immune systems don't have a way to recognize these cells as foreign and so leaves them well enough alone.
How is Rh-null the rarest blood type?
Let's return to golden blood. In truth, the eight common blood types are an oversimplification of how blood types actually work. As Smithsonian.com points out, "[e]ach of these eight types can be subdivided into many distinct varieties," resulting in millions of different blood types, each classified on a multitude of antigens combinations.
Here is where things get tricky. The RhD protein previously mentioned only refers to one of 61 potential proteins in the Rh system. Blood is considered Rh-null if it lacks all of the 61 possible antigens in the Rh system. This not only makes it rare, but this also means it can be accepted by anyone with a rare blood type within the Rh system.
This is why it is considered "golden blood." It is worth its weight in gold.
As Mosaic reports, golden blood is incredibly important to medicine, but also very dangerous to live with. If a Rh-null carrier needs a blood transfusion, they can find it difficult to locate a donor, and blood is notoriously difficult to transport internationally. Rh-null carriers are encouraged to donate blood as insurance for themselves, but with so few donors spread out over the world and limits on how often they can donate, this can also put an altruistic burden on those select few who agree to donate for others.
Some bloody good questions about blood types
A nurse takes blood samples from a pregnant woman at the North Hospital (Hopital Nord) in Marseille, southern France.
Photo by BERTRAND LANGLOIS / AFP
There remain many mysteries regarding blood types. For example, we still don't know why humans evolved the A and B antigens. Some theories point to these antigens as a byproduct of the diseases various populations contacted throughout history. But we can't say for sure.
In this absence of knowledge, various myths and questions have grown around the concept of blood types in the popular consciousness. Here are some of the most common and their answers.
Do blood types affect personality?
Japan's blood type personality theory is a contemporary resurrection of humorism. The idea states that your blood type directly affects your personality, so type A blood carriers are kind and fastidious, while type B carriers are optimistic and do their own thing. However, a 2003 study sampling 180 men and 180 women found no relationship between blood type and personality.
The theory makes for a fun question on a Cosmopolitan quiz, but that's as accurate as it gets.
Should you alter your diet based on your blood type?
Remember Galen of Pergamon? In addition to bloodletting, he also prescribed his patients to eat certain foods depending on which humors needed to be balanced. Wine, for example, was considered a hot and dry drink, so it would be prescribed to treat a cold. In other words, belief that your diet should complement your blood type is yet another holdover of humorism theory.
Created by Peter J. D'Adamo, the Blood Type Diet argues that one's diet should match one's blood type. Type A carriers should eat a meat-free diet of whole grains, legumes, fruits, and vegetables; type B carriers should eat green vegetables, certain meats, and low-fat dairy; and so on.
However, a study from the University of Toronto analyzed the data from 1,455 participants and found no evidence to support the theory. While people can lose weight and become healthier on the diet, it probably has more to do with eating all those leafy greens than blood type.
Are there links between blood types and certain diseases?
There is evidence to suggest that different blood types may increase the risk of certain diseases. One analysis suggested that type O blood decreases the risk of having a stroke or heart attack, while AB blood appears to increase it. With that said, type O carriers have a greater chance of developing peptic ulcers and skin cancer.
None of this is to say that your blood type will foredoom your medical future. Many factors, such as diet and exercise, hold influence over your health and likely to a greater extent than blood type.
What is the most common blood type?
In the United States, the most common blood type is O+. Roughly one in three people sports this type of blood. Of the eight well-known blood types, the least common is AB-. Only one in 167 people in the U.S. have it.
Do animals have blood types?
They most certainly do, but they are not the same as ours. This difference is why those 17th-century patients who thought, "Animal blood, now that's the ticket!" ultimately had their tickets punched. In fact, blood types are distinct between species. Unhelpfully, scientists sometimes use the same nomenclature to describe these different types. Cats, for example, have A and B antigens, but these are not the same A and B antigens found in humans.
Interestingly, xenotransfusion is making a comeback. Scientists are working to genetically engineer the blood of pigs to potentially produce human compatible blood.
Scientists are also looking into creating synthetic blood. If they succeed, they may be able to ease the current blood shortage, while also devising a way to create blood for rare blood type carriers. While this may make golden blood less golden, it would certainly make it easier to live with.* While antigens are typically proteins, they can be other molecules as well, such as polysaccharides.
The author of 'How We Read' Now explains.
During the pandemic, many college professors abandoned assignments from printed textbooks and turned instead to digital texts or multimedia coursework.
As a professor of linguistics, I have been studying how electronic communication compares to traditional print when it comes to learning. Is comprehension the same whether a person reads a text onscreen or on paper? And are listening and viewing content as effective as reading the written word when covering the same material?
The answers to both questions are often “no," as I discuss in my book “How We Read Now," released in March 2021. The reasons relate to a variety of factors, including diminished concentration, an entertainment mindset and a tendency to multitask while consuming digital content.
Print versus digital reading
The benefits of print particularly shine through when experimenters move from posing simple tasks – like identifying the main idea in a reading passage – to ones that require mental abstraction – such as drawing inferences from a text. Print reading also improves the likelihood of recalling details – like “What was the color of the actor's hair?" – and remembering where in a story events occurred – “Did the accident happen before or after the political coup?"
Studies show that both grade school students and college students assume they'll get higher scores on a comprehension test if they have done the reading digitally. And yet, they actually score higher when they have read the material in print before being tested.
Educators need to be aware that the method used for standardized testing can affect results. Studies of Norwegian tenth graders and U.S. third through eighth graders report higher scores when standardized tests were administered using paper. In the U.S. study, the negative effects of digital testing were strongest among students with low reading achievement scores, English language learners and special education students.
My own research and that of colleagues approached the question differently. Rather than having students read and take a test, we asked how they perceived their overall learning when they used print or digital reading materials. Both high school and college students overwhelmingly judged reading on paper as better for concentration, learning and remembering than reading digitally.
The discrepancies between print and digital results are partly related to paper's physical properties. With paper, there is a literal laying on of hands, along with the visual geography of distinct pages. People often link their memory of what they've read to how far into the book it was or where it was on the page.
But equally important is mental perspective, and what reading researchers call a “shallowing hypothesis." According to this theory, people approach digital texts with a mindset suited to casual social media, and devote less mental effort than when they are reading print.
Podcasts and online video
Given increased use of flipped classrooms – where students listen to or view lecture content before coming to class – along with more publicly available podcasts and online video content, many school assignments that previously entailed reading have been replaced with listening or viewing. These substitutions have accelerated during the pandemic and move to virtual learning.
Surveying U.S. and Norwegian university faculty in 2019, University of Stavanger Professor Anne Mangen and I found that 32% of U.S. faculty were now replacing texts with video materials, and 15% reported doing so with audio. The numbers were somewhat lower in Norway. But in both countries, 40% of respondents who had changed their course requirements over the past five to 10 years reported assigning less reading today.
A primary reason for the shift to audio and video is students refusing to do assigned reading. While the problem is hardly new, a 2015 study of more than 18,000 college seniors found only 21% usually completed all their assigned course reading.
Maximizing mental focus
Researchers found similar results with university students reading an article versus listening to a podcast of the text. A related study confirms that students do more mind-wandering when listening to audio than when reading.
Results with younger students are similar, but with a twist. A study in Cyprus concluded that the relationship between listening and reading skills flips as children become more fluent readers. While second graders had better comprehension with listening, eighth graders showed better comprehension when reading.
Research on learning from video versus text echoes what we see with audio. For example, researchers in Spain found that fourth through sixth graders who read texts showed far more mental integration of the material than those watching videos. The authors suspect that students “read" the videos more superficially because they associate video with entertainment, not learning.
The collective research shows that digital media have common features and user practices that can constrain learning. These include diminished concentration, an entertainment mindset, a propensity to multitask, lack of a fixed physical reference point, reduced use of annotation and less frequent reviewing of what has been read, heard or viewed.
Digital texts, audio and video all have educational roles, especially when providing resources not available in print. However, for maximizing learning where mental focus and reflection are called for, educators – and parents – shouldn't assume all media are the same, even when they contain identical words.
Humans may have evolved to be tribalistic. Is that a bad thing?
- From politics to every day life, humans have a tendency to form social groups that are defined in part by how they differ from other groups.
- Neuroendocrinologist Robert Sapolsky, author Dan Shapiro, and others explore the ways that tribalism functions in society, and discuss how—as social creatures—humans have evolved for bias.
- But bias is not inherently bad. The key to seeing things differently, according to Beau Lotto, is to "embody the fact" that everything is grounded in assumptions, to identify those assumptions, and then to question them.
Ancient corridors below the French capital have served as its ossuary, playground, brewery, and perhaps soon, air conditioning.