- The First World War took place during a time of rapid technological innovation.
- New tools like radiography and antiseptics helped mediate the hazardous conditions of trench warfare.
- The war not only changed the way medical scientists looked at the human body itself, but also hastened the rise of the welfare state.
The First World War took place during a time of rapid technological innovation and was, as its name suggests, a war of firsts. It was the first war in human history to be fought on land, air, and sea. It was the first war to witness the deployment of tanks. It was the first war in which soldiers were armed with flamethrowers and attacked by chemical weapons like mustard gas.
Perhaps the most important inventions were those designed to save lives rather than take them. Advances in medicine meant people were suddenly able to survive injuries that would have killed them decades earlier. Their survival also introduced doctors to a new kind of injury, one that was invisible and psychological, and which stayed with the soldiers long after the fighting had ceased.
The First World War was fought primarily in trenches. These defensive ditches, writes professor and medical historian Charles Van Way, III, represented “a disaster for public health.” They were filled with dirt, blood, vermin, and human waste. Soldiers stationed in them had to be rotated out every two weeks so they could be deloused and provided with clean clothing.
Despite these horrible conditions, “the casualty care system was still much better than in any previous war.” Ambulances driven by the likes of Ernest Hemingway, Ray Kroc, and Walt Disney picked up the wounded and rushed them toward aid stations or field hospitals. There, legions of well-equipped surgeons and nurses were prepared to handle any number of medical emergencies.
According to Van Way, soldiers were given tetanus toxoid to ward off the eponymous bacteria that entered wounds via trench dirt. They were also treated with antiseptic and anesthetics. Doctors even had access to radiography, despite the fact that this technology was invented fewer than twenty years prior. These new tools reduced the mortality rate of amputations from 25% during the Civil War to 5% in World War I.
Meanwhile, unprecedented records of disfigurement forced a change in the design and production of prosthetic limbs. Previously, prosthetics were hand-crafted and reserved for the elite. Now, metal limbs — from arms to legs and even noses — were being mass produced. These advances, though spurred by unspeakable suffering, constitute a highlight in the history of public healthcare.
Thinking about the human body
On a slightly more abstract level, the First World War also changed the way medical professionals thought about the human body itself. Before 1914, the body was treated like a machine that was made up of separate components. Diseases were forces that disrupted the established relationships between those components and had to be removed in order for the body to recover.
The injuries of the First World War revealed that the human body was far more complex and unpredictable than any piece of machinery. Field doctors struggled to understand why one patient recovered while another succumbed. Psychological scars were particularly difficult to comprehend, as each individual seemed to respond to their trauma in a completely unique way.
New conditions like “wound shock” revealed the physiological limitations of the body. In wound shock, bodies respond to minor lacerations as if they were life-threatening; a normal patient might recover from a shot in the chest, while a wound shock patient might die from a leg injury.
As the New York University historians Todd Meyers and Stefanos Geroulanos state in an article written for Aeon magazine, medical scientists stopped looking at the human body as the sum of its parts. Instead, they now began to describe it “as an integral whole, and detailed how elaborately it collapses in on itself, system by system, when it is pierced by shrapnel or bullets.”
This system-wide approach was most applicable when it came to treating brain injuries. Rarely did patients afflicted with the same wounds display the same pathologies. Neurologists looked for answers in the structure of the brain itself, while psychoanalysts like Sigmund Freud and Carl Jung turned to the psyche — our sense of self as constructed through social, biological, and experiential forces.
The First World War helped dispel one more age-old assumption about the human body: the idea that the mind (also referred to as the “soul” in many Western European languages) was disconnected from the body and, as such, unaffected by physical suffering. As Virginia Woolf surmised in her essay “On Being Ill,” making a point already illustrated by soldiers suffering from PTSD, “The creature within cannot separate off from the body like the sheath of a knife.”
The rise of the welfare state
The unique way that patients responded to mental and physical illnesses called for a customization of medical treatment. Away from the front lines, European and American hospitals stopped treating their sick as a homogenized group and made efforts to approach each patient as an individual in need of a particular solution to their equally particular problem.
Systemic thinking soon spread from the medical into the political sphere, where it gave rise to a concept we now know as the welfare state. Welfare states, defined as governments that protect the health and wellbeing of their citizens, are founded on the belief that a society is like a human body, and that socioeconomic classes are connected to each other the same way that organs are.
Just as the injuries of World War I required holistic treatment, so too does the welfare state demand the participation of every citizen. To progressive lawmakers, poverty was not a concern for the poor and their bad decision-making, but a problem created in part by negligence from the rich and powerful; thus poverty — like disease or racism — became a social ill that disadvantaged society as a whole.
“Many prominent scientists in the 1920s,” Meyers and Geroulanos continue, “became socialists.” They add, “Many expressed support for Soviet medicine because they believed that individualism required social welfare.”
Support for the welfare state, born from revolutions in medical care, went on to influence medical care itself. This is most notable in the field of psychiatry, with both the Soviet psychologist Alexander Luria and Jung arguing that “integration of the individual personality” into the wider context of society was the key to solving the various mental problems that plagued the individual.
Though the welfare state failed to live up to the lofty ideals of its principal architects, there can be no denying that public healthcare in the Western world is better off today than it was 100 years ago. Upon closer inspection, it turns out that this positive development is heavily indebted to the unprecedented level of destruction witnessed during World War I.