Once a week.
Subscribe to our weekly newsletter.
A new study suggests that reports of the impending infertility of the human male are greatly exaggerated.
- A new review of a famous study on declining sperm counts finds several flaws.
- The old report makes unfounded assumptions, has faulty data, and tends toward panic.
- The new report does not rule out that sperm counts are going down, only that this could be quite normal.
Several years ago, a meta-analysis of studies on human fertility came out warning us about the declining sperm counts of Western men. It was widely shared, and its findings were featured on the covers of popular magazines. Indeed, its findings were alarming: a nearly 60 percent decline in sperm per milliliter since 1973 with no end in sight. It was only a matter of time, the authors argued, until men were firing blanks, literally.
Well… never mind.
It turns out that the impending demise of humanity was greatly exaggerated. As the predicted infertility wave crashed upon us, there was neither a great rush of men to fertility clinics nor a sudden dearth of new babies. The only discussions about population decline focus on urbanization and the fact that people choose not to have kids rather than not being able to have them.
Now, a new analysis of the 2017 study says that lower sperm counts is nothing to be surprised by. Published in Human Fertility, its authors point to flaws in the original paper's data and interpretation. They suggest a better and smarter reanalysis.
Counting tiny things is difficult
The original 2017 report analyzed 185 studies on 43,000 men and their reproductive health. Its findings were clear: "a significant decline in sperm counts… between 1973 and 2011, driven by a 50-60 percent decline among men unselected by fertility from North America, Europe, Australia and New Zealand."
However, the new analysis points out flaws in the data. As many as a third of the men in the studies were of unknown age, an important factor in reproductive health. In 45 percent of cases, the year of the sample collection was unknown- a big detail to miss in a study measuring change over time. The quality controls and conditions for sample collection and analysis vary widely from study to study, which likely influenced the measured sperm counts in the samples.
Another study from 2013 also points out that the methods for determining sperm count were only standardized in the 1980s, which occurred after some of the data points were collected for the original study. It is entirely possible that the early studies gave inaccurately high sperm counts.
This is not to say that the 2017 paper is entirely useless; it had a much more rigorous methodology than previous studies on the subject, which also claimed to identify a decline in sperm counts. However, the original study had more problems.
Garbage in, garbage out
Predictable as always, the media went crazy. Discussions of the decline of masculinity took off, both in mainstream and less-than-reputable forums; concerns about the imagined feminizing traits of soy products continued to increase; and the authors of the original study were called upon to discuss the findings themselves in a number of articles.
However, as this new review points out, some of the findings of that meta-analysis are debatable at best. For example, the 2017 report suggests that "declining mean [sperm count] implies that an increasing proportion of men have sperm counts below any given threshold for sub-fertility or infertility," despite little empirical evidence that this is the case.
The WHO offers a large range for what it considers to be a healthy sperm count, from 15 to 250 million sperm per milliliter. The benefits to fertility above a count of 40 million are seen as minimal, and the original study found a mean sperm concentration of 47 million sperm per milliliter.
Healthy sperm, healthy man?
The claim that sperm count is evidence of larger health problems is also scrutinized in this new article. While it is true that many major health problems can impact reproductive health, there is little evidence that it is the "canary in the coal mine" for overall well-being. A number of studies suggest that any relation between lifestyle choices and this part of reproductive health is limited at best.
Lastly, ideas that environmental factors could be at play have been debunked since 2017. While the original paper considered the idea that pollutants, especially from plastics, could be at fault, it is now known that this kind of pollution is worse in the parts of the world that the original paper observed higher sperm counts in (i.e., non-Western nations).
There never was a male fertility crisis
The authors of the new review do not deny that some measurements are showing lower sperm counts, but they do question the claim that this is catastrophic or part of a larger pathological issue. They propose a new interpretation of the data. Dubbed the "Sperm Count Biovariability hypothesis," it is summarized as:
"Sperm count varies within a wide range, much of which can be considered non-pathological and species-typical. Above a critical threshold, more is not necessarily an indicator of better health or higher probability of fertility relative to less. Sperm count varies across bodies, ecologies, and time periods. Knowledge about the relationship between individual and population sperm count and life-historical and ecological factors is critical to interpreting trends in average sperm counts and their relationships to human health and fertility."
Still, the authors note that lower sperm counts "could decline due to negative environmental exposures, or that this may carry implications for men's health and fertility."
However, they disagree that the decline in absolute sperm count is necessarily a bad sign for men's health and fertility. We aren't at civilization ending catastrophe just yet.
A year of disruptions to work has contributed to mass burnout.
- Junior members of the workforce, including Generation Z, are facing digital burnout.
- 41 percent of workers globally are thinking about handing in their notice, according to a new Microsoft survey.
- A hybrid blend of in-person and remote work could help maintain a sense of balance – but bosses need to do more.
More than half of 18 to 25 year-olds in the workforce are considering quitting their job. And they're not the only ones.
In a report called The Next Great Disruption Is Hybrid Work – Are We Ready?, Microsoft found that as well as 54% of Generation Z workers, 41% of the entire global workforce could be considering handing in their resignation.
Similarly, a UK and Ireland survey found that 38% of employees were planning to leave their jobs in the next six months to a year, while a US survey reported that 42% of employees would quit if their company didn't offer remote working options long term.
New work trends
Based on surveys with over 30,000 workers in 31 countries, the Microsoft report – which is the latest in the company's annual Work Trend Index series – pulled in data from applications including Teams, Outlook and Office 365, to gauge productivity and activity levels. It highlighted seven major trends, which show the world of work has been profoundly reshaped by the pandemic:
- Flexible work is here to stay
- Leaders are out of touch with employees and need a wake-up call
- High productivity is masking an exhausted workforce
- Gen Z is at risk and will need to be re-energized
- Shrinking networks are endangering innovation
- Authenticity will spur productivity and wellbeing
- Talent is everywhere in a hybrid world
"Over the past year, no area has undergone more rapid transformation than the way we work," Microsoft CEO Satya Nadella says in the report. "Employee expectations are changing, and we will need to define productivity much more broadly – inclusive of collaboration, learning and wellbeing to drive career advancement for every worker, including frontline and knowledge workers, as well as for new graduates and those who are in the workforce today. All this needs to be done with flexibility in, when, where and how people work."
Organizations have become more siloed
While the report highlights the opportunities created by increased flexible and remote working patterns, it warns that some people are experiencing digital exhaustion and that remote working could foster siloed thinking. With the shift to remote working, much of the spontaneous sharing of ideas that can take place within a workplace was lost. In its place are scheduled calls, regular catch-ups and virtual hangouts. The loss of in-person interaction means individual team members are more likely to only interact with their closest coworkers.
"At the onset of the pandemic, our analysis shows interactions with our close networks at work increased while interactions with our distant network diminished," the report says. "This suggests that as we shifted into lockdown, we clung to our immediate teams for support and let our broader network fall to the wayside. Simply put, companies became more siloed than they were pre-pandemic."
Burnout or drop out
One of the other consequences of the shift to remote and the reliance on tech-based communications has been the phenomenon of digital burnout. And for those who have most recently joined the workforce, this has been a significant challenge.
The excitement of joining a new employer, maybe even securing a job for the first time, usually comes with meeting lots of new people, becoming familiar with a new environment and adapting to new situations. But for many, the pandemic turned that into a daily routine of working from home while isolated from co-workers.
"Our findings have shown that for Gen Z and people just starting in their careers, this has been a very disruptive time," says LinkedIn Senior Editor-at-Large, George Anders, quoted in the report. "It's very hard to find their footing since they're not experiencing the in-person onboarding, networking and training that they would have expected in a normal year."
But it is perhaps the data around quitting that is one of the starkest indications that change is now the new normal. Being able to work remotely has opened up new possibilities for many workers, the report found. If you no longer need to be physically present in an office, your employer could, theoretically, be located anywhere. Perhaps that's why the research found that "41% of employees are considering leaving their current employer this year".
In addition to that, 46% of the people surveyed for the Microsoft report said they might relocate their home because of the flexibility of remote working.
A hybrid future
In looking for ways to navigate their way through all this change, employers should hold fast to one word, the report says – hybrid. An inflexible, location-centred approach to work is likely to encourage those 41% of people to leave and find somewhere more to their tastes. Those who are thinking of going to live somewhere else, while maintaining their current job, might also find themselves thinking of quitting if their plans are scuppered.
But remote working is not a panacea for all workforce ills. "We can no longer rely solely on offices to collaborate, connect, and build social capital. But physical space will still be important," the report says. "We're social animals and we want to get together, bounce ideas off one another, and experience the energy of in-person events. Moving forward, office space needs to bridge the physical and digital worlds to meet the unique needs of every team – and even specific roles."
Bosses must meet challenges head on
Although the majority of business leaders have indicated they will incorporate elements of the hybrid working model, the report also found many are out of touch with workforce concerns more widely.
For, while many workers say they are struggling (Gen Z – 60%; new starters – 64%), and 54% of the general workforce feels overworked, business leaders are having a much better experience. Some 61% said they were 'thriving', which is in stark contrast to employees who are further down the chain of command.
Jared Spataro, corporate vice president at Microsoft 365, writes in the report: "Those impromptu encounters at the office help keep leaders honest. With remote work, there are fewer chances to ask employees, 'Hey, how are you?' and then pick up on important cues as they respond. But the data is clear: our people are struggling. And we need to find new ways to help them."
An early feasibility study finds a potential new treatment for Alzheimer's disease.
For the past few years, Annabelle Singer and her collaborators have been using flickering lights and sound to treat mouse models of Alzheimer's disease, and they've seen some dramatic results.
Now they have results from the first human feasibility study of the flicker treatment, and they're promising.
"We looked at safety, tolerance, and adherence, and several different biological outcomes, and the results were excellent—better than we expected," says Singer, assistant professor in the biomedical engineering department at Georgia Institute of Technology and Emory University.
Singer shared preliminary results of the feasibility study in October at the American Neurological Association annual meeting. Now she is a corresponding author with Emory neurology researcher James Lah of a paper outlining their findings in the journal Alzheimer's & Dementia: Translational Research & Clinical Interventions.
The flicker treatment stimulates gamma waves, manipulating neural activity, recruiting the brain's immune system, and clearing pathogens—in short, waging a successful fight against a progressive disease that still has no cure.
Previous research already had shown that sensory areas in the human brain will entrain to flickering stimuli for seconds to hours. But this was the first time Singer and her team were able to test gamma sensory stimulation over an extended period of time.
The study included 10 patients with Alzheimer's-associated mild cognitive impairment, which required them to wear an experimental visor and headphones that exposed one group to light and sound at 40 hertz for an hour a day over eight weeks, and another group for four weeks after a delayed start.
"We were able to tune the devices to a level of light and sound that was not only tolerable, but it also successfully provoked an underlying brain response," Lah says.
As they hoped and expected, Singer says, "there was widespread entrainment." That is, brain activity—in this case, gamma waves—synchronized to the external stimulation.
Gamma waves are associated with high-level cognitive functions, like perception and memory. Disruptions to these waves have been found in various neurological disorders, not just Alzheimer's.
The human feasibility study showed that the gamma flicker treatment was safe and tolerable. And perhaps most surprising, patients followed the full treatment schedule.
"Adherence was one of our major concerns," Singer says. "When we sent the device home with the participants, would they use it? Would they use it for a couple of days, and that would be it? We were pleasantly surprised that this wasn't the case."
Adherence rates hovered around 90%, with no severe adverse effects reported during the study or the 10-month open label extension (some patients even volunteered to continue being monitored and assessed after the study, though this data wasn't part of the published research).
Some participants reported mild discomfort that could have been flicker related—dizziness, ringing in the ears, and headaches. But overall, Singer says, the device's safety profile was excellent. She also reported some positive biological outcomes.
"We looked at default mode network connectivity, which is basically how different brain regions that are particularly active during wakeful rest and memory, interact with each other," Singer says.
"There are deficits in this network in Alzheimer's, but after eight weeks [of treatment], we found strengthening in that connectivity." This may indicate stronger interactions and therefore better communication between these regions.
In previous animal studies, the 40Hz of flicker stimulated mouse gamma waves, significantly reducing some Alzheimer's pathogenic hallmarks and recruited microglia to the cause—these are the primary immune cells in the brain. But in the human study, there were no clear changes in the presence of pathogens amyloid beta or p-Tau.
However, as with the mouse studies, "we are getting immune engagement in humans," Singer says. The flicker treatment sparked the activity of cytokines, proteins used in cell signaling—a sign that flicker had engaged the brain's immune system.
"That is something we want to see, because microglia do things like clear out pathogens. Some people think that part of what's going wrong in Alzheimer's is a failure of this clearance mechanism," Singer says.
She and Lah have wondered if a longer human trial would make a difference—would there be reduced amyloid activity, for example.
"So far, this is very preliminary, and we're nowhere close to drawing conclusions about the clinical benefit of this treatment," Lah says. "But we now have some very good arguments for a larger, longer study with more people."
Funding for the study came from the National Institute of Neurological Disorders and Stroke at the National Institutes of Health, the Packard Foundation, the Friends and Alumni of Georgia Tech, the Lane Family, the Wright Family, and Cognito Therapeutics. Any findings, conclusions, and recommendations are those of the researchers and not necessarily of the sponsors.
Annabelle Singer owns shares in Cognito Therapeutics, which funded the human study at Emory Brain Health Center. Cognito aims to develop gamma stimulation-related products. These conflicts are managed by Georgia Tech's Office of Research Integrity Assurance.
Source: Georgia Tech
Original Study DOI: 10.1002/trc2.12178
Like Fox Mulder, people have a lot of strong opinions about UFOs.
- Extraordinary claims, such as that UFOs have visited our planet or that aliens exist, require extraordinary evidence.
- Personal testimonies are simply insufficient to conclude that UFOs and aliens are real.
- Good luck having a rational conversation about it with anyone on Twitter.
If you were hoping, based on the title, that I was going to describe the time I saw strange lights moving at inexplicable speeds across the sky, then I am about to disappoint you. This column is actually about my experience in the public spotlight talking publicly about the connection between UFOs and extraterrestrial life. It was quite a ride.
Extraordinary claims require extraordinary evidence
On May 30, 2021, I wrote an op-ed in the New York Times titled "I'm a Physicist Who Studies Aliens. U.F.O's Don't Impress Me." I don't get to write titles for the op-ed pieces that I write for the Times — or most other places for that matter — but, as provocative as it was, I think it captured the essence of my point. As a scientist involved in the search for life and "techno-signatures" on exoplanets, I think a lot about what constitutes a good data set for that search. In other words, what kind of data would allow me to make the extraordinary claim that my colleagues and I have detected life and a civilization on another world?
The answer had better be "some really damn good data." By that, I mean we would need to take measurements that gave us strong and unambiguous evidence for the conclusion that a particular signal comes from a technologically advanced civilization. My main point in the op-ed was that no matter how intriguing those navy UFO sightings may be — and they are interesting — they don't provide the extraordinary evidence that we need to conclude that aliens are visiting us. My arguments are in the op-ed if you want to see them. What I want to focus on here is what happened after that argument appeared in the press.
The UFO brigade
Within an hour or so, my email and Twitter feed began to light up. By the end of the day, I was getting more messages about the piece than almost anything I had ever written before. Some of the messages affirmed the argument I was making. The majority, however, wanted me to know how wrong I was. These fell into two categories.
There was a fair amount of "the-government-knows-but-won't-tell-us" kind of narrative. Lots of these messages were pretty mean.
Some people wanted me to know that UFOs — or as the government calls them, Unidentified Aerial Phenomena (UAPs) — didn't need to be connected to aliens for them to be of interest. I had however made this exact point in my piece.
I have no problem with people wanting to have those navy sightings (and others) studied scientifically and openly. My colleagues on the NASA techno-signature grant made this point in an excellent Washington Post op-ed. I think the process of vetting those sightings would greatly help show the public exactly how science works. These days, we have a real problem with science denial, and anything that lets folks understand "what science knows and how it knows it" would be helpful.
Credit: IgorZh / 280582371 via Adobe Stock
But many folks (on Twitter and elsewhere) held that the connection between UFOs and aliens had already been made. I got floods of links to one video or website after another, the vast majority of which were people describing something they had seen in the sky. As I said in the op-ed, there really isn't much science you can do with personal testimony. One can't get accurate measurements of velocity or distance or mass or any of the other basic data that a physicist would need to tell if something really was moving in a way that's impossible for human technology.
Some folks reached out because they had seen a UFO themselves. I totally understand that these people would want someone to take their reports seriously. I would never tell them that they did not have their experiences. What I can say, however, is that there's nothing a scientist can do to transform the description of that experience into data that we would need to reach the extraordinary conclusion that they had seen evidence for extraterrestrial life.
The truth is out there
But a significant fraction of what I saw coming across Twitter and elsewhere was just pure vehemence. These folks were absolutely certain that UFOs were alien visitors. There was a fair amount of "the-government-knows-but-won't-tell-us" kind of narrative. Lots of these messages were pretty mean. I got the sense that, for these folks, no public investigation — no matter how open and transparent — would be satisfying unless it reached the conclusion that they already believed. This, of course, is the opposite of science.
So, it was an interesting week. My brief time in the UFO limelight (I did many interviews on places like CNN, BBC, etc.) showed me a lot about how people view the question. Since I am so deeply involved with techno-signature science, I felt it was important to try to explain how the science of life and the universe works as a science.
But I don't really want to spend a whole lot more time in that limelight. It was kind of exhausting, in large part because of the vehemence of the true believers. I will follow whatever happens after the government's report comes out with interest. But my bet (and every researcher makes a bet when they choose their research topics) is that the data I need to know about life elsewhere in the universe will come from telescopes, not jet fighters.
- Robert Koch proved that microbes cause infectious diseases and famously identified the etiological agents of anthrax, tuberculosis, and cholera.
- Louis Pasteur proved that life does not spontaneously generate from non-living material, made a significant advance in chemistry, invented pasteurization, and revolutionized vaccines.
- Koch and Pasteur had a bitter rivalry over the invention of the anthrax vaccine.
This following is an excerpt from Viruses, Pandemics, and Immunity by Arup K. Chakraborty and Andrey S. Shaw. Reprinted with Permission from The MIT PRESS. Copyright 2021.
Koch's Postulates, Anthrax, Tuberculosis, and Cholera
Robert Koch was born in Germany in 1843. His father was a mining engineer. He taught himself to read by the time he was five years old, and was a brilliant student from a young age. After a brief time studying natural sciences in college, he decided to pursue a career in medicine. Koch held positions as a physician in various capacities in Poland, Berlin, and other places, including service as a doctor during the Franco-Prussian War. Koch also developed a deep interest in basic scientific research. Today, we would consider him a clinician-scientist, someone who tries to understand clinical aspects of diseases using basic scientific principles. Anthrax is a disease that affects both animals and humans, and was a problem in Koch's time. Koch showed that, for a wide variety of animals, he could transfer disease from one animal to another by transferring blood from the infected animal to the healthy animal. All animals thus infected exhibited the same disease symptoms, and had the same rod-shaped bacteria in their blood. This convinced Koch that this specific bacterium caused anthrax. Koch's work on anthrax was the first to associate a specific microbe with a particular disease.
Credit: Wikipedia / Public domain
It was known that healthy cattle got sick if they grazed on fields long after anthrax-infected cattle had grazed there. This was a puzzle because Koch had determined that anthrax bacteria in the blood of infected animals lost their infectivity after a few days. He decided that he would need to watch the bacteria over time and would need to develop methods to grow the bacteria in the lab. Koch developed methods to keep bacteria growing for days. This process is called "growing bacteria in culture" — "culture" refers to the medium in which the bacteria are grown. This method is now used millions of times every day around the world. When a doctor suspects that you have a bacterial infection, a small sample is collected from the suspected site of infection (e.g., a wound) and is sent to the pathology department. If the sample contains bacteria, they grow out in culture and can be identified. The doctor can use such a positive test result to prescribe the right treatment to kill the identified bacteria.
With the technique to culture bacteria in hand, using his careful observational skills, Koch noted that on occasion anthrax bacteria would convert into opaque spheres. He showed that these spheres could be dried and then reconstituted weeks later by immersing them into fluid. He suspected that the bacteria, if converted into the dry spheres, or spores, could remain dormant for years. Indeed, this is the case, and they can cause bacterial infection when ingested by uninfected cattle. Some readers will remember the anthrax scares in the United States right after the September 11, 2001, terrorist attacks when an individual placed anthrax spores into envelopes that were sent to members of the US Congress.
As Koch become more skilled in the identification of disease causing bacteria, his methods became codified into rules known as "Koch's postulates":
- The microorganism must be present in every instance of the disease.
- The microorganism must be isolated from a human with the disease and grown in culture.
- The microorganism grown in culture must cause the same disease upon injection in an animal.
- Samples from the animal in which disease thus occurs must contain the same organism that was present in the original diseased human.
These principles were applied successfully to determine the causative agents of many of the infectious diseases known today. Knowing the identity of specific bacteria that cause a particular disease, scientists and drug companies can develop antibiotics that can kill the bacteria and cure disease. Before the discovery of antibiotics, a small skin cut could get infected and result in death. We live in a world that would be unrecognizable to a nineteenth-century inhabitant because many previously lethal infections and diseases are easily treatable today.
Koch's other significant discoveries were the bacteria that cause tuberculosis and cholera. Tuberculosis (TB) is a disease that has longed plagued the world. It was often called consumption, because it made the person look pale and thin as the disease progressed. In opera, it is the disease from which both Mimi in La Bohème and Violetta in La Traviata suffer, reflecting a nineteenth century association of romantic tragedy with this disease. TB caused enormous numbers of deaths in the nineteenth century. Since it is a contagious disease, it flourished partly because of the increased population density in growing cities during the industrial revolution. Throughout the nineteenth century, about one out of a 100 people living in New York City died of tuberculosis, roughly the same percentage as the number of reported COVID-19 deaths in the city and ten times more than die of influenza in an average year.
Until Koch showed that it was an infectious disease caused by bacteria, many thought that TB was an inherited disease. In 1882, using his postulates, Koch identified the causative organism and called it Mycobacterium tuberculosis. This discovery led to a better understanding of the disease and the development of TB-specific antibiotics, which along with better sanitation resulted in a significant decline in infections and deaths. However, TB is still widespread and remains a scourge in many parts of the world. In 2018 TB killed 1.5 million people globally. An especially worrisome development has been the recent emergence of antibiotic-resistant forms of M. tuberculosis. A vaccine that is used around the world to protect against TB infection has only limited efficacy.
Cholera is a waterborne disease that causes severe diarrhea and vomiting. Cholera outbreaks still cause havoc in the developing world today. The most recent outbreak of cholera was in Sudan in 2019. Another recent cholera epidemic was in Haiti in 2010 following a devastating earthquake. There are indications that, sadly, peacekeepers from the United Nations who came to provide aid may have inadvertently brought the disease to Haiti.
Koch received worldwide fame for his identification of the organism that causes cholera. However, the causative bacterium was, in fact, first described by an Italian physician, Filippo Pacini (1812–1883), many years earlier. During the period from the late 1810s to the early 1860s, there were worldwide cholera pandemics that started in India in the state of Bengal. Pacini was a doctor in Florence, Italy, when the pandemic spread into that city. Using a microscope to examine tissues collected during autopsies of those who had succumbed to cholera, Pacini discovered the bacterium, Vibrio cholerae, that causes the disease. Remarkably, few, including Koch, knew of his discovery, perhaps partly because the germ theory of disease was not widely accepted when Pacini described his observations. Better sanitation has made cholera a disease that is nonexistent in the developed world.
Koch, who passed away in 1910, received many significant recognitions for his work, including the 1905 Nobel Prize for Physiology and Medicine. We now turn to the work of his bitter rival, Louis Pasteur.
Pasteur, Rabies, and a New Paradigm for Vaccination
Pasteur was born in 1822 in France. His father was a tanner. Pasteur did not distinguish himself academically as a youngster. After earning a bachelor's degree in philosophy in 1840, he was drawn to the study of science and mathematics. As is true today, in Pasteur's time only the very best students in France were admitted to the École Normale Supérieure. Pasteur was ranked very poorly the first time he took the admission test, but he was ultimately admitted in 1843. This hiccup at an early stage of his scientific career did not prevent Pasteur from going on to make transformative discoveries.
When he was a professor at the University of Strasbourg, in France, Pasteur made a very important fundamental discovery which involved the mathematical concept of chirality. Two similar objects that have non-superimposable mirror images are chiral. The simplest example is our right and left hands — look at images of your hands in a mirror and you will see what we mean. While studying crystals of salts of certain acids, Pasteur demonstrated that molecules can also be chiral, either "right-handed" or "left-handed." He developed a way to detect the handedness of such so-called optical isomers. A good example of handedness is sugar. Sugar is a chiral molecule that is right-handed, and sugar substitutes can be composed of its left-handed optical isomer. The molecule in our body that metabolizes sugar does not act on its left-handed isomer, and thus we do not metabolize it. But our taste buds cannot tell the difference between the right- and left-handed molecules, and so such sugar-substitutes would taste the same to us — a free lunch, so to speak.
Pasteur's next big achievement was inventing a process which was later named pasteurization. One of Pasteur's students was the son of a wine merchant, and he interested Pasteur into thinking about how to prevent wine from spoiling. It was commonly believed at the time that wine spoiled because it spontaneously decomposed into constituents that tasted like vinegar. Pasteur showed that this was not true and that a microbe called yeast was required to carry out these chemical transformations. Pasteur also showed that contamination of wine with various other microbes causes it to spoil. He invented a process to prevent this, which exploited the fact that microbes die at high temperatures. The wine was heated to about 120–140°F, and then sealed and cooled. Although this pasteurization process was invented to prevent wine from spoiling, it is rarely used for this purpose today. Rather, pasteurization is used all over the world to prevent milk from spoiling.
Pasteur also played a significant role in laying to rest the popular idea that many living organisms were spontaneously generated from nonliving matter. As old bread begins to grow mold and maggots suddenly appear in old meat, it wasn't illogical to believe that these changes occurred spontaneously. Evidence against this so-called spontaneous generation theory had already been presented many times by other scientists, but Koch's postulates and an elegant and definitive experiment that Pasteur did in 1859 finally proved to be its death knell. Pasteur stored boiled (pasteurized) water in two curved, swan-necked flasks. Boiling the water ensured that there were no microbes in it when the experiment was started. The construction of the swan-neck flask was such that microbes in the air would get stuck to the walls of the tube and not reach the water if the flask was vertically positioned. Pasteur positioned one flask vertically, and the other was tilted. As time passed, the water in the vertical flask did not show any signs of a developing biofilm (you must have seen such disgusting biofilms when you leave food in the refrigerator too long and microbes grow on it). A biofilm developed in the water in the tilted flask because microbes in the air could reach the water. This demonstration was the end of the spontaneous generation theory.
Most scientists can only dream of making contributions as important as Pasteur's discovery of optical isomers, his invention of pasteurization, and his experiment ending the debate on the spontaneous generation of microbes. But his contributions to vaccination had such a major impact on humankind that the achievements described above have been completely overshadowed.
Pasteur's paradigm-shifting advance in vaccine development was the result of a serendipitous observation he made while studying chicken cholera. On one occasion, after chickens were injected with the bacteria that causes this disease, they did not fall ill. On further investigation, Pasteur discovered that the batch of chicken cholera he had injected had spoiled. Rather than buy new chickens, he reinjected the first set of chickens with the properly cultured bacteria. To his surprise, the chickens did not fall ill. Pasteur is often credited with the famous remark, "In the field of observation, chance favors the prepared mind." Pasteur's mind was apparently prepared, as he immediately understood that he had stumbled on to an important finding. He realized that you could protect animals from infection with a live disease-causing microbe by vaccinating them with a weakened form of the same microbe.
This was a paradigm shift compared to previous methods. Variolation involved administering the real pathogen. Jenner's use of cowpox involved finding a pathogen that was harmless to humans but related to the one that caused human disease. Pasteur's new method did not involve hunting for a related harmless pathogen or risking the life of the patient by administering the real pathogen. Rather, a weakened or attenuated form of the pathogen could be used. It is worth remarking here that variolation involved powdering material from smallpox scabs and waiting a few days before administering it. These procedures were probably inadvertent ways to attenuate the virulence of the pathogen. But it was Pasteur who in the period between 1879 and 1880 formalized the procedure of using an attenuated pathogen to protect people from infectious diseases, and established a method that continues to be used today. Pasteur labeled his new method of protecting against various infectious diseases "vaccination," in honor of Jenner's use of vaccinia (cowpox) to protect against smallpox. Pasteur used his method to vaccinate birds to prevent cholera and vaccinate sheep to prevent anthrax.
Pasteur then developed a vaccine to protect against rabies. Rabies is an infection of the brain caused by the bite of an infected dog or, more often today, a bat. People infected with rabies exhibit symptoms like paralysis and fear of water. This fear of water is why the disease is sometimes called hydrophobia. Almost everyone afflicted with the disease died. Pasteur was a chemist and not a physician, but having successfully developed two animal vaccines, he was keen to use his skills to cure a human disease or protect people from it. We know today that rabies is caused by a virus, but the concept of a virus was not known at that time. Therefore, Pasteur could neither follow Koch's postulates to identify the causative agent of the disease, nor grow the microbe in culture using methods that worked for bacteria. It was known, however, that the infectious agent was present in saliva. Pasteur is claimed to have been fearless, having used his mouth to suck on a glass tube to draw saliva from a rabid dog.
Using a method developed by his close collaborator, Emile Roux, Pasteur then attenuated the infectious agent. Pasteur and Roux administered the attenuated infectious agent and showed that multiple doses of this vaccine could protect dogs from rabies infection. Pasteur was anxious to try his vaccine in humans. He knew that the onset of symptoms usually lagged the dog bite by about a month. His idea was to vaccinate people soon after the dog bite, and hope that the protective mechanism (about which they knew nothing) would kick in quickly enough to cure them. The first two patients on whom this procedure was tried were in the late stages of the disease, however, and both died before they could receive the second dose of the vaccine. But Pasteur persevered.
In 1885, Joseph Meister, a 9-year-old boy living in Alsace, was bitten multiple times by a rabid dog that was subsequently shot by the police. His doctor learned that Pasteur had developed a vaccine to treat rabies. In an attempt to evade what was a certain death sentence, he brought Joseph and his family to Paris the next day to seek Pasteur's help. Emile Roux refused to use the vaccine on Joseph as he worried that it was not ready for humans and was too dangerous to try on a child who did not yet have any symptoms of the disease. Pasteur found another physician to administer the treatment and it worked — the boy was cured. Subsequently, others would undergo the same procedure with similar success, and Pasteur became a hero. Years later, Meister, who was devoted to Pasteur, would serve as a caretaker at the Pasteur Institute.
Throughout this period, Pasteur worked on an anthrax vaccine even though Koch, who discovered the bacterium that causes anthrax, was also working on a vaccine. This led to terrible arguments between the two acclaimed scientists. Koch and his students wrote that Pasteur did not even know how to make pure cultures of bacteria. Pasteur fought back. These arguments took on an even more vicious tone during the Franco-Prussian War. In 1868, Pasteur had been awarded an honorary degree by the faculty of Bonn in Germany. He returned it during the war with an angry accompanying note. Thus began a division between German and French immunologists that would continue for decades, to the detriment of scientific advances. Pasteur ultimately achieved success in a public experiment in 1881 when he successfully vaccinated several sheep and cows, and a goat, to protect them from anthrax. He then declared it to be a great French victory. Ironically, an anthrax vaccine had earlier been developed by Jean Joseph Henri Toussaint (1847–1890) in France. Pasteur used the same method as Toussaint, but claimed that his approach was different.
When Pasteur died, he left his laboratory notebooks to his oldest male child, and his will stipulated that these notebooks should never leave the family and were to be passed on from generation to generation by male inheritors. In 1964, Pasteur's last surviving direct male descendant donated his laboratory notebooks to the Bibliotheque Nationale in Paris. Scholars studying these notebooks found that Pasteur often cut corners in his work, sometimes did not describe exactly how experiments were done, and did not always publicly report results transparently. This straddling of ethical boundaries or, worse, fraud is severely punished by the modern scientific community. Indeed, as it should be, because the scientific edifice is built on the trust that scientists have described their studies honestly. Mistakes can happen, of course, but deceit is not allowed.
Pasteur's straddling of ethical boundaries notwithstanding, he made groundbreaking advances that had a transformative effect. Vaccines designed using Pasteur's methods have saved more lives than any other medical procedure. Vaccines that protect children from diseases are a major contributor to the dramatic reduction in childhood mortality. Today, we crave a vaccine against the ongoing COVID-19 pandemic, and hopefully, we will have one soon. Pasteur's work is the foundation for this hope.
For his achievements, Pasteur received many honors and awards. Many streets around the world are named after him, and the Pasteur Institute in Paris is a famed medical research laboratory that Pasteur himself founded. He died in 1895, when he was 72, and his body is interred in the first floor of the original building of the Pasteur Institute. Visitors are welcome to see his tomb and the apartment where Pasteur lived at the end of his life. Pasteur did not receive a Nobel Prize because the first of these was awarded in 1901.
A cartogram makes it easy to compare regional and national GDPs at a glance.
- On these maps, each hexagon represents one-thousandth of the world's economy.
- That makes it easy to compare the GDP of regions and nations across the globe.
- There are versions for nominal GDP and GDP adjusted for purchasing power.
Shanghai's skyline at night. According to the GDP (PPP) map, China is the world's largest economy. But that oft-cited statistic says more about the problems of PPP as a yardstick than about the economic prominence of China per se.Credit: Adi Constantin, CC0 1.0
If you want to rank the regions and countries of the world, area and population are but crude predictors of their importance. A better yardstick is GDP, or gross domestic product, defined as the economic value produced in a given region or country over a year.
Who's hot and who's not
And these two maps are possibly the best instruments to show who's hot and who's not, economically speaking. They are in fact cartograms, meaning they abandon geographic accuracy in order to represent the values of another dataset, in this case GDP: the larger a region or country is shown relative to its actual size, the greater its GDP, and vice versa.
So far, so familiar. What's unique about these maps is how this is done. Both are composed of hexagons, exactly 1,000 each. And each of those hexagons represents 0.1 percent of global GDP. That makes it fascinatingly easy to assess and compare the economic weight of various regions and countries throughout the world.
Did we say easy? Scratch that. GDP comes in two main flavors: nominal and PPP-adjusted, with each map showing one.
Nominal GDP does not take into account differences in standard of living. It simply converts local GDP values into U.S. dollars based on foreign exchange rates. GDP adjusted for purchasing power parity (PPP) takes into account living standards. $100 buys more stuff in poor countries than it does in rich countries. If you get more bang for your buck in country A, its PPP-adjusted GDP will be relatively higher than in country B.
Nominal GDP is a good way of comparing the crude economic size of various countries and regions, while GDP (PPP) is an attempt to measure the relative living standards between countries and regions. But this is also just an approximation, since it does not measure the distribution of personal income. For that, we have the Gini index, which measures the relative (in)equality of income distribution.
In other words, PPP factors in the high cost of living in mature markets as an economic disadvantage, while giving slightly more room to low-cost economies elsewhere. Think of it as the Peters projection of GDP models.
Who's number one: the U.S. or China?
The economy of the world, divided into a thousand hexagons.Credit: BerryBlue_BlueBerry, reproduced with kind permission
The difference is important, though, since the versions produce significantly different outcomes. The most salient one: on the nominal GDP map, the United States remains the world's largest economy. But on the PPP-adjusted GDP map, China takes the top spot. However, it is wrong to assume on this basis that China is the world's biggest economy.
As this article explains in some detail, PPP-adjusted GDP is not a good yardstick for comparing the size of economies – nominal GPD is the obvious measure for that. GDP (PPP) is an attempt to compare living standards; but even in that respect, it has its limitations. For example, $100 might buy you more in country B, but you might not be able to buy the stuff you can get in country A.
Both maps, shown below, are based on data from the IMF published in the first quarter of 2021. For the sake of brevity, we will have a closer look at the nominal GDP map and leave comparisons with the PPP map to you.
For the nominal map, global GDP is just over U.S. $93.86 trillion. That means each of the hexagons represents about U.S. $93.86 billion.
The worldwide overview clearly shows which three regions are the world's economic powerhouses. Despite the rise of East Asia (265 hexagons), North America (282) is still number one, with Europe (250) placing a close third. Added up, that's just three hexagons shy of 80 percent of the world's GDP. The remaining one-fifth of the world's economy is spread — rather thinly, by necessity — across Southeast Asia & Oceania (56), South Asia (41), the Middle East (38), South America (32), Africa (27), and North & Central Asia (9).
California über alles
California's economy is bigger than that of all of South America or Africa.Credit: BerryBlue_BlueBerry, reproduced with kind permission
Thanks to the hexagons, the maps get more interesting the closer you zoom in on them.
In North America, the United States (242) overshadows Canada (20) and Mexico (13); and within the U.S., California (37) outperforms not just all other states, but also most other countries — and a few continents — worldwide. To be fair, Texas (21), New York (20), Florida (13), and Illinois (10) also do better than many individual nations.
Interestingly, states that look the same on a "regular" map are way out of each others' leagues on this one. Missouri is four hexagons but Nebraska only one. Alabama has three but Mississippi only one.
The granularity of the map goes beyond the state level, showing (in red) the economic heft of certain Metropolitan Statistical Areas (MSAs), within or across state lines. The New York City-Newark-Jersey City one is 20 hexagons, that is, 2 percent of the world's GDP. The Greater Toronto Area is five hexagons, a quarter of all of Canada. And Greater Mexico City is three hexagons. That's the same as the entire state of Oregon.
By comparison, South America (32) and Africa (27) are small fry on the GDP world map. But each little pond has its own big fish. In the former, it's Brazil (16), in particular, the state of São Paulo (5), which on its own is bigger than any other country in South America. In Africa, there is one regional leader each in the north, center, and south: Egypt (4), Nigeria (5), and South Africa (3), respectively.
Economically, Italy is bigger than Russia
Europe's "Big Five" represent three-fifths of the continent's GDP. The Asian part of the former Soviet Union is an economic afterthought.Credit: BerryBlue_BlueBerry, reproduced with kind permission
Europe is bewilderingly diverse, so it helps to focus on the "Big Five" economies: Germany (46), UK (33), France (31), Italy (22), and Spain (16). They comprise three-fifths of Europe's GDP.
Each of these five has one or more regional economic engines. In Germany, it's the state of North Rhine-Westphalia, and in France, it's Île de France (both 10). In the UK, it's obviously London (8), in Italy Lombardy (5), and in Spain, it's a photo-finish between Madrid and Catalonia (both 3).
Interesting about Europe's economies are the small countries that punch well above their geographic and/or demographic weight, such as the Netherlands (11) and Switzerland (9).
Slide across to Eastern Europe and things get pretty mono-hexagonal. Poland (7) stands out positively and Russia (18) negatively. The former superpower, spread out over two continents, has an economy smaller than Italy's. Three individual German states have a GDP larger than that of the Moscow Metropolitan Area (5), the seat and bulk of Russia's economic power.
China, the biggest fish in a big pond
Australia and South Korea's GDPs are about equal, and each is about a third of Japan's. But even put together, these three add up to barely half of China's economic weight.Credit: BerryBlue_BlueBerry, reproduced with kind permission
In the 1980s, the United States was wary of Japan's rise to global prominence. But as this map shows, that fear was misguided — or rather, slightly misdirected. It's China (177) that now dominates the region economically, putting even the land of the Rising Sun (57) in the shade. South Korea (19) and Taiwan (8) look a lot larger than on a "regular" map, but it's clear who rules the roost here.
Interestingly, China's hubs are mainly but not exclusively coastal. Yes, there's Guangdong (19), Jiangsu (18), and Shandong (13), plus a few other provinces with access to the sea. But the inland provinces of Henan (10), Sichuan (9), and Hubei (8) are economically as important as any mid-sized European country. Tibet (1) and Xinjiang (2), huge on the "regular" map, are almost invisible here.
In the ASEAN countries (36), Thailand (6), Singapore (4), and the Indonesian island of Java (7) stand out. Economically, Oceania is virtually synonymous with Australia (17) — sorry, New Zealand (3).
As for South Asia and the Middle East, India (32) is clearly the dominant player, outperforming near neighbors Bangladesh (4) and Pakistan (3), as well as more distant ones like Saudi Arabia (9), Turkey (8), and Iran (7). But that's cold comfort for a country that sees itself as a challenger to China's dominance.
The PPP-adjusted GDP world map looks slightly different from the nominal GDP one. China is the #1 country and East Asia the #1 region.Credit: BerryBlue_BlueBerry, reproduced with kind permission
Strange Maps #1089
Got a strange map? Let me know at firstname.lastname@example.org.