Once a week.
Subscribe to our weekly newsletter.
Humans once worked just 3 hours a day. Now we're always working, but why?
As human beings we all must do some work for basic survival—but how much? Is there a “minimum daily requirement” of work?
As human beings we all must do some work for basic survival—but how much? Is there a “minimum daily requirement” of work? A number of diverse sources—studies ranging from hunter-gatherer cultures to modern history—would place this figure at about three hours
a day during an adult lifetime.
Marshall Sahlins, author of Stone Age Economics, discovered that before Western influence changed daily life, Kung men, who live in the Kalahari, hunted from two to two and a half days a week, with an average workweek of fifteen hours. Women gathered for about the same period of time each week. In fact, one day’s work supplied a woman’s family with vegetables for the next three days. Throughout the year both men and women worked for a couple of days, then took a couple off to rest and play games, gossip, plan rituals, and visit. . . . It would appear that the workweek in the old days beats today’s banker’s hours by quite a bit.
This suggests that three hours a day is all that we must spend working for survival. One can imagine that in preindustrial times this pattern would make sense. Life was more whole back then, when “work” blended into family time, religious celebrations, and play. Then came the “labor-saving” Industrial Revolution and the compartmentalization of life into “work” and “nonwork”—with work taking an ever-bigger bite out of the average person’s day.
In the nineteenth century the “common man,” with justified aversion to such long hours on the job, began to fight for a shorter workweek. Champions for the workers claimed that fewer hours on the job would decrease fatigue and increase productivity. Indeed, they said, fewer
hours was the natural expression of the maturing Industrial Revolution. People would pursue learning. An educated and engaged citizenry would support our democracy.
But all that came to a halt during the Depression. The workweek, having fallen dramatically from sixty hours at the turn of the century to thirty-five hours during the Depression, became locked in at forty hours for many and has crept up to fifty or even sixty hours a week in recent years. Why? The Right to Life, Liberty, and the Pursuit of a Paycheck?
During the Depression, free time became equated with unemployment. In an effort to boost the economy and reduce unemployment, the New Deal established the forty-hour week and the government as the employer of last resort. Workers were educated to consider employment, not free time, to be their right as citizens (life, liberty, and the pursuit of the paycheck?). Benjamin Kline Hunnicutt, in Work Without End, illuminates the doctrine of “full employment”: Since the Depression, few Americans have thought of work reduction as a natural, continuous, and positive result of economic growth and increased productivity. Instead, additional leisure has been seen as a drain on the economy, a liability on wages, and the abandonment of economic progress.
The myths of “growth is good” and “full employment” established themselves as key values. These dovetailed nicely with the gospel of “full consumption,” which preached that leisure is a commodity to be consumed rather than free time to be enjoyed. For the past half century, full employment has meant more consumers with more “disposable income.” This means increased profits, which means business expansion, which means more jobs, which means more consumers with more disposable income. Consumption keeps the wheels of “progress” moving.
So we see that our concept (as a society) of leisure has changed radically. From being considered a desirable and civilizing component of day-to-day life, it has become something to be feared, a reminder of unemployment during the years of the Depression. As the value of leisure has dropped, the value of work has risen. The push for full employment, along with the growth of advertising, has created a populace increasingly oriented toward work and toward earning more money in order to consume more resources.
To counter all this, a free-time movement has sprung up in the early twenty-first century. A campaign called Take Back Your Time, initiated by filmmaker John de Graaf, advocates for shorter work hours and longer vacations for overworked Americans. Even with all the studies
saying that reduced hours and sufficient leisure actually increase worker productivity, time advocates are swimming upstream against a cultural assumption that the eight-hour workday is next to godliness.
The emerging Slow Food movement also challenges our workaholic lifestyle. This movement suggests that eating is far more than wolfing down fast food alone at your computer, fueling the body for the next leg of the rat race; rather, it’s a time of conviviality, pleasure, and
conversation. In short, it’s civilizing.
Work Takes On New Meaning
In addition, according to Hunnicutt, during the last half century we’ve begun to lose the fabric of family, culture, and community that give meaning to life outside the workplace. The traditional rituals, the socializing, and the simple pleasure of one another’s company
all provided structure for nonwork time, affording people a sense of purpose and belonging. Without this experience of being part of a people and a place, leisure leads more often to loneliness and boredom. Because life outside the workplace has lost vitality and meaning, work
has ceased being a means to an end and become an end in itself.
Meaning, justification, purpose, and even salvation were now sought in work, without a necessary reference to any traditional philosophic or theological structure. Men and women were answering the old religious questions in new ways, and the answers were more and more in terms of work, career, occupation, and professions.
Arlie Hochschild, in her 2001 book, The Time Bind, says that families now have three jobs—work, home, and repair of relationships damaged by ever more time at the office. Even corporations with “family-friendly” policies subtly reward people who spend more time at work (whether they are more productive or not). Some offices are even getting more comfortable, while homes are more hectic, inducing a guilty desire to spend more time working because it’s more restful!
The final piece of the puzzle snaps into place when we look at the shift in the religious attitude toward work that came with the rise of the Protestant ethic. Before that time, work was profane and religion was sacred. Afterward, work was seen as the arena where you worked
out your salvation—and the evidence of a successful religious life was a successful financial life.
So here we are in the twenty-first century. Our paid employment has taken on myriad roles. Our jobs now serve the function that traditionally belonged to religion: They are the place where we seek answers to the perennial questions “Who am I?” and “Why am I here?” and “What’s it all for?” They also serve the function of families, giving answers to the questions “Who are my people?” and “Where do I belong?”
Our jobs are called upon to provide the exhilaration of romance and the depths of love. It’s as though we believed that there is a Job Charming out there—like the Prince Charming in fairy tales—that will fill our needs and inspire us to greatness. We’ve come to believe that, through this job, we would somehow have it all: status, meaning, adventure, travel, luxury, respect, power, tough challenges, and fantastic rewards. All we need is to find Mr. or Ms. Right—Mr. or Ms. Right Job. Indeed, in terms of sheer hours, we may be more wedded to our jobs than to our partners. The vows for better or worse, richer or poorer, in sickness and in health—and often till death do us part—may be better applied to our jobs than to our wives or husbands. Perhaps what keeps some of us stuck in the home-freeway-office loop is this very Job Charming illusion. We’re like the princess who keeps kissing toads, hoping one day to find herself hugging a handsome prince. Our jobs are our toads.
Young people today are swimming against an even stronger current. Our phones and laptops keep us on call to our employers and side hustles (second and third jobs that fit into the cracks of the main one) 24-7. When your primary job isn’t enough, it’s hard to patch together enough hustles to pay off student loans and graduate from living in your parents’ basement. The fact that they’ve dubbed their multiple jobs as hustles indicates how much energy it takes to fledge and flourish. They know full well they are in a brave new world of endless hustle—brave as in it takes courage to move against the undertow. The old conveyor belt of job as identity as career as security and pension is now utterly shredded. Does this liberate young people from the Job(s) Charming syndrome? No. If they are always hustling, they are always “on the job.” Even dating can become networking for the next job opportunity.
From YOUR MONEY OR YOUR LIFE by Vicki Robin and Joe Dominguez, published by Penguin Books, an imprint of Penguin Publishing Group, a division of Penguin Random House, LLC. Copyright © 2008, 2018 by Vicki Robin.
"Deepfakes" and "cheap fakes" are becoming strikingly convincing — even ones generated on freely available apps.
- A writer named Magdalene Visaggio recently used FaceApp and Airbrush to generate convincing portraits of early U.S. presidents.
- "Deepfake" technology has improved drastically in recent years, and some countries are already experiencing how it can weaponized for political purposes.
- It's currently unknown whether it'll be possible to develop technology that can quickly and accurately determine whether a given video is real or fake.
After former U.S. President William Henry Harrison delivered his inaugural speech on March 4, 1841, he posed for a daguerreotype, the first widely available photographic technology. It became the first photo taken of a sitting American president.
As for the eight presidents before Harrison, history can see them only through artistic renderings. (The exception is a handful of surviving daguerreotypes of John Quincy Adams, taken after he left office. In his diary, Adams described them as "hideous" and "too true to the original.")
But a recent project offers a glimpse of what early presidents might've looked like if photographed through modern cameras. Using FaceApp and Airbrush, Magdalene Visaggio, author of books such as "Eternity Girl" and "Kim & Kim," generated a collection of convincing portraits of the nation's first presidents, from George Washington to Ulysses S. Grant.
Modern Presidents George Washington https://t.co/CURJQB0kap— Magdalene Visaggio (@Magdalene Visaggio)1611952243.0
What might be surprising is that Visaggio was able to generate the images without a background in graphic design, using freely available tools. She wrote on Twitter:
"A lot of people think I'm a digital artist or whatever, so let me clarify how I work. Everything you see here is done in Faceapp+Airbrush on my phone. On the outside, each takes between 15-30 mins. Washington was a pretty simple one-and-done replacement."
Ulysses S Grant https://t.co/L1IGXLI3Vl— Magdalene Visaggio (@Magdalene Visaggio)1611959480.0
"Other than that? I am not a visual artist in any sense, just a hobbyist using AI tools see what she can make. I'm actually a professional comics writer."
Did another pass at Lincoln. https://t.co/PdT4QVpMbn— Magdalene Visaggio (@Magdalene Visaggio)1611973947.0
Of course, Visaggio isn't the first person to create deepfakes (or "cheap fakes") of politicians.
In 2017, many people got their first glimpse of the technology through a video depicting former President Barack Obama warning: "We're entering an era in which our enemies can make it look like anyone is saying anything at any point in time." The video quickly reveals itself to be fake, with comedian Jordan Peele speaking for the computer-generated Obama.
While deepfakes haven't yet caused significant chaos in the U.S., incidents in other nations may offer clues of what's to come.
The future of deepfakes
In 2018, Gabon's president Ali Bongo had been out of the country for months receiving medical treatment. After Bongo hadn't been seen in public for months, rumors began swirling about his condition. Some suggested Bongo might even be dead. In response, Bongo's administration released a video that seemed to show the president addressing the nation.
But the video is strange, appearing choppy and blurry in parts. After political opponents declared the video to be a deepfake, Gabon's military attempted an unsuccessful coup. What's striking about the story is that, to this day, experts in the field of deepfakes can't conclusively verify whether the video was real.
The uncertainty and confusion generated by deepfakes poses a "global problem," according to a 2020 report from The Brookings Institution. In 2018, the U.S. Department of Defense released some of the first tools able to successfully detect deepfake videos. The problem, however, is that deepfake technology keeps improving, meaning forensic approaches may forever be one step behind the most sophisticated forms of deepfakes.
As the 2020 report noted, even if the private sector or governments create technology to identify deepfakes, they will:
"...operate more slowly than the generation of these fakes, allowing false representations to dominate the media landscape for days or even weeks. "A lie can go halfway around the world before the truth can get its shoes on," warns David Doermann, the director of the Artificial Intelligence Institute at the University of Buffalo. And if defensive methods yield results short of certainty, as many will, technology companies will be hesitant to label the likely misrepresentations as fakes."
Ancient corridors below the French capital have served as its ossuary, playground, brewery, and perhaps soon, air conditioning.
- People have been digging up limestone and gypsum from below Paris since Roman times.
- They left behind a vast network of corridors and galleries, since reused for many purposes — most famously, the Catacombs.
- Soon, the ancient labyrinth may find a new lease of life, providing a sustainable form of air conditioning.
Ancient mining areas below Paris for limestone (red) and gypsum (green).Credit: Émile Gérards (1859–1920) / Public domain
"If you're brave enough to try, you might be able to catch a train from UnLondon to Parisn't, or No York, or Helsunki, or Lost Angeles, or Sans Francisco, or Hong Gone, or Romeless."
China Miéville's fantasy novel Un Lun Dun is set in an eerie mirror version of London. In it, he hints that other cities have similar doubles. On the list that he offhandedly rattles off, Paris stands out. Because the City of Light really does have a twisted sister. Below Paris Overground is Paris Underground, the City of Darkness.
Most people will have heard of the Catacombs of Paris: subterranean charnel houses for the bones of around six million dead Parisians. They are one of the French capital's most famous tourist attractions – and undoubtedly its grisliest.
But they constitute only a small fragment of what the locals themselves call les carrières de Paris ("the mines of Paris"), a collection of tunnels and galleries up to 300 km (185 miles) long, most of which are off-limits to the public, yet eagerly explored by so-called cataphiles.
The Grand Réseau Sud ("Great Southern Network") takes up around 200 km beneath the 5th, 6th, 14th, and 15th arrondissements (administrative districts), all south of the river Seine. Smaller networks run beneath the 12th, 13th, and 16th arrondissements. How did they get there?
Paris stone and plaster of Paris
It all starts with geology. Sediments left behind by ancient seas created large deposits of limestone in the south of the city, mostly south of the Seine; and gypsum in the north, particularly in the hills of Montmartre and Ménilmontant. Highly sought after as building materials, both have been mined since Roman times.
The limestone is also known as Lutetian limestone (Lutetia is the Latin name for ancient Paris) or simply "Paris stone." It has been used for many famous Paris landmarks, including the Louvre and the grand buildings erected during Georges-Eugène Haussmann's large-scale remodelling of the city in the mid-19th century. The stone's warm, yellowish color provides visual unity and a bright elegance to the city.
The fine-powdered gypsum of northern Paris, used for making quick-setting plaster, was so famed for its quality that "plaster of Paris" is still used as a term of distinction. However, as gypsum is very soluble in water, the underground cavities left by its extraction were extremely vulnerable to collapse.
Like living on top of a rotting tooth: subsidence starts far below the surface, but it can destroy your house.Credit : Delavanne Avocats
In previous centuries, a road would occasionally open up to swallow a chariot, or even a whole house would disappear down a sinkhole. In 1778, a catastrophic subsidence in Ménilmontant killed seven. That's why the Montmartre gypsum quarries were dynamited rather than just left as they were. The remaining gypsum caves were to be filled up with concrete.
The official body governing Paris down below is the Inspection Générale des Carrières (IGC), founded in the late 1770s by King Louis XVI. The IGC was tasked with mapping and, where needed, propping up the current and ancient (and sometimes forgotten) mining corridors and galleries hiding beneath Paris.
A delightful hiding place
Also around that time, the dead of Paris were getting in the way of the living. At the end of the 18th century, their final destination consisted of about 200 small cemeteries, scattered throughout the city — all bursting at the seams, so to speak. There was no room to bury the newly dead, and the previously departed were fouling up both the water and air around their respective churchyards.
Something radical had to happen. And it did. From 1785 until 1814, the smaller cemeteries were emptied of their bones, which were transported with full funerary pomp to their final resting place in the ancient limestone quarries at Tombe-Issoire. Three large and modern cemeteries were opened to receive the remains of subsequent generations of Parisians: Montparnasse, Père-Lachaise, and Passy.
The six million dead Parisians in the Catacombs, from all corners of the capital and across many centuries, together form the world's largest necropolis — their now anonymized skulls and bones methodically stacked, occasionally into whimsical patterns. The Catacombs are fashioned into a memorial to the brevity of life. The message above the entrance reads: Arrête! C'est ici l'empire de la Mort. ("Halt! This is the empire of Death.")
That has not stopped the Catacombs, accessible via a side door to a classicist building on the Avenue du Colonel Henri Rol-Tanguy, making just about every Top 20 list of things to see in Paris.
An underground economy
However, while the Catacombs certainly are the most famous part of the centuries-old network beneath Paris, and in non-pandemic times draw thousands of tourists each day, they constitute just 1.7 km (1 mile) of the 300-km (185-mile) tunneling total.
Subterranean Paris wasn't just used for mining and storing dead people. In the 17th century, Carthusian monks converted the ancient quarries under their monastery into distilleries for the green or yellow liqueur that still carries their name, chartreuse.
Because the mines generally keep a constant cool temperature of around 15° C (60° F), they were also ideal for brewing beer, as happened on a large scale from the end of the 17th century until well into the 20th century. Several caves were dug especially for establishing breweries, and not just because of the ambient temperature: going underground allowed brewers to remain close to their customers without having to pay a premium for real estate up top.
Overview of the Paris Catacombs.Credit: Inspection Générale des Carrières, 1857 / Public domain.
At the end of the 19th century, the underground breweries of the 14th arrondissement alone produced more than a million hectoliters (22 million gallons) per year. One of the most famous of Paris' underground breweries, Dumesnil, stayed in operation until the late 1960s.
In that decade, the network of corridors and galleries south of the Seine, long since abandoned by miners, became the unofficial playground for the young people of Paris. They explored the fantastical world beneath their feet, in some cases via entry points located in their very schools. Fascinated, these cataphiles ("catacomb lovers") read up on old books, explored the subterranean labyrinth, and drew up schematics that were passed around among fellow initiates as reverently as treasure maps.
As Robert Macfarlane writes in Underland, Paris-beneath-their-feet became "a place where people might slip into different identities, assume new ways of being and relating, become fluid and wild in ways that are constrained on the surface."
Some larger caves turned into notorious party zones: a 7-meter-tall gallery below the Val-de-Grâce hospital is widely known as "Salle Z." Over the last few decades, various other locations in subterranean Paris have hosted jazz and rock concerts and rave parties — like no other city, Paris really has an "underground music scene."
Hokusai's Great Wave as the backdrop to the "beach" under Paris.Credit: Reddit
Cataphiles vs. cataphobes
With popularity came increased reports of nuisance and crime — the tunnels provided easy access to telephone cables, which were stolen for the resale value of their copper.
The general public's "discovery" of the underground network led the city of Paris to officially interdict all access by non-authorized persons. That decree dates back to 1955, but the "underground police" have an understanding with seasoned cataphiles. Their main targets are so-called tourists, who by their lack of knowledge expose themselves to risk of injuries or worse, and degrade their surroundings, often leaving loads of litter in their wake.
The understanding does not extend to the IGC. Unlike in the 19th century, when weak cavities were shored up by purpose-built pillars, the policy now is to inject concrete to fill up endangered spaces — thus progressively blocking off parts of the network. That procedure has also been used to separate the Catacombs to prevent "infiltration" of the site by cataphiles.
Many subterranean streets have their own names, signs and all. This is the Rue des Bourguignons (Street of the Burgundians) below the Champs des Capucins (Capuchin Field), neither of which exists on the surface.Credit: Jean-François Gornet via Wikimedia and licensed under
The cataphiles, however, are fighting back. In a game of cat and mouse with the authorities, they are reopening blocked passages and creating chatières ("cat flaps") through which they can squeeze into chambers no longer accessible via other underground corridors.
Catacomb climate control
Alone against the unstoppable tide of concrete, the amateurs of Underground Paris would be helpless. But the fight against climate change may turn the subterranean labyrinths from a liability into an asset — and the City of Paris into an ally.
The UN's 2015 Climate Plan — concluded in Paris, by the way — requires the world to reduce greenhouse gas emissions by 75 percent by 2050. And Paris itself wants to be Europe's greenest city by 2030. More sustainable climate control of our living spaces would be a great help toward both targets. A lot of energy is spent heating houses in winter and cooling them in summer.
This is where the constant temperature of the Parisian tunnels comes in. It's not just good for brewing beer; it's a source of geothermal energy, says Fieldwork, an architectural firm based in Paris. It can be used to temper temperatures, helping to cool houses in summer and warming them in winter.
One catch for the cataphiles: it also works when the underground cavities are filled up with concrete. So perhaps one day, Paris Underground, fully filled up with concrete, will completely fall off the map, reducing the city's formerly real doppelgänger into an air conditioning unit.
Cool in summer, warm in winter: Paris Underground could become Paris A/C.Credit: Fieldwork
Strange Maps #1083
Got a strange map? Let me know at firstname.lastname@example.org.
Meconium contains a wealth of information.
- A new study finds that the contents of an infants' first stool, known as meconium, can predict if they'll develop allergies with a high degree of accuracy.
- A metabolically diverse meconium, which indicates the initial food source for the gut microbiota, is associated with fewer allergies.
- The research hints at possible early interventions to prevent or treat allergies just after birth.
The prevalence of allergies arising in childhood has increased over the last 50 years, with 30 percent of the human population now having some kind of atopic disease such as eczema, food allergies, or asthma. The cause of this increase is still subject to debate, though it has been associated with a number of factors, including changes to the gut microbiomes of infants.
A new study by Canadian researchers published in Cell Reports Medicine may shed further light on how these allergies develop in children by examining the contents of their first diaper.
The things you do for science
The research team examined the first stool of 100 infants from the CHILD Cohort Study. The first stool of an infant is a thick, green, horrid-looking substance called meconium. It consists of various things that the infant ingests during the second half of gestation. Additionally, it provides not only a snapshot of what the infant was exposed to during that time, but it also reveals what the food sources will be for the initial gut bacteria that colonize the baby's digestive tract.
The content of the meconium was examined and found to contain such varied elements as amino acids, lipids, carbohydrates, and myriad other substances.
A graph of the comparative, summed abundance of different elements in a metabolic pathway after scaling to median abundance of each metabolite. The blue figures are those children without atopy, the yellow ones show the data for those with an atopic condition. Petersen et al.
The authors fed this information into an algorithm that used this data, along with the identities of the bacteria present as well as the baby's overall health, to predict which infants would go on to develop allergies within one year. The algorithm got it right 76 percent of the time.
A way to prevent childhood allergies?
Infants whose meconium had a less diverse metabolic niche the initial microbes to settle in the gut were at the highest risk of developing allergies a year later. Many of these elements were associated with the presence or absence of different bacterial groups in the digestive system of the child, which play an increasingly appreciated role in our overall health and development. The findings were summarized by senior co-author Dr. Brett Finlay:
"Our analysis revealed that newborns who developed allergic sensitization by one year of age had significantly less 'rich' meconium at birth, compared to those who didn't develop allergic sensitization."
The findings could be used to help understand how allergies form and even how to prevent them. Co-author Dr. Stuart Turvey commented on this possibility:
"We know that children with allergies are at the highest risk of also developing asthma. Now we have an opportunity to identify at-risk infants who could benefit from early interventions before they even begin to show signs and symptoms of allergies or asthma later in life."
A model for early childhood allergies
Petersen et al.
As shown above, the authors constructed a model of how they believe metabolites and bacterial diversity help prevent allergies. Increased diversity of metabolic products in the meconium encourage the development of "healthy" families of bacteria, like Peptostreptococcaceae, which in turn promote the development of a healthy and diverse gut microbiome. Ultimately, such diversity decreases the likelihood that a child will develop allergies.