Social Media's Dark Side: How Connectivity Uprooted Our Self-Worth
We used to use technology. Now technology uses us. Silicon Valley ethicist Tristan Harris explains how the attention economy hijacked our self-worth for profit.
Tristan Harris is a design thinker, philosopher and entrepreneur.
Called the “closest thing Silicon Valley has to a conscience,” by The Atlantic magazine, Tristan Harris was a Design Ethicist at Google and is now a leader in Time Well Spent, a movement to align technology with our humanity. Time Well Spent aims to heighten consumer awareness about how technology shapes our minds, empower consumers with better ways to use technology and change business incentives and design practices to align with humanity’s best interest.
Tristan is an avid researcher of what influences human behavior, beliefs and interpersonal dynamics, drawing on insights from sleight of hand magic and hypnosis to cults and behavioral economics. Currently he is developing a framework for ethical influence, especially as it relates to the moral responsibility of technology companies.
His work has been featured on PBS NewsHour, The Atlantic Magazine, ReCode, TED, 1843 Economist Magazine, Wired, NYTimes, Der Spiegel, NY Review of Books, Rue89 and more.
Previously, Tristan was CEO of Apture, which Google acquired in 2011. Apture enabled millions of users to get instant, on-the-fly explanations across a publisher network of a billion page views per month.
Tristan holds several patents from his work at Apple, Wikia, Apture and Google. He graduated from Stanford University with a degree in Computer Science, focused on Human Computer Interaction, while dabbling in behavioral economics, social psychology, behavior change and habit formation in Professor BJ Fogg’s Stanford Persuasive Technology lab. He was rated #16 in Inc Magazine’s Top 30 Entrepreneurs Under 30 in 2009.
You can read his most popular essay: How Technology Hijacks People’s Minds – from a Magician and Google’s Design Ethicist.
Tristan Harris: Well, there's a really common misconception that technology is neutral and it's up to us to just choose how to use it.
And so we're sitting there and we're scrolling and we find ourselves in this kind of wormhole and then we say, “Oh man, like, I should really have more self-control." And that's partially true, but what we forget when we talk about it that way is that there's a thousand engineers on the other side of the screen whose job it was to get my finger to do that the next time. And there's this whole playbook of techniques that they use to get us to keep using the software more.
Was design always this manipulative? It wasn't always this way. In fact, back in the 1970s and the early '80s at Xerox PARC when Steve Jobs first went over and saw the graphical user interface, the way people talked about computers and what computers were supposed to be was a “bicycle for our minds” that, here we are, you take a human being and they have a certain set of capacities and capabilities, and then you give them a bicycle and they can go to all these new distances, they're empowered to go to these brand-new places and to do these new things, to have these new capacities.
And that's always been the philosophy of people who make technology: how do we create bicycles for our minds to do and empower us to feel and access more?
Now, when the first iPhone was introduced it was also the philosophy of the technology; how do we empower people to do something more? And in those days it wasn't manipulative because there was no competition for attention. Photoshop wasn't trying to maximize how much attention it took from you—it didn't measure its success that way.
And the Internet overall had been, in the very beginning, not designed to maximize attention, it was just putting things out there, putting things out there, creating these message boards.
It wasn't designed with this whole persuasive psychology that emerged later. What happened is that the attention economy and this race for attention got more and more competitive, and the more competitive it got to get people's attention on, let's say a news website, the more they need to add these design principles, these more manipulative design tactics as ways of holding onto your attention.
And so YouTube goes from being a more neutral, honest tool of just, “Here's a video,” to, “Oh, do you want to see these other videos? And do you want to auto-play the next video? And here's some notifications…”
These products start to look and feel more like media that's about maximizing consumption and less like bicycles for our minds.
And I think that's such a subtle an important thing to recognize, is that more and more of technology is really not on our team to help us spend our time the way we want to, it's more on the team of maximizing how much time we spend on the screen.
Right now, the only way to succeed in the app store is by proving you're really good at getting people's attention. And so by making attention the currency of success it forces all of these good-hearted, good-intentioned people who make apps or make media sites, to do things they don't want to do just because they have to get attention. You have mediation apps that have to send you all thee notifications and add streaks and do all this game-y stuff just to get you to use it the most, as opposed to having the phone itself recognize that there's parts of your life where mediation might want to be at the top of the list for you, because you're defining that, and so that meditation app could win for the right reasons, not for the wrong reasons.
So we always worry about new technologies when they first appear. When people started bringing out the newspapers on the subway people worried, “Oh my god, people are going to stop talking to each other on the subway!” And then TV showed up and people worried, “Oh my god, people are going to spend all this time at home!” And the radio—we always worry, and then somehow it seems to turn out okay. Back in the 1970s people were at home on the telephone and calling each other all the time, we thought, “Oh the kids these days, what are they doing to their minds?” So now it's tempting to say, “Well now the kids these days they're on Snapchat, and therefore we survived all those other technology transitions, nothing really bad happened, so maybe it's all okay, this is just kids being kids in a new way.” And I want to talk about why that's not true and why this is different.
What's different is that, let's say—let's take that telephone example: in the 1970s if someone picked up a telephone to go call their friend and they spent time gossiping on the telephone, we could do that, that's fine. But the telephone wasn't updated every day with new manipulative design principles to be better and better at seducing you into calling your friends.
So what's different with Snapchat is that there are a thousand engineers every single day who work on the product to actually find new ways not to just get you to use it but to kind of tap you both on the shoulder and try to get you into a reciprocity relationship where you owe someone else a response. In fact they have a feature in Snapchat called Snapstreaks. And Snapstreaks count the number of days in a row you've sent a message back-and-forth with someone. So if I have a best friend and we've been chatting every day for a hundred days I see a little fireball and the number 100 next to it. And what that does is, they've just created something now I don't want to lose because I have a streak going, I've got a hundred days and now if I don't send them a message tomorrow I'm going to lose the whole thing.
And when you're a kid actually this has a really big impact on you. I'm not making this up. There's a woman named Emily Weinstein at Harvard who studied the effects of Snapchat and Snapstreaks on kids by accident—it emerged in her interviews.
And she found out that kids, when they go on vacation they have to give their password to up to five other friends to actually have them send messages while they're gone on vacation because they're so worried about losing their Snapstreaks. In fact a lot of kids, they wake up and they see these 30 contacts with different streaks and they have to just take pictures of floors and ceilings just to kind of get through all these Snapstreaks so they don't lose any of them.
So we have to ask, when Snapchat is designing a feature like Snapstreaks, are they doing that because they most want to help kids empower them to live their lives or are they doing that because it's really good at getting their attention?
And when we're parents and we see kids using it this way we have to recognize there's something very different going on here. This was not true in the 1970s when we had a neutral telephone that we would choose to use when we wanted to use it. There are now a thousand people behind our new telephone, which is say, Snapchat, that's being designed and updated every day to be more compelling at addicting and holding onto people's attention. And it's not good for us.
I think we can all feel it. We become more and more on the treadmill to the number of likes or feedback we get, basically, from social media and start tying our own approval, our own self-worth to how much attention we get from other people. I mean, even for me I notice that if I post something it does affect me whether or not I have a lot of likes or few likes.
And it's hard to really, if you think about it and get to the sense of, “My self-worth is completely independent of that.” That's a subtle thing to hold and developmentally, children are more vulnerable to their self-worth being externalized like this.
And the problem is that there's always been, as well, ways of externalizing our self-worth in terms of how much money we have in our bank account or in terms of how many friends we have, but now the externalization of our self-worth is controlled by a handful of companies whose goals are different than our goals.
They're not evil companies, they just have the different goal of maximizing attention, and our goals are not that. But the problem is that their goals become our goals. This is what's actually so dangerous, is that their goals of engaging us the most by having us care about likes become our goals. We actually wake up in the morning as a sovereign human beings and we start caring about the number of likes we got, as if that's our goal in life. That becomes our goal. And it's as if we've been infected, it's as if they've drilled a hole in the back of our head and now they've injected the virus and now we walk around searching for feedback using social media. And they won, if that happens.
And again, it's not because they're evil but they're in a different game, they're trying to maximize attention. But we have to ask a much deeper question, which is: what do we want in our lives and what is our self-worth actually tied to? And maybe it's being virtuous or being a good friend or caring about what matters or living by our values.
There's a whole bunch of things that we can define for ourselves, and I think the less people have to find their own values the more vulnerable they are to someone else coming in and giving them their values. And to fix that we're going to need to talk about different kinds of metrics, different ways of measuring success and monetizing success. So instead of making more money the more time we get, we instead make more money the more we've helped you in your life.
In the 1970s, at the dawn of personal computers, people like Steve Jobs and the scientists at Xerox PARC talked about computers as "bicycles for our mind". Sure, someone was going to make big money selling these hardware units, but the intention was at heart quite pure; computers would give our minds wheels to go farther than ever before. Our capabilities would be augmented by technology, and we would become smarter and more capable. That ethos has not really stuck, and today we find ourselves in a Pavlovian relationship with push notifications, incapacitated by the multi-directional pull on our attention spans.
We've made it through every new technological wave—newspapers, radio, TV, laptops, cell phones—without the social decay that was widely prophesied, but there's something different about smartphones loaded with apps living in the palm of our hand, says tech ethicist Tristan Harris. It would be a mistake not to recognize how, this time, it really is different. Companies today are not more evil than they were in the 1970s, what's changed is the environment they operate in: the attention economy, where the currency is your eyeballs on their product, for as long as possible—precious exposure that can be sold to advertisers. Unlike the neutral technology we once used, and could walk away from, today's technology uses us. Behind every app—Facebook, Twitter, Snapchat—are 1,000 software designers working every day to update and find new psychological levers to keep you hooked to this product. The most powerful development has been that of 'likes', public feedback that externalized our self-worth onto a score card (this has reached new heights with Snapchat's streaks, which research by Emily Weinstein at Harvard has shown puts extreme stress on kids and adolescents.) "These products start to look and feel more like media that's about maximizing consumption and less like bicycles for our minds," says Harris. Is it too late to do something about the attention economy? To find out more about Tristan Harris, head to tristanharris.com.
Once a week.
Subscribe to our weekly newsletter.
Humans may have evolved to be tribalistic. Is that a bad thing?
- From politics to every day life, humans have a tendency to form social groups that are defined in part by how they differ from other groups.
- Neuroendocrinologist Robert Sapolsky, author Dan Shapiro, and others explore the ways that tribalism functions in society, and discuss how—as social creatures—humans have evolved for bias.
- But bias is not inherently bad. The key to seeing things differently, according to Beau Lotto, is to "embody the fact" that everything is grounded in assumptions, to identify those assumptions, and then to question them.
Ancient corridors below the French capital have served as its ossuary, playground, brewery, and perhaps soon, air conditioning.
- People have been digging up limestone and gypsum from below Paris since Roman times.
- They left behind a vast network of corridors and galleries, since reused for many purposes — most famously, the Catacombs.
- Soon, the ancient labyrinth may find a new lease of life, providing a sustainable form of air conditioning.
Ancient mining areas below Paris for limestone (red) and gypsum (green).Credit: Émile Gérards (1859–1920) / Public domain
"If you're brave enough to try, you might be able to catch a train from UnLondon to Parisn't, or No York, or Helsunki, or Lost Angeles, or Sans Francisco, or Hong Gone, or Romeless."
China Miéville's fantasy novel Un Lun Dun is set in an eerie mirror version of London. In it, he hints that other cities have similar doubles. On the list that he offhandedly rattles off, Paris stands out. Because the City of Light really does have a twisted sister. Below Paris Overground is Paris Underground, the City of Darkness.
Most people will have heard of the Catacombs of Paris: subterranean charnel houses for the bones of around six million dead Parisians. They are one of the French capital's most famous tourist attractions – and undoubtedly its grisliest.
But they constitute only a small fragment of what the locals themselves call les carrières de Paris ("the mines of Paris"), a collection of tunnels and galleries up to 300 km (185 miles) long, most of which are off-limits to the public, yet eagerly explored by so-called cataphiles.
The Grand Réseau Sud ("Great Southern Network") takes up around 200 km beneath the 5th, 6th, 14th, and 15th arrondissements (administrative districts), all south of the river Seine. Smaller networks run beneath the 12th, 13th, and 16th arrondissements. How did they get there?
Paris stone and plaster of Paris
It all starts with geology. Sediments left behind by ancient seas created large deposits of limestone in the south of the city, mostly south of the Seine; and gypsum in the north, particularly in the hills of Montmartre and Ménilmontant. Highly sought after as building materials, both have been mined since Roman times.
The limestone is also known as Lutetian limestone (Lutetia is the Latin name for ancient Paris) or simply "Paris stone." It has been used for many famous Paris landmarks, including the Louvre and the grand buildings erected during Georges-Eugène Haussmann's large-scale remodelling of the city in the mid-19th century. The stone's warm, yellowish color provides visual unity and a bright elegance to the city.
The fine-powdered gypsum of northern Paris, used for making quick-setting plaster, was so famed for its quality that "plaster of Paris" is still used as a term of distinction. However, as gypsum is very soluble in water, the underground cavities left by its extraction were extremely vulnerable to collapse.
Like living on top of a rotting tooth: subsidence starts far below the surface, but it can destroy your house.Credit : Delavanne Avocats
In previous centuries, a road would occasionally open up to swallow a chariot, or even a whole house would disappear down a sinkhole. In 1778, a catastrophic subsidence in Ménilmontant killed seven. That's why the Montmartre gypsum quarries were dynamited rather than just left as they were. The remaining gypsum caves were to be filled up with concrete.
The official body governing Paris down below is the Inspection Générale des Carrières (IGC), founded in the late 1770s by King Louis XVI. The IGC was tasked with mapping and, where needed, propping up the current and ancient (and sometimes forgotten) mining corridors and galleries hiding beneath Paris.
A delightful hiding place
Also around that time, the dead of Paris were getting in the way of the living. At the end of the 18th century, their final destination consisted of about 200 small cemeteries, scattered throughout the city — all bursting at the seams, so to speak. There was no room to bury the newly dead, and the previously departed were fouling up both the water and air around their respective churchyards.
Something radical had to happen. And it did. From 1785 until 1814, the smaller cemeteries were emptied of their bones, which were transported with full funerary pomp to their final resting place in the ancient limestone quarries at Tombe-Issoire. Three large and modern cemeteries were opened to receive the remains of subsequent generations of Parisians: Montparnasse, Père-Lachaise, and Passy.
The six million dead Parisians in the Catacombs, from all corners of the capital and across many centuries, together form the world's largest necropolis — their now anonymized skulls and bones methodically stacked, occasionally into whimsical patterns. The Catacombs are fashioned into a memorial to the brevity of life. The message above the entrance reads: Arrête! C'est ici l'empire de la Mort. ("Halt! This is the empire of Death.")
That has not stopped the Catacombs, accessible via a side door to a classicist building on the Avenue du Colonel Henri Rol-Tanguy, making just about every Top 20 list of things to see in Paris.
An underground economy
However, while the Catacombs certainly are the most famous part of the centuries-old network beneath Paris, and in non-pandemic times draw thousands of tourists each day, they constitute just 1.7 km (1 mile) of the 300-km (185-mile) tunneling total.
Subterranean Paris wasn't just used for mining and storing dead people. In the 17th century, Carthusian monks converted the ancient quarries under their monastery into distilleries for the green or yellow liqueur that still carries their name, chartreuse.
Because the mines generally keep a constant cool temperature of around 15° C (60° F), they were also ideal for brewing beer, as happened on a large scale from the end of the 17th century until well into the 20th century. Several caves were dug especially for establishing breweries, and not just because of the ambient temperature: going underground allowed brewers to remain close to their customers without having to pay a premium for real estate up top.
Overview of the Paris Catacombs.Credit: Inspection Générale des Carrières, 1857 / Public domain.
At the end of the 19th century, the underground breweries of the 14th arrondissement alone produced more than a million hectoliters (22 million gallons) per year. One of the most famous of Paris' underground breweries, Dumesnil, stayed in operation until the late 1960s.
In that decade, the network of corridors and galleries south of the Seine, long since abandoned by miners, became the unofficial playground for the young people of Paris. They explored the fantastical world beneath their feet, in some cases via entry points located in their very schools. Fascinated, these cataphiles ("catacomb lovers") read up on old books, explored the subterranean labyrinth, and drew up schematics that were passed around among fellow initiates as reverently as treasure maps.
As Robert Macfarlane writes in Underland, Paris-beneath-their-feet became "a place where people might slip into different identities, assume new ways of being and relating, become fluid and wild in ways that are constrained on the surface."
Some larger caves turned into notorious party zones: a 7-meter-tall gallery below the Val-de-Grâce hospital is widely known as "Salle Z." Over the last few decades, various other locations in subterranean Paris have hosted jazz and rock concerts and rave parties — like no other city, Paris really has an "underground music scene."
Hokusai's Great Wave as the backdrop to the "beach" under Paris.Credit: Reddit
Cataphiles vs. cataphobes
With popularity came increased reports of nuisance and crime — the tunnels provided easy access to telephone cables, which were stolen for the resale value of their copper.
The general public's "discovery" of the underground network led the city of Paris to officially interdict all access by non-authorized persons. That decree dates back to 1955, but the "underground police" have an understanding with seasoned cataphiles. Their main targets are so-called tourists, who by their lack of knowledge expose themselves to risk of injuries or worse, and degrade their surroundings, often leaving loads of litter in their wake.
The understanding does not extend to the IGC. Unlike in the 19th century, when weak cavities were shored up by purpose-built pillars, the policy now is to inject concrete to fill up endangered spaces — thus progressively blocking off parts of the network. That procedure has also been used to separate the Catacombs to prevent "infiltration" of the site by cataphiles.
Many subterranean streets have their own names, signs and all. This is the Rue des Bourguignons (Street of the Burgundians) below the Champs des Capucins (Capuchin Field), neither of which exists on the surface.Credit: Jean-François Gornet via Wikimedia and licensed under
The cataphiles, however, are fighting back. In a game of cat and mouse with the authorities, they are reopening blocked passages and creating chatières ("cat flaps") through which they can squeeze into chambers no longer accessible via other underground corridors.
Catacomb climate control
Alone against the unstoppable tide of concrete, the amateurs of Underground Paris would be helpless. But the fight against climate change may turn the subterranean labyrinths from a liability into an asset — and the City of Paris into an ally.
The UN's 2015 Climate Plan — concluded in Paris, by the way — requires the world to reduce greenhouse gas emissions by 75 percent by 2050. And Paris itself wants to be Europe's greenest city by 2030. More sustainable climate control of our living spaces would be a great help toward both targets. A lot of energy is spent heating houses in winter and cooling them in summer.
This is where the constant temperature of the Parisian tunnels comes in. It's not just good for brewing beer; it's a source of geothermal energy, says Fieldwork, an architectural firm based in Paris. It can be used to temper temperatures, helping to cool houses in summer and warming them in winter.
One catch for the cataphiles: it also works when the underground cavities are filled up with concrete. So perhaps one day, Paris Underground, fully filled up with concrete, will completely fall off the map, reducing the city's formerly real doppelgänger into an air conditioning unit.
Cool in summer, warm in winter: Paris Underground could become Paris A/C.Credit: Fieldwork
Strange Maps #1083
Got a strange map? Let me know at firstname.lastname@example.org.
"Deepfakes" and "cheap fakes" are becoming strikingly convincing — even ones generated on freely available apps.
- A writer named Magdalene Visaggio recently used FaceApp and Airbrush to generate convincing portraits of early U.S. presidents.
- "Deepfake" technology has improved drastically in recent years, and some countries are already experiencing how it can weaponized for political purposes.
- It's currently unknown whether it'll be possible to develop technology that can quickly and accurately determine whether a given video is real or fake.
After former U.S. President William Henry Harrison delivered his inaugural speech on March 4, 1841, he posed for a daguerreotype, the first widely available photographic technology. It became the first photo taken of a sitting American president.
As for the eight presidents before Harrison, history can see them only through artistic renderings. (The exception is a handful of surviving daguerreotypes of John Quincy Adams, taken after he left office. In his diary, Adams described them as "hideous" and "too true to the original.")
But a recent project offers a glimpse of what early presidents might've looked like if photographed through modern cameras. Using FaceApp and Airbrush, Magdalene Visaggio, author of books such as "Eternity Girl" and "Kim & Kim," generated a collection of convincing portraits of the nation's first presidents, from George Washington to Ulysses S. Grant.
Modern Presidents George Washington https://t.co/CURJQB0kap— Magdalene Visaggio (@Magdalene Visaggio)1611952243.0
What might be surprising is that Visaggio was able to generate the images without a background in graphic design, using freely available tools. She wrote on Twitter:
"A lot of people think I'm a digital artist or whatever, so let me clarify how I work. Everything you see here is done in Faceapp+Airbrush on my phone. On the outside, each takes between 15-30 mins. Washington was a pretty simple one-and-done replacement."
Ulysses S Grant https://t.co/L1IGXLI3Vl— Magdalene Visaggio (@Magdalene Visaggio)1611959480.0
"Other than that? I am not a visual artist in any sense, just a hobbyist using AI tools see what she can make. I'm actually a professional comics writer."
Did another pass at Lincoln. https://t.co/PdT4QVpMbn— Magdalene Visaggio (@Magdalene Visaggio)1611973947.0
Of course, Visaggio isn't the first person to create deepfakes (or "cheap fakes") of politicians.
In 2017, many people got their first glimpse of the technology through a video depicting former President Barack Obama warning: "We're entering an era in which our enemies can make it look like anyone is saying anything at any point in time." The video quickly reveals itself to be fake, with comedian Jordan Peele speaking for the computer-generated Obama.
While deepfakes haven't yet caused significant chaos in the U.S., incidents in other nations may offer clues of what's to come.
The future of deepfakes
In 2018, Gabon's president Ali Bongo had been out of the country for months receiving medical treatment. After Bongo hadn't been seen in public for months, rumors began swirling about his condition. Some suggested Bongo might even be dead. In response, Bongo's administration released a video that seemed to show the president addressing the nation.
But the video is strange, appearing choppy and blurry in parts. After political opponents declared the video to be a deepfake, Gabon's military attempted an unsuccessful coup. What's striking about the story is that, to this day, experts in the field of deepfakes can't conclusively verify whether the video was real.
The uncertainty and confusion generated by deepfakes poses a "global problem," according to a 2020 report from The Brookings Institution. In 2018, the U.S. Department of Defense released some of the first tools able to successfully detect deepfake videos. The problem, however, is that deepfake technology keeps improving, meaning forensic approaches may forever be one step behind the most sophisticated forms of deepfakes.
As the 2020 report noted, even if the private sector or governments create technology to identify deepfakes, they will:
"...operate more slowly than the generation of these fakes, allowing false representations to dominate the media landscape for days or even weeks. "A lie can go halfway around the world before the truth can get its shoes on," warns David Doermann, the director of the Artificial Intelligence Institute at the University of Buffalo. And if defensive methods yield results short of certainty, as many will, technology companies will be hesitant to label the likely misrepresentations as fakes."
Meconium contains a wealth of information.
- A new study finds that the contents of an infants' first stool, known as meconium, can predict if they'll develop allergies with a high degree of accuracy.
- A metabolically diverse meconium, which indicates the initial food source for the gut microbiota, is associated with fewer allergies.
- The research hints at possible early interventions to prevent or treat allergies just after birth.
The prevalence of allergies arising in childhood has increased over the last 50 years, with 30 percent of the human population now having some kind of atopic disease such as eczema, food allergies, or asthma. The cause of this increase is still subject to debate, though it has been associated with a number of factors, including changes to the gut microbiomes of infants.
A new study by Canadian researchers published in Cell Reports Medicine may shed further light on how these allergies develop in children by examining the contents of their first diaper.
The things you do for science
The research team examined the first stool of 100 infants from the CHILD Cohort Study. The first stool of an infant is a thick, green, horrid-looking substance called meconium. It consists of various things that the infant ingests during the second half of gestation. Additionally, it provides not only a snapshot of what the infant was exposed to during that time, but it also reveals what the food sources will be for the initial gut bacteria that colonize the baby's digestive tract.
The content of the meconium was examined and found to contain such varied elements as amino acids, lipids, carbohydrates, and myriad other substances.
A graph of the comparative, summed abundance of different elements in a metabolic pathway after scaling to median abundance of each metabolite. The blue figures are those children without atopy, the yellow ones show the data for those with an atopic condition. Petersen et al.
The authors fed this information into an algorithm that used this data, along with the identities of the bacteria present as well as the baby's overall health, to predict which infants would go on to develop allergies within one year. The algorithm got it right 76 percent of the time.
A way to prevent childhood allergies?
Infants whose meconium had a less diverse metabolic niche the initial microbes to settle in the gut were at the highest risk of developing allergies a year later. Many of these elements were associated with the presence or absence of different bacterial groups in the digestive system of the child, which play an increasingly appreciated role in our overall health and development. The findings were summarized by senior co-author Dr. Brett Finlay:
"Our analysis revealed that newborns who developed allergic sensitization by one year of age had significantly less 'rich' meconium at birth, compared to those who didn't develop allergic sensitization."
The findings could be used to help understand how allergies form and even how to prevent them. Co-author Dr. Stuart Turvey commented on this possibility:
"We know that children with allergies are at the highest risk of also developing asthma. Now we have an opportunity to identify at-risk infants who could benefit from early interventions before they even begin to show signs and symptoms of allergies or asthma later in life."
A model for early childhood allergies
Petersen et al.
As shown above, the authors constructed a model of how they believe metabolites and bacterial diversity help prevent allergies. Increased diversity of metabolic products in the meconium encourage the development of "healthy" families of bacteria, like Peptostreptococcaceae, which in turn promote the development of a healthy and diverse gut microbiome. Ultimately, such diversity decreases the likelihood that a child will develop allergies.