Once a week.
Subscribe to our weekly newsletter.
How dictators flourish through social media
What does the power of the online mob hold for tyranny and conformity?
- Disney CEO Bob Iger's recent critique of social media hate is indicative of a greater problem.
- The psychology of the crowd could be responsible for the hate and conformity seen online.
- Polymath Gustave Le Bon's crowd psychology theories could be more relevant today than ever.
Disney CEO Bob Iger recently made comments during his Humanitarian Award speech that critiqued the role of social media and even likened it as a tool that Hitler would have loved and something that would-be dictators could utilize.
Iger joins a rising chorus of voices both condemning and critiquing the ever present role that social media has in our lives and societies. His speech focused on the degeneration of our civic values, loss of individualistic thought and the hateful atmosphere he feels is part of the very structure of online social platforms. Here's what he had to say:
"Apathy is actually growing. In the last few years, we have been harshly reminded that hate takes many forms, sometimes disguising itself as more socially acceptable expression like fear or resentment or contempt. It is consuming our public discourse and shaping our country and culture into something that is wholly unrecognizable to those of us who still believe in civility, human rights and basic decency."
Iger's comments regarding social media hate touches upon a greater problem endemic to social networks that few even realize exists: online crowds possess an underlying and unconscious power that enables them to push forward ignoble agendas, whatever they may be.
Bob Iger’s comments on Hitler and social media hate
In his critique, Bob Iger referenced Hitler, an individual that always stirs up some controversy, not to mention a person that was quite familiar with seizing opportunities to manipulate the minds of the masses.
"Hitler would have loved social media. It's the most powerful marketing tool an extremist could ever hope for because by design social media reflects a narrow worldview filtering out anything that challenges our beliefs while constantly validating our convictions and amplifying our deepest fears."
While proponents of social media activism and other empty online sloganeering would argue to the defense of the medium that a figure like Hitler couldn't gain traction in this new world and that these social networks encourage diversity of opinion, give voice to the marginalized, expose others to differing worldview points and so on… The harrowing fact of the matter is that its much more complicated than that.
We're beginning to realize that social media is not a panacea for encouraging logical thought, engaging in any kind of serious discourse or enlightening the populace. Iger illustrates this point:
"It creates a false sense that everyone shares the same opinion. Social media allows evil to prey on troubled minds and lost souls, and we all know that social news feeds can contain more fiction than fact, propagating vile ideology that has no place in a civil society that values human life."
In other words, the same forces that have been responsible for destitution and horror for centuries has all the more power now to rise to the occasion through the channels of an all encompassing medium.
Digital tyrants, anonymity and crowd psychology
Iger isn't the only one sounding the alarm. Jonathan A. Greenblatt, chief executive of the Anti-Defamation League is on record saying
"Social media companies have created, allowed and enabled extremists to move their message from the margins to the mainstream. In the past, they couldn't find audiences for their poison. Now, with a click or a post or a tweet, they can spread their ideas with a velocity we've never seen before."
Whether the originators of this hate and rabble rousing know it or not — they're tapping into the age-old power of the crowd, the crowd that has now migrated from physical space into the digital domain.
In order to get a better understanding of this phenomenon, we must first look back to French polymath Gustave Le Bon and his crowd psychology theories.
With a remarkable insight, Le Bon was able to grasp the uncanny mindset that can take over an entire group of people and completely change the character of their collective, regardless of each individual's psychological constitution:
…the individual forming part of a crowd acquires, solely from numerical considerations, a sentiment of invincible power which allows him to yield to instincts which, had he been alone, he would perforce have kept under restraint.
This type of change is echoed in the way that many people interact with others online. Jonathan Albright, a research director at Columbia University's Tow Center for Digital Journalism, remarked that "Social media is emboldening people to cross the line and push the envelope on what they are willing to say to provoke and to incite. The problem is clearly expanding."
What we have now is a perpetual state of crowd mentality paired with that seductive power of anonymity to spew whatever nonsense or bile comes to people's minds. Not only does this lead to the hate that Iger was describing, but it also leads to a mob mentality that gloms onto any cultural narrative flavor of the day.
Political philosopher Hannah Arendt described this phenomenon as "Mass opinion without a critical evaluation of the consequences of their actions."
The herding power of crowds online
Crowds being only capable of thinking in images are only to be impressed by images. It is only images that terrify or attract them and become motives of action. — Gustave Le Bon
In an interesting study titled The online crowd: A contradiction in terms? On the potentials of Gustave Le Bon's crowd psychology in an analysis of affective blogging, psychological research was laid down to explore the implications Le Bon's theory has for the concept of the online crowd.
Author Carsten Stage stated that the crowd has now been transformed from an entity residing in a particular spatial location into one that is now "a series of more flexible, adaptable and mobile entities. The improvised crowds are imagined in the iconic image of social networks such as Facebook and Twitter […] allowing a temporary and transient public to be formed on and sometimes off-line."
While looking at instances of collective "flaming or rage, hyping, bullying, and mourning" on certain social media channels, the author found that trying to find a distinction "between a relatively controlled individual reflecting on the message of the media text and the uncontrolled (non-)person of the crowd seems difficult to uphold." In other words, the individual's and the crowd's thoughts are indistinguishable from one another.
Furthermore, the advent of crowd practices are now always open for online media users at any time of the day and in any place of the world. For example, whenever a new pulp scandal breaks or atrocities are uncovered and news breaks in some never ending Orwellian Ministry of Truth fashion, the digital crowd always has an opportunity to interact with the feuilleton fodder of the day.
Stage considers this a virtual version of the self-perpetuating logic of the crowd, which was described by Elias Canetti in his book, Crowds and Power:
"Suddenly everywhere is black with people and more come streaming from all sides as though streets had only one direction. Most of them do not know what has happened and, if questioned, have no answer; but they hurry to be there where most other people are."
Revealing the dynamics inherent in crowd psychology and its effects on online social media interactions is just the first step in understanding the perils of social media when used as a tool for hate or thoughtless conformity.
- The Tyranny of the Many is (Perhaps) as Bad as the Tyranny of One ... ›
- End online hate by treating it like a virus - Big Think ›
Why mega-eruptions like the ones that covered North America in ash are the least of your worries.
- The supervolcano under Yellowstone produced three massive eruptions over the past few million years.
- Each eruption covered much of what is now the western United States in an ash layer several feet deep.
- The last eruption was 640,000 years ago, but that doesn't mean the next eruption is overdue.
The end of the world as we know it
Panoramic view of Yellowstone National Park
Image: Heinrich Berann for the National Park Service – public domain
Of the many freak ways to shuffle off this mortal coil – lightning strikes, shark bites, falling pianos – here's one you can safely scratch off your worry list: an outbreak of the Yellowstone supervolcano.
As the map below shows, previous eruptions at Yellowstone were so massive that the ash fall covered most of what is now the western United States. A similar event today would not only claim countless lives directly, but also create enough subsidiary disruption to kill off global civilisation as we know it. A relatively recent eruption of the Toba supervolcano in Indonesia may have come close to killing off the human species (see further below).
However, just because a scenario is grim does not mean that it is likely (insert topical political joke here). In this case, the doom mongers claiming an eruption is 'overdue' are wrong. Yellowstone is not a library book or an oil change. Just because the previous mega-eruption happened long ago doesn't mean the next one is imminent.
Ash beds of North America
Ash beds deposited by major volcanic eruptions in North America.
Image: USGS – public domain
This map shows the location of the Yellowstone plateau and the ash beds deposited by its three most recent major outbreaks, plus two other eruptions – one similarly massive, the other the most recent one in North America.
The Huckleberry Ridge eruption occurred 2.1 million years ago. It ejected 2,450 km3 (588 cubic miles) of material, making it the largest known eruption in Yellowstone's history and in fact the largest eruption in North America in the past few million years.
This is the oldest of the three most recent caldera-forming eruptions of the Yellowstone hotspot. It created the Island Park Caldera, which lies partially in Yellowstone National Park, Wyoming and westward into Idaho. Ash from this eruption covered an area from southern California to North Dakota, and southern Idaho to northern Texas.
About 1.3 million years ago, the Mesa Falls eruption ejected 280 km3 (67 cubic miles) of material and created the Henry's Fork Caldera, located in Idaho, west of Yellowstone.
It was the smallest of the three major Yellowstone eruptions, both in terms of material ejected and area covered: 'only' most of present-day Wyoming, Colorado, Kansas and Nebraska, and about half of South Dakota.
The Lava Creek eruption was the most recent major eruption of Yellowstone: about 640,000 years ago. It was the second-largest eruption in North America in the past few million years, creating the Yellowstone Caldera.
It ejected only about 1,000 km3 (240 cubic miles) of material, i.e. less than half of the Huckleberry Ridge eruption. However, its debris is spread out over a significantly wider area: basically, Huckleberry Ridge plus larger slices of both Canada and Mexico, plus most of Texas, Louisiana, Arkansas, and Missouri.
This eruption occurred about 760,000 years ago. It was centered on southern California, where it created the Long Valley Caldera, and spewed out 580 km3 (139 cubic miles) of material. This makes it North America's third-largest eruption of the past few million years.
The material ejected by this eruption is known as the Bishop ash bed, and covers the central and western parts of the Lava Creek ash bed.
Mount St Helens
The eruption of Mount St Helens in 1980 was the deadliest and most destructive volcanic event in U.S. history: it created a mile-wide crater, killed 57 people and created economic damage in the neighborhood of $1 billion.
Yet by Yellowstone standards, it was tiny: Mount St Helens only ejected 0.25 km3 (0.06 cubic miles) of material, most of the ash settling in a relatively narrow band across Washington State and Idaho. By comparison, the Lava Creek eruption left a large swathe of North America in up to two metres of debris.
The difference between quakes and faults
The volume of dense rock equivalent (DRE) ejected by the Huckleberry Ridge event dwarfs all other North American eruptions. It is itself overshadowed by the DRE ejected at the most recent eruption at Toba (present-day Indonesia). This was one of the largest known eruptions ever and a relatively recent one: only 75,000 years ago. It is thought to have caused a global volcanic winter which lasted up to a decade and may be responsible for the bottleneck in human evolution: around that time, the total human population suddenly and drastically plummeted to between 1,000 and 10,000 breeding pairs.
Image: USGS – public domain
So, what are the chances of something that massive happening anytime soon? The aforementioned mongers of doom often claim that major eruptions occur at intervals of 600,000 years and point out that the last one was 640,000 years ago. Except that (a) the first interval was about 200,000 years longer, (b) two intervals is not a lot to base a prediction on, and (c) those intervals don't really mean anything anyway. Not in the case of volcanic eruptions, at least.
Earthquakes can be 'overdue' because the stress on fault lines is built up consistently over long periods, which means quakes can be predicted with a relative degree of accuracy. But this is not how volcanoes behave. They do not accumulate magma at constant rates. And the subterranean pressure that causes the magma to erupt does not follow a schedule.
What's more, previous super-eruptions do not necessarily imply future ones. Scientists are not convinced that there ever will be another big eruption at Yellowstone. Smaller eruptions, however, are much likelier. Since the Lava Creek eruption, there have been about 30 smaller outbreaks at Yellowstone, the last lava flow being about 70,000 years ago.
As for the immediate future (give or take a century): the magma chamber beneath Yellowstone is only 5 percent to 15 percent molten. Most scientists agree that is as un-alarming as it sounds. And that its statistically more relevant to worry about death by lightning, shark, or piano.
Strange Maps #1041
Got a strange map? Let me know at firstname.lastname@example.org.
The pandemic has many people questioning whether they ever want to go back to the office.
If one thing is clear about remote work, it's this: Many people prefer it and don't want their bosses to take it away.
When the pandemic forced office employees into lockdown and cut them off from spending in-person time with their colleagues, they almost immediately realized that they favor remote work over their traditional office routines and norms.
As remote workers of all ages contemplate their futures – and as some offices and schools start to reopen – many Americans are asking hard questions about whether they wish to return to their old lives, and what they're willing to sacrifice or endure in the years to come.
Even before the pandemic, there were people asking whether office life jibed with their aspirations.
We spent years studying “digital nomads" – workers who had left behind their homes, cities and most of their possessions to embark on what they call “location independent" lives. Our research taught us several important lessons about the conditions that push workers away from offices and major metropolitan areas, pulling them toward new lifestyles.
Legions of people now have the chance to reinvent their relationship to their work in much the same way.
Big-city bait and switch
Most digital nomads started out excited to work in career-track jobs for prestigious employers. Moving to cities like New York and London, they wanted to spend their free time meeting new people, going to museums and trying out new restaurants.
But then came the burnout.
Although these cities certainly host institutions that can inspire creativity and cultivate new relationships, digital nomads rarely had time to take advantage of them. Instead, high cost of living, time constraints and work demands contributed to an oppressive culture of materialism and workaholism.
Pauline, 28, who worked in advertising helping large corporate clients to develop brand identities through music, likened city life for professionals in her peer group to a “hamster wheel." (The names used in this article are pseudonyms, as required by research protocol.)
“The thing about New York is it's kind of like the battle of the busiest," she said. “It's like, 'Oh, you're so busy? No, I'm so busy.'"
Most of the digital nomads we studied had been lured into what urbanist Richard Florida termed “creative class" jobs – positions in design, tech, marketing and entertainment. They assumed this work would prove fulfilling enough to offset what they sacrificed in terms of time spent on social and creative pursuits.
Yet these digital nomads told us that their jobs were far less interesting and creative than they had been led to expect. Worse, their employers continued to demand that they be “all in" for work – and accept the controlling aspects of office life without providing the development, mentorship or meaningful work they felt they had been promised. As they looked to the future, they saw only more of the same.
Ellie, 33, a former business journalist who is now a freelance writer and entrepreneur, told us: “A lot of people don't have positive role models at work, so then it's sort of like 'Why am I climbing the ladder to try and get this job? This doesn't seem like a good way to spend the next twenty years.'"
By their late 20s to early 30s, digital nomads were actively researching ways to leave their career-track jobs in top-tier global cities.
Looking for a fresh start
Although they left some of the world's most glamorous cities, the digital nomads we studied were not homesteaders working from the wilderness; they needed access to the conveniences of contemporary life in order to be productive. Looking abroad, they quickly learned that places like Bali in Indonesia, and Chiang Mai in Thailand had the necessary infrastructure to support them at a fraction of the cost of their former lives.
With more and more companies now offering employees the choice to work remotely, there's no reason to think digital nomads have to travel to southeast Asia – or even leave the United States – to transform their work lives.
During the pandemic, some people have already migrated away from the nation's most expensive real estate markets to smaller cities and towns to be closer to nature or family. Many of these places still possess vibrant local cultures. As commutes to work disappear from daily life, such moves could leave remote workers with more available income and more free time.
The digital nomads we studied often used savings in time and money to try new things, like exploring side hustles. One recent study even found, somewhat paradoxically, that the sense of empowerment that came from embarking on a side hustle actually improved performance in workers' primary jobs.
The future of work, while not entirely remote, will undoubtedly offer more remote options to many more workers. Although some business leaders are still reluctant to accept their employees' desire to leave the office behind, local governments are embracing the trend, with several U.S. cities and states – along with countries around the world – developing plans to attract remote workers.
This migration, whether domestic or international, has the potential to enrich communities and cultivate more satisfying work lives.
The potential of CRISPR technology is incredible, but the threats are too serious to ignore.
- CRISPR (Clustered Regularly Interspaced Short Palindromic Repeats) is a revolutionary technology that gives scientists the ability to alter DNA. On the one hand, this tool could mean the elimination of certain diseases. On the other, there are concerns (both ethical and practical) about its misuse and the yet-unknown consequences of such experimentation.
- "The technique could be misused in horrible ways," says counter-terrorism expert Richard A. Clarke. Clarke lists biological weapons as one of the potential threats, "Threats for which we don't have any known antidote." CRISPR co-inventor, biochemist Jennifer Doudna, echos the concern, recounting a nightmare involving the technology, eugenics, and a meeting with Adolf Hitler.
- Should this kind of tool even exist? Do the positives outweigh the potential dangers? How could something like this ever be regulated, and should it be? These questions and more are considered by Doudna, Clarke, evolutionary biologist Richard Dawkins, psychologist Steven Pinker, and physician Siddhartha Mukherjee.
Measuring a person's movements and poses, smart clothes could be used for athletic training, rehabilitation, or health-monitoring.