Participatory democracy is presumed to be the gold standard. Here’s why it isn’t.

Political activism may get people invested in politics, and affect urgently needed change, but it comes at the expense of tolerance and healthy democratic norms.

Photo by Nicholas Roberts /Getty Images
  • Polarization and extreme partisanships have been on the rise in the United States.
  • Political psychologist Diana Mutz argues that we need more deliberation, not political activism, to keep our democracy robust.
  • Despite increased polarization, Americans still have more in common than we appear to.


Imagine everyday citizens engaging in the democratic process. What images spring to mind? Maybe you thought of town hall meetings where constituents address their representatives. Maybe you imagined mass sit-ins or marches in the streets to protest unpopular legislation. Maybe it's grassroot organizations gathering signatures for a popular referendum. Though they vary in means and intensity, all these have one thing in common: participation.

Participatory democracy is a democratic model that emphasizes civic engagement as paramount for a robust government. For many, it's both the "hallmark of social movements" and the gold standard of democracy.

But all that glitters may not be gold. While we can all point to historical moments in which participatory democracy was critical to necessary change, such activism can have deleterious effects on the health of a democracy, too. One such byproduct, political psychologist Diana Mutz argues, can be the lessening political tolerance.

Participation or deliberation?

In her book Hearing the Other Side: Deliberative Versus Participatory Democracy, Mutz argues that participatory democracy is best supported by close-knit groups of like-minded people. Political activism requires fervor to rouse people to action. To support such passions, people surround themselves with others who believe in the cause and view it as unassailable.

Alternative voices and ideologies — what Mutz calls "cross-cutting exposures" — are counterproductive to participation because they don't reinforce the group's beliefs and may soften the image of the opposing side. This can dampen political zeal and discourage participation, particularly among those averse to conflict. To prevent this from happening, groups can become increasingly intolerant of the other side.

"You can have a coup and maximize levels of participation, but that wouldn't be a great thing to do. It wouldn't be a sign of health and that things were going well."

As the book's title suggests, deliberative democracy fosters a different outlook for those who practice it. This model looks toward deliberation, communication, compromise, and consensus as the signs of a resilient democracy. While official deliberation is the purview of politicians and members of the court, it's worth noting that deliberative democracy doesn't mean inactivity from constituents. It's a philosophy we can use in our daily lives, from community memberships to interactions on social media.

"The idea is that people learn from one another," Mutz tells Big Think. "They learn arguments from the other side as well as learn more about the reasons behind their own views. [In turn], they develop a respect for the other side as well as moderate their own views."

Mutz's analysis leads her to support deliberation over activism in U.S. politics. She notes that the homogeneous networks required for activism can lead to positive changes — again, there are many historical examples to choose from. But such networks also risk developing intolerance and extremism within their ranks, examples of which are also readily available on both the right and left.

Meanwhile, the cross-cutting networks required for deliberative democracy offer a bounty of benefits, with the only risk being lowered levels of participation.

As Mutz writes: "Hearing the other side is also important for its indirect contributions to political tolerance. The capacity to see that there is more than one side to an issue, that political conflict is, in fact, a legitimate controversy with rationales on both sides, translates to greater willingness to extend civil liberties to even those groups whose political views one dislikes a great deal."

Of politics and summer camp

(Photo by Fox Photos/Getty Images)

Take that! A boxing bout between two members of a schoolboys' summer camp at Pendine, South Wales, takes place in a field within a ring of cheering campmates.

Of course, listening openly and honestly to the other side doesn't come naturally. Red versus blue. Religious versus secular. Rural versus cosmopolitan. We divide ourselves into polarized groups that seek to silence cross-cutting communication in the pursuit of political victory.

"The separation of the country into two teams discourages compromise and encourages an escalation of conflict," Lilliana Mason, assistant professor of Government and Politics at the University of Maryland, writes in her book Uncivil Agreement: How Politics Became Our Identity. "The cooperation and compromise required by democracy grow less attainable as partisan isolation and conflict increase."

Mason likens the current situation to Muzafer Sherif's famous Robbers Cave Experiment.

In the early 1950s, Sherif gathered a group of boys for a fun summer camp at Robbers Cave State Park, Oklahoma. At least, that was the pretense. In reality, Sherif and his counselors were performing an experiment in intergroup conflict that would now be considered unethical.

The 20 boys were divided into two groups, the Rattlers and the Eagles. For a while, the counselors kept the groups separate, allowing the boys to bond only with their assigned teammates. Then the two groups were introduced to participate in a tournament. They played competitive games, such as baseball and tug-o-war, with the winning team promised the summer camp trophy.

Almost immediately, the boys identified members of the other team as intruders. As the tournament continued, the conflict escalated beyond sport. The Eagles burned a Rattlers flag. The Rattlers raided the Eagles' cabin. When asked to describe the other side, both groups showed in-group favoritism and out-group aggression.

Most troubling, the boys wholly assumed the identity of an Eagle or Rattler despite having never been either before that very summer.

"We, as modern Americans, probably like to think of ourselves as more sophisticated and tolerant than a group of fifth-grade boys from 1954. In many ways, of course, we are," Mason writes. "But the Rattlers and the Eagles have a lot more in common with today's Democrats and Republicans than we would like to believe."

Like at Robbers Cave, signs of incendiary conflict are easy to spot in U.S. politics today.

"Political Polarization in the American Public", Pew Research Center, Washington, D.C. (June 12, 2014)

A 2014 Pew survey found that the ideological overlap between Democrats and Republicans is much more distant than in the past. More Republicans lie further right of moderate Democrats than before and vice versa. The survey also found that partisan animosity had doubled since 1994.

In her book, Mason points to research that shows an "increasing number of partisans don't want party leaders to compromise," blame "the other party for all incivility in government," and abhor the idea of dating someone from outside their ideological group.

And let's not forget Congress, which has grown increasingly divided along ideological lines over the past 60 years.

A dose of daily deliberation

Painting by Charles Francois Jalabert (1819-1901) 1846. Beaux-Arts museum, Nimes, France. Photo by Leemage/Corbis via Getty Images.

Horace, Virgil and Varius at the house of Maecenas.

A zero-sum mindset may be inevitable in a summer camp tournament, but it's detrimental if taken into wider society and politics. Yet if participatory democracy leads to the silencing of oppositional voices, a zero-sum mindset is exactly what we get. Conversely, creating networks that tolerate and support differing opinions offers non-zero benefits, like tolerance and an improvement of one's understanding of complicated issues.

Mutz wrote her book in 2006, but as she told us in our interview, the intervening years have only strengthened her resolve that deliberation improves democratic health:

"Right now, I'm definitely on the side of greater deliberation rather than just do whatever we can to maximize levels of participation. You can have a coup and maximize levels of participation, but that wouldn't be a great thing to do. It wouldn't be a sign of health and that things were going well. Democracy [must be] able to absorb differences in opinion and funnel them into a means of governing that people were okay with, even when their side didn't win."

Unfortunately, elected officials and media personalities play up incivility and the sense of national crisis for ratings and attention, respectively. That certainly doesn't help promote deliberation, but as Mutz reminded us, people perceive political polarization to be much higher than it actually is. In our daily lives, deliberative democracy is more commonplace than we realize and something we can promote in our communities and social groups.

Remember that 2014 Pew survey that found increased levels of partisan animosity? Its results showed the divide to be strongest among those most engaged and active in politics. The majority of those surveyed did not hold uniform left or right views, did not see the opposing party as an existential threat, and believed in the deliberative process in government. In other words, the extremes were pulling hard at the poles.

Then there's social media. The popular narrative is that social media is a morass of political hatred and clashing identities. But most social media posts have nothing to do with politics. An analysis of Facebook posts from September 2016, the middle of an election year, found the most popular topics centered on football, Halloween, Labor Day, country music, and slow cookers.

And what of political partisanship and prejudice? In an analysis of polarization and ideological identity, Mason found that labels like "liberal" and "conservative" had less to do with values and policy attitudes – as the majority of Americans agree on a substantial number of issues – and more to do with social group identification.

Yes, we all know those maps that media personalities dust off every election year, the ones that show the U.S. carved up into competing camps of red and blue. The reality is far more intricate and complex, and Americans' intolerance for the other side varies substantially from place to place and across demographics.

So while participation has its place, a healthy democracy requires deliberation, a recognition of the other side's point of view, and the willingness to compromise. Tolerance may not make for good TV or catchy political slogans, but it's something we all can foster in our own social groups.

Understanding what tolerance means in a highly polarized America

More From Big Think
Related Articles

High-fat diets change your brain, not just your body

Unhealthy diets cause the part of your brain responsible for appetite to become inflamed, encouraging further eating and obesity.

Photo by Miguel Andrade on Unsplash
Mind & Brain
  • Anyone who has tried to change their diet can tell you it's not as simple as simply waking up and deciding to eat differently.
  • New research sheds light on a possible explanation for this; high-fat diets can cause inflammation in the hypothalamus, which regulates hunger.
  • Mice fed high-fat diets tended to eat more and become obese due to this inflammation.

Your wardrobe won't be the only thing a bad diet will change in your life — new research published in Cell Metabolism shows that high-fat and high-carbohydrate diets physically change your brain and, correspondingly, your behavior. Anyone who has tried to change their diet can tell you that it's far more challenging than simply deciding to change. It could be because of the impact high-fat diets have on the hypothalamus.

Yale researcher Sabrina Diano and colleagues fed mice a high-fat, high-carb diet and found that the animals' hypothalamuses quickly became inflamed. This small portion of the brain release hormones that regulate many autonomic processes, including hunger. It appears that high-fat, high-carb diets create a vicious cycle, as this inflammation caused the mice to eat more and gain more weight.

"There are specific brain mechanisms that get activated when we expose ourselves to specific type of foods," said Diano in a Yale press release. "This is a mechanism that may be important from an evolutionary point of view. However, when food rich in fat and carbs is constantly available it is detrimental."

A burger and a side of fries for mice

Chicken Nuggets

Photo by Miguel Andrade on Unsplash

The main driver of this inflammation appeared to be how high-fat diets changed the mice's microglial cells. Along with other glial cells, microglia are a kind of cell found in the central nervous system (CNS), although they aren't neurons. Instead, they play a supporting role in the brain, providing structure, supplying nutrients, insulating neurons, and destroying pathogens. Microglia work as part of the CNS's immune system, seeking out and destroying foreign bodies as well as plaques and damaged neurons or synapses.

In just three days after being fed a high-fat diet, the mice's microglia activated, causing inflammation in the hypothalamus. As a result, the mice started to eat more and became obese. "We were intrigued by the fact that these are very fast changes that occur even before the body weight changes, and we wanted to understand the underlying cellular mechanism," said Diano.

In mice fed with a high-fat diet, the researchers found that the mitochondria of the microglia had shrunk. They suspected that a specific protein called Uncoupling Protein 2 (UCP2) was the likely culprit for this change, since it helps to regulate the amount of energy microglia use and tends to be highly expressed on activated microglia.

To test whether UCP2 was behind the hypothalamus inflammation, the researchers deleted the gene responsible for producing that protein in a group of mice. Then, they fed those mice the same high-fat diet. This time, however, the mice's microglia did not activate. As a result, they ate significantly less food and did not become obese.

An out-of-date adaptation

When human beings did not have reliable access to food, this kind of behavioral change would have been beneficial. If an ancient human stumbled across a high-fat, calorically dense meal, it would make sense for that individual to eat as much as they could, not knowing where it's next meal would come from.

But there were no Burger Kings during the Pleistocene. Humanity has been extraordinarily successful in changing its environment, but our genome has yet to catch up. The wide availability of food, and especially high-fat foods, means that this adaptation is no longer a benefit for us.

If anything, research such as this underscores how difficult it is to really change bad habits. A poor diet isn't a moral failing — it's a behavioral demand. Fortunately, the same big brains that gave us this abundance of food can also exert control over our behavior, even if those brains seem to be working against us.

Detecting patients’ pain levels via their brain signals

The system could help with diagnosing and treating patients that cannot communicate.

Pat Greenhouse/The Boston Globe via Getty Images
Surprising Science

Researchers from MIT and elsewhere have developed a system that measures a patient's pain level by analyzing brain activity from a portable neuroimaging device.

Keep reading Show less
BRENDAN SMIALOWSKI/AFP/Getty Images
Politics & Current Affairs

Pick any of the big topics of the day – Brexit, climate change or Trump's immigration policies – and wander online.

Keep reading Show less