Does It Matter Who Wins in November?

Does It Matter Who Wins in November?

Arguments on both sides of this question were aired at a thought-provoking colloquium sponsored by the Hannah Arendt Center at Bard College on September 21-22: “Does the President Matter? A Conference on the American Age of Disrepair.” Video recordings of the proceedings are available here.

Let’s begin with three reasons to be skeptical about the importance of the election’s outcome.

1. Mitt Romney and President Obama are uncannily similar in a host of ways.


Ordinarily it is only people on the far right or far left for whom candidates closer to the center appear to be identical. But consider: both Romney and Obama are politically pragmatic Harvard grads who haven’t served in the military. And they aren’t that far apart on many issues. This may come as a surprise to anyone who watched the party conventions, where speakers drew bright, clear lines between their side and the Other, but there is an impressive number of issues on which the two men converge. There is also reason to believe that candidate Romney’s rhetoric against Obamacare and in favor of radical Medicare reform will soften once in office — for both ideological and structural reasons.

2. Structural constraints on the presidency radically limit the ability of chief executives to act in the domestic arena. 

The founders sought to instill “energy" in the executive branch, but they were careful to establish limits on a president’s power as well. In Federalist 51, James Madison made the case for the separation and limitation of federal power:

Ambition must be made to counteract ambition. The interest of the man must be connected with the constitutional rights of the place....In framing a government which is to be administered by men over men, the great difficulty lies in this: you must first enable the government to control the governed; and in the next place oblige it to control itself. A dependence on the people is, no doubt, the primary control on the government; but experience has taught mankind the necessity of auxiliary precautions.  

The Founders thought checks and balances were essential, but a frozen, dysfunctional government was not what they had in mind. They would be surprised by the level of acrimony defining the relationship between the legislative and executive branches today, and by the intransigence Congress displays in considering  a president’s agenda. They’d be stunned by procedural rules the Senate has adopted for itself  provisions nowhere to be found in the Constitution  which make it impossible to get legislation passed by a simple majority vote. Here is Jon Stewart’s incredulity in the face of 58 senators’ inability to pass a jobs bill for veterans last week:





Things are only getting worse, as Daniel Drezner wrote in the New York Times on Friday:

Congress has become increasingly sclerotic. During the 1950s, for example, Congress passed an average of 800 laws per session; in the post-cold-war era, that figure has declined to fewer than 400. Based on the 112th Congress, that figure will continue to decline in the future.

The party not in the White House has been increasingly obstructionist — and if you doubt this, look up the filibuster statistics. Any president trying to accomplish something with Congressional approval either needs a majority of the House and 60 votes in the Senate, or needs to compromise with an opposition party ever further away on the ideological spectrum. Short of a landslide, presidents have a brief honeymoon period in which to push major domestic policy initiatives through Congress.

3. Presidential campaigns do not attract candidates with true leadership skills.

Eric Liu, a domestic policy advisor under Bill Clinton, noted at the conference that presidents are “trailing indicators” rather than “leading indicators” of social mores. And Roger Berkowitz, director of the Center, notes that recent presidents have not been effective leaders:

The last two presidents who successfully amassed large majorities to pass transformative legislation were Lyndon Johnson and Ronald Reagan. What unites Johnson and Reagan — different in temperament and politics — was an uncanny quality of leadership. They were able to bring opposing sides together to accomplish grand and important visions. It is just such political leadership that we desperately need and clearly lack today.

Berkowitz continues:

Politics, Hannah Arendt argued, requires courage. It demands a risky and rare willingness to experiment and seek to bring about new directions in the world. To act politically demands doing things that are spontaneous and new; politics requires actions that are surprising and thus attract attention and generate interest, drawing people together around a common idea...

Leaders are those who take risks and are willing to fail. To look at Mitt Romney and President Obama is to see what happens when leaders are afraid to lose. We must now confront the fact that the need to raise money and the rise of consultants and the dominance of public relations has sapped politics of the spontaneity, thoughtfulness, and fun that can and should be at the center of political action.

I'm not sure thoughtfulness has evaporated as completely as Berkowitz argues, and the conventions did give the party faithful a chance for some spectacle and a tiny bit of spontaneity (witness Clint Eastwood's inane, unscripted performance). But Berkowitz is right that courage and risk-taking  and genuine creative thinking about solutions to national challenges  seem to be in short supply in recent election campaigns.

Given all this, does it matter who wins the election this fall? In my view, absolutely.

As hampered as presidents are in the domestic arena, they can still get some things done. The limited recovery from the economic crisis the stimulus produced may have been a lot more halting  or we may still be in recession  had John McCain had won the election in 2008. General Motors might be history. There would certainly be no Affordable Care Act providing health insurance coverage for millions of Americans.

And from a broader perspective, the identity of the president matters. For Leon Botstein, president of Bard, the shadow of race hangs heavily, if imperceptibly, over this election. If Obama were white, Botstein said on Friday, he’d be crushing Romney in the polls. Anne Norton, a political theorist at the University of Pennsylvania, argued that Barack Obama’s presidency was significant from a symbolic perspective. She notices less discomfort and more civility in her racially integrated Philadelphia neighborhood than there was before 2009. “The United States is still a nation of white supremacy,” she said, “but now it’s a nation of white supremacy with a black president.”

Expanding on this point, Eric Liu argued that a president matters crucially on a symbolic level. For Liu, the president “sets a frame of narrative for the rest of the country,” not only in terms of legislative proposals but on questions of national identity and the polity’s challenges and aspirations. Think of President Kennedy's goal of putting a man on the moon, President Reagan’s “it’s morning in America,” President Clinton’s pledge to “end welfare as we know it” or President Obama’s frank address on race & politics late in his first campaign. In the age of electronic media in which “all kinds of memes can spread instantly,” Liu said, “the president has an unparalleled ability to set off cascades of memes at will.” Even when he faces a hostile Congress or low approval ratings, the president “embodies our sense of unity,” appealing to the people’s “tribal need” to belong.

It matters a great deal who wins the election this fall, but no president has the capacity to heal our political disrepair all by himself.

Follow Steven Mazie on Twitter: @stevenmazie

Iron Age discoveries uncovered outside London, including a ‘murder’ victim

A man's skeleton, found facedown with his hands bound, was unearthed near an ancient ceremonial circle during a high speed rail excavation project.

Photo Credit: HS2
Culture & Religion
  • A skeleton representing a man who was tossed face down into a ditch nearly 2,500 years ago with his hands bound in front of his hips was dug up during an excavation outside of London.
  • The discovery was made during a high speed rail project that has been a bonanza for archaeology, as the area is home to more than 60 ancient sites along the planned route.
  • An ornate grave of a high status individual from the Roman period and an ancient ceremonial circle were also discovered during the excavations.
Keep reading Show less

Are we really addicted to technology?

Fear that new technologies are addictive isn't a modern phenomenon.

Credit: Rodion Kutsaev via Unsplash
Technology & Innovation

This article was originally published on our sister site, Freethink, which has partnered with the Build for Tomorrow podcast to go inside new episodes each month. Subscribe here to learn more about the crazy, curious things from history that shaped us, and how we can shape the future.

In many ways, technology has made our lives better. Through smartphones, apps, and social media platforms we can now work more efficiently and connect in ways that would have been unimaginable just decades ago.

But as we've grown to rely on technology for a lot of our professional and personal needs, most of us are asking tough questions about the role technology plays in our own lives. Are we becoming too dependent on technology to the point that it's actually harming us?

In the latest episode of Build for Tomorrow, host and Entrepreneur Editor-in-Chief Jason Feifer takes on the thorny question: is technology addictive?

Popularizing medical language

What makes something addictive rather than just engaging? It's a meaningful distinction because if technology is addictive, the next question could be: are the creators of popular digital technologies, like smartphones and social media apps, intentionally creating things that are addictive? If so, should they be held responsible?

To answer those questions, we've first got to agree on a definition of "addiction." As it turns out, that's not quite as easy as it sounds.

If we don't have a good definition of what we're talking about, then we can't properly help people.

LIAM SATCHELL UNIVERSITY OF WINCHESTER

"Over the past few decades, a lot of effort has gone into destigmatizing conversations about mental health, which of course is a very good thing," Feifer explains. It also means that medical language has entered into our vernacular —we're now more comfortable using clinical words outside of a specific diagnosis.

"We've all got that one friend who says, 'Oh, I'm a little bit OCD' or that friend who says, 'Oh, this is my big PTSD moment,'" Liam Satchell, a lecturer in psychology at the University of Winchester and guest on the podcast, says. He's concerned about how the word "addiction" gets tossed around by people with no background in mental health. An increased concern surrounding "tech addiction" isn't actually being driven by concern among psychiatric professionals, he says.

"These sorts of concerns about things like internet use or social media use haven't come from the psychiatric community as much," Satchell says. "They've come from people who are interested in technology first."

The casual use of medical language can lead to confusion about what is actually a mental health concern. We need a reliable standard for recognizing, discussing, and ultimately treating psychological conditions.

"If we don't have a good definition of what we're talking about, then we can't properly help people," Satchell says. That's why, according to Satchell, the psychiatric definition of addiction being based around experiencing distress or significant family, social, or occupational disruption needs to be included in any definition of addiction we may use.

Too much reading causes... heat rashes?

But as Feifer points out in his podcast, both popularizing medical language and the fear that new technologies are addictive aren't totally modern phenomena.

Take, for instance, the concept of "reading mania."

In the 18th Century, an author named J. G. Heinzmann claimed that people who read too many novels could experience something called "reading mania." This condition, Heinzmann explained, could cause many symptoms, including: "weakening of the eyes, heat rashes, gout, arthritis, hemorrhoids, asthma, apoplexy, pulmonary disease, indigestion, blocking of the bowels, nervous disorder, migraines, epilepsy, hypochondria, and melancholy."

"That is all very specific! But really, even the term 'reading mania' is medical," Feifer says.

"Manic episodes are not a joke, folks. But this didn't stop people a century later from applying the same term to wristwatches."

Indeed, an 1889 piece in the Newcastle Weekly Courant declared: "The watch mania, as it is called, is certainly excessive; indeed it becomes rabid."

Similar concerns have echoed throughout history about the radio, telephone, TV, and video games.

"It may sound comical in our modern context, but back then, when those new technologies were the latest distraction, they were probably really engaging. People spent too much time doing them," Feifer says. "And what can we say about that now, having seen it play out over and over and over again? We can say it's common. It's a common behavior. Doesn't mean it's the healthiest one. It's just not a medical problem."

Few today would argue that novels are in-and-of-themselves addictive — regardless of how voraciously you may have consumed your last favorite novel. So, what happened? Were these things ever addictive — and if not, what was happening in these moments of concern?

People are complicated, our relationship with new technology is complicated, and addiction is complicated — and our efforts to simplify very complex things, and make generalizations across broad portions of the population, can lead to real harm.

JASON FEIFER HOST OF BUILD FOR TOMORROW

There's a risk of pathologizing normal behavior, says Joel Billieux, professor of clinical psychology and psychological assessment at the University of Lausanne in Switzerland, and guest on the podcast. He's on a mission to understand how we can suss out what is truly addictive behavior versus what is normal behavior that we're calling addictive.

For Billieux and other professionals, this isn't just a rhetorical game. He uses the example of gaming addiction, which has come under increased scrutiny over the past half-decade. The language used around the subject of gaming addiction will determine how behaviors of potential patients are analyzed — and ultimately what treatment is recommended.

"For a lot of people you can realize that the gaming is actually a coping (mechanism for) social anxiety or trauma or depression," says Billieux.

"Those cases, of course, you will not necessarily target gaming per se. You will target what caused depression. And then as a result, If you succeed, gaming will diminish."

In some instances, a person might legitimately be addicted to gaming or technology, and require the corresponding treatment — but that treatment might be the wrong answer for another person.

"None of this is to discount that for some people, technology is a factor in a mental health problem," says Feifer.

"I am also not discounting that individual people can use technology such as smartphones or social media to a degree where it has a genuine negative impact on their lives. But the point here to understand is that people are complicated, our relationship with new technology is complicated, and addiction is complicated — and our efforts to simplify very complex things, and make generalizations across broad portions of the population, can lead to real harm."

Behavioral addiction is a notoriously complex thing for professionals to diagnose — even more so since the latest edition of the Diagnostic and Statistical Manual of Mental Disorders (DSM-5), the book professionals use to classify mental disorders, introduced a new idea about addiction in 2013.

"The DSM-5 grouped substance addiction with gambling addiction — this is the first time that substance addiction was directly categorized with any kind of behavioral addiction," Feifer says.

"And then, the DSM-5 went a tiny bit further — and proposed that other potentially addictive behaviors require further study."

This might not sound like that big of a deal to laypeople, but its effect was massive in medicine.

"Researchers started launching studies — not to see if a behavior like social media use can be addictive, but rather, to start with the assumption that social media use is addictive, and then to see how many people have the addiction," says Feifer.

Learned helplessness

The assumption that a lot of us are addicted to technology may itself be harming us by undermining our autonomy and belief that we have agency to create change in our own lives. That's what Nir Eyal, author of the books Hooked and Indistractable, calls 'learned helplessness.'

"The price of living in a world with so many good things in it is that sometimes we have to learn these new skills, these new behaviors to moderate our use," Eyal says. "One surefire way to not do anything is to believe you are powerless. That's what learned helplessness is all about."

So if it's not an addiction that most of us are experiencing when we check our phones 90 times a day or are wondering about what our followers are saying on Twitter — then what is it?

"A choice, a willful choice, and perhaps some people would not agree or would criticize your choices. But I think we cannot consider that as something that is pathological in the clinical sense," says Billieux.

Of course, for some people technology can be addictive.

"If something is genuinely interfering with your social or occupational life, and you have no ability to control it, then please seek help," says Feifer.

But for the vast majority of people, thinking about our use of technology as a choice — albeit not always a healthy one — can be the first step to overcoming unwanted habits.

For more, be sure to check out the Build for Tomorrow episode here.

Why the U.S. and Belgium are culture buddies

The Inglehart-Welzel World Cultural map replaces geographic accuracy with closeness in terms of values.

According to the latest version of the Inglehart-Welzel World Cultural Map, Belgium and the United States are now each other's closest neighbors in terms of cultural values.

Credit: World Values Survey, public domain.
Strange Maps
  • This map replaces geography with another type of closeness: cultural values.
  • Although the groups it depicts have familiar names, their shapes are not.
  • The map makes for strange bedfellows: Brazil next to South Africa and Belgium neighboring the U.S.
Keep reading Show less
Quantcast