The Teaching Method of Respect for Texts

Someone might say—and libertarians skeptics often do—that classes in philosophy and literature are given a quite an arbitrarily inflated value by according them credit. Do away with the credit system and give degrees based on real demonstration of measurable competencies valuable in the 21st century marketplace, and you'll find out what studying Plato’s Republic is really worth. I admit that's a humbling thought, one that I'm sure my college's administrators would like me to have at least once in a while. And I've heard that our professors of finance and accounting and computer science (and even the political SCIENTIST) think I should be having it a lot more often than that.


Now to be fair to libertarian techno-skeptics, they almost all believe (and many have discovered for themselves) that it's really worthwhile to read the Republic or Shakespeare. It's just that you can do that on your own time and for free.

Well, I agree you should do that on your own time and for free. But it's pretty hard—if not quite impossible—to know why you should spend your precious time that way without a good teacher. It turns out that openness to books—and so openness to the truth—usually depends on trusting a personal authority to some extent. That person—your teacher—has to earn your trust. And in some ways that's harder than ever.

Alexis de Tocqueville explains pretty well why Americans have an "issue" with trusting personal authority—especially personal intellectual authority. The Americans, he says, are Cartesians who've never read a word of Descartes. They hit upon the "Cartesian method" because it's identical to the "democratic method." If you want to reduce that method to one word, it would be "doubt." That means being really, really skeptical of the words of other persons. If I trust you, then I let you rule me. And ruling myself is what democracy means for me. I have to think for myself.

Thinking for myself, so understood, is a mixture of anti-authoritarian paranoia and a kind of unwarranted or excessive self-confidence in one's own "critical thinking skills." We techno-Americans tend to believe there’s a method for everything, even or especially, as Descartes said, thinking. But the question remains: what should I think about? Surely I’m stuck with thinking about who I am and what I’m supposed to do. And just as surely it’s asking too much for me to answer that “who” and that "what" question all by myself.  Even God himself didn’t create himself out of nothing.

Now there's something admirable and something ridiculous in this "hermeneutic of suspicion." It has a noble Protestant origin, after all, in the determination to trust no one but myself—and not those Satanic deceivers who call themselves priests—in interpreting the word of God. But even choosing to read the Bible depends on taking someone's word that the Bible is God's word. And to submit even to the word of God, after all, is undemocratic. Christianity might teach we're all equal under God, but we won't really be free until we free ourselves from being under God's personal thumb.

So at a point certain democrats dispense with the Bible and other people and try to find God in themselves. But the religion of me—all alone—turns out to be pretty empty, and certainly not the foundation of much "critical thinking"—toughly judgmental thinking—about who I am and what I'm supposed to do. It's this personal emptiness, Tocqueville explains, that causes democratic religion to morph into pantheism—or the denial of real personal identity.

Obviously we'd know more about ourselves if read the Bible as if it might be true—or not from the point of view of detached tourists who believe that what this or that "culture" once believed has nothing to do with us these days—or, more precisely, with me these days. And obviously Americans would know a lot more about what genuinely critical thinking requires if they read Descartes. But to privilege his book on method over others requires submitting to the personal authority of those who have read and recommend it. We democrats really see the despotic danger of such submission. We’ve all read, for example, that Leo Strauss got his “neocon” students to read Plato to impose his own personal agenda on them.

It's easy to respond that to not read the Bible or Descartes is to be even more thoroughly or thoughtlessly dominated by those books. The personal egalitarianism that drives most moral thinking today is full of Biblical premises, and to think with those premises with no awareness of their foundation is, obviously, not really to think for yourself.  We defenders of “human rights” assert that every human person is unique and irreplaceable.  But we have no idea why. Certainly most of modern science is incapable of even beginning to explain why.

The same goes, of course, with Descartes' audacious choice of the modern technological project. Every transhumanist is a Cartesian, whether he knows it or not. If you actually become liberally educated, you can actually start to make connections between the Bible and Descartes. Then you will actually start to think clearly about how techno-liberation both depends on the Bible's view of the person while being a rejection of the Bible's personal and relational God. A critical thinker full of theological and philosophical content might exclaim: "How reasonable is that!?"

The big point here is the excessively resolute determination to doubt personal authority doesn't really lead to freeing oneself altogether from authority. What rushes in in the absence of personal authority or relational personal identity is impersonal authority. It's too hard—too dizzying and disorienting—to think all by yourself. Because you don't know who you are, you really don't know what to do. So what fills the void and makes action possible, Tocqueville observes, is usually either public opinion—or trendy opinion—or the impersonal expertise of science.

When we defer to public opinion, we, in fact, become relativists. We say there’s no standard—when it comes to truth, beauty, justice, and so forth—higher than what sophisticated public intellectuals assert these days. When we defer to experts and what their “studies” or “data” show, find ourselves in the thrall of scientism. We too easily believe neuroscience or evolutionary psychology or rational-choice theory as explorations for everything, as the definitive sources of knowledge of who we are and what we’re supposed to do. It’s not denying the truth and utility of science to be aware that scientism is the ideology that’s the result of popularizing scientists speculating authoritatively beyond the limits of what they can really know through their methods.

Both deferring to public opinion and deferring to “popular science” are ways of denying what you really can see with own eyes about you are and what you’re supposed to do.  It’s, as the philosopher Heidegger and the novelist Walker Percy observe, surrendering oneself to what “they”—to what no one in particular—say.

Public opinion, let me emphasize, doesn't only mean what the majority thinks. It means the opinion of your public. As Rousseau incisively pointed out, sophisticated intellectuals in democratic times flee from "the vulgar" by being witty and fashionable—and so by not being critical of what the witty and fashionable believe at any particular time. We can see now, for example, that neuroscience as a comprehensive explanatory system—that incorporates, for example, neurotheology and neuro-humanities—peaked out as scientism about 2008. But the progress in the real science of neuroscience continues, even if it's given less attention by the witty and fashionable.

Insofar as scientism affects the teaching of the humanities, public intellectuals, and even our art and literature, we find, as our theologian-novelist Marilynne Robinson observes, "persons understood as having radically limited self-awareness, a minimum of meaningful inwardness, very little ability to choose or appraise their actions." What we find, in other words, is the denial of real personal identity. Personal responsibility, if you think about, depends on personal authority.

In Robinson's words: "The flourishing of these [impersonal] ideas, of neo-Darwinism in general, would not be possible except in the absence of the vigorous and critical study of the humanities. Its 'proofs' are nothing except the failure of education, in the schools but also in the churches." Neurotheology make sense only to someone who hasn't really attended to what the Bible, St. Augustine, Calvin (especially in Robinson's case), Shakespeare, Melville, and so forth really say about who we are. So the return to "real books" is, for us, the only way to be saved from being deformed by degrading nonsense promulgated by experts, not to mention by the despotism of fashion.

Studies show, E.D. Hirsch (the cultural literacy guy) has reported, that the key to flourishing in the world as someone with a strong personal identity in touch with the world as it actually is and ready to take responsibility for himself and others is to have a huge vocabulary and an exact and imaginative understanding of what those words mean. The only way to achieve this competency reliably in primary and secondary school is through reading lots of "real books." And another study shows that a very reliable clue to how a child will fare in school and life is the number of book shelves his parents have at home or, in other words, whether his parents have raised him in a seriously "bookish" environment, in a home where reading is privileged as a form of civilized enjoyment.

What "teaching method" works best to inculcate that respect? That's a topic for another time. But here's a beginning: There should be nothing in the classroom except a professor, students, and a great or at least really good book (a Supreme Court opinion or a classic political speech count as a really good book). No PowerPoint, no laptops, no smart phones, and so forth. And the professor should be calling constant attention to the text, reading aloud and dramatically, from time to time. At least the occasional class should be devoted to a single page or even a single paragraph, just to make clear how much there is for us to know.

​There are two kinds of failure – but only one is honorable

Malcolm Gladwell teaches "Get over yourself and get to work" for Big Think Edge.

Big Think Edge
  • Learn to recognize failure and know the big difference between panicking and choking.
  • At Big Think Edge, Malcolm Gladwell teaches how to check your inner critic and get clear on what failure is.
  • Subscribe to Big Think Edge before we launch on March 30 to get 20% off monthly and annual memberships.
Keep reading Show less

Why are so many objects in space shaped like discs?

It's one of the most consistent patterns in the unviverse. What causes it?

Videos
  • Spinning discs are everywhere – just look at our solar system, the rings of Saturn, and all the spiral galaxies in the universe.
  • Spinning discs are the result of two things: The force of gravity and a phenomenon in physics called the conservation of angular momentum.
  • Gravity brings matter together; the closer the matter gets, the more it accelerates – much like an ice skater who spins faster and faster the closer their arms get to their body. Then, this spinning cloud collapses due to up and down and diagonal collisions that cancel each other out until the only motion they have in common is the spin – and voila: A flat disc.

Freud is renowned, but his ideas are ill-substantiated

The Oedipal complex, repressed memories, penis envy? Sigmund Freud's ideas are far-reaching, but few have withstood the onslaught of empirical evidence.

Mind & Brain
  • Sigmund Freud stands alongside Charles Darwin and Albert Einstein as one of history's best-known scientists.
  • Despite his claim of creating a new science, Freud's psychoanalysis is unfalsifiable and based on scant empirical evidence.
  • Studies continue to show that Freud's ideas are unfounded, and Freud has come under scrutiny for fabricating his most famous case studies.

Few thinkers are as celebrated as Sigmund Freud, a figure as well-known as Charles Darwin and Albert Einstein. Neurologist and the founder of psychoanalysis, Freud's ideas didn't simply shift the paradigms in academia and psychotherapy. They indelibly disseminated into our cultural consciousness. Ideas like transference, repression, the unconscious iceberg, and the superego are ubiquitous in today's popular discourse.

Despite this renown, Freud's ideas have proven to be ill-substantiated. Worse, it is now believed that Freud himself may have fabricated many of his results, opportunistically disregarding evidence with the conscious aim of promoting preferred beliefs.

"[Freud] really didn't test his ideas," Harold Takooshian, professor of psychology at Fordham University, told ATI. "He was just very persuasive. He said things no one said before, and said them in such a way that people actually moved from their homes to Vienna and study with him."

Unlike Darwin and Einstein, Freud's brand of psychology presents the impression of a scientific endeavor but ultimately lack two of vital scientific components: falsification and empirical evidence.

Psychoanalysis

Freud's therapeutic approach may be unfounded, but at least it was more humane than other therapies of the day. In 1903, this patient is being treated in "auto-conduction cage" as a part of his electrotherapy. (Photo: Wikimedia Commons)

The discipline of psychotherapy is arguably Freud's greatest contribution to psychology. In the post-World War II era, psychoanalysis spread through Western academia, influencing not only psychotherapy but even fields such as literary criticism in profound ways.

The aim of psychoanalysis is to treat mental disorders housed in the patient's psyche. Proponents believe that such conflicts arise between conscious thoughts and unconscious drives and manifest as dreams, blunders, anxiety, depression, or neurosis. To help, therapists attempt to unearth unconscious desires that have been blocked by the mind's defense mechanisms. By raising repressed emotions and memories to the conscious fore, the therapist can liberate and help the patient heal.

That's the idea at least, but the psychoanalytic technique stands on shaky empirical ground. Data leans heavily on a therapist's arbitrary interpretations, offering no safe guards against presuppositions and implicit biases. And the free association method offers not buttress to the idea of unconscious motivation.

Don't get us wrong. Patients have improved and even claimed to be cured thanks to psychoanalytic therapy. However, the lack of methodological rigor means the division between effective treatment and placebo effect is ill-defined.

Repressed memories

Sigmund Freud, circa 1921. (Photo: Wikimedia Commons)

Nor has Freud's concept of repressed memories held up. Many papers and articles have been written to dispel the confusion surrounding repressed (aka dissociated) memories. Their arguments center on two facts of the mind neurologists have become better acquainted with since Freud's day.

First, our memories are malleable, not perfect recordings of events stored on a biological hard drive. People forget things. Childhood memories fade or are revised to suit a preferred narrative. We recall blurry gists rather than clean, sharp images. Physical changes to the brain can result in loss of memory. These realities of our mental slipperiness can easily be misinterpreted under Freud's model as repression of trauma.

Second, people who face trauma and abuse often remember it. The release of stress hormones imprints the experience, strengthening neural connections and rendering it difficult to forget. It's one of the reasons victims continue to suffer long after. As the American Psychological Association points out, there is "little or no empirical support" for dissociated memory theory, and potential occurrences are a rarity, not the norm.

More worryingly, there is evidence that people are vulnerable to constructing false memories (aka pseudomemories). A 1996 study found it could use suggestion to make one-fifth of participants believe in a fictitious childhood memory in which they were lost in a mall. And a 2007 study found that a therapy-based recollection of childhood abuse "was less likely to be corroborated by other evidence than when the memories came without help."

This has led many to wonder if the expectations of psychoanalytic therapy may inadvertently become a self-fulfilling prophecy with some patients.

"The use of various dubious techniques by therapists and counselors aimed at recovering allegedly repressed memories of [trauma] can often produce detailed and horrific false memories," writes Chris French, a professor of psychology at Goldsmiths, University of London. "In fact, there is a consensus among scientists studying memory that traumatic events are more likely to be remembered than forgotten, often leading to post-traumatic stress disorder."

The Oedipal complex

The Blind Oedipus Commending His Children to the Gods by Benigne Gagneraux. (Photo: Wikimedia Commons)

During the phallic stage, children develop fierce erotic feelings for their opposite-sex parent. This desire, in turn, leads them to hate their same-sex parent. Boys wish to replace their father and possess their mother; girls become jealous of their mothers and desire their fathers. Since they can do neither, they repress those feelings for fear of reprisal. If unresolved, the complex can result in neurosis later in life.

That's the Oedipal complex in a nutshell. You'd think such a counterintuitive theory would require strong evidence to back it up, but that isn't the case.

Studies claiming to prove the Oedipal complex look to positive sexual imprinting — that is, the phenomenon in which people choose partners with physical characteristics matching their same-sex parent. For example, a man's wife and mother have the same eye color, or woman's husband and father sport a similar nose.

But such studies don't often show strong correlation. One study reporting "a correction of 92.8 percent between the relative jaw width of a man's mother and that of [his] mates" had to be retracted for factual errors and incorrect analysis. Studies showing causation seem absent from the literature, and as we'll see, the veracity of Freud's own case studies supporting the complex is openly questioned today.

Better supported, yet still hypothetical, is the Westermarck effect. Also called reverse sexual imprinting, the effect predicts that people develop a sexual aversion to those they grow up in close proximity with, as a mean to avoid inbreeding. The effect isn't just shown in parents and siblings; even step-siblings will grow sexual averse to each other if they grow up from early childhood.

An analysis published in Behavioral Ecology and Sociobiology evaluated the literature on human mate choice. The analysis found little evidence for positive imprinting, citing study design flaws and an unwillingness of researchers to seek alternative explanations. In contrast, it found better support for negative sexual imprinting, though it did note the need for further research.

The Freudian slip

Mark notices Deborah enter the office whistling an upbeat tune. He turns to his coworker to say, "Deborah's pretty cheery this morning," but accidentally blunders, "Deborah's pretty cherry this morning." Simple slip up? Not according to Freud, who would label this a parapraxis. Today, it's colloquially known as a "Freudian slip."

"Almost invariably I discover a disturbing influence from something outside of the intended speech," Freud wrote in The Psychopathology of Everyday Life. "The disturbing element is a single unconscious thought, which comes to light through the special blunder."

In the Freudian view, Mark's mistaken word choice resulted from his unconscious desire for Deborah, as evident by the sexually-charged meanings of the word "cherry." But Rob Hartsuiker, a psycholinguist from Ghent University, says that such inferences miss the mark by ignoring how our brains process language.

According to Hartsuiker, our brains organize words by similarity and meaning. First, we must select the word in that network and then process the word's sounds. In this interplay, all sorts of conditions can prevent us from grasping the proper phonemes: inattention, sleepiness, recent activation, and even age. In a study co-authored by Hartsuiker, brain scans showed our minds can recognize and correct for taboo utterances internally.

"This is very typical, and it's also something Freud rather ignored," Hartsuiker told BBC. He added that evidence for true Freudian slips is scant.

Freud's case studies

Sergej Pankejeff, known as the "Wolf Man" in Freud's case study, claimed that Freud's analysis of his condition was "propaganda."

It's worth noting that there is much debate as to the extent that Freud falsified his own case studies. One famous example is the case of the "Wolf Man," real name Sergej Pankejeff. During their sessions, Pankejeff told Freud about a dream in which he was lying in bed and saw white wolves through an open window. Freud interpreted the dream as the manifestation of a repressed trauma. Specifically, he claimed that Pankejeff must have witnessed his parents in coitus.

For Freud this was case closed. He claimed Pankejeff successfully cured and his case as evidence for psychoanalysis's merit. Pankejeff disagreed. He found Freud's interpretation implausible and said that Freud's handling of his story was "propaganda." He remained in therapy on and off for over 60 years.

Many of Freud's other case studies, such "Dora" and "the Rat Man" cases, have come under similar scrutiny.

Sigmund Freud and his legacy

Freud's ideas may not live up to scientific inquiry, but their long shelf-life in film, literature, and criticism has created some fun readings of popular stories. Sometimes a face is just a face, but that face is a murderous phallic symbol. (Photo: Flickr)

Of course, there are many ideas we've left out. Homosexuality originating from arrested sexual development in anal phase? No way. Freudian psychosexual development theory? Unfalsifiable. Women's penis envy? Unfounded and insulting. Men's castration anxiety? Not in the way Freud meant it.

If Freud's legacy is so ill-informed, so unfounded, how did he and his cigars cast such a long shadow over the 20th century? Because there was nothing better to offer at the time.

When Freud came onto the scene, neurology was engaged in a giddy free-for-all. As New Yorker writer Louis Menand points out, the era's treatments included hypnosis, cocaine, hydrotherapy, female castration, and institutionalization. By contemporary standards, it was a horror show (as evident by these "treatments" featuring so prominently in our horror movies).

Psychoanalysis offered a comparably clement and humane alternative. "Freud's theories were like a flashlight in a candle factory," anthropologist Tanya Luhrmann told Menand.

But Freud and his advocates triumph his techniques as a science, and this is wrong. The empirical evidence for his ideas is limited and arbitrary, and his conclusions are unfalsifiable. The theory that explains every possible outcome explains none of them.

With that said, one might consider Freud's ideas to be a proto-science. As astrology heralded astronomy, and alchemy preceded chemistry, so to did Freud's psychoanalysis popularize psychology, paving the way for its more rapid development as a scientific discipline. But like astrology and alchemy, we should recognize Freud's ideas as the historic artifacts they are.

Photo by Alina Grubnyak on Unsplash
Mind & Brain

Do human beings have a magnetic sense? Biologists know other animals do. They think it helps creatures including bees, turtles and birds navigate through the world.

Keep reading Show less