Once a week.
Subscribe to our weekly newsletter.
The Role of the Novelist: How Jonathan Franzen Won the Book Publicity Game
I can still vividly remember reading, back in 2001, the New York Times Magazine write-up on the release of The Corrections. It began:
Some days, Jonathan Franzen wrote in the dark. He did so in a spartan studio on 125th Street in East Harlem, behind soundproof walls and a window of double-paned glass. The blinds were drawn. The lights were off. And Franzen, hunched over his keyboard in a scavenged swivel chair held together with duct tape, wore earplugs, earmuffs and a blindfold. ''You can always find the 'home' keys on your computer,'' he says in an embarrassed whisper, explaining how he managed to type under such constraints. “They have little raised bumps.”
What could drive a man to such madness? Later in the piece, I learned:
''I'm very concerned with providing a maximally enthralling experience,'' Franzen says of his work. ''Another 20 years of boring literary novels, and the thing's dead.''
Even then, this struck me as a wonderful piece of theater. Imagine persuading the Times that you’ve personally saved the novel—blindfolded!
I don’t mean to suggest that Franzen was fibbing about his work habits. Actually, I’d be shocked if the anecdote weren’t true. What I’m suggesting is that it’s possible to inhabit a role so fully that you disappear into it; that’s as true of authors and corporate executives as it is of actors. Yeats thought Oscar Wilde was forever playing the part of Oscar Wilde, and in fact we remember Wilde’s persona better than his brilliant, but spotty, literary output. More to the point, we still view his persona as authentic.
Something similar has happened with Franzen. He is now the literary celebrity of our age: a presidential favorite, a “Great American Novelist” according to the cover of Time magazine. Increasingly he’s called upon by journalists to speak for book culture as a whole. Yet he may be best known for expressing ambivalence about publicity—at least, a certain kind of publicity—in advance of his first appearance on The Oprah Winfrey Show. He told NPR at the time:
So much of reading is sustained in this country, I think, by the fact that women read while men are off golfing or watching football on TV or playing with their flight simulator...I continue to believe that, and now, I'm actually at the point with this book that I worry...I had some hope of actually reaching a male audience, and I've heard more than one reader in signing lines now in book stores that said, "If I hadn't heard you, I would have been put off by the fact that it is an Oprah pick. I figure those books are for women and I would never touch it." Those are male readers speaking. So, I'm a little confused about the whole thing now.
Was this snobbery? Calculated provocation? Genuine distaste for the way books are marketed? I’d guess primarily the third, but it hardly mattered: the cancellation of his appearance earned him more notoriety than an enthusiastic acceptance ever would have. It didn’t even cost him a chat with Oprah; he ended up appearing on the show in 2010.
It is this tortured relationship with fame that has made Jonathan Franzen so famous. He has won the book publicity game because part of him—but only part—despises it. So great is his anxiety about the role of the novelist in our culture that it has become integral to his literary persona. And as happened with Wilde, Norman Mailer, even Hemingway, his persona now threatens to overshadow his work.
Understanding this persona requires looking back, as the Times Magazine article did, to a well-known essay he published in Harper’s in 1996. In some ways this is the best thing Franzen’s ever written; certainly it’s the most central to his project. It begins a bit preciously, with the title “Perchance to Dream” and the following sentence:
My despair about the American novel began in the winter of 1991, when I fled to Yaddo, the artist’s colony in upstate New York, to write the last two chapters of my second book.
By the end, however, it has transformed into a probing meditation on the connection between art and society, between the American novelist and America itself. Observing that “just as the camera drove a stake through the heart of serious portraiture and landscape painting, television has killed the novel of social reportage,” Franzen confronts the hopelessness of "bringing the news" to his culture. After much soul-searching, he concludes—buoyantly and accurately—that this loss is no loss at all. Great novelists hone their insights into society by observing human character, not cultural ephemera:
I'm amazed, now, that I'd trusted myself so little for so long…as if, in peopling and arranging my own little alternate world, I could ignore the bigger social picture even if I wanted to.
Every literary critic who is also a literary author writes to some degree with the goal of clearing ground for his own work. Franzen is no exception, but his Harper’s essay doesn’t so much shill for his aesthetic as dissect it. He ventures a thoughtful self-critique—or self-correction—framed by a personal narrative of depression and recovery.
Yet the essay tells another story, too. Even as Franzen was pondering his aesthetic, he was also thinking through his book publicity strategy:
The writer for whom nothing matters but the printed word is, ipso facto, an untelevisable personality, and it's instructive to recall how many of our critically esteemed older novelists have chosen, in a country where publicity is otherwise sought like the Grail, to guard their privacy. Roth, McCarthy, Don DeLillo, William Gaddis, Anne Tyler, J. D. Salinger, Thomas Pynchon, Cynthia Ozick, and Denis Johnson all give few or no interviews, do little if any teaching or touring, and in some cases decline even to be photographed…for some of these writers, reticence is integral to their artistic creed.
…In 1955, before television had even supplanted radio as the regnant medium, Gaddis recognized that no matter how attractively subversive self-promotion may seem in the short run, the artist who's really serious about resisting a culture of inauthentic mass-marketed image must resist becoming an image himself, even at the price of certain obscurity.
For a long time, trying to follow Gaddis's example, I took a hard line on letting my work speak for itself. I refused to teach, to review for the Times, to write about writing, to go to pub-industry parties. To speak extranovelistically in an age of personalities seemed to me a betrayal….I had a cosmology of silent heroes and gregarious traitors.
Silence, however, is a useful statement only if someone, somewhere, expects your voice to be loud. Silence in the Nineties seemed only to guarantee that I would be alone. And eventually it dawned on me that the despair I felt about the novel was less the result of my obsolescence than of my isolation.
A quiet humor runs through this passage, and it’s not clear that the author is in on the joke. Franzen claims that his initial posture—the august silence of a Pynchon or DeLillo—hindered his loftiest goals as an artist: engaging with the culture, making a “useful statement.” An alternative reading would be that he found this posture less marketable than he’d hoped, and decided he’d better try another one. “Certain obscurity” is a steep price to pay over the long haul; besides, self-advertisement had worked for Mailer…
And so with a new book in the works, Franzen took to the pages of Harper’s, opining on the talent in the room, the condition of the novel, the condition of his novel. Nor did he court success through public gestures alone; his private life, too, he shaped ergonomically to the purpose. The very furniture of that tiny Harlem studio—the drawn blinds, the duct-taped chair—set the stage for his coup. Ears plugged, eyes covered, he willed himself to become the figure he’d dreamed of being, a figure he feared might go extinct: the celebrated novelist.
As for the novel he created, it didn’t live up to its grandiose billing, but it hardly flopped. I consider James Wood’s review for The New Republic definitive and won’t try to top it; suffice to say, The Corrections contains passages of great depth and passages (for example, all the sex scenes) that set my teeth on edge. At any rate, it was good enough: Franzen had called his shot and hit, if not a home run, then at least a solid double. He was tapped for the National Book Award and, of course, Oprah's Book Club. As a literary name and a popular favorite, he had arrived.
At the same time he had become, as a social reporter, out of date. In a cruel irony, 9/11 struck a week after the publication of The Corrections, turning the late-'90s culture he had carefully field-researched into an instant fossil. True, some of the book’s topical concerns survived, but the whole scale of its landscape had changed. Wood’s essay, published just six weeks later, pointed out passages that already looked obsolete. More broadly, it traced The Corrections’ internal divide between breezy journalism and authentic character study, suggesting that the book’s fate would depend—sooner than expected—on its commitment to the latter.
Franzen may have disagreed: in fact he’s never quite taken the medicine he prescribed in his Harper’s essay. His fiction remains chock full of social reportage, sometimes to the point of congestion. I have yet to read Freedom, but the excerpts quoted in reviews were not encouraging in this regard. Part of Franzen clearly likes playing social reporter; and not just reporter, but critic; and not just in his fictional worlds, but in the real one. During the past year Franzen has weighed in on everything from Twitter to e-books to Occupy Wall Street, causing more than one stir in the process. The man who once scorned "gregarious" authors is now the only novelist in America more famous for what he says than what he writes.
But he is famous, too, for what he writes, and there’s something admirable in that double achievement. I don’t love Franzen’s fiction and I don’t care for all of his opinions, but I can’t help paying attention to both. He knows that a writer’s first obligation is to his audience (“The reader is a friend,” he has said, “not an adversary, not a spectator”), and he is willing to extend that obligation beyond the page. Even as many of his fellow novelists continue to hide in their sanctuaries, he has taken off his shoes, waded into the culture, and mucked around a little.
At the same time, he hasn’t disguised the wariness of his steps, his occasional disgust with the muck of it all. On paper he can seem brash, on camera, shy; but he has yet to become overly prolific on the one hand, or to give up appearing before cameras on the other. Being interviewed “isn’t my favorite thing,” he told the Onion A.V. Club in 2010, “but I like TV interviews…[even though] I hate seeing myself.”
In short, Franzen is a man as internally divided as any of his characters, an impossible combination of Salinger and Mailer, and as such he compels us. The novel never really needed saving, nor would he have been a likely savior; but the role of the novelist—as suffering artist, as public intellectual—was in its death throes, and he has restored it to an eminence not seen on the American stage in decades.
[Image: Dan Winters, Time. Courtesy time.com.]
All this from a wad of gum?
- Researchers recently uncovered a piece of chewed-on birch pitch in an archaeological dig in Denmark.
- Conducting a genetic analysis of the material left in the birch pitch offered a plethora of insights into the individual who last chewed it.
- The gum-chewer has been dubbed Lola. She lived 5,700 years ago; and she had dark skin, dark hair, and blue eyes.
Five thousand and seven hundred years ago, "Lola" — a blue-eyed woman with dark skin and hair — was chewing on a piece of pitch derived from heating birch bark. Then, this women spit her chewing gum out into the mud on an island in Denmark that we call Syltholm today, where it was unearthed by archaeologists thousands of years later. A genetic analysis of the chewing gum has provided us with a wealth of information on this nearly six-thousand-year-old Violet Beauregarde.
This represents the first time that the human genome has been extracted from material such as this. "It is amazing to have gotten a complete ancient human genome from anything other than bone," said lead researcher Hannes Schroeder in a statement.
"What is more," he added, "we also retrieved DNA from oral microbes and several important human pathogens, which makes this a very valuable source of ancient DNA, especially for time periods where we have no human remains."
In the pitch, researchers identified the DNA of the Epstein-Barr virus, which infects about 90 percent of adults. They also found DNA belonging to hazelnuts and mallards, which were likely the most recent meal that Lola had eaten before spitting out her chewing gum.
Insights into ancient peoples
The birch pitch was found on the island of Lolland (the inspiration for Lola's name) at a site called Syltholm. "Syltholm is completely unique," said Theis Jensen, who worked on the study for his PhD. "Almost everything is sealed in mud, which means that the preservation of organic remains is absolutely phenomenal.
"It is the biggest Stone Age site in Denmark and the archaeological finds suggest that the people who occupied the site were heavily exploiting wild resources well into the Neolithic, which is the period when farming and domesticated animals were first introduced into southern Scandinavia."
Since Lola's genome doesn't show any of the markers associated with the agricultural populations that had begun to appear in this region around her time, she provides evidence for a growing idea that hunter-gatherers persisted alongside agricultural communities in northern Europe longer than previously thought.
Her genome supports additional theories on northern European peoples. For example, her dark skin bolsters the idea that northern populations only recently acquired their light-skinned adaptation to the low sunlight in the winter months. She was also lactose intolerant, which researchers believe was the norm for most humans prior to the agricultural revolution. Most mammals lose their tolerance for lactose once they've weaned off of their mother's milk, but once humans began keeping cows, goats, and other dairy animals, their tolerance for lactose persisted into adulthood. As a descendent of hunter-gatherers, Lola wouldn't have needed this adaptation.
A hardworking piece of gum
A photo of the birch pitch used as chewing gum.
These findings are encouraging for researchers focusing on ancient peoples from this part of the world. Before this study, ancient genomes were really only ever recovered from human remains, but now, scientists have another tool in their kit. Birch pitch is commonly found in archaeological sites, often with tooth imprints.
Ancient peoples used and chewed on birch pitch for a variety of reasons. It was commonly heated up to make it pliable, enabling it to be molded as an adhesive or hafting agent before it settled. Chewing the pitch may have kept it pliable as it cooled down. It also contains a natural antiseptic, and so chewing birch pitch may have been a folk medicine for dental issues. And, considering that we chew gum today for no other reason than to pass the time, it may be that ancient peoples chewed pitch for fun.
Whatever their reasons, chewed and discarded pieces of birch pitch offer us the mind-boggling option of learning what someone several thousands of years ago ate for lunch, or what the color of their hair was, their health, where their ancestors came from, and more. It's an unlikely treasure trove of information to be found in a mere piece of gum.
The Inglehart-Welzel World Cultural map replaces geographic accuracy with closeness in terms of values.
- This map replaces geography with another type of closeness: cultural values.
- Although the groups it depicts have familiar names, their shapes are not.
- The map makes for strange bedfellows: Brazil next to South Africa and Belgium neighboring the U.S.
Some countries value self-expression more than others.Credit: Robyn Beck / AFP via Getty Images
Question: On what map is Lithuania a neighbor of China, Poland lies next to Brazil, and Morocco and Yemen touch?
Answer: The Inglehart-Welzel World Cultural Map. To be precise, the 2017 map. Because on the 2020 version, each of those pairs has drifted apart significantly.
These are not, strictly speaking, maps but rather scatterplot diagrams. Each dot represents a country, the position of which is based on how it ranks on two different values (discussed below). The dots are corralled together into geo-cultural groups:
- Catholic Europe, which comprises countries as diverse and far apart as Hungary and Andorra■ Protestant Europe, taking in both Iceland and Germany
- The Orthodox world, from Belarus all the way to Armenia
- The three Baltic states
- The English-speaking world, including both the U.S. and Northern Ireland
- The huge African-Islamic world, ranging from Azerbaijan to South Africa
- Latin America, which goes from Mexico to Argentina
- South Asia, which comprises both India and Cyprus
- The Confucian world, encompassing China and Japan.
The placement of the dots indicates cultural proximity or distance. Some countries from different groups can be more similar than other countries in the same group.
See the examples indicated above: cultural neighbors China and Lithuania belong to the Confucian and Baltic groups, respectively. Poland is part of Catholic Europe; its 2017 neighbor Brazil is in Latin America. Morocco and Yemen are closer culturally to Armenia, in the Orthodox group, than they are to Qatar, despite all belonging to the African-Islamic group.
The 2017 version of the map places Malta deep inside South America and lets Vietnam, Portugal, and Macedonia meet.Credit: World Values Survey, public domain.
Creating a culture map
So, what exactly are the criteria used for plotting these dots in the first place?
These maps are part of the World Values Survey, first conducted by political scientist Ronald Inglehart in the late 1990s. With his colleague Christian Welzel, he produced an update in 2005. The WVS has been revised several times since, most recently in 2020.
The WVS asserts that there are two fundamental dimensions to cross-cultural variation across the world. These are used as the axes to plot the various countries on the diagram.
- The X-axis measures survival versus self-expression values.
Survival values focus on economic and physical security. There is not much room for trust and tolerance of "others." Self-expression values prioritize well-being, quality of life, and self-expression. There is more room for tolerating ethnic, religious, and sexual minorities.
- The Y-axis measures traditional versus secular-rational values.
Traditional values include deference to religion and parental authority as well as traditional social and family values. Societies that score high on traditions typically also are highly nationalistic. In more secular-rational societies, science and bureaucracy replace faith as the basis for authority. Secular-rational values include high tolerance of things like divorce, abortion, euthanasia, and suicide.
As indicated by the significant changes on the 2020 map, the cultural values of nations are not static:
- Countries that move up on the map are shifting from traditional to more secular-rational values.
- Countries that move to the right on the map are shifting from survival values to self-expression values.
- And, of course, vice versa in both cases.
According to the authors of the map, changes in cultural outlook can be the result of socioeconomic changes — increasing levels of wealth, for example. But the religious and cultural heritage of each country also plays a part.
The world's cultural landscape is dynamic — you could even say promiscuous, producing new bedfellows every few years.Credit: World Values Survey, public domain.
Some notable features of the 2020 map:
- The Baltic group has been dissolved; Lithuania is now part of Catholic Europe, Estonia a lone Protestant island in a Catholic sea. More worryingly, Latvia seems to have dissolved completely.
- In general, survival values are strongest in African-Islamic countries, self-expression values in Protestant Europe.
- Traditional values are strongest in African-Islamic countries and Latin America, while secular values dominate in Confucian countries and Protestant Europe.
- The United States is an atypical member of the English-speaking group, scoring much lower on both scales (that is to say, lower and more to the left). In other words, the U.S. is more into traditional and survival values than the group's other members.
- Shifting attitudes don't just separate; they also unite. Belgium and the U.S. are now culture buddies, as are New Zealand and Iceland. Kazakhstan is virtually indistinguishable from Bosnia.
The Inglehart-Welzel map is not without its critics. It has been decried as Eurocentric, simplistic, and culturally essentialist (that is, the assumption that certain cultural characteristics are essential and fixed, and that some are superior to others). Which is, of course, a very self-expressive thing to say.
For more on these maps, on the WVS surveys, and on the methodology used, visit the World Values Survey.
Strange Maps #1098
Got a strange map? Let me know at firstname.lastname@example.org.
A study finds that baby mammals dream about the world they are about to experience to prepare their senses.
- Researchers find that babies of mammals dream about the world they are entering.
- The study focused on neonatal waves in mice before they first opened their eyes.
- Scientists believe human babies also prime their visual motion detection before birth.
Imagine opening your eyes for the first time as a brand new baby. The world is so mysterious, full of obstacles and strange shapes. And yet it does not take babies all that long to get their bearings, to latch on to their parents, and to start interacting. How do they do this so quickly? A new study published in Science proposes that babies of mammals dream about the world they are about to enter before being born, developing important skills.
The team, led by professor Michael Crair, who specializes in neuroscience, ophthalmology, and visual science, wanted to understand why when mammals are born, they are already somewhat prepared to interact with the world.
"At eye opening, mammals are capable of pretty sophisticated behavior," said Craig, "But how do the circuits form that allow us to perceive motion and navigate the world? It turns out we are born capable of many of these behaviors, at least in rudimentary form."
Unusual retinal activity
The scientists observed waves of activity radiating from the retinas of newborn mice before their eyes first open. Imaging shows that soon after birth, this activity disappears. In its place matures a network of neural transmissions that carries visual stimuli to the brain, as explained by a Yale press release. Once it reaches the brain, the information is encoded for storage.
What's particularly unusual about this neonatal activity is that it demonstrates a pattern that would happen if the animal was moving forward somewhere. As the researchers write in the study, "Spontaneous waves of retinal activity flow in the same pattern as would be produced days later by actual movement through the environment."
Crair explained that this "dream-like activity" makes sense from an evolutionary standpoint, as it helps the mouse get ready for what will happen to it after it opens its eyes. It allows the animal to "respond immediately to environmental threats," Crair shared.
Retinal waves in a newborn mouse prepare it for vision www.youtube.com
What is creating the waves?
The scientists also probed what is responsible for creating the retinal waves that mimic the forward motion. They turned on and off the functionality of starburst amacrine cells — retinal cells that release neurotransmitters — and discovered that blocking them stopped the retinal waves from flowing, which hindered the mouse from developing the ability to react to visual motion upon birth. These cells are also important to an adult mouse, affecting how it reacts to environmental stimuli.
Graphic showing the origin and functionality of directional retinal waves.Michael C. Crair et al, Science, 2021.
What about human babies?
While the study focused on mice, human babies also seem to be able to identify objects and motion right after birth. This suggests the presence of a similar phenomenon in babies before they are born.
"These brain circuits are self-organized at birth and some of the early teaching is already done," Crair stated. "It's like dreaming about what you are going to see before you even open your eyes."