A Penny, if that Much, For Your Thoughts: The De-Professionalization of Intellectual Labor
For the reader, art connoisseur, music fan, or student, this is a “money for nothing and your chicks for free” kind of world (Google the lyrics if you’re too young to know the song).
Thousands of book reviews come up for free. Do a search and all of your questions about the universe are answered for free. Worried about a local bottle tax? Never mind. A dozen bored citizens have blogged about it, for free.
For the writer, artist, musician, or scholar, however, it’s a world of no money for nothing and your work for free.
Most “content providers” online aren’t paid. The ones who are paid might get $5.00 per 1,000 “page views” for the first 20,000. That means for the first 20,000 the writer doesn’t get a penny for his thoughts. As the folk song says, a half-a-penny will do.
Colleges and universities are moving toward the adjunctification of faculty—underpaying Ph.D.s, in light of their years of education and debt burdens, to teach a few classes part-time. A New York university advertised last year that it would pay instructors only for their “contact hours” with students (the tip of the iceberg of teaching) at a rate of $64.84 per hour, with 45 contact hours per class, to teach weighty courses such as contemporary political thought. That amounts to under $3,000 to teach a class of college students.
The Internship economy has long been an engine of free intellectual labor, but is apparently relied upon even more to supplant actual, costly employees. Q: How many interns does it take to screw in a lightbulb? A: Who cares? They’re free.
At the most subtle levels of “creativity,” Jaron Lanier argues in Who Owns the Future that the most quotidian, prosaic acts of creation—the artful folding of a shirt, or a simple mechanical action—are now, under the aegis of “sharing,” instructing robots in how to do the same job. And all--you’ve got it—for free. The human muse and instructor receives no compensation.
But, we get “exposure” in return for working for free! Exposure, we’ve got. It’s a living we need.
These are all examples of the de-professionalization of intellectual and creative labor. Work that was once supported and remunerated within a professional system of school and employment is now performed by amateurs and dispersed widely through new social media and search engines, without the usual constraints—or quality controls and certification—that a profession implies.
What is true for Walmart is true for ideas. Sure, you can get them—whether “them” is a thought or a t-shirt—real cheap, but they’re both things dashed off by a miserable little waif in a coffee shop or an Indonesian processing zone, respectively.
You get what you pay for. I was in a keenly competitive Honors seminar program in a keenly competitive college as an undergraduate. An acquaintance of mine in this program once wrote and presented a paper to his seminar, as was the custom. At the start he disclaimed with false modesty, “I wrote this paper while watching a Mets game.” To which the professor responded, “And you think it doesn’t show?”
Writers write without the benefit of editors, copy editors, proofreaders, or fact checkers, in the increasingly fast-paste (just joking—I know it’s fast-paced, but that’s just the sort of error we writers like editors to catch) world of online media. And, as the wise professor said, it’s not like it doesn’t show.
A variety of fields of intellectual and creative labor that were structured into bona fide professions in the late 19th and 20th centuries are being de-professionalized.
The labor is conducted by amateurs—in a descriptive not a pejorative sense—who aren’t schooled in the professional ethics and methods of, say, journalism, or investigative reporting, or interview techniques, or even the legacies and disciplinary heritage of photography or the creative arts.
We’ve got reviewers galore—of food, movies, music, wine, books, restaurants—who have never had to grapple with an editor’s advice or thought about the self-referential traditions in which they write. They can dash off bile on Amazon, and never think about the Book Reviewer’s responsibility, or what a book review means, or think about the literary tradition of the critical essay, or even accommodate the thoroughly professional checks and balances of respect and reciprocity that once kept book reviewers from being complete, raging lunatics. Hell, they don’t even need to put their name to their work.
Where does this all leave us? Anis Shivani, in a brilliant book of criticism, notes parenthetically that the best thing financially a writer can do, especially if s/he wants to avoid the soul-deadening academic departments, is to marry well, and marry rich.
Ironically, the new media age might be catapulting us back to a Neo-Victorian economy for intellectual labor, where only those who are independently wealthy either through birth, marriage, or the Lotto, can afford to write, or even pursue the indulgent luxury of a Ph.D.
The neo-Victorian economy brings us the hobbyist, amateur, and genteel dilettante, who paints, writes, investigates, researches, and plays music, more or less for free, as a happy pursuit of their wealth and independent resources, since no one else is inclined to support their work.
The economy of the hobbyist or the trust fund literati had its advantages, certainly. Arguably, it’s that world, of the curious dabbler-empiricist who did precisely what s/he wanted to do, that brought us Darwin’s theories of evolution.
We’re more likely, however, to get a neo-Victorian economy of hobbyists in a complex post-modern world, without the educational resources for them to make good on their leisure enterprises.
Journalism is the most profound example of de-professionalization. Journalism was professionalized in the 1900s. Schools to perfect it and study it sprung up, where once there had only been the ink-stained wretches, schooled in nothing but hard knocks. Not everyone had to pursue a journalism degree to become a reporter, but with professionalization came the notion that the journalist had to adhere to codes of conduct and methods. Among other things, this professionalization brought us the thoroughly modern notion of journalistic “objectivity.”
Before, in the 1900s, a very few people could make a good living writing and doing journalism. Now, very many more people can not make a living doing it. At a party some time ago, I chatted with a local newspaper reporter, who realized that she was a dead man walking. “A housewife in suburban Rochester thinks that she knows how to do local reportage,” she commented. “She does it for free, and doesn’t know what she’s talking about, but people read her. I can’t compete with that.”
In journalism there is an infrastructure that undergirds these efforts, and someone, somewhere, has to pay for that infrastructure, if we don’t want our reportage coming from a mommy blog in the anteroom of Rochester, and if we don’t want our assessment of scientific research coming from someone who knows nothing about vaccines, or nothing about climate change, but thinks that their opinion counts as fact. The infrastructure includes professional training, and schooling. It includes editors. It includes resources to support the travel and time required to get an honest view of a complex topic.
The adage comes to mind: First they came for the manufacturing jobs, and I said nothing; then they came for the customer service call line jobs, and I said nothing; then they came for the high tech support jobs and I said nothing… Now, they’ve come for creative and intellectual labor, too. And there’s no one left to fight.
It sounds, and I suppose is, anti-democratic, but we’ve bent over backwards so far to dignify the amateur voice that we’ve forgotten to say, at some point, these simple words: You don’t know what the hell you’re talking about.
How a cataclysm worse than what killed the dinosaurs destroyed 90 percent of all life on Earth.
While the demise of the dinosaurs gets more attention as far as mass extinctions go, an even more disastrous event called "the Great Dying” or the “End-Permian Extinction” happened on Earth prior to that. Now scientists discovered how this cataclysm, which took place about 250 million years ago, managed to kill off more than 90 percent of all life on the planet.
A new study discovers the “liking gap” — the difference between how we view others we’re meeting for the first time, and the way we think they’re seeing us.
We tend to be defensive socially. When we meet new people, we’re often concerned with how we’re coming off. Our anxiety causes us to be so concerned with the impression we’re creating that we fail to notice that the same is true of the other person as well. A new study led by Erica J. Boothby, published on September 5 in Psychological Science, reveals how people tend to like us more in first encounters than we’d ever suspect.
Using advanced laser technology, scientists at NASA will track global changes in ice with greater accuracy.
Leaving from Vandenberg Air Force base in California this coming Saturday, at 8:46 a.m. ET, the Ice, Cloud, and Land Elevation Satellite-2 — or, the "ICESat-2" — is perched atop a United Launch Alliance Delta II rocket, and when it assumes its orbit, it will study ice layers at Earth's poles, using its only payload, the Advance Topographic Laser Altimeter System (ATLAS).
SMARTER FASTER trademarks owned by The Big Think, Inc. All rights reserved.