Self-Motivation
David Goggins
Former Navy Seal
Career Development
Bryan Cranston
Actor
Critical Thinking
Liv Boeree
International Poker Champion
Emotional Intelligence
Amaryllis Fox
Former CIA Clandestine Operative
Management
Chris Hadfield
Retired Canadian Astronaut & Author
Learn
from the world's big
thinkers
Start Learning

A Penny, if that Much, For Your Thoughts: The De-Professionalization of Intellectual Labor

For the reader, art connoisseur, music fan, or student, this is a “money for nothing and your chicks for free” kind of world (Google the lyrics if you’re too young to know the song).


Thousands of book reviews come up for free. Do a search and all of your questions about the universe are answered for free. Worried about a local bottle tax? Never mind. A dozen bored citizens have blogged about it, for free.

For the writer, artist, musician, or scholar, however, it’s a world of no money for nothing and your work for free.

Most “content providers” online aren’t paid. The ones who are paid might get $5.00 per 1,000 “page views” for the first 20,000. That means for the first 20,000 the writer doesn’t get a penny for his thoughts. As the folk song says, a half-a-penny will do.

Colleges and universities are moving toward the adjunctification of faculty—underpaying Ph.D.s, in light of their years of education and debt burdens, to teach a few classes part-time. A New York university advertised last year that it would pay instructors only for their “contact hours” with students (the tip of the iceberg of teaching) at a rate of $64.84 per hour, with 45 contact hours per class, to teach weighty courses such as contemporary political thought. That amounts to under $3,000 to teach a class of college students.

The Internship economy has long been an engine of free intellectual labor, but is apparently relied upon even more to supplant actual, costly employees. Q: How many interns does it take to screw in a lightbulb? A: Who cares? They’re free.

At the most subtle levels of “creativity,” Jaron Lanier argues in Who Owns the Future that the most quotidian, prosaic acts of creation—the artful folding of a shirt, or a simple mechanical action—are now, under the aegis of “sharing,” instructing robots in how to do the same job. And all--you’ve got it—for free. The human muse and instructor receives no compensation.

Suzanne Moore puts her foot down against the idea of working for free in the digital economy, allowing work to be used with no royalties or compensation. So does a friend of mine.

But, we get “exposure” in return for working for free! Exposure, we’ve got. It’s a living we need.

These are all examples of the de-professionalization of intellectual and creative labor. Work that was once supported and remunerated within a professional system of school and employment is now performed by amateurs and dispersed widely through new social media and search engines, without the usual constraints—or quality controls and certification—that a profession implies.

What is true for Walmart is true for ideas. Sure, you can get them—whether “them” is a thought or a t-shirt—real cheap, but they’re both things dashed off by a miserable little waif in a coffee shop or an Indonesian processing zone, respectively.

You get what you pay for. I was in a keenly competitive Honors seminar program in a keenly competitive college as an undergraduate. An acquaintance of mine in this program once wrote and presented a paper to his seminar, as was the custom. At the start he disclaimed with false modesty, “I wrote this paper while watching a Mets game.” To which the professor responded, “And you think it doesn’t show?”

Writers write without the benefit of editors, copy editors, proofreaders, or fact checkers, in the increasingly fast-paste (just joking—I know it’s fast-paced, but that’s just the sort of error we writers like editors to catch) world of online media. And, as the wise professor said, it’s not like it doesn’t show.

A variety of fields of intellectual and creative labor that were structured into bona fide professions in the late 19th and 20th centuries are being de-professionalized.

The labor is conducted by amateurs—in a descriptive not a pejorative sense—who aren’t schooled in the professional ethics and methods of, say, journalism, or investigative reporting, or interview techniques, or even the legacies and disciplinary heritage of photography or the creative arts.

We’ve got reviewers galore—of food, movies, music, wine, books, restaurants—who have never had to grapple with an editor’s advice or thought about the self-referential traditions in which they write. They can dash off bile on Amazon, and never think about the Book Reviewer’s responsibility, or what a book review means, or think about the literary tradition of the critical essay, or even accommodate the thoroughly professional checks and balances of respect and reciprocity that once kept book reviewers from being complete, raging lunatics. Hell, they don’t even need to put their name to their work.

Where does this all leave us? Anis Shivani, in a brilliant book of criticism, notes parenthetically that the best thing financially a writer can do, especially if s/he wants to avoid the soul-deadening academic departments, is to marry well, and marry rich.

Ironically, the new media age might be catapulting us back to a Neo-Victorian economy for intellectual labor, where only those who are independently wealthy either through birth, marriage, or the Lotto, can afford to write, or even pursue the indulgent luxury of a Ph.D.

The neo-Victorian economy brings us the hobbyist, amateur, and genteel dilettante, who paints, writes, investigates, researches, and plays music, more or less for free, as a happy pursuit of their wealth and independent resources, since no one else is inclined to support their work.

The economy of the hobbyist or the trust fund literati had its advantages, certainly. Arguably, it’s that world, of the curious dabbler-empiricist who did precisely what s/he wanted to do, that brought us Darwin’s theories of evolution.

We’re more likely, however, to get a neo-Victorian economy of hobbyists in a complex post-modern world, without the educational resources for them to make good on their leisure enterprises.

Journalism is the most profound example of de-professionalization. Journalism was professionalized in the 1900s. Schools to perfect it and study it sprung up, where once there had only been the ink-stained wretches, schooled in nothing but hard knocks. Not everyone had to pursue a journalism degree to become a reporter, but with professionalization came the notion that the journalist had to adhere to codes of conduct and methods. Among other things, this professionalization brought us the thoroughly modern notion of journalistic “objectivity.”

Before, in the 1900s, a very few people could make a good living writing and doing journalism. Now, very many more people can not make a living doing it. At a party some time ago, I chatted with a local newspaper reporter, who realized that she was a dead man walking. “A housewife in suburban Rochester thinks that she knows how to do local reportage,” she commented. “She does it for free, and doesn’t know what she’s talking about, but people read her. I can’t compete with that.”

In journalism there is an infrastructure that undergirds these efforts, and someone, somewhere, has to pay for that infrastructure, if we don’t want our reportage coming from a mommy blog in the anteroom of Rochester, and if we don’t want our assessment of scientific research coming from someone who knows nothing about vaccines, or nothing about climate change, but thinks that their opinion counts as fact. The infrastructure includes professional training, and schooling. It includes editors. It includes resources to support the travel and time required to get an honest view of a complex topic.

The adage comes to mind: First they came for the manufacturing jobs, and I said nothing; then they came for the customer service call line jobs, and I said nothing; then they came for the high tech support jobs and I said nothing… Now, they’ve come for creative and intellectual labor, too. And there’s no one left to fight.

It sounds, and I suppose is, anti-democratic, but we’ve bent over backwards so far to dignify the amateur voice that we’ve forgotten to say, at some point, these simple words: You don’t know what the hell you’re talking about.

Live today! Unfiltered lessons of a female entrepreneur

Join Pulitzer Prize-winning reporter and best-selling author Charles Duhigg as he interviews Victoria Montgomery Brown, co-founder and CEO of Big Think, live at 1pm EDT today.

Two MIT students just solved Richard Feynman’s famed physics puzzle

Richard Feynman once asked a silly question. Two MIT students just answered it.

Surprising Science

Here's a fun experiment to try. Go to your pantry and see if you have a box of spaghetti. If you do, take out a noodle. Grab both ends of it and bend it until it breaks in half. How many pieces did it break into? If you got two large pieces and at least one small piece you're not alone.

Keep reading Show less

Two-thirds of parents say technology makes parenting harder

Parental anxieties stem from the complex relationship between technology, child development, and the internet's trove of unseemly content.

Sex & Relationships
  • Today's parents believe parenting is harder now than 20 years ago.
  • A Pew Research Center survey found this belief stems from the new challenges and worries brought by technology.
  • With some schools going remote next year, many parents will need to adjust expectations and re-learn that measured screen usage won't harm their children.

Parents and guardians have always endured a tough road. They are the providers of an entire human being's subsistence. They keep that person feed, clothed, and bathe; They help them learn and invest in their enrichment and experiences; They also help them navigate social life in their early years, and they do all this with limited time and resources, while simultaneously balancing their own lives and careers.

Add to that a barrage of advice and reminders that they can always spend more money, dedicate more time, or flat-out do better, and it's no wonder that psychologists worry about parental burnout.

But is parenting harder today than it was, say, 20 years ago? The Pew Research Center asked more than 3,600 parents this question, and a majority (66 percent) believe the answer is yes. While some classic complaints made the list—a lack of discipline, a disrespectful generation, and the changing moral landscape—the most common reason cited was the impact of digital technology and social media.

A mixed response to technology

children using desktop computer

Parents worry that their children spend too much time in front of screens while also recognizing technologies educational benefits.

(Photo: Chris Hondros/Getty Images)

This parental concern stems not only from the ubiquity of screens in children's lives, but the well-publicized relationship between screen time and child development. Headlines abound citing the pernicious effects screen time has on cognitive and language development. Professional organizations, such as the American Academy of Child and Adolescent Psychiatry, issue warnings that too much screen time can lead to sleep problems, lower grades, weight problems, mood problems, poor self-image, and the fear of missing out—to name a few!

According to Pew's research, parents—which Pew defines as an adult or guardian with at least one child under their care, though they may also have adult children—have taken these warnings to heart. While 84 percent of those surveyed are confident they know how much screen time is appropriate, 71 percent worry their child spends too much time in front of screens.

To counter this worry, most parents take the measured approach of setting limits on the length of time children can access screens. Others limit which technologies children have access to. A majority of parents (71 percent) view smartphones as potentially harmful to children. They believe the devices impair learning effective social skills, developing healthy friendships, or being creative. As a result, about the same percentage of parents believe children should be at least 12 years old before owning a smartphone or using social media.

But a deeper concern than screen time seems to be what content those screens can access. An overwhelming 98 percent of those surveyed say parents and guardians shouldered the responsibility of protecting children from inappropriate online content. Far less put the responsibility on tech companies (78 percent) or the government (65 percent).

Parents of young children say they check the websites and apps their children use and set parental controls to restrict access. A minority of parents admit to looking at call and text records, tracking their child's location with GPS, or following their child on social media.

Yet, parents also recognize the value of digital technology or, at least, have acquiesced to its omnipresence. The poster child for this dichotomy is YouTube, with its one billion hours played daily, many before children's eyes. Seventy-three percent of parents with young children are concerned that their child will encounter inappropriate content on the platform, and 46 percent say they already have. Yet, 80 percent still let their children watch videos, many letting them do so daily. Some reasons cited are that they can learn new things or be exposed to different cultures. The number one cited reason, however, is to keep children entertained.

For the Pew Research Center's complete report, check out "Parenting Children in the Age of Screens."

Screens, parents, and pandemics

Perhaps most troubling, Pew's survey was conducted in early March. That's before novel coronavirus spread wildly across the United States. Before shelter-in-place laws. Before schools shuttered their doors. Before desperate parents, who suddenly found themselves their child's only social and educational outlet, needed a digital lifeline to help them cope.

The COVID-19 pandemic has led many parents to rely on e-learning platforms and YouTube to supplement their children's education—or just let the kids enjoy their umpteenth viewing of "Moana" so they can eke out a bit more work. With that increase in screen time comes a corresponding increase in guilt, anxiety, and frustration.

But are these concerns overblown?

As Jenny Radesky, M.D., a pediatrician and expert on children and the media at the University of Michigan's C.S. Mott Children's Hospital, told the New York Times, parents don't always need to view screen time as a negative. "Even the phrase 'screen time' itself is problematic. It reduces the debate to a black and white issue, when the reality is much more nuanced," Radesky said.

Radesky helped the American Academy of Pediatrics craft its statement about screen time use during the pandemic. While the AAP urges parents to preserve offline experiences and maintain limits, the organization acknowledges that children's media use will, by necessity, increase. To make it a supportive experience, the statement recommends parents make a plan with their children, be selective of the quality of media, and use social media to maintain connections together. It also encourages parents to adjust their expectations and notice their own technology use.

"We are trying to prevent parents from feeling like they are not meeting some sort of standard," Radesky said. "There is no science behind this right now. If you are looking for specific time limits, then I would say: Don't be on it all day."

This is good advice for parents, now and after the pandemic. While studies show that excessive screen time is deleterious, others show no harm from measured, metered use. For every fear that screens make our kids stupid, there's a study showing the kids are all right. If we maintain realistic standards and learn to weigh quality and quantity within those standards, maybe parenting in the digital age won't seem so darn difficult.

How meditation can change your life and mind

Reaching beyond the stereotypes of meditation and embracing the science of mindfulness.

Videos
  • There are a lot of misconceptions when it comes to what mindfulness is and what meditation can do for those who practice it. In this video, professors, neuroscientists, psychologists, composers, authors, and a former Buddhist monk share their experiences, explain the science behind meditation, and discuss the benefits of learning to be in the moment.
  • "Mindfulness allows us to shift our relationship to our experience," explains psychologist Daniel Goleman. The science shows that long-term meditators have higher levels of gamma waves in their brains even when they are not meditating. The effect of this altered response is yet unknown, though it shows that there are lasting cognitive effects.
  • "I think we're looking at meditation as the next big public health revolution," says ABC News anchor Dan Harris. "Meditation is going to join the pantheon of no-brainers like exercise, brushing your teeth and taking the meds that your doctor prescribes to you." Closing out the video is a guided meditation experience led by author Damien Echols that can be practiced anywhere and repeated as many times as you'd like.
Keep reading Show less
Scroll down to load more…
Quantcast