Let’s Stop Competing Over Who Can Stay at Work the Longest
Real productivity doesn’t always look anything like “first to work, last to leave.”
Taking work home isn’t necessarily a problem. Neither is working hard. It’s the performance of work at the expense of real productivity that is foolish, and harmful. Real productivity doesn’t always look anything like "first to work, last to leave."
Every company is different. As I type this, I am sitting in an open, loft-style office in which everyone is working in complete silence. There are video producers silently editing videos, editors silently editing text, sales and product people silently organizing their client databases. In the background is the ambient hum of a giant server that houses thousands of hours of precious interview video, the core property of the company I work for. This is a nice room in which to work: It’s contemporary, airy, and open. We’re not in miserable, carpeted cubicles. In fact, one of my bosses recently bought everyone who wanted one an exercise ball to sit on, which I did, for a month straight until it was punctured by a staple.* And we do, periodically, stand up and reconfigure ourselves, entering the studio at one end of this floor-through for video or podcast taping, or the conference room at the other end for meetings. The people who work here are generally friendly and helpful to one another, and management is understanding of the fact that people have lives, and that unexpected and non-work related things happen in those lives: births, deaths, vacations. In short, it’s a really good place to work, as workplaces go.
But here’s the thing:
Even in a funky, startupy media office like mine, everybody works long hours. And they spend most of those hours hunched over their desks, eyes glued to their computer screens. It’s the nature of the work, of course, that it requires so much screen time. We’re part of the “idea economy,” and that’s not going to change. But countless organizational psychologists and management gurus have passed through these doors and told us and our audience that people are happiest, most creative, and most productive when they take regular breaks, get up periodically and walk, and when they are not in a constant state of sleep deprivation with 600 tasks in various states of completion.
Yet of all the people who work here, I may be the only one who regularly eats lunch away from my desk. The production team seems to subsist entirely on Red Bull and Lara Bars. There’s a Bloc of Three who sip Liquiteria shakes with chia seeds while they work, and must each by now have amassed an obscenely large collection of those expensive-looking blue cloth bags they come in. Two others don’t eat anything, ever, as far as I can tell. And while nobody has ever said anything to me about it, it takes real willpower on my part not to conform to what feels like a cultural expectation of total-all-the-time-hunched-over-your-deskness. But I do it: Almost every day I eat lunch somewhere other than at my desk and, when the weather permits, I take a longish walk afterward**, wondering always if one day, when other things aren’t going so well, it’ll come up in a performance review.
Once, in a previous job (in publishing), we had a visiting colleague from the offices in Spain. She was genuinely shocked and horrified to see people eating at their desks. In Spain, she said, it would be unthinkable not to take a one- to two-hour midday break for lunch. This eating at your desk was one of the saddest office sights she had ever seen.
Beyond lunch: Many people here are working 10-12-hour days, myself included. Also, definitively not good for you according to any Nobel-laureate psychologist we’ve ever interviewed. And everybody here is fully aware of that. Still, there are regularly people here until 8 or 9 o’clock at night, and this is also the kind of work you can’t help but take home with you.
Let me be clear: This is not company policy. It just evolved this way. And I suspect that this is the situation in thousands of other hip, modern workplaces much like mine, nationwide. Taking work home isn’t necessarily a problem. Neither is working hard. It’s the performance of work at the expense of real productivity that is foolish, and harmful. Real productivity doesn’t always look anything like “first to work, last to leave.”
Of course, all this is as nothing to the proudly punishing Silicon Valley work ethic gorily exposed in that recent New York Times piece on Amazon. Amazon pushed back on that article, but it was no hatchet job. It is well and widely known that startup culture favors the young and childless for their ability to work (or appear to work) ceaselessly. I once reached the interview stage for a job with Coursera, a massive online learning company based in Palo Alto, California. More than once, the interviewer (who was half my age) bragged about how Coursera employees tend to be “obsessive” and “workaholics,” never leaving their posts until the job is done, which is never. For three days after that interview I couldn’t sleep, terrified that I might actually get the job, move my family to California, and never see my kid again.
In the current startup climate (and more established companies, too, are trying to “think like startups” in order to keep lithe and competitive), with management neither encouraging nor actively counteracting this state of affairs, you have two choices:
Conform to be on the safe side.
Do what works best for you and take your chances, job-wise.
Some reading this article will believe in the culture of hard work for its own sake and roundly reject the notion that balance is possible or desirable for those who want to “succeed” professionally. Marissa Mayer’s two-week working maternity leave and Mindy Kaling’s new book offer this counterargument. I’d argue instead that valuing hard work over freedom of mind, physical health, and good relationships is a sickness, and a highly contagious one at that. At this point, in the America I live in, in spite of all the books and articles and TED talks and Big Think videos counseling us to the contrary, and all the think pieces on how “millennials” are changing the world for the better, this collective slide into workaholism is starting to feel like an epidemic.
Is anybody out there working on a cure?
*Which, it now strikes me, is pretty suspicious. Staples don’t just position themselves on the floor, pointy side up, waiting for an exercise ball to roll on top of them...
**The fact is, walking is almost never “downtime” for me. It’s creative time — I’m writing headlines in my head, reworking a story, thinking through all the things that get tied up in knots when I’m stationary.
@jgots is me on Twitter
As religious diversity increases in the United States, we must learn to channel religious identity into interfaith cooperation.
- Religious diversity is the norm in American life, and that diversity is only increasing, says Eboo Patel.
- Using the most painful moment of his life as a lesson, Eboo Patel explains why it's crucial to be positive and proactive about engaging religious identity towards interfaith cooperation.
- The opinions expressed in this video do not necessarily reflect the views of the Charles Koch Foundation, which encourages the expression of diverse viewpoints within a culture of civil discourse and mutual respect.
Pulitzer Prize-winner Jared Diamond explains why some nations make it through epic crises and why others fail.
- "A country is not going to resolve a national crisis unless it acknowledges that it's in a crisis," says Jared Diamond. "If you don't, you're going to get nowhere. Many Americans still don't recognize today that the United States is descending into a crisis."
- The U.S. tends to focus on "bad countries" like China, Canada and Mexico as the root of its problems, however Diamond points out the missing piece: Americans are generating their own problems.
- The crisis the U.S. is experiencing is not cause for despair. The U.S. has survived many tragedies, such as the War of Independence and the Great Depression – history is proof that the U.S. can get through this current crisis too.
If you don't want to know anything about your death, consider this your spoiler warning.
- For centuries cultures have personified death to give this terrifying mystery a familiar face.
- Modern science has demystified death by divulging its biological processes, yet many questions remain.
- Studying death is not meant to be a morbid reminder of a cruel fate, but a way to improve the lives of the living.
When it comes to sniffing out whether a source is credible or not, even journalists can sometimes take the wrong approach.
- We all think that we're competent consumers of news media, but the research shows that even journalists struggle with identifying fact from fiction.
- When judging whether a piece of media is true or not, most of us focus too much on the source itself. Knowledge has a context, and it's important to look at that context when trying to validate a source.