Many workers moved home on the promise or hope that they'd be able to keep working remotely at least some of the time after the pandemic ended.
A good example of this is a recent op-ed written by the CEO of a Washington, D.C., magazine that suggested workers could lose benefits like health care if they insist on continuing to work remotely as the COVID-19 pandemic recedes. The staff reacted by refusing to publish for a day.
While the CEO later apologized, she isn't alone in appearing to bungle the transition back to the office after over a year in which tens of millions of employees were forced to work from home. A recent survey of full-time corporate or government employees found that two-thirds say their employers either have not communicated a post-pandemic office strategy or have only vaguely done so.
As workforce scholars, we are interested in teasing out how workers are dealing with this situation. Our recent research found that this failure to communicate clearly is hurting morale, culture and retention.
We first began investigating workers' pandemic experiences in July 2020 as shelter-in-place orders shuttered offices and remote work was widespread. At the time, we wanted to know how workers were using their newfound freedom to potentially work virtually from anywhere.
We analyzed a dataset that a business and technology newsletter attained from surveying its 585,000 active readers. It asked them whether they planned to relocate during the next six months and to share their story about why and where from and to.
After a review, we had just under 3,000 responses, including 1,361 people who were planning to relocate or had recently done so. We systematically coded these responses to understand their motives and, based on distances moved, the degree of ongoing remote-work policy they would likely need.
We found that a segment of these employees would require a full remote-work arrangement based on the distance moved from their office, and another portion would face a longer commute. Woven throughout this was the explicit or implicit expectation of some degree of ongoing remote work among many of the workers who moved during the pandemic.
In other words, many of these workers were moving on the assumption – or promise – that they'd be able to keep working remotely at least some of the time after the pandemic ended. Or they seemed willing to quit if their employer didn't oblige.
One of authors explains the research.
We wanted to see how these expectations were being met as the pandemic started to wind down in March 2021. So we searched online communities in Reddit to see what workers were saying. One forum proved particularly useful. A member asked, “Has your employer made remote work permanent yet or is it still in the air?" and went on to share his own experience. This post generated 101 responses with a good amount of detail on what their respective individual companies were doing.
While this qualitative data is only a small sample that is not necessarily representative of the U.S. population at large, these posts allowed us to delve into a richer understanding of how workers feel, which a simple stat can't provide.
We found a disconnect between workers and management that starts with but goes beyond the issue of the remote-work policy itself. Broadly speaking, we found three recurring themes in these anonymous posts.
1. Broken remote-work promises
Others have also found that people are taking advantage of pandemic-related remote work to relocate to a city at a distance large enough that it would require partial or full-time remote work after people return to the office.
A recent survey by consulting firm PwC found that almost a quarter of workers were considering or planning to move more than 50 miles from one of their employer's main offices. The survey also found 12% have already made such a move during the pandemic without getting a new job.
Our early findings suggested some workers would quit their current job rather than give up their new location if required by their employer, and we saw this actually start to occur in March.
One worker planned a move from Phoenix to Tulsa with her fiancé to get a bigger place with cheaper rent after her company went remote. She later had to leave her job for the move, even though “they told me they would allow me to work from home, then said never mind about it."
Another worker indicated the promise to work remotely was only implicit, but he still had his hopes up when leaders “gassed us up for months saying we'd likely be able to keep working from home and come in occasionally" and then changed their minds and demanded employees return to the office once vaccinated.
2. Confused remote-work policies
Another constant refrain we read in the worker comments was disappointment in their company's remote-work policy – or lack thereof.
Whether workers said they were staying remote for now, returning to the office or still unsure, we found that nearly a quarter of the people in our sample said their leaders were not giving them meaningful explanations of what was driving the policy. Even worse, the explanations sometimes felt confusing or insulting.
One worker complained that the manager “wanted butts in seats because we couldn't be trusted to [work from home] even though we'd been doing it since last March," adding: “I'm giving my notice on Monday."
Another, whose company issued a two-week timeline for all to return to the office, griped: “Our leadership felt people weren't as productive at home. While as a company we've hit most of our goals for the year. … Makes no sense."
After a long period of office shutterings, it stands to reason workers would need time to readjust to office life, a point expressed in recent survey results. Employers that quickly flip the switch in calling workers back and do so with poor clarifying rationale risk appearing tone-deaf.
It suggests a lack of trust in productivity at a time when many workers report putting in more effort than ever and being strained by the increased digital intensity of their job – that is, the growing number of online meetings and chats.
And even when companies said they wouldn't require a return to the office, workers still faulted them for their motives, which many employees described as financially motivated.
“We are going hybrid," one worker wrote. “I personally don't think the company is doing it for us. … I think they realized how efficient and how much money they are saving."
Only a small minority of workers in our sample said their company asked for input on what employees actually want from a future remote work policy. Given that leaders are rightly concerned about company culture, we believe they are missing a key opportunity to engage with workers on the issue and show their policy rationales aren't only about dollars and cents.
3. Corporate culture 'BS'
A company's culture is essentially its values and beliefs shared among its members. That's harder to foster when everyone is working remotely.
That's likely why corporate human resource executives rank maintaining organizational culture as their top workforce priority for 2021.
But many of the forum posts we reviewed suggested that employer efforts to do that during the pandemic by orchestrating team outings and other get-togethers were actually pushing workers away, and that this type of “culture building" was not welcome.
One worker's company “had everyone come into the office for an outdoor luncheon a week ago," according to a post, adding: “Idiots."
Surveys have found that what workers want most from management, on the issue of corporate culture, are more remote-work resources, updated policies on flexibility and more communication from leadership.
As another worker put it, “I can tell you, most people really don't give 2 flips about 'company culture' and think it's BS."
Kimberly Merriman, Professor of Management, Manning School of Business, University of Massachusetts Lowell; David Greenway, Doctoral Candidate in Leadership/Organization Studies, University of Massachusetts Lowell, and Tamara Montag-Smit, Assistant Professor of Business, University of Massachusetts Lowell
Curious about the most used emoji on social media?
Already, 217 new emojis have been announced for release in 2021, which will up the number to 3,353. Users can look forward to start sending emojis like the flaming heart, a bearded woman and interracial couples later in the year.
What emojis appear on people's phones and on their social media platforms is not arbitrary but has been coordinated by the Unicode Consortium since 1995, when the first 76 pictograms were adapted by U.S. nonprofit. The Consortium has been overseeing the character inventory of electronic text processing since 1991 and sets a standard for symbols, characters in different scripts and – last but not least – emojis, which are encoded uniformly across different platforms even though styles may vary between providers.
Even though the first Unicode listings predate them, a 1999 set of 176 simple pictograms invented by interface designer Shigetaka Kurita for a Japanese phone operator is considered to be the precursor of modern-day emojis. The concept gained popularity in Japan and by 2010, Unicode rolled out a massive release of more than 1,000 emojis to get with the burgeoning trend - the rest is history.
Different skin colors have been available for emojis since 2015. 2014 saw the release of the anti-bullying emoji "eye in speech bubble" in cooperation with The Ad Council, which produces public service ennouncements in the U.S. Same-sex couples and families have been available since the first major emoji-release in 2010.
Today it is estimated that more than 5 billion emojis are used every day on Facebook and in Facebook Messenger, with New Year's eve being the most popular day to use them, according to the social network. The most popular emoji on Facebook, as well as on Twitter, is the "laughing face with tears of joy", as it is officially called, while the heart emoji reigns supreme on Instagram.
Humans may have evolved to be tribalistic. Is that a bad thing?
- From politics to every day life, humans have a tendency to form social groups that are defined in part by how they differ from other groups.
- Neuroendocrinologist Robert Sapolsky, author Dan Shapiro, and others explore the ways that tribalism functions in society, and discuss how—as social creatures—humans have evolved for bias.
- But bias is not inherently bad. The key to seeing things differently, according to Beau Lotto, is to "embody the fact" that everything is grounded in assumptions, to identify those assumptions, and then to question them.
Hunter-gatherers probably had more spare time than you.
- For the species Homo sapiens, the Agricultural Revolution was a good deal, allowing the population to grow and culture to advance. But was it a good deal for individuals?
- Hunter-gatherers likely led lives requiring far less daily work than farmers, leading one anthropologist to call them the "original affluent society."
- The transition from hunter-gatherers to farmers may have occurred as a kind of trap in which the possibility of surplus during good years created population increases that had to be maintained.
Global warming is on track to drive lots of changes in the future. At the darkest end of the spectrum of possibilities is no future at all. That doesn't mean that humanity goes extinct, but it does mean the big project of civilization we've been working on since the Agricultural Revolution 10,000 years ago might collapse. Given that scary possibility, it's an opportune moment to look at that project with a critical eye. Yes, we have accomplished so much since we first domesticated ourselves by farming (e.g., villages, cities, empires, law, science, etc.). But is modern life worth it?
In other words, was the Agricultural Revolution a good idea?
For context, Homo sapiens appeared as a separated species about 300,000 years ago. During our entire tenure, the Earth has been undergoing a series of Ice Ages, long periods of intense glaciation where the planet was cold and dry (there is a lot of water in ice), followed by shorter interglacial periods that were warm and moist. Throughout most of those 300 millennia, human beings existed as bands of nomadic hunter-gatherers. It was only after the ice melted at the beginning of the current interglacial period (a geologic epoch called the Holocene) that we humans invented a new way of being human: farming. It was indeed a revolution, changing every aspect of being human, from how many people we might see in our lifetimes to how we spent those lifetimes.
The usual way the Agricultural Revolution gets characterized is a glorious triumph. Consider this telling of the tale.
Humans once subsisted by hunting and gathering, foraging for available food wherever it could be found. These early peoples necessarily moved frequently, as food sources changed, became scarce or moved in the case of animals. This left little time to pursue anything other than survival and a peripatetic lifestyle. Human society changed dramatically … when agriculture began… With a settled lifestyle, other pursuits flourished, essentially beginning modern civilization.
Hooray! Thanks to farming we could invent museums and concert halls and sports stadiums and then go visit them with all our free time.
The problem with this narrative, according to some writers and scholars like Jared Diamond and Yuval Noah Harari is that while the Agricultural Revolution may have been good for the species by turning surplus food into exponential population growth, it was terrible for individuals, that is, you and me.
Hunter-gatherers worked about five hours per day
Consider this. Anthropologist Marshall Sahlins once estimated that the average hunter-gatherer spent about five hours a day working at, well, hunting and gathering. That's because nature was actually pretty plentiful. It didn't take that long to gather what was needed. (Gathering was actually a much more important food source than hunting.) The rest of the day was probably spent hanging out and gossiping as people are wont to do. If nature locally stopped being abundant, the tribe just moved on. Also, hunter-gatherers appear to have lived in remarkably horizontal societies in terms of power and wealth. No one was super-rich and no one was super-poor. Goods were distributed relatively equally, which is why Sahlins called hunter-gatherers the "original affluent society."
Stationary farmers, on the other hand, had to work long, backbreaking days. They literally had to tear up the ground to plant seeds and then tear it up again digging irrigation trenches that brought water to those seeds. And if it doesn't rain enough, everyone starves. If it rains too much, everyone starves. And on top of it all, the societies that emerge from farming end up being wildly hierarchical with all kinds of kings and emperors and dudes-on-top who somehow end up with the vast majority of surplus wealth generated by all the backbreaking, tearing-up-the-ground work.
A woman harvesting wheat.Credit: Yann Forget via Wikipedia
Did we domesticate wheat, or did wheat domesticate us?
So how did this happen? How did the change occur, and why did anyone volunteer for the switch? One possibility is that it was a trap.
Historian Yuval Noah Harari sees human beings getting domesticated in a long process that closed doors behind it. During periods of good climate, some hunter-gatherers began staying near wild wheat outcroppings to harvest the cereal. Processing the grains inadvertently spread the plant around, producing more wheat next season. More wheat led to people staying longer each season. Eventually, seasonal camps became villages with granaries that led to surpluses, which in turn let people have a few more children.
So farming required far more work, but it allowed for more children. In good times, this cycle worked out fine and populations rose. But four or five generations later, the climate shifted a little, and now those hungry mouths require even more fields to be cleared and irrigation ditches to be dug. The reliance on a single food source, rather than multiple sources, also leaves more prone to famine and disease. But by the time anyone gets around to thinking, "Maybe this farming thing was a bad idea," it's too late. There's no living memory of another way of life. The trap has been sprung. We had gotten caught by our own desire for the "luxury" of owning some surplus food. For some anthropologists like Samual Bowles, it was the idea of ownership itself that trapped us.
Of course, if you could ask the species Homo sapiens if this was a good deal, like the wild wheat plants of yore, the answer would be a definitive yes! So many more people. So much advancement in technology and so many peaks reached in culture. But for you and me as individuals, in terms of how we get to spend our days or our entire lives, maybe the answer is not so clear. Yes, I do love my modern medicine and video games and air travel. But living in a world of deep connections with nature and with others that included a lot of time not working for a boss, that sounds nice too.
So, what do you think? Was the trade-off worth it? Or was it a trap?
A 50-year study reveals changing values children learned from pop culture.
- A new study tracked changes in values tweens (8-12 years old) get from popular culture.
- The researchers compared 16 values over a 50-year-period.
- The report was created by the UCLA's Center for Scholars and Storytellers.
A new report from UCLA's Center for Scholars and Storytellers focused on values espoused by television programs that were popular with children 8-12 between over half a century, between 1967 to 2017. The researchers looked at how 16 values changed in importance during that span.
The most important value in 2017, the last year the study looked at, was achievement, with self-acceptance, image, popularity, and belonging to a community in the top five.
Another interesting find charted the value of fame, which used to rank at the bottom for nearly 40 years (ranking 15th until 1997), then skyrocketed to No. 1 as a value in 2007. By 2017, it went down to No. 6.
Why was fame so important in 2007? The scientists tie it to the growth of social media platforms like Facebook (launched in 2004) and YouTube (launched in 2005). Teens quickly adopted these mediums and many content-creators from that first decade of the 2000s made "fame-focused tween shows," concluded the researchers. These platforms "which for the first time allowed anyone and everyone to seek a large audience, were brand new and seemed an essential part of the zeitgeist," they wrote in the paper.
Studies have also shown that this rise in fame-seeking corresponded to a rise in narcissism, while empathy decreased.
Change in values from tween television.
Other values, like community feeling and benevolence, also fluctuated in significance through the years, according to the study. The changes in value importance were directly correlated to changes in the culture at large. The importance of community was number one or two in four of the decades but fell to 11 in 2007. Being kind and helpful was in the second spot in 1967 and 1997, but only 12th in 2007. Now it's up to number 8.
Psychology professor Yalda Uhls, the report's author as well as founder and executive director of the Center for Scholars and Storytellers, remarked on the trends:
"I believe that television reflects the culture, and this half-century of data shows that American culture has changed drastically," said Uhls. "Media plays an important role as young people are developing a concept of the social world outside of their immediate environment."
One big value shift the study pointed out has to do with the kinds of messages kids get from reality shows (evaluated since 2007) and fictional shows that are scripted.
The most popular shows among tweens in 2017, based Nielsen ratings, were "America's Got Talent" and "American Ninja Warrior." Among scripted shows, the top two were "Thundermans" and "Girl Meets World." The values conveyed by the scripted shows were self-acceptance, community belonging and being kind. While in reality TV, values like fame, self-centeredness and image were promoted.
Reality shows, which are created for a wide audience, but watched frequently by tweens, tend to center on competition and the value of being the winner at something. They also celebrate tactics like bullying and cheating in order to win.
Most watched tween TV shows from 1967-2017 in the U.S.
The report's lead author, Agnes Varghese, a fellow of the center and a UC Riverside graduate student, explained how shows influence the kids:
"If tweens watch, admire and identify with people who mostly care about fame and winning, these values may become even more important in our culture," shared Varghese. "Reality television shows continued to reflect the same trend we saw in 2007, with self-focused values such as fame ranking highest."
Researchers also point out that shows, whether scripted or reality, are misleading in that they don't really reveal the value of hard work that it takes to achieve fame, especially for an average person. This is a crucial shortcoming, considering that tweens form lifelong belief systems during these years based on what they perceive as desirable to achieve in the future.
Check out the full report here: "The Rise and Fall of Fame: Tracking the Landscape of Values Portrayed on Television from 1967 to 2017" (PDF).