Facebook: Valuable Business Tool or Waste of Time?
Clay Shirky: There are several different trends at work \r\non the work day. My friend, Dalton Conley over at NYU, the sociologist,\r\n in fact, has just written a book about the way in which the formerly \r\nrelatively sharp dividing line between work and home has blended. That \r\nwas a tradition in a way that started long before the Internet, although\r\n the Internet has certainly accelerated it. In a way, Mine Sweeper, \r\nright, the old time-waster, has been replaced by Facebook, the new \r\ntime-waster. But Facebook is a certainly more pleasantly addictive \r\npastime use of the service than Mine Sweeper was.
\r\nBut to the larger point about going into your workday, spending all day \r\nanswering emails, dealing with interruptive things, and then leaving \r\nfeeling as if you’re getting nothing done... it seems to me that we are \r\nat the crux of a fairly, fairly significant social change in the way we \r\nconduct ourselves in the workplace because, to make a bold prediction, \r\nthings that can’t last, don’t. Right? Since it takes longer to answer a\r\n question than to ask one, we can actually all make each other too busy \r\nto get anything done by just asking each other a bunch of questions. \r\nAnd the initial assumption when email, later instant messaging, and \r\nother forms of group communication came into the workplace, is that now,\r\n finally, we could be better coordinated. The better coordination means\r\n more and more communications interfaces, thus leaving your friends, and\r\n in fact, all of us leaving the workday feeling like, oh my god, all I \r\ndid today was communicate but I accomplished nothing.
What \r\nwe’ve seen in the kind of vanguard of social movement—the open source \r\nsoftware movement is the largest sort of collection of participatory \r\ntools—is that open source software projects have consistently grown to \r\nsuch a size that they can’t actually host all of the internal \r\ncommunications. And what they do is they then subdivide themselves and \r\nthey develop tools, not to help them communicate, but rather to help \r\nthem not communicate. Which is to say, tools which allow individual \r\nworkers to get their job done with a minimum of coordination. And \r\nthere’s going to be a competition among businesses to who can create the\r\n best environment for their workers that minimizes interrupt logic and \r\nminimizes coordination. Because I think that the pain your friend is \r\nfeeling, and again, that all of us feel, is really indicative of \r\nsomething quite deep, which is we can now communicate as much as we \r\nalways thought we needed to in the business environment and it turns out\r\n to be catastrophic.
So, in large-scale enterprises, the \r\ntrick is not starting to be to figure out which kinds of communication \r\nare critical and which are just sort of “cover your ass” constantly “cc”\r\n everybody occupational spam uses of the tool. And to start fairly \r\nrigorously stamping out that second category of them because if we all \r\nhave each other communicate with one another as much as we think we need\r\n to, we’ll all swamp each other. Right? The source of your friend not \r\ngetting anything done is other people, including him, on instant \r\nmessages and email threads. But he is also himself the source of other \r\npeople not getting anything done. And it’s going to take coordinated \r\naction, probably by the leadership of those companies to put the company\r\n back on a footing where you can minimize coordination and collaboration\r\n to the critical moments rather than having it swamp everybody.
\r\nQuestion: How should companies deal with these online distractions?
Clay Shirky: You know, different companies deal with it\r\n differently. I think increasingly, between the cultural expectations \r\nand the difficulty of shutting off access, this is becoming like the \r\npersonal computer, like email, like instant messaging. Every one of \r\nthose things—and you know, now Facebook and Twitter—every one of those \r\nthings was brought into the business. Not because somebody in the \r\nexecutive suite said, “Now we have to have personal computers.” They \r\nwere dragged into the business because the accountants hated talking to \r\nthe mainframe guys. And so, once Visicalc came along, they just brought \r\ntheir own PC’s into the enterprise and hid it for a while.
If\r\n you went and talked to somebody about email in the mid-‘90s, you’d you \r\nknow, maybe they heard about it, maybe they hadn’t. You know, there \r\nwould be some, “oh, maybe some day we’ll get an email address.” \r\nRight? You go down and you talk to the sales guys and their business \r\ncards all have AOL addresses on them because their clients have demanded\r\n it.
Instant messaging, if you talk to the Wall Street guys \r\nabout instant messaging in the late ‘90’s, “do you ever talk to your \r\nclients on IM.” Oh, no, no. The FCC would never let us do that.” \r\nRight? The brokers have an ICQ number. So, the second phase of all of \r\nthat is the business then panicking and saying our employees are doing \r\nsomething that we didn’t allow them to do. At which point the hurdle \r\nthe technology has to cross is, this is embedded enough in the cultural \r\nand business logic of this company, you can’t not do it.
People\r\n in call centers will lose that battle. Right? If you’re in a call \r\ncenter and it’s gonna be you’re in a cubicle farm and you’ve got your \r\nscript, and if you’re, you know, spending a lot of time on Facebook when\r\n you should be on the phone, they’re going to shut that down. People in\r\n magazines, people in newspapers, people in the media are at the other \r\nextreme. Of course they’re going to have maximum access. But my guess \r\nis, that as with the personal computer, e-mail and instant messaging, \r\nparticipating in social networks as a way of figuring out what your \r\ncustomers are doing, figuring out what your vendors are doing, figuring \r\nout what you’re clients are doing, recruiting new hires, all of these \r\nkinds of characteristics are going to be... are going to seem to have \r\nenough value that after awhile most companies are going to capitulate \r\nand reopen the firewall inasmuch as they’ve shut it down.
Recorded on May 26, 2010
Interviewed by Victoria Brown
Like the personal computer, e-mail and instant messaging, social networks are now vital for businesses—even if they are also distractions.
Once a week.
Subscribe to our weekly newsletter.
Seawater is raising salt levels in coastal woodlands along the entire Atlantic Coastal Plain, from Maine to Florida.
Permanent flooding has become commonplace on this low-lying peninsula, nestled behind North Carolina's Outer Banks. The trees growing in the water are small and stunted. Many are dead.
Throughout coastal North Carolina, evidence of forest die-off is everywhere. Nearly every roadside ditch I pass while driving around the region is lined with dead or dying trees.
As an ecologist studying wetland response to sea level rise, I know this flooding is evidence that climate change is altering landscapes along the Atlantic coast. It's emblematic of environmental changes that also threaten wildlife, ecosystems, and local farms and forestry businesses.
Like all living organisms, trees die. But what is happening here is not normal. Large patches of trees are dying simultaneously, and saplings aren't growing to take their place. And it's not just a local issue: Seawater is raising salt levels in coastal woodlands along the entire Atlantic Coastal Plain, from Maine to Florida. Huge swaths of contiguous forest are dying. They're now known in the scientific community as “ghost forests."
Deer photographed by a remote camera in a climate change-altered forest in North Carolina. Emily Ury, CC BY-ND
The insidious role of salt
Sea level rise driven by climate change is making wetlands wetter in many parts of the world. It's also making them saltier.
In 2016 I began working in a forested North Carolina wetland to study the effect of salt on its plants and soils. Every couple of months, I suit up in heavy rubber waders and a mesh shirt for protection from biting insects, and haul over 100 pounds of salt and other equipment out along the flooded trail to my research site. We are salting an area about the size of a tennis court, seeking to mimic the effects of sea level rise.
After two years of effort, the salt didn't seem to be affecting the plants or soil processes that we were monitoring. I realized that instead of waiting around for our experimental salt to slowly kill these trees, the question I needed to answer was how many trees had already died, and how much more wetland area was vulnerable. To find answers, I had to go to sites where the trees were already dead.
Rising seas are inundating North Carolina's coast, and saltwater is seeping into wetland soils. Salts move through groundwater during phases when freshwater is depleted, such as during droughts. Saltwater also moves through canals and ditches, penetrating inland with help from wind and high tides. Dead trees with pale trunks, devoid of leaves and limbs, are a telltale sign of high salt levels in the soil. A 2019 report called them “wooden tombstones."
As the trees die, more salt-tolerant shrubs and grasses move in to take their place. In a newly published study that I coauthored with Emily Bernhardt and Justin Wright at Duke University and Xi Yang at the University of Virginia, we show that in North Carolina this shift has been dramatic.
The state's coastal region has suffered a rapid and widespread loss of forest, with cascading impacts on wildlife, including the endangered red wolf and red-cockaded woodpecker. Wetland forests sequester and store large quantities of carbon, so forest die-offs also contribute to further climate change.
Researcher Emily Ury measuring soil salinity in a ghost forest. Emily Bernhardt, CC BY-ND
Assessing ghost forests from space
To understand where and how quickly these forests are changing, I needed a bird's-eye perspective. This perspective comes from satellites like NASA's Earth Observing System, which are important sources of scientific and environmental data.
A 2016 Landsat8 image of the Albemarle Pamlico Peninsula in coastal North Carolina. USGS
Since 1972, Landsat satellites, jointly operated by NASA and the U.S. Geological Survey, have captured continuous images of Earth's land surface that reveal both natural and human-induced change. We used Landsat images to quantify changes in coastal vegetation since 1984 and referenced high-resolution Google Earth images to spot ghost forests. Computer analysis helped identify similar patches of dead trees across the entire landscape.
Google Earth image of a healthy forest on the right and a ghost forest with many dead trees on the left. Emily Ury
The results were shocking. We found that more than 10% of forested wetland within the Alligator River National Wildlife Refuge was lost over the past 35 years. This is federally protected land, with no other human activity that could be killing off the forest.
Rapid sea level rise seems to be outpacing the ability of these forests to adapt to wetter, saltier conditions. Extreme weather events, fueled by climate change, are causing further damage from heavy storms, more frequent hurricanes and drought.
We found that the largest annual loss of forest cover within our study area occurred in 2012, following a period of extreme drought, forest fires and storm surges from Hurricane Irene in August 2011. This triple whammy seemed to have been a tipping point that caused mass tree die-offs across the region.
Should scientists fight the transition or assist it?
As global sea levels continue to rise, coastal woodlands from the Gulf of Mexico to the Chesapeake Bay and elsewhere around the world could also suffer major losses from saltwater intrusion. Many people in the conservation community are rethinking land management approaches and exploring more adaptive strategies, such as facilitating forests' inevitable transition into salt marshes or other coastal landscapes.
For example, in North Carolina the Nature Conservancy is carrying out some adaptive management approaches, such as creating “living shorelines" made from plants, sand and rock to provide natural buffering from storm surges.
A more radical approach would be to introduce marsh plants that are salt-tolerant in threatened zones. This strategy is controversial because it goes against the desire to try to preserve ecosystems exactly as they are.
But if forests are dying anyway, having a salt marsh is a far better outcome than allowing a wetland to be reduced to open water. While open water isn't inherently bad, it does not provide the many ecological benefits that a salt marsh affords. Proactive management may prolong the lifespan of coastal wetlands, enabling them to continue storing carbon, providing habitat, enhancing water quality and protecting productive farm and forest land in coastal regions.
A new study used functional near-infrared spectroscopy (fNIRS) to measure brain activity as inexperienced and experienced soccer players took penalty kicks.
- The new study is the first to use in-the-field imaging technology to measure brain activity as people delivered penalty kicks.
- Participants were asked to kick a total of 15 penalty shots under three different scenarios, each designed to be increasingly stressful.
- Kickers who missed shots showed higher activity in brain areas that were irrelevant to kicking a soccer ball, suggesting they were overthinking.
In a 2019 soccer match, Swansea City was down 1-0 against West Brom late in the first half. A penalty was called against West Brom. Swansea midfielder Bersant Celina was preparing to deliver a penalty kick. He scuttled up to the ball, but his foot only made partial contact, lobbing it weakly to the right.
Was it a simple mistake? Maybe. But there might be deeper explanations for why professional athletes choke under high-pressure situations.
A new study published in Frontiers in Computer Science used functional near-infrared spectroscopy (fNIRS) to analyze the brain activity of inexperienced and experienced soccer players as they missed penalty shots. Although past research has explored why soccer players miss penalty shots, the recent study is the first to do so using in-the-field fNIRS measurement.
The results showed that kickers who choked were activating parts of their brain associated with long-term thinking, self-instruction, and self-reflection. The chokers, in other words, were overthinking it.
The psychology of penalty kicks
Penalty shots offer an interesting case study of how mental pressure affects physical performance. After all, there's a lot at stake, not only because the kick can sometimes render a win or loss, but also because there are sometimes millions of people anxiously watching, some of whom might have a financial interest in the outcome.
That pressure is no joke. For example, research on Men's World Cup penalty shoot-outs has shown that when the score is tied and a goal means an immediate win, players score 92 percent of kicks. But when teams are facing elimination in a shootout, and the kick determines an immediate tie or loss, players only score 60 percent of the time.
"How can it be that football players with a near perfect control over the ball (they can very precisely kick a ball over more than 50 meters) fail to score a penalty kick from only 11 meters?" study co-author Max Slutter, of the University of Twente in the Netherlands, said in a press release.
"Obviously, huge psychological pressure plays a role, but why does this pressure cause a missed penalty? We tried to answer this by measuring the brain activity of football players during the physical execution of a penalty kick."
In the new study, the researchers aimed to answer two key questions about choking under pressure among both experienced and inexperienced players: (1) What is the difference in brain activity between success (scoring) and failure (missing) when taking a penalty kick? (2) What brain activity is associated with performing under pressure during a penalty kick situation?
To find out, the researchers asked ten experienced soccer players and twelve inexperienced players to participate in a penalty-kicking task. The task was divided into three rounds, each of which was designed to be increasingly stressful:
- Round 1 had no goalkeeper and was labeled as a practice round.
- Round 2 had a friendly goalkeeper who wasn't allowed to distract the kicker.
- Round 3 had a competitive goalkeeper who was allowed to distract the kicker, and kickers were also competing for a prize.
Participants kicked five shots in each round. They wore a fNIRS-equipped headset during the task that measured activity in various parts of the brain.
All participants performed worse in the second and third rounds and reported experiencing the most pressure in the third round. Inexperienced players performed worse than experienced players, which might suggest that they were less able to deal with the mental stress.
The locations in which experienced and inexperienced players kicked the ball in each round. Red dots represent missed penalties and green dots represent scored penalties.Slutter et al., Frontiers in Computer Science, 2021.
The neuroscience of choke artists
So, what types of brain activity were associated with missed shots?
The most noticeable result was that kickers missed more shots when they showed higher activity in their prefrontal cortex (PFC), an area of the brain associated with long-term planning. This was especially true among participants who reported higher levels of anxiety. More specifically, experienced soccer players who missed shots showed high activity in the left temporal cortex, which is related to self-instruction and self-reflection.
"By activating the left temporal cortex more, experienced players neglect their automated skills and start to overthink the situation," the researchers wrote. "This increase can be seen as a distracting factor."
Also, when players of all experience levels felt anxious and missed shots, they showed less activity in the motor cortex, which is the brain area most directly associated with kicking a penalty shot.
Don't overthink it
The results suggest that mental pressure can activate parts of the brain that are irrelevant to the task at hand. In general, expert athletes show more efficient brain activity — that is, more activity in relevant areas, and less activity in irrelevant areas — and therefore experience fewer distractions. This is likely one reason why they were more successful at penalties than inexperienced players in high-stress situations.
This principle is described by neural efficiency theory, and it applies not only to athletes but experts in any field. As you gain mastery over something, you can rely more on automatic brain processes rather than deliberate thinking, which can lead to distractions. The authors of the study concluded that their results provide supporting evidence for neural efficiency theory.
Still, as long our experts are human, it seems that high-pressure situations can turn anyone into a choke artist.
A Harvard professor's study discovers the worst year to be alive.
- Harvard professor Michael McCormick argues the worst year to be alive was 536 AD.
- The year was terrible due to cataclysmic eruptions that blocked out the sun and the spread of the plague.
- 536 ushered in the coldest decade in thousands of years and started a century of economic devastation.
The past year has been nothing but the worst in the lives of many people around the globe. A rampaging pandemic, dangerous political instability, weather catastrophes, and a profound change in lifestyle that most have never experienced or imagined.
But was it the worst year ever?
Nope. Not even close. In the eyes of the historian and archaeologist Michael McCormick, the absolute "worst year to be alive" was 536.
Why was 536 so bad? You could certainly argue that 1918, the last year of World War I when the Spanish Flu killed up to 100 million people around the world, was a terrible year by all accounts. 1349 could also be considered on this morbid list as the year when the Black Death wiped out half of Europe, with up to 20 million dead from the plague. Most of the years of World War II could probably lay claim to the "worst year" title as well. But 536 was in a category of its own, argues the historian.
It all began with an eruption...
According to McCormick, Professor of Medieval History at Harvard University, 536 was the precursor year to one of the worst periods of human history. It featured a volcanic eruption early in the year that took place in Iceland, as established by a study of a Swiss glacier carried out by McCormick and the glaciologist Paul Mayewski from the Climate Change Institute of The University of Maine (UM) in Orono.
The ash spewed out by the volcano likely led to a fog that brought an 18-month-long stretch of daytime darkness across Europe, the Middle East, and portions of Asia. As wrote the Byzantine historian Procopius, "For the sun gave forth its light without brightness, like the moon, during the whole year." He also recounted that it looked like the sun was always in eclipse.
Cassiodorus, a Roman politician of that time, wrote that the sun had a "bluish" color, the moon had no luster, and "seasons seem to be all jumbled up together." What's even creepier, he described, "We marvel to see no shadows of our bodies at noon."
...that led to famine...
The dark days also brought a period of coldness, with summer temperatures falling by 1.5° C. to 2.5° C. This started the coldest decade in the past 2300 years, reports Science, leading to the devastation of crops and worldwide hunger.
...and the fall of an empire
In 541, the bubonic plague added considerably to the world's misery. Spreading from the Roman port of Pelusium in Egypt, the so-called Plague of Justinian caused the deaths of up to one half of the population of the eastern Roman Empire. This, in turn, sped up its eventual collapse, writes McCormick.
Between the environmental cataclysms, with massive volcanic eruptions also in 540 and 547, and the devastation brought on by the plague, Europe was in for an economic downturn for nearly all of the next century, until 640 when silver mining gave it a boost.
Was that the worst time in history?
Of course, the absolute worst time in history depends on who you were and where you lived.
Native Americans can easily point to 1520, when smallpox, brought over by the Spanish, killed millions of indigenous people. By 1600, up to 90 percent of the population of the Americas (about 55 million people) was wiped out by various European pathogens.
Like all things, the grisly title of "worst year ever" comes down to historical perspective.
What's the difference between brainwashing and rehabilitation?
- The book and movie, A Clockwork Orange, powerfully asks us to consider the murky lines between rehabilitation, brainwashing, and dehumanization.
- There are a variety of ways, from hormonal treatment to surgical lobotomies, to force a person to be more law abiding, calm, or moral.
- Is a world with less free will but also with less suffering one in which we would want to live?
Alex is a criminal. A violent and sadistic criminal. So, we decide to do something about it. We're going to "rehabilitate" him.
Using a new and exciting "Ludovico" technique, we'll change his brain chemistry to make him an upstanding, moral citizen. Alex will be forced to watch violent movies as his body is pumped with nausea-inducing drugs. After a while, he'll come to associate violence with this horrible sickness. And, after a course of Ludovico, Alex can happily return to society, never again doing an immoral or illegal act. He'll no longer be a danger to himself or anyone else.
This is the story of A Clockwork Orange by Anthony Burgess, and it raises important questions about the nature of moral decisions, free will, and the limits of rehabilitation.
Today's Clockwork Orange
This might seem like unbelievable science fiction, but it might be truer — and nearer — than we think. In 2010, Dr. Molly Crockett did a series of experiments on moral decision-making and serotonin levels. Her results showed that people with more serotonin were less aggressive or confrontational and much more easy-going and forgiving. When we're full of serotonin, we let insults pass, are more empathetic, and are less willing to do harm.
As Fydor Dostoyevsky wrote in The Brothers Karamazov, if the "entrance fee" for having free will is the horrendous suffering we see all around us, then "I hasten to return my ticket."
The idea that biology affects moral decisions is obvious. Most of us are more likely to be short-tempered and spiteful if we're tired or hungry, for instance. Conversely, we have the patience of a saint if we just have received some good news, had half a bottle of wine, or had sex.
If our decision-making can be manipulated or determined by our biology, should we not try various interventions to prevent the criminally inclined from harming others?
What is the point of prison? This is itself no easy question, and it's one with a rich philosophical debate. Surely one of the biggest reasons is to protect society by preventing criminals from reoffending. This might be achievable by manipulating a felon's serotonin levels, but why not go even further?
Today, we know enough about the brain to have identified a very particular part of the prefrontal cortex responsible for aggressive behavior. We know that certain abnormalities in the amygdala can result in anti-social behavior and rule breaking. If the purpose of the penal system is to rehabilitate, then why not "edit" these parts of the brain in some way? This could be done in a variety of ways.
Credit: Otis Historical Archives National Museum of Health and Medicine via Flickr / Wikipedia
Electroconvulsive therapy (ECT) is a surprisingly common practice in much of the developed world. Its supporters say that it can help relieve major mental health issues such as depression or bipolar disorder as well as alleviate certain types of seizures. Historically, and controversially, it has been used to "treat" homosexuality and was used to threaten those misbehaving in hospitals in the 1950s (as notoriously depicted in One Flew Over the Cuckoo's Nest). Of course, these early and crude efforts at ECT were damaging, immoral, and often left patients barely able to function as humans. Today, neuroscience and ECT are much more sophisticated. If we could easily "treat" those with aggressive or anti-social behavior, then why not?
Ideally, we might use techniques such as ECT or hormonal supplementation, but failing that, why not go even further? Why not perform a lobotomy? If the purpose of the penal system is to change the felon for the better, we should surely use all the tools at our disposal. With one fairly straightforward surgery to the prefrontal cortex, we could turn a violent, murderous criminal into a docile and law-abiding citizen. Should we do it?
Is free will worth it?
As Burgess, who penned A Clockwork Orange, wrote, "Is a man who chooses to be bad perhaps in some way better than a man who has the good imposed upon him?"
Intuitively, many say yes. Moral decisions must, in some way, be our own. Even if we know that our brains determine our actions, it's still me who controls my brain, no one else. Forcing someone to be good, by molding or changing their brain, is not creating a moral citizen. It's creating a law-abiding automaton. And robots are not humans.
And yet, it begs the question: is "free choice" worth all the evil in the world?
If my being brainwashed or "rehabilitated" means children won't die malnourished or the Holocaust would never happen, then so be it. If lobotomizing or neuro-editing a serial killer will prevent them from killing again, is that not a sacrifice worth making? There's no obvious reason why we should value free will above morality or the right to life. A world without murder and evil — even if it meant a world without free choices for some — might not be such a bad place.
As Fyodor Dostoyevsky wrote in The Brothers Karamazov, if the "entrance fee" for having free will is the horrendous suffering we see all around us, then "I hasten to return my ticket." Free will's not worth it.
Do you think the Ludovico technique from A Clockwork Orange is a great idea? Should we turn people into moral citizens and shape their brains to choose only what is good? Or is free choice more important than all the evil in the world?