Why Unfriending People on Facebook Is Immature and Counterproductive
Engaging with the world might not be comfortable, but it's much healthier than ignoring what you don't want to see.
You’ve most likely seen this: If you don’t believe in XXX, unfriend me now. Remove ‘don’t’ and a similar scenario unfolds. You can fill in XXX with anything: veganism; stopping Trump; stopping Clinton; racism; gun rights. The daily banter can be stifling, as can an inability to manage opinions other than your own.
Part of this is understandable. We all have strong beliefs regarding any number of issues. Often one or two such issues assume the top position(s) in our mental catalog of things to care about. Some are benign pet peeves—proper grammar on social media, for example. Others are quite relevant and potentially dangerous, such as the reverberating effects of racism or the next person to be appointed president.
Content is almost irrelevant here, however. What we’re discussing is the attitude of someone who says: If you don’t agree with me, I don’t ever want to see or hear from you again! This coddling has been well documented on college campuses in recent years. Yet this mindset is not restricted to universities. The attitude is apparent everywhere.
British anthropologist and evolutionary psychologist Robin Dunbar is famous for his ‘number,’ which is 150. He estimated that this is the extent of relationships humans are able to keep track of without taxing cognitive limits. Everywhere he looked evidence panned out: Neolithic Mesopotamian tribes; eleventh-century villages in Wales; ancient Roman army corps. His mean group size of 148 is, for simplicity’s sake, rounded up.
Within the folds of his theory lies another, lesser known, idea. How humans evolved from primates is a longstanding matter of contention. Dunbar believes that social interaction is the main driver. This makes sense given the other candidates: language, an advanced communication that depends upon others to listen to and talk with; nutrition, which greatly advanced thanks to group hunting; technology, even simple stone tools, which requires input and debate.
If the average tribe size is 150 people, it makes sense that you’d want to get along with all of them—for protection, communal sharing, storytelling, play. Of course, there’s always other tribes to worry about, which is where the unfriending phenomenon originates. If you believe XXX, you’re not even human—a sentiment at the roots of culture and religion for countless time. Disgust is a strong emotion with evolutionary benefit. Applying it blindly is not helpful, though.
The larger your social network, the weaker its ties. Intimate connections are usually counted on one, perhaps two, hands. My five thousand Facebook friends and thousands of other connections from pages I manage? I wouldn’t recognize them if I walked right into them. But—and this is important—if they mention we’re connected on social media, an emotional link has been made. There is some reference point, no matter how hazy, that immediately pushes through a barrier of uncertainty. At the very least, a conversation has begun.
Social we are: two-thirds of all conversation is gossip, either about those immediately present or others who are absent. Wharton Professor of Management Eric Foster found that women gossip no more than men. Given the amount of discussion I hear in the locker room about what this or that coach or player should have done last night, this is not surprising. Men simply choose another word than gossip, but that changes nothing.
Most discussion regarding unfriending is why you should do it. In this rather juvenile take, you’re told to dump political ranters, negative people, attention seekers, and, my favorite, “anyone who makes you feel really crappy about yourself.” This is exactly how many people react to foreign ideas already. If the potential of a mirror is held up that makes me question something about myself, get a rock ready. Throw. Just don’t look.
What is being lost in this age of unfriending is debate. What might seem commonsensical to you might not be so to others. Or, they just might have a different opinion. I would expect no belief of mine to be shared by seven billion others on this planet, nor even among my 150 closest friends. Honest dialogue and discussion only makes us stronger.
That’s impossible when you’re clicking the unfriend button at the first sign of distress. As I wrote earlier this week regarding education, our brains are nearly fully formed by age six; it takes another twenty years for it to become fully myelinated—the fatty insulation that connects every neural region like a superhighway. What this means is our emotional, reptilian brain is not chatting regularly with our prefrontal cortex, the seat of reason and one region implicated in Dunbar’s theory of social relationships. Little is nuanced; we lash out at what frustrates us. Worse, we hide from it.
Ideally, education is a lifelong pursuit. This means coming to terms with views that challenge your own. By engaging with others of differing opinions, you might find your viewpoint strengthened. You might feel neutral. You might even change your mind, which has the potential to reshape the course of your life.
None of this happens when the unfriend syndrome ravages your thoughts. Perhaps some brains are too myelinated—the insulation is not allowing any air to pass through. That’s a shame. Debate is an essential component of community building. When it’s lost, well, so is so much more.
Image: John Moore / Getty Images
Derek Beres is a Los-Angeles based author, music producer, and yoga/fitness instructor at Equinox Fitness. Stay in touch @derekberes.
- The meaning of the word 'confidence' seems obvious. But it's not the same as self-esteem.
- Confidence isn't just a feeling on your inside. It comes from taking action in the world.
- Join Big Think Edge today and learn how to achieve more confidence when and where it really matters.
If you're lacking confidence and feel like you could benefit from an ego boost, try writing your life story.
In truth, so much of what happens to us in life is random – we are pawns at the mercy of Lady Luck. To take ownership of our experiences and exert a feeling of control over our future, we tell stories about ourselves that weave meaning and continuity into our personal identity.
Researchers hope the technology will further our understanding of the brain, but lawmakers may not be ready for the ethical challenges.
- Researchers at the Yale School of Medicine successfully restored some functions to pig brains that had been dead for hours.
- They hope the technology will advance our understanding of the brain, potentially developing new treatments for debilitating diseases and disorders.
- The research raises many ethical questions and puts to the test our current understanding of death.
The image of an undead brain coming back to live again is the stuff of science fiction. Not just any science fiction, specifically B-grade sci fi. What instantly springs to mind is the black-and-white horrors of films like Fiend Without a Face. Bad acting. Plastic monstrosities. Visible strings. And a spinal cord that, for some reason, is also a tentacle?
But like any good science fiction, it's only a matter of time before some manner of it seeps into our reality. This week's Nature published the findings of researchers who managed to restore function to pigs' brains that were clinically dead. At least, what we once thought of as dead.
What's dead may never die, it seems
The researchers did not hail from House Greyjoy — "What is dead may never die" — but came largely from the Yale School of Medicine. They connected 32 pig brains to a system called BrainEx. BrainEx is an artificial perfusion system — that is, a system that takes over the functions normally regulated by the organ. The pigs had been killed four hours earlier at a U.S. Department of Agriculture slaughterhouse; their brains completely removed from the skulls.
BrainEx pumped an experiment solution into the brain that essentially mimic blood flow. It brought oxygen and nutrients to the tissues, giving brain cells the resources to begin many normal functions. The cells began consuming and metabolizing sugars. The brains' immune systems kicked in. Neuron samples could carry an electrical signal. Some brain cells even responded to drugs.
The researchers have managed to keep some brains alive for up to 36 hours, and currently do not know if BrainEx can have sustained the brains longer. "It is conceivable we are just preventing the inevitable, and the brain won't be able to recover," said Nenad Sestan, Yale neuroscientist and the lead researcher.
As a control, other brains received either a fake solution or no solution at all. None revived brain activity and deteriorated as normal.
The researchers hope the technology can enhance our ability to study the brain and its cellular functions. One of the main avenues of such studies would be brain disorders and diseases. This could point the way to developing new of treatments for the likes of brain injuries, Alzheimer's, Huntington's, and neurodegenerative conditions.
"This is an extraordinary and very promising breakthrough for neuroscience. It immediately offers a much better model for studying the human brain, which is extraordinarily important, given the vast amount of human suffering from diseases of the mind [and] brain," Nita Farahany, the bioethicists at the Duke University School of Law who wrote the study's commentary, told National Geographic.
An ethical gray matter
Before anyone gets an Island of Dr. Moreau vibe, it's worth noting that the brains did not approach neural activity anywhere near consciousness.
The BrainEx solution contained chemicals that prevented neurons from firing. To be extra cautious, the researchers also monitored the brains for any such activity and were prepared to administer an anesthetic should they have seen signs of consciousness.
Even so, the research signals a massive debate to come regarding medical ethics and our definition of death.
Most countries define death, clinically speaking, as the irreversible loss of brain or circulatory function. This definition was already at odds with some folk- and value-centric understandings, but where do we go if it becomes possible to reverse clinical death with artificial perfusion?
"This is wild," Jonathan Moreno, a bioethicist at the University of Pennsylvania, told the New York Times. "If ever there was an issue that merited big public deliberation on the ethics of science and medicine, this is one."
One possible consequence involves organ donations. Some European countries require emergency responders to use a process that preserves organs when they cannot resuscitate a person. They continue to pump blood throughout the body, but use a "thoracic aortic occlusion balloon" to prevent that blood from reaching the brain.
The system is already controversial because it raises concerns about what caused the patient's death. But what happens when brain death becomes readily reversible? Stuart Younger, a bioethicist at Case Western Reserve University, told Nature that if BrainEx were to become widely available, it could shrink the pool of eligible donors.
"There's a potential conflict here between the interests of potential donors — who might not even be donors — and people who are waiting for organs," he said.
It will be a while before such experiments go anywhere near human subjects. A more immediate ethical question relates to how such experiments harm animal subjects.
Ethical review boards evaluate research protocols and can reject any that causes undue pain, suffering, or distress. Since dead animals feel no pain, suffer no trauma, they are typically approved as subjects. But how do such boards make a judgement regarding the suffering of a "cellularly active" brain? The distress of a partially alive brain?
The dilemma is unprecedented.
Setting new boundaries
Another science fiction story that comes to mind when discussing this story is, of course, Frankenstein. As Farahany told National Geographic: "It is definitely has [sic] a good science-fiction element to it, and it is restoring cellular function where we previously thought impossible. But to have Frankenstein, you need some degree of consciousness, some 'there' there. [The researchers] did not recover any form of consciousness in this study, and it is still unclear if we ever could. But we are one step closer to that possibility."
She's right. The researchers undertook their research for the betterment of humanity, and we may one day reap some unimaginable medical benefits from it. The ethical questions, however, remain as unsettling as the stories they remind us of.
- Prejudice is typically perpetrated against 'the other', i.e. a group outside our own.
- But ageism is prejudice against ourselves — at least, the people we will (hopefully!) become.
- Different generations needs to cooperate now more than ever to solve global problems.
SMARTER FASTER trademarks owned by The Big Think, Inc. All rights reserved.