Could a pill make you more moral? Should you take it if it could?
- Moral enhancement is the idea that technology can be used to make us more moral people.
- Proponents argue that we need to be better people in order to solve global problems.
- Ideas on how to use this ethically abound, but no solid consensus exists yet.
People have been artificially enhancing themselves for a long time. Caffeine and other stimulants improve our cognitive performance and might have made the enlightenment possible. More controversially, some athletes use steroids to enhance their athletic performance beyond what would naturally be possible for them.
These aren't the only ways that we can use science and technology to improve our performance, of course. In the last few years, some philosophers have argued that we can, and perhaps should, use these tools to enhance our moral abilities to become a more cooperative, empathetic, or properly motivated species.
Moral enhancement explained
The term "moral enhancement" was first used in a 2008 essay by Tom Douglas. It generally refers to biomedical enhancements but can refer to any technological attempt to make humans more moral. While one could debate what "more moral" means, the literature on the subject focuses on ideas of making people more cooperative, altruistic, and the like.
I reached out to Dr. Joao Fabiano, a Visiting Fellow at Harvard University's Safra Center for Ethics, for more information. He expanded on the idea of moral enhancement and provided the motivation for it.
We all sometimes behave worse than we think we should but have a hard time improving. Moral enhancement would be a technological intervention that helps us behave as we should. There is often a certain pattern to our moral failures shared by most of us. As the neuroscience of morality progresses, we might be able to fix these failures with technology. In fact, we urgently need moral enhancement given the grave social problems these moral failures create and their ingrained biological nature...
...Many of these recurrent moral failures are connected to grave problems in society, such as our inability to tackle global threats (global warming, nuclear proliferation, and pandemics) and grave injustices. Often, these failures can be explained by evolutionary science; they are deep-seated adaptations hardwired in our brains which we can, sometimes, costly and partially control with improved social norms. For instance, many forms of group favoritism and discrimination, such as racism, are to some degree evolved adaptations to an ancestral environment where groups were small and at constant war, and long-distance trade was limited. As neuroscience continues to uncover the biological modulators of our moral behaviour, we might soon be able to reliably influence that behavior with technological interventions.
Ways to make people more moral
Several studies have demonstrated that the moral actions people take can be influenced with biomedical interventions. One found that people will be more aggressive and more likely to violate social norms when their serotonin levels are artificially lowered. Another found that increasing serotonin levels made people harm-averse and more likely to stick to ideas of fairness. Lowering the amount of tryptophan, a precursor to serotonin and melatonin, that people have in their system makes them less cooperative.
Outside of the laboratory, some commonly used drugs, such as painkillers and antidepressants, are also known to slightly modify moral decision-making. Remember that next time you try to make a decision after taking some acetaminophen. The painkiller Tylenol also kills empathy.
Dr. Fabiano points out that the widespread use of these drugs means that "technology is already interfering with our morality, sometimes in undesirable and unpredictable ways." He adds, "We should, at the very least, try to take control of that to produce desirable changes."
He also mentioned, however, that no drug that can reliably enhance moral behavior currently exists. So you shouldn't get the idea that you'll be able to enhance yourself tomorrow.
While philosophers have only been discussing this idea for the last decade or so, plenty of them have argued both for and against moral enhancement.
The basic argument for moral enhancement has been mentioned, namely, that we humans are inclined to certain moral failures, these failures can be corrected, and we have the ability to do so with technological interventions. Some thinkers, such as Julian Savulescu and Ingmar Persson, suggest that we have a moral imperative to do so, as the possibility for even a single person to cause widespread destruction is greater now than it has ever been.
On the other hand, some thinkers, like Allen Buchanan, suggest that while the problems that many proponents of moral enhancement want to solve are real, moral enhancement isn't likely to be a feasible solution to these problems.
Instead, these thinkers propose that non-medical interventions, such as adopting more progressive and accepting attitudes toward out-groups, have proven that our moral natures are not fixed and can be improved without technological intervention — even if the process is a little slow. They additionally have a few doubts about the feasibility or desirability of relying on technology to improve our morals and conclude that focusing on traditional methods is the better bet.
Of course, these are not mutually exclusive options, and it is possible that moral enhancement can be used in tandem with more traditional methods of making people more moral.
The many problems with moral enhancement
The problem of how to actually implement any technological solution remains unsolved. While some philosophers, including Dr. Fabiano, have developed frameworks to guide our use of this technology, there is no real consensus on it. This is a bit of a problem, as simplistic variations of moral enhancement, such as the use of chemical castration as a tool to try to reform sexual offenders, are already in use today in ways that are controversial.
Moral enhancement raises many other ethical questions. Which traits should be enhanced (or suppressed)? What are the side effects of taking a drug that alters your moral behavior? Should such treatments be required for some people, like violent criminals?
Ironically, there is even the chance that improving in-group cooperation, a possible excellent application of moral enhancement, could cause other problems. As Dr. Fabiano explains, "[T]here is a lot of empirical evidence indicating that a drug increasing cooperation between individuals would likely decrease cooperation between groups. Highly cooperative groups tend to be highly discriminatory. Such a drug would create more problems than it would solve."
On the other hand, the possible benefits of moral enhancement are obvious. People could become more cooperative, empathetic, or altusic without the years of work that our current moral improvement systems require. Problems we currently face could vanish in the face of an enhanced population. As Dr. Savulescu argues, this is enough of a benefit to make moral enhancement a worthwhile consideration.
If offered to you, would you take the pill?
Study confirms the existence of a special kind of groupthink in large groups.
- Large groups of people everywhere tend to come to the same conclusions.
- In small groups, there's a much wider diversity of ideas.
- The mechanics of a large group make some ideas practically inevitable.
People make sense of the world by organizing things into categories and naming them. "These are circles." "That's a tree." "Those are rocks." It's one way we tame our world. There's a weird correspondence between different cultures, though — even though we come from different places and very different circumstances, cultures everywhere develop largely the same categorizations.
"But this raises a big scientific puzzle," says Damon Centola of the University of Pennsylvania. "If people are so different, why do anthropologists find the same categories, for instance for shapes, colors, and emotions, arising independently in many different cultures? Where do these categories come from and why is there so much similarity across independent populations?"
Centola is the senior investigator of a new study in the journal Nature Communications from the Network Dynamics Group (NDG) at the Annenberg School for Communication that explores how such categorization happens.
Some have theorized that these categories are innate—pre-wired in our brains—but the study says "nope." Its authors hypothesize that it has more to do with the dynamics of large groups, or networks.
The grouping game
Some of the shapes used in the experiment
Credit: Guilbeault, et al./University of Pennsylvania
The researchers tested their theory with 1,480 people playing an online "Grouping Game" via Amazon's Mechanical Turk platform. The individuals were paired with another participant or made a member of a group of 6, 8, 24, or 50 people. Each pair and group were tasked with categorizing the symbols shown above, and they could see each other's answers.
The small groups came up with wildly divergent categories—the entire experiment produced nearly 5,000 category suggestions—while the larger groups came up with categorization systems that were virtually identical to each other.
Says Centola, "Even though we predicted it, I was nevertheless stunned to see it really happen. This result challenges many long-held ideas about culture and how it forms."
Nor was this unanimity a matter of having teamed-up like-minded individuals. "If I assign an individual to a small group," says lead author Douglas Guilbeault, "they are much more likely to arrive at a category system that is very idiosyncratic and specific to them. But if I assign that same individual to a large group, I can predict the category system that they will end up creating, regardless of whatever unique viewpoint that person happens to bring to the table."
Why this happens
The many categories suggested by small groups on the left, the few from large groups on the right
Credit: Guilbeault, et al./Nature Communications
The striking results of the experiment correspond to a previous study done by NDG that investigated tipping points for people's behavior in networks.
That study concluded that after an idea enters a discussion among a large network of people, it can gain irresistible traction by popping up again and again in enough individuals' conversations. In networks of 50 people or more, such ideas eventually reach critical mass and become a prevailing opinion.
The same phenomenon does not happen often enough within a smaller network, where fewer interactions offer an idea less of an opportunity to take hold.
The study's finding raises an interesting practical possibility: Would categorization-related decisions made by large groups be less likely to fall prey to members' individual biases?
With this question in mind, the researchers are currently looking into content moderation on Facebook and Twitter. They're investigating whether the platforms would be wiser when categorizing content as free speech or hate speech if large groups were making these decisions instead of lone individuals working at these companies.
Similarly, they're also exploring the possibility that larger networks of doctors and healthcare professionals might be better at making diagnoses that would avoid biases such as racism or sexism that could cloud the judgment of individual practitioners.
"Many of the worst social problems reappear in every culture," notes Centola, "which leads some to believe these problems are intrinsic to the human condition. Our research shows that these problems are intrinsic to the social experiences humans have, not necessarily to humans themselves. If we can alter that social experience, we can change the way people organize things, and address some of the world's greatest problems."
We have the money to change the world. What's standing in the way?
- What does it actually take to drive large-scale change? Co-Impact founder and CEO Olivia Leland argues that it takes more than money, voting in elections, and supporting your favorite nonprofit. Solving complex global issues takes philanthropy in concert with community advocacy, support from businesses, innovation, an organized vision, and a plan to execute it.
- Leland has identified three areas that need to be addressed before real and meaningful change can happen. To effectively provide support, we must listen to the people who are already doing the work, rather than trying to start from scratch; make it easier for groups, government, and others to collaborate; and change our mindsets to think more long-term so that we can scale impact in ways that matter.
- Through supporting educational programs like Pratham and its Teaching at the Right Level model, Co-Impact has seen how these collaborative strategies can be employed to successfully tackle a complex problem like child literacy.
Turns out gender assumptions have been going on for quite some time.
- A recent archaeological dig in the Peruvian mountains uncovered evidence of ancient female big-game hunters.
- This adds to a growing consensus that women played a much bigger role in hunting than previously assumed.
- Gender assumptions are a constant throughout history, with culture often playing a more important role than biology.
You've likely heard it like this: for most of history, women foraged, secured water, and partook in minor agriculture while men went out to hunt. Even if this was the end of the story, women still provided an inordinate amount of calories for the tribe, as fruits, vegetables, and nuts accounted for the bulk of sustenance.
As with many myths, this longstanding story might not be completely accurate. Thanks to recent archaeological findings in Peru's Andes Mountains, published in the journal Science Advances, up to half the women in mobile groups in the Americas were big-game hunters.
University of California, Davis archaeologist Randall Haas started shifting his view of ancient hunting practices in 2018 while leading his crew 13,000 feet above sea level in Wilamay Patxja. Upon uncovering the remains, he automatically assumed one body was male due to the proximity of weaponry.
He was wrong.
The team unearthed a total of over 20,000 artifacts, including the remains of six bodies in five burial pits. One pit, which contained a teenage woman, included a toolkit with spearpoints and shafts. Tools for dissecting game were also discovered. In total, 24 stone tools were unearthed, including projectile points for killing large game, heavy rocks for stripping hides and cracking bones, and red ocher to preserve hides.
Previously, such tools were thought to be used for cutting or scraping when discovered near female remains. Haas says we need to rethink that approach, which is likely the result of modern bias. Buried near these pits were the remains of Andean deer and vicuña, two commonly hunted animals in Peru.
Haas's group then reviewed the remains of 429 bodies spread over 107 sites in the Americas. These individuals lived between 6,000 and 12,500 years ago. Big-game hunting tools were buried with 11 women and 16 men. The Wilamay Patxja dig is not an outlier.
Credit: Randall Haas, University of California, Davis
Why Female Gladiators Were Polarizing Figures in Ancient Rome
Extrapolating from the most recent dataset, Haas estimates that between 30-50 percent of big-game hunters were women. This doesn't imply that it's a global phenomenon, although female warriors were recently identified in California, dating back roughly 5,000 years. Likewise, women warriors were discovered in Mongolia 1,500 years ago and in Scandinavia about a millennium ago.
Researchers say these findings challenge our understanding of gender identities. Modern analysis can discover the biological sex of these individuals, though we cannot make assumptions about the role of men and women by current standards. As University of Miami archaeologist, Pamela Geller says,
"With few exceptions, the researchers who study hunting and gathering groups—regardless of which continent they work on—presume that a sexual division of labor was universal and rigid. And because it is commonsensical, they then have a hard time explaining why female-bodied individuals also bear the skeletal markers of hunting or have hunting tool kits as grave goods."
There's always the possibility that hunting tools were ritualistically buried alongside varied members of the tribe, including women. Yet we also have to remember that there were no supermarkets on the savanna. Tribal life was an all-hands-on-deck affair. Female hunters should surprise us no more than stay-at-home dads today. Societies are fluid dependent on circumstances, and the ancient world provided challenges we can only dream of today.
Stay in touch with Derek on Twitter and Facebook. His new book is "Hero's Dose: The Case For Psychedelics in Ritual and Therapy."