from the world's big
A belief in meritocracy is not only false: it’s bad for you
Most people don't just think the world should be run meritocratically, they think it is meritocratic.
'We are true to our creed when a little girl born into the bleakest poverty knows that she has the same chance to succeed as anybody else …' Barack Obama, inaugural address, 2013
'We must create a level playing field for American companies and workers.' Donald Trump, inaugural address, 2017
Meritocracy has become a leading social ideal. Politicians across the ideological spectrum continually return to the theme that the rewards of life – money, power, jobs, university admission – should be distributed according to skill and effort. The most common metaphor is the "even playing field" upon which players can rise to the position that fits their merit. Conceptually and morally, meritocracy is presented as the opposite of systems such as hereditary aristocracy, in which one's social position is determined by the lottery of birth.
Under meritocracy, wealth and advantage are merit's rightful compensation, not the fortuitous windfall of external events.
Most people don't just think the world should be run meritocratically, they think it is meritocratic. In the U.K., 84 percent of respondents to the 2009 British Social Attitudes survey stated that hard work is either 'essential' or 'very important' when it comes to getting ahead, and in 2016 the Brookings Institute found that 69 percent of Americans believe that people are rewarded for intelligence and skill. Respondents in both countries believe that external factors, such as luck and coming from a wealthy family, are much less important. While these ideas are most pronounced in these two countries, they are popular across the globe.
Although widely held, the belief that merit rather than luck determines success or failure in the world is demonstrably false. This is not least because merit itself is, in large part, the result of luck. Talent and the capacity for determined effort, sometimes called 'grit', depend a great deal on one's genetic endowments and upbringing.
This is to say nothing of the fortuitous circumstances that figure into every success story. In his book Success and Luck (2016), the US economist Robert Frank recounts the long-shots and coincidences that led to Bill Gates's stellar rise as Microsoft's founder, as well as to Frank's own success as an academic. Luck intervenes by granting people merit, and again by furnishing circumstances in which merit can translate into success. This is not to deny the industry and talent of successful people. However, it does demonstrate that the link between merit and outcome is tenuous and indirect at best.
According to Frank, this is especially true where the success in question is great, and where the context in which it is achieved is competitive. There are certainly programmers nearly as skillful as Gates who nonetheless failed to become the richest person on Earth. In competitive contexts, many have merit, but few succeed. What separates the two is luck.
In addition to being false, a growing body of research in psychology and neuroscience suggests that believing in meritocracy makes people more selfish, less self-critical and even more prone to acting in discriminatory ways. Meritocracy is not only wrong; it's bad.
The 'ultimatum game' is an experiment, common in psychological labs, in which one player (the proposer) is given a sum of money and told to propose a division between him and another player (the responder), who may accept the offer or reject it. If the responder rejects the offer, neither player gets anything. The experiment has been replicated thousands of times, and usually the proposer offers a relatively even split. If the amount to be shared is $100, most offers fall between $40–$50.
One variation on this game shows that believing one is more skilled leads to more selfish behaviour. In research at Beijing Normal University, participants played a fake game of skill before making offers in the ultimatum game. Players who were (falsely) led to believe they had 'won' claimed more for themselves than those who did not play the skill game. Other studies confirm this finding. The economists Aldo Rustichini at the University of Minnesota and Alexander Vostroknutov at Maastricht University in the Netherlands found that subjects who first engaged in a game of skill were much less likely to support the redistribution of prizes than those who engaged in games of chance. Just having the idea of skill in mind makes people more tolerant of unequal outcomes. While this was found to be true of all participants, the effect was much more pronounced among the 'winners'.
By contrast, research on gratitude indicates that remembering the role of luck increases generosity. Frank cites a study in which simply asking subjects to recall the external factors (luck, help from others) that had contributed to their successes in life made them much more likely to give to charity than those who were asked to remember the internal factors (effort, skill).
Perhaps more disturbing, simply holding meritocracy as a value seems to promote discriminatory behaviour. The management scholar Emilio Castilla at the Massachusetts Institute of Technology and the sociologist Stephen Benard at Indiana University studied attempts to implement meritocratic practices, such as performance-based compensation in private companies. They found that, in companies that explicitly held meritocracy as a core value, managers assigned greater rewards to male employees over female employees with identical performance evaluations. This preference disappeared where meritocracy was not explicitly adopted as a value.
This is surprising because impartiality is the core of meritocracy's moral appeal. The 'even playing field' is intended to avoid unfair inequalities based on gender, race and the like. Yet Castilla and Benard found that, ironically, attempts to implement meritocracy leads to just the kinds of inequalities that it aims to eliminate. They suggest that this 'paradox of meritocracy' occurs because explicitly adopting meritocracy as a value convinces subjects of their own moral bona fides. Satisfied that they are just, they become less inclined to examine their own behaviour for signs of prejudice.
Meritocracy is a false and not very salutary belief. As with any ideology, part of its draw is that it justifies the status quo, explaining why people belong where they happen to be in the social order. It is a well-established psychological principle that people prefer to believe that the world is just.
However, in addition to legitimation, meritocracy also offers flattery. Where success is determined by merit, each win can be viewed as a reflection of one's own virtue and worth. Meritocracy is the most self-congratulatory of distribution principles. Its ideological alchemy transmutes property into praise, material inequality into personal superiority. It licenses the rich and powerful to view themselves as productive geniuses. While this effect is most spectacular among the elite, nearly any accomplishment can be viewed through meritocratic eyes. Graduating from high school, artistic success or simply having money can all be seen as evidence of talent and effort. By the same token, worldly failures becomes signs of personal defects, providing a reason why those at the bottom of the social hierarchy deserve to remain there.
This is why debates over the extent to which particular individuals are 'self-made' and over the effects of various forms of 'privilege' can get so hot-tempered. These arguments are not just about who gets to have what; it's about how much 'credit' people can take for what they have, about what their successes allow them to believe about their inner qualities. That is why, under the assumption of meritocracy, the very notion that personal success is the result of 'luck' can be insulting. To acknowledge the influence of external factors seems to downplay or deny the existence of individual merit.
Despite the moral assurance and personal flattery that meritocracy offers to the successful, it ought to be abandoned both as a belief about how the world works and as a general social ideal. It's false, and believing in it encourages selfishness, discrimination and indifference to the plight of the unfortunate.
This article was originally published at Aeon and has been republished under Creative Commons.
Innovation in manufacturing has crawled since the 1950s. That's about to speed up.
Health officials in China reported that a man was infected with bubonic plague, the infectious disease that caused the Black Death.
- The case was reported in the city of Bayannur, which has issued a level-three plague prevention warning.
- Modern antibiotics can effectively treat bubonic plague, which spreads mainly by fleas.
- Chinese health officials are also monitoring a newly discovered type of swine flu that has the potential to develop into a pandemic virus.
Bacteria under microscope
needpix.com<p>Today, bubonic plague can be treated effectively with antibiotics.</p><p style="margin-left: 20px;">"Unlike in the 14th century, we now have an understanding of how this disease is transmitted," Dr. Shanthi Kappagoda, an infectious disease physician at Stanford Health Care, told <a href="https://www.healthline.com/health-news/seriously-dont-worry-about-the-plague#Heres-how-the-plague-spreads" target="_blank">Healthline</a>. "We know how to prevent it — avoid handling sick or dead animals in areas where there is transmission. We are also able to treat patients who are infected with effective antibiotics, and can give antibiotics to people who may have been exposed to the bacteria [and] prevent them [from] getting sick."</p>
This plague patient is displaying a swollen, ruptured inguinal lymph node, or buboe.
Centers for Disease Control and Prevention<p>Still, hundreds of people develop bubonic plague every year. In the U.S., a handful of cases occur annually, particularly in New Mexico, Arizona and Colorado, <a href="https://www.cdc.gov/plague/faq/index.html" target="_blank">where habitats allow the bacteria to spread more easily among wild rodent populations</a>. But these cases are very rare, mainly because you need to be in close contact with rodents in order to get infected. And though plague can spread from human to human, this <a href="https://www.healthline.com/health-news/seriously-dont-worry-about-the-plague#Heres-how-the-plague-spreads" target="_blank">only occurs with pneumonic plague</a>, and transmission is also rare.</p>
A new swine flu in China<p>Last week, researchers in China also reported another public health concern: a new virus that has "all the essential hallmarks" of a pandemic virus.<br></p><p>In a paper published in the <a href="https://www.pnas.org/content/early/2020/06/23/1921186117" target="_blank">Proceedings of the National Academy of Sciences</a>, researchers say the virus was discovered in pigs in China, and it descended from the H1N1 virus, commonly called "swine flu." That virus was able to transmit from human to human, and it killed an estimated 151,700 to 575,400 people worldwide from 2009 to 2010, according to the Centers for Disease Control and Prevention.</p>There's no evidence showing that the new virus can spread from person to person. But the researchers did find that 10 percent of swine workers had been infected by the virus, called G4 reassortant EA H1N1. This level of infectivity raises concerns, because it "greatly enhances the opportunity for virus adaptation in humans and raises concerns for the possible generation of pandemic viruses," the researchers wrote.
So far, 30 student teams have entered the Indy Autonomous Challenge, scheduled for October 2021.
- The Indy Autonomous Challenge will task student teams with developing self-driving software for race cars.
- The competition requires cars to complete 20 laps within 25 minutes, meaning cars would need to average about 110 mph.
- The organizers say they hope to advance the field of driverless cars and "inspire the next generation of STEM talent."
Indy Autonomous Challenge<p>Completing the race in 25 minutes means the cars will need to average about 110 miles per hour. So, while the race may end up being a bit slower than a typical Indy 500 competition, in which winners average speeds of over 160 mph, it's still set to be the fastest autonomous race featuring full-size cars.</p><p style="margin-left: 20px;">"There is no human redundancy there," Matt Peak, managing director for Energy Systems Network, a nonprofit that develops technology for the automation and energy sectors, told the <a href="https://www.post-gazette.com/business/tech-news/2020/06/01/Indy-Autonomous-Challenge-Indy-500-Indianapolis-Motor-Speedway-Ansys-Aptiv-self-driving-cars/stories/202005280137" target="_blank">Pittsburgh Post-Gazette</a>. "Either your car makes this happen or smash into the wall you go."</p>
Illustration of the Indy Autonomous Challenge
Indy Autonomous Challenge<p>The Indy Autonomous Challenge <a href="https://www.indyautonomouschallenge.com/rules" target="_blank">describes</a> itself as a "past-the-post" competition, which "refers to a binary, objective, measurable performance rather than a subjective evaluation, judgement, or recognition."</p><p>This competition design was inspired by the 2004 DARPA Grand Challenge, which tasked teams with developing driverless cars and sending them along a 150-mile route in Southern California for a chance to win $1 million. But that prize went unclaimed, because within a few hours after starting, all the vehicles had suffered some kind of critical failure.</p>
Indianapolis Motor Speedway
Indy Autonomous Challenge<p>One factor that could prevent a similar outcome in the upcoming race is the ability to test-run cars on a virtual racetrack. The simulation software company Ansys Inc. has already developed a model of the Indianapolis Motor Speedway on which teams will test their algorithms as part of a series of qualifying rounds.</p><p style="margin-left: 20px;">"We can create, with physics, multiple real-life scenarios that are reflective of the real world," Ansys President Ajei Gopal told <a href="https://www.wsj.com/articles/autonomous-vehicles-to-race-at-indianapolis-motor-speedway-11595237401?mod=e2tw" target="_blank">The Wall Street Journal</a>. "We can use that to train the AI, so it starts to come up to speed."</p><p>Still, the race could reveal that self-driving cars aren't quite ready to race at speeds of over 110 mph. After all, regular self-driving cars already face enough logistical and technical roadblocks, including <a href="https://www.bbc.com/news/technology-53349313#:~:text=Tesla%20will%20be%20able%20to,no%20driver%20input%2C%20he%20said." target="_blank">crumbling infrastructure, communication issues</a> and the <a href="https://bigthink.com/paul-ratner/would-you-ride-in-a-car-thats-programmed-to-kill-you" target="_self">fateful moral decisions driverless cars will have to make in split seconds</a>.</p>But the Indy Autonomous Challenge <a href="https://static1.squarespace.com/static/5da73021d0636f4ec706fa0a/t/5dc0680c41954d4ef41ec2b2/1572890638793/Indy+Autonomous+Challenge+Ruleset+-+v5NOV2019+%282%29.pdf" target="_blank">says</a> its main goal is to advance the industry, by challenging "students around the world to imagine, invent, and prove a new generation of automated vehicle (AV) software and inspire the next generation of STEM talent."
A new Harvard study finds that the language you use affects patient outcome.
- A study at Harvard's McLean Hospital claims that using the language of chemical imbalances worsens patient outcomes.
- Though psychiatry has largely abandoned DSM categories, professor Joseph E Davis writes that the field continues to strive for a "brain-based diagnostic system."
- Chemical explanations of mental health appear to benefit pharmaceutical companies far more than patients.
Challenging the Chemical Imbalance Theory of Mental Disorders: Robert Whitaker, Journalist<span style="display:block;position:relative;padding-top:56.25%;" class="rm-shortcode" data-rm-shortcode-id="41699c8c2cb2aee9271a36646e0bee7d"><iframe type="lazy-iframe" data-runner-src="https://www.youtube.com/embed/-8BDC7i8Yyw?rel=0" width="100%" height="auto" frameborder="0" scrolling="no" style="position:absolute;top:0;left:0;width:100%;height:100%;"></iframe></span><p>This is a far cry from Howard Rusk's 1947 NY Times editorial calling for mental healt</p><p>h disorders to be treated similarly to physical disease (such as diabetes and cancer). This mindset—not attributable to Rusk alone; he was merely relaying the psychiatric currency of the time—has dominated the field for decades: mental anguish is a genetic and/or chemical-deficiency disorder that must be treated pharmacologically.</p><p>Even as psychiatry untethered from DSM categories, the field still used chemistry to validate its existence. Psychotherapy, arguably the most efficient means for managing much of our anxiety and depression, is time- and labor-intensive. Counseling requires an empathetic and wizened ear to guide the patient to do the work. Ingesting a pill to do that work for you is more seductive, and easier. As Davis writes, even though the industry abandoned the DSM, it continues to strive for a "brain-based diagnostic system." </p><p>That language has infiltrated public consciousness. The team at McLean surveyed 279 patients seeking acute treatment for depression. As they note, the causes of psychological distress have constantly shifted over the millennia: humoral imbalance in the ancient world; spiritual possession in medieval times; early childhood experiences around the time of Freud; maladaptive thought patterns dominant in the latter half of last century. While the team found that psychosocial explanations remain popular, biogenetic explanations (such as the chemical imbalance theory) are becoming more prominent. </p><p>Interestingly, the 80 people Davis interviewed for his book predominantly relied on biogenetic explanations. Instead of doctors diagnosing patients, as you might expect, they increasingly serve to confirm what patients come in suspecting. Patients arrive at medical offices confident in their self-diagnoses. They believe a pill is the best course of treatment, largely because they saw an advertisement or listened to a friend. Doctors too often oblige without further curiosity as to the reasons for their distress. </p>
Image: Illustration Forest / Shutterstock<p>While medicalizing mental health softens the stigma of depression—if a disorder is inheritable, it was never really your fault—it also disempowers the patient. The team at McLean writes,</p><p style="margin-left: 20px;">"More recent studies indicate that participants who are told that their depression is caused by a chemical imbalance or genetic abnormality expect to have depression for a longer period, report more depressive symptoms, and feel they have less control over their negative emotions."</p><p>Davis points out the language used by direct-to-consumer advertising prevalent in America. Doctors, media, and advertising agencies converge around common messages, such as everyday blues is a "real medical condition," everyone is susceptible to clinical depression, and drugs correct underlying somatic conditions that you never consciously control. He continues,</p><p style="margin-left: 20px;">"Your inner life and evaluative stance are of marginal, if any, relevance; counseling or psychotherapy aimed at self-insight would serve little purpose." </p><p>The McLean team discovered a similar phenomenon: patients expect little from psychotherapy and a lot from pills. When depression is treated as the result of an internal and immutable essence instead of environmental conditions, behavioral changes are not expected to make much difference. Chemistry rules the popular imagination.</p>