Why a more diverse workplace is also a more talented one
Ram Charan has spent his working life as a business mentor and consultant to CEOs of global companies. He's the guy that Coca-Cola, KLM, GE, and Bank of America (just to name a few) call when they need help. And he's a firm believer in a diverse workplace. If a 90-year-old can do the job the best, then why not hire them? Raw talent doesn't just exist in ivy league business schools, he says, and that applies to the whole company... from the work floor to the boardroom. Ram's latest book is Talent Wins: The New Playbook for Putting People First , and he is brought to you today by Amway. Amway believes that diversity and inclusion are essential to the growth and prosperity of today’s companies. When woven into every aspect of the talent life cycle, companies committed to diversity and inclusion are the best equipped to innovate, improve brand image and drive performance.
Would companies be more diverse if A.I. did the hiring?
The best hiring manager might just be the computer sitting on your desk. AI and ethics expert Joanna Bryson posits that artificial intelligence can go through all the resumes in a stack and find what employers are missing. Most humans, on the other hand, will rely on biases — whether they are aware of them or not — to get them through the selection process. This is sadly why those with European-sounding names get more calls for interviews than others. AI, she says, can change that. Joanna is brought to you today by Amway. Amway believes that diversity and inclusion are essential to the growth and prosperity of today’s companies. When woven into every aspect of the talent life cycle, companies committed to diversity and inclusion are the best equipped to innovate, improve brand image and drive performance.
How equal parental leave can help close the gender pay gap
It's no small secret that America is far behind the rest of the world when it comes to maternal leave. But studies are finding that paternal leave shouldn't be overlooked, either. Lauren Smith Brody, former editor of Glamor magazine and now a full-time author and founder of The Fifth Trimester movement, makes the case here that dads need time off, too, to bond with their newborns, and that modern companies need to understand and appreciate that. Lauren's latest book is The Fifth Trimester: The Working Mom's Guide to Style, Sanity, and Success After Baby. This video is brought to you by Amway. Amway believes that diversity and inclusion are essential to the growth and prosperity of today’s companies. When woven into every aspect of the talent life cycle, companies committed to diversity and inclusion are the best equipped to innovate, improve brand image and drive performance.
Real talk at work: How Amway created a better office for more people
Most people approach talking about difficult subjects as if they were at a debate. That is, arriving at the table (metaphorically speaking) with preconceived notions and ideas. But Amway's VP of Global Litigation and Corporate Law, Claire Groen, knew there had to be a better way. She and the leaders at Amway devised what they call RealTalk, which brings people together to hold conversations on current topics. And when the topics happened to turn into hot-button issues like immigration, the racism at Charlottesville, and so forth, these talks became an incredible conduit to a more inclusive office. People were heard, and in turn listened more to ideas outside of their comfort zone. This resulted in a better and more inclusive culture at Amway. Amway believes that diversity and inclusion are essential to the growth and prosperity of today’s companies. When woven into every aspect of the talent life cycle, companies committed to diversity and inclusion are the best equipped to innovate, improve brand image and drive performance.
Breaking the ice: How astronauts overcome their differences aboard the ISS
Look up—you can see the greatest feat of human cooperation orbiting 254 miles above Earth. As commander of Expedition 35 aboard the International Space Station (ISS), Canadian astronaut Chris Hadfield understands the difficulty of cultural barriers in team work, and the life or death necessity of learning to communicate across those divides. The ISS is a joint project between five space agencies, built by people from 15 different nations—and each of them has a different take on what is "normal". Hadfield explains the scale of cultural differences aboard the spaceship: "What do you do on a Friday night? What does "yes" mean? What does "uh-huh" mean? What is the day of worship? When do you celebrate a holiday? How do you treat your spouse or your children? How do you treat each other? What is the hierarchy of command? All of those things seem completely clear to you, but you were raised in a specific culture that is actually shared by no one else." Here, Hadfield explains his strategy for genuine listening and communication. Whether it's money, reputation, or your life that's at stake, being sensitive and aware of people's differences helps you accomplish something together—no matter where you’re from. Amway believes that diversity and inclusion are essential to the growth and prosperity of today’s companies. When woven into every aspect of the talent life cycle, companies committed to diversity and inclusion are the best equipped to innovate, improve brand image and drive performance. Chris Hadfield features in the new docuseries One Strange Rock and is the author of An Astronaut's Guide to Life on Earth: What Going to Space Taught Me About Ingenuity, Determination, and Being Prepared for Anything
How experiencing discrimination in VR can make you less biased
What would it be like to live in the body of someone else? Since the dawn of mankind, people have imagined what it would be like to inhabit another body, just for a day or even for a few minutes. Thanks to the magic of VR, we can now do that. Jeremy Bailenson, the creator of the Virtual Human Interaction Lab, has designed a VR experience called 1000 Cut Journey that may change the way people see race: by experiencing it firsthand. Jeremy explains to us, "You start out as an elementary school child and you’re in a classroom. You then become a teenager and you’re interacting with police officers. You then become an adult who’s going on a job interview, and what you experience while wearing the body of a black male is implicit bias that happens repeatedly and over time." Jeremy is brought to you today by Amway. Amway believes that diversity and inclusion are essential to the growth and prosperity of today’s companies. When woven into every aspect of the talent life cycle, companies committed to diversity and inclusion are the best equipped to innovate, and improve brand image and drive performance.
When data drives diversity and inclusion, good things happen
What makes a job a great place to work? A sense of equity and ownership, says Michael Bush, the CEO of the conveniently named Great Place to Work. They're a global consulting and analytics firm that produces the annual Fortune 100 Best Companies to Work For list, the 100 Best Workplaces for Women list, the Best Workplaces for Diversity list, and dozens of other distinguished workplace rankings around the world. Michael's new book is A Great Place to Work for All: Better for Business, Better for People, Better for the World, and he's brought to you today by Amway. Amway believes that diversity and inclusion are essential to the growth and prosperity of today’s companies. When woven into every aspect of the talent life cycle, companies committed to diversity and inclusion are the best equipped to innovate, improve brand image and drive performance.
Neurodiversity: Many mental 'deficits' are really hidden strengths
Color-blindness. Left-handedness. Dyslexia. Autism. These are all different ways in which the brain is rewired differently than the norm. But Heather Heying, evolutionary biologist and former Professor at Evergreen State College, is saying that these so-called differences are really strengths. For example, she relays us a story about her autistic students being far more adept at spotting social dynamics emerging in the classroom, long before non-autistic students. And left-handed people are often way more creative than their righty counterparts. Evolution might suggest that we need these differences to be stronger as a whole. Be sure to follow Heather on twitter: @HeatherEHeying and through her website, heatherheying.com. Heather is brought to you today by Amway. Amway believes that diversity and inclusion are essential to the growth and prosperity of today’s companies. When woven into every aspect of the talent life cycle, companies committed to diversity and inclusion are the best equipped to innovate, improve brand image, and drive performance.
- Deep Nostalgia uses machine learning to animate static images.
- The AI can animate images by "looking" at a single facial image, and the animations include movements such as blinking, smiling and head tilting.
- As deepfake technology becomes increasingly sophisticated, some are concerned about how bad actors might abuse the technology to manipulate the pubic.
A new service gives new life to the past by using artificial intelligence to convert static images into moving videos.
Called Deep Nostalgia, the service creates animations by using deep learning to analyze a single facial photo. Then, the system animates the facial image through a "driver" — a pre-determined sequence of movements and gestures, like blinking, smiling and head-turning. The process is completely automated, and the service enhances the images to make the animations run more smoothly.
Launched in February by the Israeli genealogy company My Heritage, some of Deep Nostalgia's early results are impressive.
My Heritage/Deep Nostalgia
But that's not to say the animations are perfect. As with most deep-fake technology, there's still an uncanny air to the images, with some of the facial movements appearing slightly unnatural. What's more, Deep Nostalgia is only able to create deepfakes of one person's face from the neck up, so you couldn't use it to animate group photos, or photos of people doing any sort of physical activity.
My Heritage/Deep Nostalgia
But for a free deep-fake service, Deep Nostalgia is pretty impressive, especially considering you can use it to create deepfakes of any face, human or not.
I just ran the @Warcraft cover through the Deep Nostalgia Tool, and this happened... https://t.co/1eD3bb7fAN— Solitaire #The06 (@Solitaire #The06)1614590060.0
Generated with MyHeritage. #DeepNostalgia https://t.co/gNX3wLHsS8— Andrey Frolov (@Andrey Frolov)1614413388.0
So, is creating deepfakes of long-dead people a bit creepy? Some people seem to think so.
"Some people love the feature with Deep Nostalgia ™ and consider it magical while others think it is scary and dislike it," My Heritage wrote on its website. "In fact, the results can be controversial and it is difficult to be indifferent to this technology. We invite you to create movies using this feature and share them on social media to see what your friends and relatives think. This feature is intended for nostalgic use, that is, to give life back to beloved ancestors."
Deep Nostalgia isn't the first project to create deepfakes from single images. In 2019, researchers working at the Samsung AI Center in Moscow published a paper describing how machine-learning techniques can produce deepfakes after "looking" at only one or a few images. Using a framework known as a generative adversarial network, the researchers trained a pair of computer models to compete with each other to create convincing deepfakes.
While the results from the Samsung researchers were impressive, the Deep Nostalgia project shows how deepfake technology is advancing at a rapid pace. As these tools have become increasingly sophisticated, media experts have raised concerns about how bad actors might use deepfakes and "cheap fakes" to manipulate the public.
My Heritage seemed to sense Deep Nostalgia's potential for abuse, writing:
"Please use this feature on your own historical photos and not on photos of living people without their consent."
- How far should we defend an idea in the face of contrarian evidence?
- Who decides when it's time to abandon an idea and deem it wrong?
- Science carries within it its seeds from ancient Greece, including certain prejudices of how reality should or shouldn't be.
From the perspective of the west, it all started in ancient Greece, around 600 BCE. This is during the Axial Age, a somewhat controversial term coined by German philosopher Karl Jaspers to designate the remarkable intellectual and spiritual awakening that happened in different places across the globe roughly within the span of a century. Apart from the Greek explosion of thought, this is the time of Siddhartha Gautama (aka the Buddha) in India, of Confucius and Lao Tzu in China, of Zoroaster (or Zarathustra) in ancient Persia—religious leaders and thinkers who would reframe the meaning of faith and morality. In Greece, Thales of Miletus and Pythagoras of Samos pioneered pre-Socratic philosophy, (sort of) moving the focus of inquiry and explanation from the divine to the natural.
To be sure, the divine never quite left early Greek thinking, but with the onset of philosophy, trying to understand the workings of nature through logical reasoning—as opposed to supernatural reasoning—would become an option that didn't exist before. The history of science, from its early days to the present, could be told as an increasingly successful split between belief in a supernatural component to reality and a strictly materialistic cosmos. The Enlightenment of the 17th and 18th centuries, the Age of Reason, means quite literally 'to see the light,' the light here clearly being the superiority of human logic above any kind of supernatural or nonscientific methodology to get to the "truth" of things.
Einstein, for one, was a believer, preaching the fundamental reasonableness of nature; no weird unexplainable stuff, like a god that plays dice—his tongue-in-cheek critique of the belief that the unpredictability of the quantum world was truly fundamental to nature and not just a shortcoming of our current understanding.
To what extent we can understand the workings of nature through logic alone is not something science can answer. It is here that the complication begins. Can the human mind, through the diligent application of scientific methodology and the use of ever-more-powerful instruments, reach a complete understanding of the natural world? Is there an "end to science"? This is the sensitive issue. If the split that started in pre-Socratic Greece were to be completed, nature in its entirety would be amenable to a logical description, the complete collection of behaviors that science studies identified, classified, and described by means of perpetual natural laws. All that would be left for scientists and engineers to do would be practical applications of this knowledge, inventions, and technologies that would serve our needs in different ways.
This sort of vision—or hope, really—goes all the way back to at least Plato who, in turn, owes much of this expectation to Pythagoras and Parmenides, the philosopher of Being. The dispute between the primacy of that which is timeless or unchangeable (Being), and that which is changeable and fluid (Becoming), is at least that old. Plato proposed that truth was in the unchangeable, rational world of Perfect Forms that preceded the tricky and deceptive reality of the senses. For example, the abstract form Chair embodies all chairs, objects that can take many shapes in our sensorial reality while serving their functionality (an object to sit on) and basic design (with a sittable surface and some legs below it). According to Plato, the Forms hold the key to the essence of all things.
Plato used the allegory of the cave to explain that what humans see and experience is not the true reality.
Credit: Gothika via Wikimedia Commons CC 4.0
When scientists and mathematicians use the term Platonic worldview, that's what they mean in general: The unbound capacity of reason to unlock the secrets of creation, one by one. Einstein, for one, was a believer, preaching the fundamental reasonableness of nature; no weird unexplainable stuff, like a god that plays dice—his tongue-in-cheek critique of the belief that the unpredictability of the quantum world was truly fundamental to nature and not just a shortcoming of our current understanding. Despite his strong belief in such underlying order, Einstein recognized the imperfection of human knowledge: "What I see of Nature is a magnificent structure that we can comprehend only very imperfectly, and that must fill a thinking person with a feeling of humility." (Quoted by Dukas and Hoffmann in Albert Einstein, The Human Side: Glimpses from His Archives (1979), 39.)
Einstein embodies the tension between these two clashing worldviews, a tension that is still very much with us today: On the one hand, the Platonic ideology that the fundamental stuff of reality is logical and understandable to the human mind, and, on the other, the acknowledgment that our reasoning has limitations, that our tools have limitations and thus that to reach some sort of final or complete understanding of the material world is nothing but an impossible, semi-religious dream.
This kind of tension is palpable today when we see groups of scientists passionately arguing for or against the existence of the multiverse, an idea that states that our universe is one in a huge number of other universes; or for or against the final unification of the laws of physics.
Nature, of course, is always the final arbiter of any scientific dispute. Data decides, one way or another. That's the beauty and power at the core of science. The challenge, though, is to know when to let go of an idea. How long should one wait until an idea, seductive as it may be, is deemed unrealistic? This is where the debate gets interesting. Data to support more "out there" ideas such as the multiverse or extra symmetries of nature needed for unification models has refused to show up for decades, despite extensive searches with different instruments and techniques. On the other hand, we only find if we look. So, should we keep on defending these ideas? Who decides? Is it a community decision or should each person pursue their own way of thinking?
In 2019, I participated in an interesting live debate at the World Science Festival with physicists Michael Dine and Andrew Strominger and hosted by physicist Brian Greene. The theme was string theory, our best candidate for a final theory of how particles of matter interact. When I completed my PhD in 1986, string theory was the way. The only way. But, by 2019, things had changed, and quite dramatically, due to the lack of supporting data. To my surprise, both Mike and Andy were quite open to the fact that that certainty of the past was no more. String theory has taught physicists many things and that was perhaps its use. The Platonic outlook was in peril.
The dispute remains alive, although with each experiment that fails to show supporting evidence for string theory the dream grows harder to justify. Will it be a generational thing, as celebrated physicist Max Planck once quipped, "Ideas don't die, physicists do"? (I paraphrase.) I hope not. But it is a conversation that should be held more in the open, as was the case with the World Science Festival. Dreams die hard. But they may die a little easier when we accept the fact that our grasp of reality is limited, and doesn't always fit our expectations of what should or shouldn't be real.
Does this mean you can make your way through the world like the old days without fear of spreading the virus? Deborah Fuller is a microbiologist at the University of Washington School of Medicine working on coronavirus vaccines. She explains what the science shows about transmission post-vaccination – and whether new variants could change this equation.
1. Does vaccination completely prevent infection?
The short answer is no. You can still get infected after you've been vaccinated. But your chances of getting seriously ill are almost zero.
Many people think vaccines work like a shield, blocking a virus from infecting cells altogether. But in most cases, a person who gets vaccinated is protected from disease, not necessarily infection.
Every person's immune system is a little different, so when a vaccine is 95% effective, that just means 95% of people who receive the vaccine won't get sick. These people could be completely protected from infection, or they could be getting infected but remain asymptomatic because their immune system eliminates the virus very quickly. The remaining 5% of vaccinated people can become infected and get sick, but are extremely unlikely to be hospitalized.
Vaccination doesn't 100% prevent you from getting infected, but in all cases it gives your immune system a huge leg up on the coronavirus. Whatever your outcome – whether complete protection from infection or some level of disease – you will be better off after encountering the virus than if you hadn't been vaccinated.
Vaccines prevent disease, not infection. (National Institute of Allergy and Infectious Diseases, CC BY)
2. Does infection always mean transmission?
Transmission happens when enough viral particles from an infected person get into the body of an uninfected person. In theory, anyone infected with the coronavirus could potentially transmit it. But a vaccine will reduce the chance of this happening.
In general, if vaccination doesn't completely prevent infection, it will significantly reduce the amount of virus coming out of your nose and mouth – a process called shedding – and shorten the time that you shed the virus. This is a big deal. A person who sheds less virus is less likely to transmit it to someone else.
This seems to be the case with coronavirus vaccines. In a recent preprint study which has yet to be peer reviewed, Israeli researchers tested 2,897 vaccinated people for signs of coronavirus infection. Most had no detectable virus, but people who were infected had one-quarter the amount of virus in their bodies as unvaccinated people tested at similar times post-infection.
Less coronavirus virus means less chance of spreading it, and if the amount of virus in your body is low enough, the probability of transmitting it may reach almost zero. However, researchers don't yet know where that cutoff is for the coronavirus, and since the vaccines don't provide 100% protection from infection, the Centers for Disease Control and Prevention recommends that people continue to wear masks and social distance even after they've been vaccinated.
3. What about the new coronavirus variants?
New variants of coronavirus have emerged in recent months, and recent studies show that vaccines are less effective against certain ones, like the B1351 variant first identified in South Africa.
Every time SARS-CoV-2 replicates, it gets new mutations. In recent months, researchers have found new variants that are more infective – meaning a person needs to breathe in less virus to become infected – and other variants that are more transmissible - meaning they increase the amount of virus a person sheds. And researchers have also found at least one new variant that seems to be better at evading the immune system, according to early data.
So how does this relate to vaccines and transmission?
For the South Africa variant, vaccines still provide greater than 85% protection from getting severely ill with COVID–19. But when you count mild and moderate cases, they provide, at best, only about 50%-60% protection. That means at least 40% of vaccinated people will still have a strong enough infection – and enough virus in their body – to cause at least moderate disease.
If vaccinated people have more virus in their bodies and it takes less of that virus to infect another person, there will be higher probability a vaccinated person could transmit these new strains of the coronavirus.
If all goes well, vaccines will very soon reduce the rate of severe disease and death worldwide. To be sure, any vaccine that reduces disease severity is also, at the population level, reducing the amount of virus being shed overall. But because of the emergence of new variants, vaccinated people still have the potential to shed and spread the coronavirus to other people, vaccinated or otherwise. This means it will likely take much longer for vaccines to reduce transmission and for populations to reach herd immunity than if these new variants had never emerged. Exactly how long that will take is a balance between how effective vaccines are against emerging strains and how transmissible and infectious these new strains are.
- It's difficult to overstate the impact of technology and artificial intelligence. Smart machines are fundamentally reshaping the economy—indeed, society as a whole.
- Seemingly overnight, they have changed our roles in the workplace, our views of democracy—even our family and personal relationships.
- In my latest book, I argue that we can—and must—rise to this challenge by developing our capacity for "human work," the work that only humans can do: thinking critically, reasoning ethically, interacting interpersonally, and serving others with empathy.
Until now, it's fair to say that technology and artificial intelligence have tended to make people more passive participants in society. Too many have lost the ability to play an active role in the economy as AI has disrupted the workplace. Too many have become passive consumers of information and are living in self-imposed bubbles of belief. And too many have withdrawn into passive lives of isolation apart from any meaningful engagement in their communities or, in some cases, even their families.
When considering human work and the future of democracy, it's impossible to avoid the rise of authoritarianism throughout the world.
How does one escape the temptations of an AI-created bubble of information and belief? The answer, obviously, is to burst the bubble—to escape by being exposed to ideas and experiences that are fundamentally different from our own.
This begins by being exposed to people who are different from us—who have different beliefs, values, cultures, and life experiences. Human work offers this chance because it is built on human attributes such as empathy, openness, and flexibility—precisely those needed for strong communities and a strong society. The results we need to assure through human work are not just higher incomes but also openness to different cultures, willingness to engage individuals with different ideologies and perspectives, increased likelihood to vote and volunteer, and recognition of the value of open markets and free, democratic systems of government. The characteristics of human work have much more than economic consequences; they are the lifeblood of free people and societies.
People with higher levels of education are less inclined toward authoritarian political preferences.
Credit: Georgetown University Center on Education and the Workforce analysis of data from the World Values Survey (WVS), 1994–2014.
When considering human work and the future of democracy, it's impossible to avoid the rise of authoritarianism throughout the world. According to new research from the Georgetown University Center on Education and the Workforce, the alarming increase of authoritarianism on a global scale can't be considered in isolation.
The postwar world order was based on the expectation in the West that democracy was spreading throughout the world, country by country, and would eventually become the preferred form of government everywhere. Foreign relations were based on the broad consensus that established democracies should be vigilant and unwavering in offering military and cultural support to emerging democracies. Democracy spread throughout Latin America and even appeared likely to take root in China. The end of the Cold War seemed to confirm the inevitability of democracy's spread, with only a few old-style authoritarian systems left in Cuba, North Korea, and other poor, isolated countries.
Today, the tide seems to be turning in the opposite direction. Authoritarianism—particularly in the form of populist nationalism—has returned to Russia and parts of Eastern Europe, Asia, and Latin America. China appears resolute in maintaining state control over political and cultural expression. And we now understand clearly that not even the United States and Western Europe are immune from authoritarianism's allure.
Today, nearly a third of Americans who haven't gone to college believe that having a "strong leader" is good for the country, compared to only about 13% of those with a bachelor's degree.
Of course, much of that allure is based on fear—fear of change, fear of loss of advantage, fear of the other. Authoritarian leaders and wannabes exploit this fear by appealing to group identity and cohesion and by defining those who appear different as a threat. We should recognize that authoritarianism is not just imposed from above—at least, not at first. It is an individual worldview that everyone to a greater or lesser extent is susceptible to. Research on authoritarianism supports the idea that preferences for conformity and social cohesion are among the psychological tendencies that predispose people toward preferring strong hierarchical leadership styles. In other words, individuals who have a greater preference for group cohesion are more inclined to feel threatened by diversity, be intolerant of outsiders, and react by supporting authoritarian leaders.
With its preference for conformity, authoritarianism is a clear threat to liberal democracy and the diversity of expression, belief, and ways of living that it is designed to protect. But the same education system that prepares people for work can play a role in protecting our democratic way of life. Numerous studies going back decades and conducted throughout the world have shown that higher levels of education are inversely correlated with authoritarianism.
Today, nearly a third of Americans who haven't gone to college believe that having a "strong leader" is good for the country, compared to only about 13% of those with a bachelor's degree. Meanwhile, according to a 2017 Pew Research Center study, about a quarter of people with a high school diploma or less say "military rule would be a good way to govern our country." Only 7% of college grads support that view.
People with higher levels of education are much less likely to be authoritarian in their child-rearing preferences than others. The shift toward raising children who themselves are more tolerant, independent, and inquisitive may be education's most profound effect on society.
Why does education thwart authoritarian attitudes? At its best, higher education strives to promote independent thought and critical examination of established orthodoxy, not to mention inquisitiveness and curiosity. All this stands in stark contrast to the blind acceptance of information and opinion from authorities. Higher education also exposes people to diverse ideas and cultures, showing that differences are not as bad or as dangerous as people may have been conditioned to believe. Education helps people to better understand abstract principles of democracy and equality and how to deal with complexity and differences in society. Education also helps improve interpersonal communication skills—essential for civic participation in a democracy.
But perhaps the most powerful reason education is an antidote to authoritarianism lies even deeper. People with higher levels of education are much less likely to be authoritarian in their child-rearing preferences than others. The shift toward raising children who themselves are more tolerant, independent, and inquisitive may be education's most profound effect on society.
Of course, formal learning cannot on its own change the equation, but absent well-informed citizens who can critically judge the ideas and perspectives of those who hold office, the consequences will be chilling. When the former president of the United States invents "facts" or tells outright lies, dismisses scientific evidence, and demonstrates a stunning ignorance of history, the consequences are real for those who have not developed their own critical-thinking capacities.
So, the greatest contribution of a better-educated population to shared prosperity is that educated citizens are the best defense against the threats to our democratic way of life. The debate about President Donald Trump's and others' perceived threats to democracy will linger, but for democracy to prosper in the long term, we need more people to reach higher levels of education.
This is an edited excerpt from Chapter 6 of Human Work in the Age of Smart Machines, by Jamie Merisotis.
- Simulation theory proposes that our world is likely a simulation created by beings with super-powerful computers.
- In "A Glitch in the Matrix," filmmaker Rodney Ascher explores the philosophy behind simulation theory, and interviews a handful of people who believe the world is a simulation.
- "A Glitch in the Matrix" premiered at the 2021 Sundance Film Festival and is now available to stream online.
Are you living in a computer simulation?
If you've spent enough time online, you've probably encountered this question. Maybe it was in one of the countless articles on simulation theory. Maybe it was during the chaos of 2020, when Twitter users grew fond of saying things like "we're living in the worst simulation" or "what a strange timeline we're living in." Or maybe you saw that clip of Elon Musk telling an audience at a tech conference that the probability of us not living in a simulation is "one in billions."
It might sound ludicrous. But Twitter memes and quotes from "The Matrix" aside, simulation theory has some lucid arguments to back it up. The most cited explanation came in 2003, when Oxford University philosopher Nick Bostrom published a paper claiming at least one of the following statements is true:
- The human species is very likely to go extinct before reaching a "posthuman" stage
- Any posthuman civilization is extremely unlikely to run a significant number of simulations of their evolutionary history (or variations thereof)
- We are almost certainly living in a computer simulation
The basic idea: Considering that computers are growing exponentially powerful, it's reasonable to think that future civilizations might someday be able to use supercomputers to create simulated worlds. These worlds would probably be populated by simulated beings. And those beings might be us.
In the new documentary "A Glitch in the Matrix", filmmaker Rodney Ascher sends viewers down the rabbit hole of simulation theory, exploring the philosophical ideas behind it, and the stories of a handful of people for whom the theory has become a worldview.
The film features, for example, a man called Brother Laeo Mystwood, who describes how a series of strange coincidences and events — a.k.a "glitches in the matrix" — led him to believe the world is a simulation. Another interviewee, a man named Paul Gude, said the turning point for him came in childhood when he was watching people sing at a church service; the "absurdity of the situation" caused him to realize "none of this is real."
But others have darker reactions after coming to believe the world is a simulation. For example, if you believe you're in a simulation, you might also think that some people in the simulation are less real than you. A few of the film's subjects describe the idea of other people being "chemical robots" or "non-player characters," a video-game term used to describe characters who behave according to code.
The documentary's most troubling sequences features the story of Joshua Cooke. In 2003, Cooke was 19 years old and suffering from an undiagnosed mental illness when he became obsessed with "The Matrix." He believed he was living in a simulation. On a February night, he shot and killed his adoptive parents with a shotgun. The murder trial spawned what's now known as the "Matrix defense," a version of the insanity defense in which a defendant claims to have been unable to distinguish reality from simulation when they committed a crime.
Of course, Cooke's case lies on the extreme side of the simulation theory world, and there's nothing inherently nihilistic about simulation theory or people who believe in it. After all, there are many ways to think about simulation theory and its implications, just as there are many different ways to think about religion.
And as with religion, a key question in simulation theory is: Who created the simulation and why?
In his 2003 paper, Bostrom argued that future human civilizations might be interested in creating "ancestor simulations," meaning that our world might be a simulation of a human civilization that once existed in base reality; it'd be a way for future humans to study their past. Other explanations range from the simulation being some form of entertainment for future humans, to the simulation being the creation of aliens.
"If this is a simulation, there's sort of a half dozen different explanations for what this is for," Ascher told Big Think. "And some of them are completely opposite from one another."
To learn more about simulation theory and those who believe in it, we spoke to Ascher about "A Glitch in the Matrix", which premiered at the 2021 Sundance Film Festival and is now available to stream online. (This interview has been lightly edited for concision and clarity.)
Rodney Ascher / "A Glitch in the Matrix"
Throughout 2020, many people seemed to talk about the world being a simulation, especially on Twitter. What do you make of that?
I see that just as sort of evidence of how deep the idea [of simulation theory] is penetrating our culture. You know, I'm addicted to Twitter, and everyday something strange happens in the news, and people make some jokes about, "This simulation is misfiring," or, "What am I doing in the dumbest possible timeline?"
I enjoy those conversations. But two things about them: On the one hand, they're using simulation theory as a way to let off steam, right? "Well, this world is so absurd, perhaps that's an explanation for it," or, "Maybe at the end of the day it doesn't matter that much because this isn't the real world."
But also, when you talk about the strange or horrifying, or bizarre unlikely things that happen as evidence [for the simulation], then that begs the question, well what is the simulation for, and why would these things happen? They could be an error or glitch in the matrix. [...] Or those strange things that happen might be the whole point [of the simulation].
How do you view the connections between religious ideas and simulation theory?
I kind of went in [to making the film] thinking that this was, in large part, going to be a discussion of the science. And people very quickly went to, you know, religious and sort of ethical places.
I think that connection made itself clearest when I talked to Erik Davis, who wrote a book called "Techgnosis", which is specifically about the convergence of religion and technology. He wanted to make it clear that, from his point of view, simulation theory was sort of a 21st-century spin on earlier ideas, some of them quite ancient.
To say that [religion and simulation theory] are exactly the same thing is sort of pushing it. [...] You could say that if simulation theory is correct, and that we are genuinely in some sort of digitally created world, that earlier traditions wouldn't have had the vocabulary for that.
So, they would have talked about it in terms of magic. But by the same token, if those are two alternative, if similar, explanations for how the world works, I think one of the interesting things that it does is that either one suggests something different about the creator itself.
In a religious tradition, the creator is this omnipotent, supernatural being. But in simulation theory, it could be a fifth-grader who just happens to have access to an incredibly powerful computer [laughs].
Rodney Ascher / "A Glitch in the Matrix"
How did your views on simulation theory change since you started working on this documentary?
I think what's changed my mind the most in the course of working on the film is how powerful it is as a metaphor for understanding the here-and-now world, without necessarily having to believe in [simulation theory] literally.
Emily Pothast brought up the idea of Plato's cave as sort of an early thought experiment that is kind of resonant of simulation theory. And she expands upon it, talking about how, in 21st-century America, the shadows that we're seeing of the real world are much more vivid. You know, the media diets that we all absorb, that are all reflections of the real world.
But the danger that the ones you're seeing aren't accurate—whether that's just signal loss from mistakes made by journalists working in good faith, or whether it's intentional distortion by somebody with an agenda—that leads to a really provocative idea about the artificial world, the simulated world, that each of us create, and then live in, based on our upbringing, our biases, and our media diet. That makes me stop and pause from time to time.
Do you see any connections between mental illness, or an inability to empathize with others, and some peoples' obsession with simulation theory?
It can certainly lead to strange, obsessive thinking. [Laughs] For some reason, I feel like I have to defend [people who believe in simulation theory], or qualify it. But you can get into the same sort of non-adaptive behavior obsessing on, you know, the Beatles or the Bible, or anything. [Charles] Manson was all obsessed on "The White Album." He didn't need simulation theory to send him down some very dark paths.
Credit: K_e_n via AdobeStock
Why do you think people are attracted to simulation theory?
You might be attracted to it because your peer group is attracted to it, or people that you admire are attracted to it, which lends it credibility. But also like, just the way you and I are talking about it now, it's a juicy topic that extends in a thousand different ways.
And despite the cautionary tales that come up in the film, I've had a huge amount of fascinating social conversations with people because of my interest in simulation theory, and I imagine it's true about a lot of people who spend a lot of time thinking about it. I don't know if they all think about it alone, right? Or if it's something that they enjoy talking about with other people.
If technology became sufficiently advanced, would you create a simulated world?
It'd be very tempting, especially if I could add the power of flight or something like that [laughs]. I think the biggest reason not to, and I just saw this on a comment on Twitter yesterday, and I don't know if it had occurred to me, but what might stop me is all the responsibility I'd feel to all the people within it, right? If this were an accurate simulation of planet Earth, the amounts of suffering that occurs there for all the creatures and what they went through, that might be what stops me from doing it.
If you discovered you were living in a simulation, would it change the way you behave in the world?
I think I would need more information about what the nature of the purpose of the simulation is. If I found out that I was the only person in a very elaborate virtual-reality game, and I had forgotten who I really was, well then I would act very differently then I would if I learned this is an accurate simulation of 21st-century America as conceived by aliens or people in the far future, in which case I think things would stay more or less the same — you know, my closest personal relationships, and my responsibility to my family and friends.
Just that we're in a simulation isn't enough. If all we know is that it's a simulation, kind of the weirdness is that that word "simulation" starts to mean less. Because whatever qualities the real world has and ours doesn't is inconceivable to us. This is still as real as real gets.