New hypothesis argues the universe simulates itself into existence

A physics paper proposes neither you nor the world around you are real.

Tetrahedrons in the sky above New York City

Tetrahedrons representing the quasicrystalline spin network (QSN), the fundamental substructure of spacetime, according to emergence theory.

Credit: Quantum Gravity Institute
  • A new hypothesis says the universe self-simulates itself in a "strange loop".
  • A paper from the Quantum Gravity Research institute proposes there is an underlying panconsciousness.
  • The work looks to unify insight from quantum mechanics with a non-materialistic perspective.

How real are you? What if everything you are, everything you know, all the people in your life as well as all the events were not physically there but just a very elaborate simulation? Philosopher Nick Bostrom famously considered this in his seminal paper "Are you living in a computer simulation?," where he proposed that all of our existence may be just a product of very sophisticated computer simulations ran by advanced beings whose real nature we may never be able to know. Now a new theory has come along that takes it a step further – what if there are no advanced beings either and everything in "reality" is a self-simulation that generates itself from pure thought?

The physical universe is a "strange loop" says the new paper titled "The Self-Simulation Hypothesis Interpretation of Quantum Mechanics" from the team at the Quantum Gravity Research, a Los Angeles-based theoretical physics institute founded by the scientist and entrepreneur Klee Irwin. They take Bostrom's simulation hypothesis, which maintains that all of reality is an extremely detailed computer program, and ask, rather than relying on advanced lifeforms to create the amazing technology necessary to compose everything within our world, isn't it more efficient to propose that the universe itself is a "mental self-simulation"? They tie this idea to quantum mechanics, seeing the universe as one of many possible quantum gravity models.

One important aspect that differentiates this view relates to the fact that Bostrom's original hypothesis is materialistic, seeing the universe as inherently physical. To Bostrom, we could simply be part of an ancestor simulation, engineered by posthumans. Even the process of evolution itself could just be a mechanism by which the future beings are testing countless processes, purposefully moving humans through levels of biological and technological growth. In this way they also generate the supposed information or history of our world. Ultimately, we wouldn't know the difference.

But where does the physical reality that would generate the simulations comes from, wonder the researchers? Their hypothesis takes a non-materialistic approach, saying that everything is information expressed as thought. As such, the universe "self-actualizes" itself into existence, relying on underlying algorithms and a rule they call "the principle of efficient language."

Under this proposal, the entire simulation of everything in existence is just one "grand thought." How would the simulation itself be originated? It was always there, say the researchers, explaining the concept of "timeless emergentism." According to this idea, time isn't there at all. Instead, the all-encompassing thought that is our reality offers a nested semblance of a hierarchical order, full of "sub-thoughts" that reach all the way down the rabbit hole towards the base mathematics and fundamental particles. This is also where the rule of efficient language comes in, suggesting that humans themselves are such "emergent sub-thoughts" and they experience and find meaning in the world through other sub-thoughts (called "code-steps or actions") in the most economical fashion.

In correspondence with Big Think, physicist David Chester elaborated: "While many scientists presume materialism to be true, we believe that quantum mechanics may provide hints that our reality is a mental construct. Recent advances in quantum gravity, such as seeing spacetime emergent via a hologram, also is a hint that spacetime is not fundamental. This is also compatible with ancient Hermetic and Indian philosophy. In a sense, the mental construct of reality creates spacetime to efficiently understand itself by creating a network of subconscious entities that can interact and explore the totality of possibilities."

The scientists link their hypothesis to panpsychism, which sees everything as thought or consciousness. The authors think that their "panpsychic self-simulation model" can even explain the origin of an overarching panconsciousness at the foundational level of the simulations, which "self-actualizes itself in a strange loop via self-simulation." This panconsciousness also has free will and its various nested levels essentially have the ability to select what code to actualize, while making syntax choices. The goal of this consciousness? To generate meaning or information.

If all of this is hard to grasp, the authors offer another interesting idea that may link your everyday experience to these philosophical considerations. Think of your dreams as your own personal self-simulations, postulates the team. While they are rather primitive (by super-intelligent future AI standards), dreams tend to provide better resolution than current computer modeling and are a great example of the evolution of the human mind. As the scientists write, "What is most remarkable is the ultra-high-fidelity resolution of these mind-based simulations and the accuracy of the physics therein." They point especially to lucid dreams, where the dreamer is aware of being in a dream, as instances of very accurate simulations created by your mind that may be impossible to distinguish from any other reality. To that end, now that you're sitting here reading this article, how do you really know you're not in a dream? The experience seems very high in resolution but so do some dreams. It's not too much of a reach to imagine that an extremely powerful computer that we may be able to make in not-too-distant future could duplicate this level of detail.

The team also proposes that in the coming years we will be able to create designer consciousnesses for ourselves as advancements in gene editing could allow us to make our own mind-simulations much more powerful. We may also see minds emerging that do not require matter at all.

While some of these ideas are certainly controversial in the mainstream science circles, Klee and his team respond that "We must critically think about consciousness and certain aspects of philosophy that are uncomfortable subjects to some scientists."

Want to know more? You can read the full paper online in the journal Entropy.

More on the hypothesis and the backstory of the Quantum Gravity Research institute —

Iron Age discoveries uncovered outside London, including a ‘murder’ victim

A man's skeleton, found facedown with his hands bound, was unearthed near an ancient ceremonial circle during a high speed rail excavation project.

Photo Credit: HS2
Culture & Religion
  • A skeleton representing a man who was tossed face down into a ditch nearly 2,500 years ago with his hands bound in front of his hips was dug up during an excavation outside of London.
  • The discovery was made during a high speed rail project that has been a bonanza for archaeology, as the area is home to more than 60 ancient sites along the planned route.
  • An ornate grave of a high status individual from the Roman period and an ancient ceremonial circle were also discovered during the excavations.
Keep reading Show less

Are we really addicted to technology?

Fear that new technologies are addictive isn't a modern phenomenon.

Credit: Rodion Kutsaev via Unsplash
Technology & Innovation

This article was originally published on our sister site, Freethink, which has partnered with the Build for Tomorrow podcast to go inside new episodes each month. Subscribe here to learn more about the crazy, curious things from history that shaped us, and how we can shape the future.

In many ways, technology has made our lives better. Through smartphones, apps, and social media platforms we can now work more efficiently and connect in ways that would have been unimaginable just decades ago.

But as we've grown to rely on technology for a lot of our professional and personal needs, most of us are asking tough questions about the role technology plays in our own lives. Are we becoming too dependent on technology to the point that it's actually harming us?

In the latest episode of Build for Tomorrow, host and Entrepreneur Editor-in-Chief Jason Feifer takes on the thorny question: is technology addictive?

Popularizing medical language

What makes something addictive rather than just engaging? It's a meaningful distinction because if technology is addictive, the next question could be: are the creators of popular digital technologies, like smartphones and social media apps, intentionally creating things that are addictive? If so, should they be held responsible?

To answer those questions, we've first got to agree on a definition of "addiction." As it turns out, that's not quite as easy as it sounds.

If we don't have a good definition of what we're talking about, then we can't properly help people.

LIAM SATCHELL UNIVERSITY OF WINCHESTER

"Over the past few decades, a lot of effort has gone into destigmatizing conversations about mental health, which of course is a very good thing," Feifer explains. It also means that medical language has entered into our vernacular —we're now more comfortable using clinical words outside of a specific diagnosis.

"We've all got that one friend who says, 'Oh, I'm a little bit OCD' or that friend who says, 'Oh, this is my big PTSD moment,'" Liam Satchell, a lecturer in psychology at the University of Winchester and guest on the podcast, says. He's concerned about how the word "addiction" gets tossed around by people with no background in mental health. An increased concern surrounding "tech addiction" isn't actually being driven by concern among psychiatric professionals, he says.

"These sorts of concerns about things like internet use or social media use haven't come from the psychiatric community as much," Satchell says. "They've come from people who are interested in technology first."

The casual use of medical language can lead to confusion about what is actually a mental health concern. We need a reliable standard for recognizing, discussing, and ultimately treating psychological conditions.

"If we don't have a good definition of what we're talking about, then we can't properly help people," Satchell says. That's why, according to Satchell, the psychiatric definition of addiction being based around experiencing distress or significant family, social, or occupational disruption needs to be included in any definition of addiction we may use.

Too much reading causes... heat rashes?

But as Feifer points out in his podcast, both popularizing medical language and the fear that new technologies are addictive aren't totally modern phenomena.

Take, for instance, the concept of "reading mania."

In the 18th Century, an author named J. G. Heinzmann claimed that people who read too many novels could experience something called "reading mania." This condition, Heinzmann explained, could cause many symptoms, including: "weakening of the eyes, heat rashes, gout, arthritis, hemorrhoids, asthma, apoplexy, pulmonary disease, indigestion, blocking of the bowels, nervous disorder, migraines, epilepsy, hypochondria, and melancholy."

"That is all very specific! But really, even the term 'reading mania' is medical," Feifer says.

"Manic episodes are not a joke, folks. But this didn't stop people a century later from applying the same term to wristwatches."

Indeed, an 1889 piece in the Newcastle Weekly Courant declared: "The watch mania, as it is called, is certainly excessive; indeed it becomes rabid."

Similar concerns have echoed throughout history about the radio, telephone, TV, and video games.

"It may sound comical in our modern context, but back then, when those new technologies were the latest distraction, they were probably really engaging. People spent too much time doing them," Feifer says. "And what can we say about that now, having seen it play out over and over and over again? We can say it's common. It's a common behavior. Doesn't mean it's the healthiest one. It's just not a medical problem."

Few today would argue that novels are in-and-of-themselves addictive — regardless of how voraciously you may have consumed your last favorite novel. So, what happened? Were these things ever addictive — and if not, what was happening in these moments of concern?

People are complicated, our relationship with new technology is complicated, and addiction is complicated — and our efforts to simplify very complex things, and make generalizations across broad portions of the population, can lead to real harm.

JASON FEIFER HOST OF BUILD FOR TOMORROW

There's a risk of pathologizing normal behavior, says Joel Billieux, professor of clinical psychology and psychological assessment at the University of Lausanne in Switzerland, and guest on the podcast. He's on a mission to understand how we can suss out what is truly addictive behavior versus what is normal behavior that we're calling addictive.

For Billieux and other professionals, this isn't just a rhetorical game. He uses the example of gaming addiction, which has come under increased scrutiny over the past half-decade. The language used around the subject of gaming addiction will determine how behaviors of potential patients are analyzed — and ultimately what treatment is recommended.

"For a lot of people you can realize that the gaming is actually a coping (mechanism for) social anxiety or trauma or depression," says Billieux.

"Those cases, of course, you will not necessarily target gaming per se. You will target what caused depression. And then as a result, If you succeed, gaming will diminish."

In some instances, a person might legitimately be addicted to gaming or technology, and require the corresponding treatment — but that treatment might be the wrong answer for another person.

"None of this is to discount that for some people, technology is a factor in a mental health problem," says Feifer.

"I am also not discounting that individual people can use technology such as smartphones or social media to a degree where it has a genuine negative impact on their lives. But the point here to understand is that people are complicated, our relationship with new technology is complicated, and addiction is complicated — and our efforts to simplify very complex things, and make generalizations across broad portions of the population, can lead to real harm."

Behavioral addiction is a notoriously complex thing for professionals to diagnose — even more so since the latest edition of the Diagnostic and Statistical Manual of Mental Disorders (DSM-5), the book professionals use to classify mental disorders, introduced a new idea about addiction in 2013.

"The DSM-5 grouped substance addiction with gambling addiction — this is the first time that substance addiction was directly categorized with any kind of behavioral addiction," Feifer says.

"And then, the DSM-5 went a tiny bit further — and proposed that other potentially addictive behaviors require further study."

This might not sound like that big of a deal to laypeople, but its effect was massive in medicine.

"Researchers started launching studies — not to see if a behavior like social media use can be addictive, but rather, to start with the assumption that social media use is addictive, and then to see how many people have the addiction," says Feifer.

Learned helplessness

The assumption that a lot of us are addicted to technology may itself be harming us by undermining our autonomy and belief that we have agency to create change in our own lives. That's what Nir Eyal, author of the books Hooked and Indistractable, calls 'learned helplessness.'

"The price of living in a world with so many good things in it is that sometimes we have to learn these new skills, these new behaviors to moderate our use," Eyal says. "One surefire way to not do anything is to believe you are powerless. That's what learned helplessness is all about."

So if it's not an addiction that most of us are experiencing when we check our phones 90 times a day or are wondering about what our followers are saying on Twitter — then what is it?

"A choice, a willful choice, and perhaps some people would not agree or would criticize your choices. But I think we cannot consider that as something that is pathological in the clinical sense," says Billieux.

Of course, for some people technology can be addictive.

"If something is genuinely interfering with your social or occupational life, and you have no ability to control it, then please seek help," says Feifer.

But for the vast majority of people, thinking about our use of technology as a choice — albeit not always a healthy one — can be the first step to overcoming unwanted habits.

For more, be sure to check out the Build for Tomorrow episode here.

Why the U.S. and Belgium are culture buddies

The Inglehart-Welzel World Cultural map replaces geographic accuracy with closeness in terms of values.

According to the latest version of the Inglehart-Welzel World Cultural Map, Belgium and the United States are now each other's closest neighbors in terms of cultural values.

Credit: World Values Survey, public domain.
Strange Maps
  • This map replaces geography with another type of closeness: cultural values.
  • Although the groups it depicts have familiar names, their shapes are not.
  • The map makes for strange bedfellows: Brazil next to South Africa and Belgium neighboring the U.S.
Keep reading Show less
Quantcast