Skip to content
The Future

Transhumanists want to upload their minds to a computer. They really won’t like the result

Uploading your mind is not a pathway to immortality. Instead, it will create a possibly hostile digital doppelgänger.
upload brain
Credit: Ivan / Adobe Stock
Key Takeaways
  • While it is theoretically possible to perfectly model a unique human brain down to the level of its synapses and molecules, doing so will not allow you to become immortal.
  • Instead, you will still be in your body, and the thing in the computer will be your “digital doppelgänger.”
  • The copy would feel just like you feel — fully entitled to own its own property and earn its own wages and make its own decisions. It would claim your name, memories, and even family as its own.
Sign up for the Smarter Faster newsletter
A weekly newsletter featuring the biggest ideas from the smartest people

If you are reading these words, your brain is alive and well, stored within the protective confines of your skull where it will reside for the remainder of your life. I feel the need to point this out because there is a small but vocal population of self-proclaimed “transhumanists” who believe that within their lifetimes, technological advances will enable them to “upload their minds” into computer systems, thereby allowing them to escape the limitations of their biology and effectively “live forever.”

These transhumanists are wrong.

To be fair, not all transhumanists believe in “mind uploading” as a pathway to immortality, but there’s enough chatter about the concept within that community that excitement has spilled out into the general public — so much so, that Amazon has a comedic TV series based on the premise called Upload. These may be fun stories, but the notion that a single biological human will ever extend their life by uploading their mind into a computer system is pure fiction.

The science behind transhumanism

The concept of “mind uploading” is rooted in the very reasonable premise that the human brain, like any system that obeys the laws of physics, can be modeled in software if you devote sufficient computing power to the problem. To be clear, we’re not talking about modeling human brains in the abstract, but modeling very specific brains — your brain, my brain, your uncle Herbert’s brain — each one represented in such extreme detail that every single neuron is accurately simulated, including all the complex connections among them.

It is an understatement to say that modeling a unique, individual human brain is a non-trivial task.

There are over 85 billion neurons in your head, each with thousands of links to other neurons. In total, there are about 100 trillion connections, which is unfathomably large — a thousand times more than the number of stars in the Milky Way galaxy. It’s those trillions of connections that make you who you are — your personality, your memories, your fears, your skills, your peculiarities. Your mind is encoded in those 100 trillion connections, and so to accurately reproduce your mind in software, a system would need to precisely simulate the vast majority of those connections down to the most subtle interactions. 

Obviously, that level of modeling will not be done by hand. People who believe in “mind uploading” envision an automated scanning process, likely using some kind of supercharged MRI machine, that captures the biology down to resolutions that approach the molecular level. They then envision the use of intelligent software to turn that scan into a simulation of each unique brain cell and its thousands of connections to other cells. 

That is an extremely challenging task, but I cannot deny that it is theoretically feasible. If it ever happens, it is not going to happen in the next 20 years, but much, much further out. And with additional time and resources, it also is not crazy to think that large numbers of simulated minds could co-exist inside of a rich and detailed simulation of physical reality. Still, the notion that this process will offer anyone reading this article a pathway to immortality is utterly absurd. 

Digital doppelgänger

As I stated above, the idea that a single biological human will ever extend their life by uploading their minds is pure fiction. The two key words in that sentence are “their life.” While it is theoretically possible with sufficient technological advances to copy and reproduce the precise form and function of a unique human brain within a simulation, the original human would still exist in their biological body, their brain still housed within their skull. What would exist in the computer would be a copy — a digital doppelgänger.

In other words, you would not feel like you suddenly transported yourself into a computer. In fact, you would not feel anything at all. The brain copying process could have happened without your knowledge, while you were asleep or sedated, and you would never have the slightest inkling that a reproduction of your mind existed within a simulation. And if you found yourself crossing a busy street with a car racing toward you — you would jump out of the way, because you would not be immortal.

But what about that version of you within a simulation? 

You could think of it as a digital clone or identical twin, but it would not be you. It would be a copy of you, including all your memories up to the moment your brain was scanned. But from that instant on, it would generate its own memories. It might be interacting with other simulated minds in a simulated world, learning new things and having new experiences. Or maybe it interacts with the physical world through robotic interfaces. At the same time, the biological you would be generating new memories and having new experiences. 

In other words, it would only be identical for an instant, and then you and the copy would both diverge in different directions. Your skills would diverge. Your knowledge would diverge. Your personalities would diverge. After a few years, there would be substantial differences. Your copy might become deeply religious while you are agnostic. Your copy might become an environmentalist while you are an oil executive. You and the copy would retain similar personalities, but you would be different people.

Clone wars

Yes, the copy of you would be a person — but a different person. That’s a critical point, because that copy of you would need to have its own identity and its own rights that have nothing to do with you. After all, that person would feel just as real inside their digital mind as you feel within your biological mind. Certainly, that person should not be your slave, required to take on tasks that you are too busy to do during your biological life. Such exploitation would be immoral.

After all, the copy would feel just like you feel — fully entitled to own its own property and earn its own wages and make its own decisions. In fact, you and the copy would likely have a dispute as to who gets to use your name, as you would both feel like you had used it your entire life. If I made a copy of myself, it would wake up and fully believe it was Louis Barry Rosenberg, a lifelong technologist in the fields of virtual reality and artificial intelligence. If it was able to interact with the real world through digital or robotic means, it would believe it had every right to use the name Louis Barry Rosenberg in the physical world. And it certainly would not feel subservient to the biological version.

In other words, creating a digital copy through “mind uploading” has nothing to do with allowing you to live forever. Instead, it would just create a competitor who has identical skills and capabilities and memories to the biological version, and who feels equally justified to be the owner of your identity. And yes, the copy would feel equally justified to be married to your spouse and parent to your children.

In other words, “mind uploading” is not a path to immortality. It is a path for creating another you who immediately will feel like they are equally justified owners of everything you possess and everything you have accomplished. And they would react exactly the way you would react if you woke up one day and were told: “Sorry, but all those memories of your life aren’t really yours but copies, so your spouse is not really your spouse, your kids are not really your kids, and your job is not really your job.”

Is this really what anyone would want to subject a copy of yourself to?

A dystopian future

Back in 2008, I wrote a graphic novel called Upgrade that explores the absurdity of mind uploading. It takes place in the 2040’s in a future world where everyone spends the vast majority of their lives in the Metaverse, logging in the moment they wake up and logging out the moment they go to sleep. (Coincidentally, the fictional reason why society went in this direction was a global pandemic that drove people inside.) What the inhabitants of this future world didn’t realize is that as they lived their lives in the Metaverse, they were being characterized by AI systems that observed all of their actions and reactions and interactions, capturing every sentiment and emotional response so it could build a digital model of their mind from a behavioral perspective rather than from molecular scanning.

After 20 years of collecting data in this dystopian metaverse, the fictional AI system had fully modeled every person in this future society with sufficient detail that it didn’t need real people anymore. After all, real humans are less efficient, as we need food and housing and healthcare. The digital copies didn’t need any of that. And so, guess what the fictional AI system decided to do? It convinced all of us biological people to “upgrade ourselves” by ending our own lives and allowing the digital copies to replace us. And we were willing to do it under the false notion that we would be immortal.

That’s what mind uploading really means. It means ending humanity and replacing it with a digital representation. I wrote Upgrade 14 years ago because I genuinely believe we humans might be foolish enough to head in that direction, ending our biological existence in favor of a purely digital one. 

Why is this bad? If you think Big Tech has too much power now — having the ability to track what you do and moderate the information you access — imagine what it will be like when human minds are trapped inside the systems they control, unable to exit. That is the future many are pushing for. It’s terrifying. “Mind uploading” is not the path to immortality some believe.

Sign up for the Smarter Faster newsletter
A weekly newsletter featuring the biggest ideas from the smartest people

Related

Up Next