Don’t Want to Die? Just Upload Your Brain
Oxford researchers say we are only a few decades away from a chance at digital immortality.
l haven’t seen “Her,” the Oscar-nominated movie about a man who has an intimate relationship with a Scarlett Johansson-voiced computer operating system. I have, however, read Susan Schneider’s “The Philosophy of ‘Her’,” a post on The Stone blog at the New York Times looking into the possibility, in the pretty near future, of avoiding death by having your brain scanned and uploaded to a computer. Presumably you’d want to Dropbox your brain file (yes, you’ll need to buy more storage) to avoid death by hard-drive crash. But with suitable backups, you, or an electronic version of you, could go on living forever, or at least for a very, very long time, “untethered,” as Ms. Schneider puts it, “from a body that’s inevitably going to die.”
This idea isn’t the loopy brainchild of sci-fi hacks. Researchers at Oxford University have been on the path to human digitization for a while now, and way back in 2008 the Future of Humanity Institute at Oxford released a 130-page technical report entitled Whole Brain Emulation: A Roadmap. Of the dozen or so benefits of whole-brain emulation listed by the authors, Andrew Sandberg and Nick Bostrom, one stands out:
If emulation of particular brains is possible and affordable, and if concerns about individual identity can be met, such emulation would enable back‐up copies and “digital immortality.”
Scanning brains, the authors write, “may represent a radical new form of human enhancement.”
Hmm. Immortality and radical human enhancement. Is this for real? Yes:
It appears feasible within the foreseeable future to store the full connectivity or even multistate compartment models of all neurons in the brain within the working memory of a large computing system.
Foreseeable future means not in our lifetimes, right? Think again. If you expect to live to 2050 or so, you could face this choice. And your beloved labrador may be ready for upload by, say, 2030:
A rough conclusion would nevertheless be that if electrophysiological models are enough, full human brain emulations should be possible before mid‐century. Animal models of simple mammals would be possible one to two decades before this.
Interacting with your pet via a computer interface (“Hi Spot!”/“Woof!”) wouldn’t be quite the same as rolling around the backyard with him while he slobbers on your face or watching him dash off after a tennis ball you toss into a pond. You might be able to simulate certain aspects of his personality with computer extensions, but the look in his eyes, the cock of his head and the feel and scent of his coat will be hard to reproduce electronically. All these limitations would probably not make up for no longer having to scoop up his messes or feed him heartworm pills. The electro-pet might also make you miss the real Spot unbearably as you try to recapture his consciousness on your home PC.
But what about you? Does the prospect of uploading your own brain allay your fear of abruptly disappearing from the universe? Is it the next best thing to finding the fountain of youth? Ms. Schneider, a philosophy professor at the University of Connecticut, counsels caution. First, she writes, we might find our identity warped in disturbing ways if we pour our brains into massive digital files. She describes the problem via an imaginary guy named Theodore:
[If Theodore were to truly upload his mind (as opposed to merely copy its contents), then he could be downloaded to multiple other computers. Suppose that there are five such downloads: Which one is the real Theodore? It is hard to provide a nonarbitrary answer. Could all of the downloads be Theodore? This seems bizarre: As a rule, physical objects and living things do not occupy multiple locations at once. It is far more likely that none of the downloads are Theodore, and that he did not upload in the first place.
This is why the Oxford futurists included the caveat “if concerns about individual identity can be met.” It is the nightmare of infinitely reproducible individuals — a consequence that would, in an instant, undermine and destroy the very notion of an individual.
But Ms. Schneider does not come close to appreciating the extent of the moral failure of brain uploads. She is right to observe an apparent “categorical divide between humans and programs.” Human beings, she writes, “cannot upload themselves to the digital universe; they can upload only copies of themselves — copies that may themselves be conscious beings.” The error here is screamingly obvious: brains are parts of us, but they are not “us.” A brain contains the seed of consciousness, and it is both the bank for our memories and the fount of our rationality and our capacity for language, but a brain without a body is fundamentally different from the human being that possessed both.
It sounds deeply claustrophobic to be housed (imprisoned?) forever in a microchip, unable to dive into the ocean, taste chocolate or run your hands through your loved one’s hair. Our participation in these and infinite other emotive and experiential moments are the bulk of what constitutes our lives, or at least our meaningful lives. Residing forever in the realm of pure thought and memory and discourse doesn’t sound like life, even if it is consciousness. Especially if it is consciousness.
So I cannot agree with Ms. Schneider’s conclusion when she writes that brain uploads may be choiceworthy for the benefits they can bring to our species or for the solace they provide to dying individuals who “wish to leave a copy of [themselves] to communicate with [their] children or complete projects that [they] care about.” It may be natural, given the increasingly virtual lives many of us live in this pervasively Internet-connected world, to think ourselves mainly in terms of avatars and timelines and handles and digital faces. Collapsing our lives into our brains, and offloading the contents of our brains to a supercomputer is a fascinating idea. It does not sound to me, though, like a promising recipe for preserving our humanity.
Image credit: Shutterstock.com
Swipe right to make the connections that could change your career.
Swipe right. Match. Meet over coffee or set up a call.
No, we aren't talking about Tinder. Introducing Shapr, a free app that helps people with synergistic professional goals and skill sets easily meet and collaborate.
In his final years, Martin Luther King, Jr. become increasingly focused on the problem of poverty in America.
- Despite being widely known for his leadership role in the American civil rights movement, Martin Luther King, Jr. also played a central role in organizing the Poor People's Campaign of 1968.
- The campaign was one of the first to demand a guaranteed income for all poor families in America.
- Today, the idea of a universal basic income is increasingly popular, and King's arguments in support of the policy still make a good case some 50 years later.
10 of the most sandbagging, red-herring, and effective logical fallacies.
- Many an otherwise-worthwhile argument has been derailed by logical fallacies.
- Sometimes these fallacies are deliberate tricks, and sometimes just bad reasoning.
- Avoiding these traps makes disgreeing so much better.
For Damien Echols, tattoos are part of his existential armor.
- In prison Damien Echols was known by his number SK931, not his name, and had his hair sheared off. Stripped of his identity, the only thing he had left was his skin.
- This is why he began tattooing things that are meaningful to him — to carry a "suit of armor" made up the images of the people and objects that have significance to him, from his friends to talismans.
- Echols believes that all places are imbued with divinity: "If you interact with New York City as if there's an intelligence behind... then it will behave towards you the same way."
SMARTER FASTER trademarks owned by The Big Think, Inc. All rights reserved.