To Chip or Not to Chip: Would You Get a Biotech Implant?

Many of us balk at the idea of biotech implants as an affront to privacy and good old-fashioned humanity - but what if it could restore your eyesight and prevent Alzheimer's, what then?

In the last ten years, our relationship with technology has practically changed faster than we can keep tabs on it. In 2006, the term “Crackberry” was coined referring to the addictive quality of the Blackberry smartphone, one of the first to achieve widespread popularity in the United States. Now it is already sort of quaint, in a way, to think of us as “addicted” to our phones - for how can we be addicted to something that has become virtually an extension of ourselves?


The incomparable success of such devices has opened the way toward a consumer product revolution. Companies are trying to make everything connect to the rapidly evolving Internet of Things. From fridges to thermostats to home monitoring systems to your car, life is truly becoming hyperconnected. Despite the ubiquity of things that are talking, thinking and sensing everything about and around you, the true spiritual successor to smartphones might be wearable tech. The phone in the pocket evolves into the watch around the wrist, moving our personal technology ever closer to our bodies. Google Glass was a great leap of faith in this direction, a ubiquitous computing platform conceived as a direct extension of our most delicate and treasured of sensory tools, our eyes.  

Perhaps you can guess where the tech-enabled rabbit hole goes from here. It seems the next step in our increasingly intimate relationship with technology will be “embeddable tech,” or “being chipped.” Just as we were getting used to the idea of the Internet of Things, the Internet of Everything might already be on the horizon. Having a microchip implanted under one’s skin will dramatically enhance many of the benefits already provided or merely hinted at by smartphones and wearables. A chip under your skin could instantly measure physical metrics such as blood glucose and blood pressure. It could dispense your medication at a precise time and dosage. It would allow you to unlock the front door, check out at the convenience store register, or pay the bus fare with simply the touch of the palm. 

Of course, some intrepid folks are already way ahead of the curve. A CNN story from 2014 profiles a small subculture of “self-hackers” who have embedded compasses into their shoulders and magnets in their ears, among other things. One such pioneer, Rich Lee, talks about the “almost erotic” quality of detecting a sensation via embedded technology that was never there before.  

But the rest of us might not be so quick to go under the knife and become first-generation (or second-, if you count pacemakers and prosthetics) cyborgs. Smartphones have already gone a long way in showing us some of the perils and tradeoffs inherent in keeping our tech so close to the breast. Google Glass, ambitious a product as it was, has only amplified these issues, causing it to meet with serious pushback on its first wide release. Use of Glass in public areas raised hackles about privacy, tech etiquette, and about users becoming dangerously distracted from having the Internet always right before their eyes. There were also plenty of slightly juvenile observations that alluded to what Google Glass made its wearer look like - a simple but telling reaction suggesting that such a product may simply be ahead of its time.  

Putting aside the technology and fashion debate, whether they take on the name implantables or embeddables, these under-the-skin devices will only further magnify such controversies. To be chipped will be to abandon with a sort of finality any pure notion of privacy in one’s life. As intrusive as smartphones can be in recording our daily habits, locations and predilections, at least we have the option to shut them off, leave them home or even throw them into the ocean, if we find ourselves so compelled. Embeddables might end - or at least fundamentally change - our idea of solitude. We will never be alone again.

So would you have a microchip implanted inside you? If you want your voice heard on the question, you can take an MIT AgeLab two-minute survey on embeddable technology. My personal hunch is that for Baby Boomers, Millennials, and those in-between, the question for many might be &%&#! ‘no,’ or a very tentative ‘maybe.’ For the large multigenerational cohort that met the onset of this revolutionary wave of technology as adults, the jump from smartphone to smartchip might simply be too far, too invasive, perhaps even violate of certain convictions about what it means to be human. But those who are born into or are just now growing up in this brave new world will almost certainly feel differently. And for those with deep suspicions, the incredible promise that embeddables contain might become too hard for even modern-day Luddites to resist. If a chip could recover your failing eyesight, would your answer to the above question change? If it could reduce your chances of getting Alzheimer’s? If it could give you superpowers… or even replace the need to find the TV remote? But let’s not get ahead of ourselves. Soon all of us will have a choice to make: to simply live within the Internet of Things, or to become part of the Internet of Everything.

MIT AgeLab’s Adam Felts contributed to this article

Photo Credit: Rhona Wise/Getty Images

Big Think
Sponsored by Lumina Foundation

Upvote/downvote each of the videos below!

As you vote, keep in mind that we are looking for a winner with the most engaging social venture pitch - an idea you would want to invest in.

Keep reading Show less

7 fascinating UNESCO World Heritage Sites

Here are 7 often-overlooked World Heritage Sites, each with its own history.

Photo by Raunaq Patel on Unsplash
Culture & Religion
  • UNESCO World Heritage Sites are locations of high value to humanity, either for their cultural, historical, or natural significance.
  • Some are even designated as World Heritage Sites because humans don't go there at all, while others have felt the effects of too much human influence.
  • These 7 UNESCO World Heritage Sites each represent an overlooked or at-risk facet of humanity's collective cultural heritage.
Keep reading Show less

Yale scientists restore brain function to 32 clinically dead pigs

Researchers hope the technology will further our understanding of the brain, but lawmakers may not be ready for the ethical challenges.

Still from John Stephenson's 1999 rendition of Animal Farm.
Surprising Science
  • Researchers at the Yale School of Medicine successfully restored some functions to pig brains that had been dead for hours.
  • They hope the technology will advance our understanding of the brain, potentially developing new treatments for debilitating diseases and disorders.
  • The research raises many ethical questions and puts to the test our current understanding of death.

The image of an undead brain coming back to live again is the stuff of science fiction. Not just any science fiction, specifically B-grade sci fi. What instantly springs to mind is the black-and-white horrors of films like Fiend Without a Face. Bad acting. Plastic monstrosities. Visible strings. And a spinal cord that, for some reason, is also a tentacle?

But like any good science fiction, it's only a matter of time before some manner of it seeps into our reality. This week's Nature published the findings of researchers who managed to restore function to pigs' brains that were clinically dead. At least, what we once thought of as dead.

What's dead may never die, it seems

The researchers did not hail from House Greyjoy — "What is dead may never die" — but came largely from the Yale School of Medicine. They connected 32 pig brains to a system called BrainEx. BrainEx is an artificial perfusion system — that is, a system that takes over the functions normally regulated by the organ. The pigs had been killed four hours earlier at a U.S. Department of Agriculture slaughterhouse; their brains completely removed from the skulls.

BrainEx pumped an experiment solution into the brain that essentially mimic blood flow. It brought oxygen and nutrients to the tissues, giving brain cells the resources to begin many normal functions. The cells began consuming and metabolizing sugars. The brains' immune systems kicked in. Neuron samples could carry an electrical signal. Some brain cells even responded to drugs.

The researchers have managed to keep some brains alive for up to 36 hours, and currently do not know if BrainEx can have sustained the brains longer. "It is conceivable we are just preventing the inevitable, and the brain won't be able to recover," said Nenad Sestan, Yale neuroscientist and the lead researcher.

As a control, other brains received either a fake solution or no solution at all. None revived brain activity and deteriorated as normal.

The researchers hope the technology can enhance our ability to study the brain and its cellular functions. One of the main avenues of such studies would be brain disorders and diseases. This could point the way to developing new of treatments for the likes of brain injuries, Alzheimer's, Huntington's, and neurodegenerative conditions.

"This is an extraordinary and very promising breakthrough for neuroscience. It immediately offers a much better model for studying the human brain, which is extraordinarily important, given the vast amount of human suffering from diseases of the mind [and] brain," Nita Farahany, the bioethicists at the Duke University School of Law who wrote the study's commentary, told National Geographic.

An ethical gray matter

Before anyone gets an Island of Dr. Moreau vibe, it's worth noting that the brains did not approach neural activity anywhere near consciousness.

The BrainEx solution contained chemicals that prevented neurons from firing. To be extra cautious, the researchers also monitored the brains for any such activity and were prepared to administer an anesthetic should they have seen signs of consciousness.

Even so, the research signals a massive debate to come regarding medical ethics and our definition of death.

Most countries define death, clinically speaking, as the irreversible loss of brain or circulatory function. This definition was already at odds with some folk- and value-centric understandings, but where do we go if it becomes possible to reverse clinical death with artificial perfusion?

"This is wild," Jonathan Moreno, a bioethicist at the University of Pennsylvania, told the New York Times. "If ever there was an issue that merited big public deliberation on the ethics of science and medicine, this is one."

One possible consequence involves organ donations. Some European countries require emergency responders to use a process that preserves organs when they cannot resuscitate a person. They continue to pump blood throughout the body, but use a "thoracic aortic occlusion balloon" to prevent that blood from reaching the brain.

The system is already controversial because it raises concerns about what caused the patient's death. But what happens when brain death becomes readily reversible? Stuart Younger, a bioethicist at Case Western Reserve University, told Nature that if BrainEx were to become widely available, it could shrink the pool of eligible donors.

"There's a potential conflict here between the interests of potential donors — who might not even be donors — and people who are waiting for organs," he said.

It will be a while before such experiments go anywhere near human subjects. A more immediate ethical question relates to how such experiments harm animal subjects.

Ethical review boards evaluate research protocols and can reject any that causes undue pain, suffering, or distress. Since dead animals feel no pain, suffer no trauma, they are typically approved as subjects. But how do such boards make a judgement regarding the suffering of a "cellularly active" brain? The distress of a partially alive brain?

The dilemma is unprecedented.

Setting new boundaries

Another science fiction story that comes to mind when discussing this story is, of course, Frankenstein. As Farahany told National Geographic: "It is definitely has [sic] a good science-fiction element to it, and it is restoring cellular function where we previously thought impossible. But to have Frankenstein, you need some degree of consciousness, some 'there' there. [The researchers] did not recover any form of consciousness in this study, and it is still unclear if we ever could. But we are one step closer to that possibility."

She's right. The researchers undertook their research for the betterment of humanity, and we may one day reap some unimaginable medical benefits from it. The ethical questions, however, remain as unsettling as the stories they remind us of.

Scientists discover how to trap mysterious dark matter

A new method promises to capture an elusive dark world particle.

Surprising Science
  • Scientists working on the Large Hadron Collider (LHC) devised a method for trapping dark matter particles.
  • Dark matter is estimated to take up 26.8% of all matter in the Universe.
  • The researchers will be able to try their approach in 2021, when the LHC goes back online.
Keep reading Show less