How to Implement New Ideas for a Better Old Age

On macro and micro levels, longer life is not working out. Almost all nations are aging and will confront the disruptive demographics of an aging society, some sooner than later. But no nation has it right, because the scope of this challenge has never been seen before. Evidence of a failure to face this fact abounds on every level, from product packaging that is difficult for older hands to open to public policies that simply extend and expand outdated programs that were built upon yesterday's assumptions about aging. We need change—that is, real innovation in business, government and community—and good ideas and technologies alone will not be sufficient. Translating those ideas into practical application is the challenge of a society that wants to live longer and better.

The problem is, those who do not see the need for doing something different often dismiss calls for change. For others, prospering by the status quo, innovation is simply a threat. Just consider the number of institutions and professions that will have to change or even implode their current business models to adjust to the new realities of an average lifespan of 100 years or more. Will retirement planning and public pension programs today meet the needs of living to a century tomorrow? Will secondary school and college education as we know it make the grade for graduates embarking on six decades plus of work? Imagine the marketing and media world having to adjust to the reality that most of the world with discretionary income is not under 30 years old.

But take heart: old institutions and practices can change, and have changed in the past. A great lesson on how such change might take place can be found in an unlikely place—the history of modern surgery. Anesthetized surgery may be the single best thing about living today as opposed to 200 years ago. When anesthesia first became available in the mid-1800s, surgeons adopted the practice as quickly as possible. Around the same time, evidence was also mounting that antiseptic techniques could save lives from infection after surgery.

But unlike anesthesia, it took decades for antiseptic surgery to become commonplace. The reason why anesthesia spread so rapidly while antiseptics took decades to catch on is the subject of a fascinating new New Yorker article by Atul Gawande. The upshot has ramifications far beyond medical history: sometimes an innovative solution to a problem can flourish within the existing norms of an industry, government, or even society. But sometimes, when old norms hobble new solutions, it's incumbent upon us to replace those old norms with newer, better ones.

Anesthesia, at first, did violate some existing surgical ideas—for instance, in its early days, some blowhards considered it a "needless luxury," Gawande writes. But its obvious benefits quickly overwhelmed any objections. (Also, once anesthesia became widely available, I imagine that the free market quickly struck down any surgeon who refused to offer it.) But in the case of antiseptic surgery, a number of issues stood in the way of full adoption, chief among which was the fact that its benefits weren't immediately apparent in the operating theater, because sepsis set in days after the fact. To make matters worse, surgeons needed to follow antiseptic practices perfectly in order for them to work. That meant that with a small mistake or just a little bad luck, a surgeon could give antiseptic surgery a try, lose a patient anyway, and consequently decide that the whole practice wasn't worth the effort. 

Another problem emanated from the very idea of what it meant to be a surgeon in the first half of the 1800s: thick-skinned, blood-spattered and effective, more warrior than scientist. As a sort of badge of honor, surgeons allowed the blood of their patients to cake onto their smocks. Antiseptic surgery required spotless, sterile uniforms, which represented a profound change in the idea of what it meant to be a surgeon. For many, that shift in identity was too much. After the invention of antiseptic surgery, a generation of surgeons would come and go before it became common practice.

For those of us interested in building a better world in which to grow old, we can take two broad lessons from the slow spread of antiseptic surgery. First, it can be hard to sell a solution that offers slow-to-materialize benefits. And second, the norms we all take for granted in any field or industry can turn out to be some of the biggest impediments to progress.

Global aging, although an overwhelmingly good thing, nevertheless has something in common with infection after surgery: its effects take time to develop, and as a result, the value of solutions may be difficult to recognize in the near term. For instance, as more people live longer with chronic health issues, product engineers stand to succeed by designing products that assume end-users who are, say, 67 and arthritic, not 27 and healthy. (That's not to say any industry should design anything that screams "old man’s" or "old woman’s" product or service, which alienates both young and older buyers, but rather, the goal should be to design useful, exciting products for the whole lifespan.) But since the demographic shift to an older population is happening very gradually, it may take a long time for many engineers and designers to realize the benefits of an age-realistic approach—one that reaches across the lifespan, not just a single age group.

That gets me to the second lesson: dealing with norms that impede progress. In the case of disruptive global aging, the challenge is twofold: first to identify problematic norms and then to replace them. Such norms can range from the relatively small—say, the misguided idea that arthritis-friendly, touch-free faucets don't belong in the home—to huge ideas that threaten large institutional norms: the assumption that one college education will be enough for a career spanning 60 years. Identifying outdated norms, changing them and implementing viable solutions will be an ongoing battle, and those currently doing well delivering products, services and policies according to decades-old ideas about aging will push back the hardest against progress, like those blood-spattered surgeons of yore who wouldn't wear white coats.

But the good news is, norms can be changed. Gawande provides the example of neonatal nurses in modern India who were in the habit of disregarding some widely acknowledged lifesaving practices. When no policy carrot or stick would succeed in changing the way these nurses operated, human advocates on the ground managed, through extensive and empathetic outreach, to slowly change the problematic norms at those hospitals. That lesson applies to the broader aging space: when a norm or set of standards is holding back solutions, it's not enough to ask people to change their behavior; we have to change their motivations and expectations first. "Simple 'awareness' isn’t going to solve anything," Gawande writes. But it's possible to exact change through awareness and clearly demonstrating how new approaches can benefit everyone involved—no one of any age complains about transportation systems too safe, communities too vibrant, products too easy to use or services too convenient. Yet, these solutions satisfy just a few of the demands of an aging society. In the context of aging, "everyone" who can catalyze and benefit from change includes not only older adults and their families, but also businesses, policymakers, non-profits across markets and policies. If parties such as these have the courage to challenge today’s norms, develop new markets and bring innovation, a better life tomorrow is within reach.

MIT AgeLab’s Lucas Yoquinto contributed to this article.

Image from Shutterstock.

Big Think
Sponsored by Lumina Foundation

Upvote/downvote each of the videos below!

As you vote, keep in mind that we are looking for a winner with the most engaging social venture pitch - an idea you would want to invest in.

Keep reading Show less

Essential financial life skills for 21st-century Americans

Having these financial life skills can help you navigate challenging economic environments.

Photo by Jp Valery on Unsplash
Personal Growth
  • Americans are swimming in increasingly higher amounts of debt, even the upper middle class.
  • For many, this burden can be alleviated by becoming familiar with some straightforward financial concepts.
  • Here's some essential financial life skills needed to ensure your economic wellbeing.
Keep reading Show less

Scientists create a "lifelike" material that has metabolism and can self-reproduce

An innovation may lead to lifelike evolving machines.

Shogo Hamada/Cornell University
Surprising Science
  • Scientists at Cornell University devise a material with 3 key traits of life.
  • The goal for the researchers is not to create life but lifelike machines.
  • The researchers were able to program metabolism into the material's DNA.
Keep reading Show less

New fossils suggest human ancestors evolved in Europe, not Africa

Experts argue the jaws of an ancient European ape reveal a key human ancestor.

Surprising Science
  • The jaw bones of an 8-million-year-old ape were discovered at Nikiti, Greece, in the '90s.
  • Researchers speculate it could be a previously unknown species and one of humanity's earliest evolutionary ancestors.
  • These fossils may change how we view the evolution of our species.

Homo sapiens have been on earth for 200,000 years — give or take a few ten-thousand-year stretches. Much of that time is shrouded in the fog of prehistory. What we do know has been pieced together by deciphering the fossil record through the principles of evolutionary theory. Yet new discoveries contain the potential to refashion that knowledge and lead scientists to new, previously unconsidered conclusions.

A set of 8-million-year-old teeth may have done just that. Researchers recently inspected the upper and lower jaw of an ancient European ape. Their conclusions suggest that humanity's forebearers may have arisen in Europe before migrating to Africa, potentially upending a scientific consensus that has stood since Darwin's day.

Rethinking humanity's origin story

The frontispiece of Thomas Huxley's Evidence as to Man's Place in Nature (1863) sketched by natural history artist Benjamin Waterhouse Hawkins. (Photo: Wikimedia Commons)

As reported in New Scientist, the 8- to 9-million-year-old hominin jaw bones were found at Nikiti, northern Greece, in the '90s. Scientists originally pegged the chompers as belonging to a member of Ouranopithecus, an genus of extinct Eurasian ape.

David Begun, an anthropologist at the University of Toronto, and his team recently reexamined the jaw bones. They argue that the original identification was incorrect. Based on the fossil's hominin-like canines and premolar roots, they identify that the ape belongs to a previously unknown proto-hominin.

The researchers hypothesize that these proto-hominins were the evolutionary ancestors of another European great ape Graecopithecus, which the same team tentatively identified as an early hominin in 2017. Graecopithecus lived in south-east Europe 7.2 million years ago. If the premise is correct, these hominins would have migrated to Africa 7 million years ago, after undergoing much of their evolutionary development in Europe.

Begun points out that south-east Europe was once occupied by the ancestors of animals like the giraffe and rhino, too. "It's widely agreed that this was the found fauna of most of what we see in Africa today," he told New Scientists. "If the antelopes and giraffes could get into Africa 7 million years ago, why not the apes?"

He recently outlined this idea at a conference of the American Association of Physical Anthropologists.

It's worth noting that Begun has made similar hypotheses before. Writing for the Journal of Human Evolution in 2002, Begun and Elmar Heizmann of the Natural history Museum of Stuttgart discussed a great ape fossil found in Germany that they argued could be the ancestor (broadly speaking) of all living great apes and humans.

"Found in Germany 20 years ago, this specimen is about 16.5 million years old, some 1.5 million years older than similar species from East Africa," Begun said in a statement then. "It suggests that the great ape and human lineage first appeared in Eurasia and not Africa."

Migrating out of Africa

In the Descent of Man, Charles Darwin proposed that hominins descended out of Africa. Considering the relatively few fossils available at the time, it is a testament to Darwin's astuteness that his hypothesis remains the leading theory.

Since Darwin's time, we have unearthed many more fossils and discovered new evidence in genetics. As such, our African-origin story has undergone many updates and revisions since 1871. Today, it has splintered into two theories: the "out of Africa" theory and the "multi-regional" theory.

The out of Africa theory suggests that the cradle of all humanity was Africa. Homo sapiens evolved exclusively and recently on that continent. At some point in prehistory, our ancestors migrated from Africa to Eurasia and replaced other subspecies of the genus Homo, such as Neanderthals. This is the dominant theory among scientists, and current evidence seems to support it best — though, say that in some circles and be prepared for a late-night debate that goes well past last call.

The multi-regional theory suggests that humans evolved in parallel across various regions. According to this model, the hominins Homo erectus left Africa to settle across Eurasia and (maybe) Australia. These disparate populations eventually evolved into modern humans thanks to a helping dollop of gene flow.

Of course, there are the broad strokes of very nuanced models, and we're leaving a lot of discussion out. There is, for example, a debate as to whether African Homo erectus fossils should be considered alongside Asian ones or should be labeled as a different subspecies, Homo ergaster.

Proponents of the out-of-Africa model aren't sure whether non-African humans descended from a single migration out of Africa or at least two major waves of migration followed by a lot of interbreeding.

Did we head east or south of Eden?

Not all anthropologists agree with Begun and his team's conclusions. As noted by New Scientist, it is possible that the Nikiti ape is not related to hominins at all. It may have evolved similar features independently, developing teeth to eat similar foods or chew in a similar manner as early hominins.

Ultimately, Nikiti ape alone doesn't offer enough evidence to upend the out of Africa model, which is supported by a more robust fossil record and DNA evidence. But additional evidence may be uncovered to lend further credence to Begun's hypothesis or lead us to yet unconsidered ideas about humanity's evolution.