On macro and micro levels, longer life is not working out. Almost all nations are aging and will confront the disruptive demographics of an aging society, some sooner than later. But no nation has it right, because the scope of this challenge has never been seen before. Evidence of a failure to face this fact abounds on every level, from product packaging that is difficult for older hands to open to public policies that simply extend and expand outdated programs that were built upon yesterday's assumptions about aging. We need change—that is, real innovation in business, government and community—and good ideas and technologies alone will not be sufficient. Translating those ideas into practical application is the challenge of a society that wants to live longer and better.
The problem is, those who do not see the need for doing something different often dismiss calls for change. For others, prospering by the status quo, innovation is simply a threat. Just consider the number of institutions and professions that will have to change or even implode their current business models to adjust to the new realities of an average lifespan of 100 years or more. Will retirement planning and public pension programs today meet the needs of living to a century tomorrow? Will secondary school and college education as we know it make the grade for graduates embarking on six decades plus of work? Imagine the marketing and media world having to adjust to the reality that most of the world with discretionary income is not under 30 years old.
But take heart: old institutions and practices can change, and have changed in the past. A great lesson on how such change might take place can be found in an unlikely place—the history of modern surgery. Anesthetized surgery may be the single best thing about living today as opposed to 200 years ago. When anesthesia first became available in the mid-1800s, surgeons adopted the practice as quickly as possible. Around the same time, evidence was also mounting that antiseptic techniques could save lives from infection after surgery.
But unlike anesthesia, it took decades for antiseptic surgery to become commonplace. The reason why anesthesia spread so rapidly while antiseptics took decades to catch on is the subject of a fascinating new New Yorker article by Atul Gawande. The upshot has ramifications far beyond medical history: sometimes an innovative solution to a problem can flourish within the existing norms of an industry, government, or even society. But sometimes, when old norms hobble new solutions, it's incumbent upon us to replace those old norms with newer, better ones.
Anesthesia, at first, did violate some existing surgical ideas—for instance, in its early days, some blowhards considered it a "needless luxury," Gawande writes. But its obvious benefits quickly overwhelmed any objections. (Also, once anesthesia became widely available, I imagine that the free market quickly struck down any surgeon who refused to offer it.) But in the case of antiseptic surgery, a number of issues stood in the way of full adoption, chief among which was the fact that its benefits weren't immediately apparent in the operating theater, because sepsis set in days after the fact. To make matters worse, surgeons needed to follow antiseptic practices perfectly in order for them to work. That meant that with a small mistake or just a little bad luck, a surgeon could give antiseptic surgery a try, lose a patient anyway, and consequently decide that the whole practice wasn't worth the effort.
Another problem emanated from the very idea of what it meant to be a surgeon in the first half of the 1800s: thick-skinned, blood-spattered and effective, more warrior than scientist. As a sort of badge of honor, surgeons allowed the blood of their patients to cake onto their smocks. Antiseptic surgery required spotless, sterile uniforms, which represented a profound change in the idea of what it meant to be a surgeon. For many, that shift in identity was too much. After the invention of antiseptic surgery, a generation of surgeons would come and go before it became common practice.
For those of us interested in building a better world in which to grow old, we can take two broad lessons from the slow spread of antiseptic surgery. First, it can be hard to sell a solution that offers slow-to-materialize benefits. And second, the norms we all take for granted in any field or industry can turn out to be some of the biggest impediments to progress.
Global aging, although an overwhelmingly good thing, nevertheless has something in common with infection after surgery: its effects take time to develop, and as a result, the value of solutions may be difficult to recognize in the near term. For instance, as more people live longer with chronic health issues, product engineers stand to succeed by designing products that assume end-users who are, say, 67 and arthritic, not 27 and healthy. (That's not to say any industry should design anything that screams "old man’s" or "old woman’s" product or service, which alienates both young and older buyers, but rather, the goal should be to design useful, exciting products for the whole lifespan.) But since the demographic shift to an older population is happening very gradually, it may take a long time for many engineers and designers to realize the benefits of an age-realistic approach—one that reaches across the lifespan, not just a single age group.
That gets me to the second lesson: dealing with norms that impede progress. In the case of disruptive global aging, the challenge is twofold: first to identify problematic norms and then to replace them. Such norms can range from the relatively small—say, the misguided idea that arthritis-friendly, touch-free faucets don't belong in the home—to huge ideas that threaten large institutional norms: the assumption that one college education will be enough for a career spanning 60 years. Identifying outdated norms, changing them and implementing viable solutions will be an ongoing battle, and those currently doing well delivering products, services and policies according to decades-old ideas about aging will push back the hardest against progress, like those blood-spattered surgeons of yore who wouldn't wear white coats.
But the good news is, norms can be changed. Gawande provides the example of neonatal nurses in modern India who were in the habit of disregarding some widely acknowledged lifesaving practices. When no policy carrot or stick would succeed in changing the way these nurses operated, human advocates on the ground managed, through extensive and empathetic outreach, to slowly change the problematic norms at those hospitals. That lesson applies to the broader aging space: when a norm or set of standards is holding back solutions, it's not enough to ask people to change their behavior; we have to change their motivations and expectations first. "Simple 'awareness' isn’t going to solve anything," Gawande writes. But it's possible to exact change through awareness and clearly demonstrating how new approaches can benefit everyone involved—no one of any age complains about transportation systems too safe, communities too vibrant, products too easy to use or services too convenient. Yet, these solutions satisfy just a few of the demands of an aging society. In the context of aging, "everyone" who can catalyze and benefit from change includes not only older adults and their families, but also businesses, policymakers, non-profits across markets and policies. If parties such as these have the courage to challenge today’s norms, develop new markets and bring innovation, a better life tomorrow is within reach.
MIT AgeLab’s Lucas Yoquinto contributed to this article.
Image from Shutterstock.