Skip to content

The Discovery-Delivery Gap: Why Implementation Does Not Proceed From Invention

Sign up for the Smarter Faster newsletter
A weekly newsletter featuring the biggest ideas from the smartest people

The gap between invention and implementation is beset by a bias: when in doubt we prefer the status quo, even when solutions to deficiencies are apparent. Is it any wonder that it took Pasteur years to convince doctors and nurses throughout Europe that germs caused infectious diseases? This is especially true when we have a hand in establishing the status quo. Skinner remained a steadfast behaviorist until his death – well after the cognitive revolution undermined many tenets of behaviorism. Think about phrenologists, phlogistonists, and alchemists; often, beliefs that built a career are responsible for its demise. 

There is another bias that widens the “discovery-delivery” gap: it’s difficult to see what an invention is good for. Just ask the folks who invented the Segway – they prophesized that the electronic scooter would change cities, companies, and just about everything. Today, a Segway sighting is few and far between, reserved for lethargic tourists and police officers. It hasn’t revolutionized a thing. 

Or consider the story of Joseph Priestley and Horace Wells. Priestley was an English theologian and Unitarian humanist but you know him from your high school chemistry textbook because he discovered oxygen in 1774.[1] His most practical finding, however, was nitrous oxide, known colloquially as laughing gas. Thanks to Priestley, trips to the dentist are now relatively painless, though the credit goes to an American named Horace Wells – the first dentist to use nitrous oxide in a public showing when he demonstrated the effects at the Massachusetts General Hospital in Boston in 1845.[2] 

What struck me about Wells’ story is how unimaginative dentists are. For nearly 75 years not one dentist (or doctor) thought of using nitrous oxide as an anesthetic. It wasn’t unavailable or unknown either. In fact, it was a popular recreational drug for the British upper class in the early 19th century. Worse, the English chemist Humphry Davy outlined nitrous oxide’s potential for surgeons in a book published in 1800. “As nitrous oxide in its extensive operation appears capable of destroying physical pain, it may probably be used with advantage during surgical operations in which no great effusion of blood takes place.” Can you imagine how many suffered who didn’t have to? 

Nassim Taleb observes a similar example. For years, travelers hauled heavy luggage from one terminal to another. Then, in 1967, a Minnesotan named Jim Muellner established Smarte Carte, Inc., the company that manufactures those handy trolleys allowing travelers to wheel their luggage and not schlep it. Muellner’s company is credited with a number of innovations for the self-service industry: dollar-bill accepters, unattended credit card payments and computerized electronic lockers. Remarkable is what Smarte Carte didn’t think of. 

Nearly two decades later, in 1987, a 747 pilot named Robert Plath invented the Rollaboard by affixing two small wheels and a long handle to suitcases. Plath initially sold them to flight attendants but eventually everyone wanted one. Today they are ubiquitous, begging the question: Why did we invent a cart with wheels before luggage with wheels? 

It’s difficult to say what is more banal between the dental industry and the luggage industry. The former deserves more scorn for the number of people who needlessly suffered during surgeries between 1775 and 1860s, but the luggage industry looks especially foolish considering that the wheel had been around for nearly six thousand years before someone thought to put it on luggage (and it was a pilot). Taleb’s point is implementation does not necessarily proceed from invention – it requires luck and circumstance. I’d add another axiom: the longer a technology exists the more difficult it is to implement in novel ways, not because no new applications exist (or it’s inherently difficult to implement an old technology in a contemporary settling) but because in our modern eyes “old” technology is the antithesis of innovation. 

This tells us something about technological progress; namely, it’s messier than we think. Ideas are forgotten, reapplied, and not recognized for what they are. George Mendel founded genetics but the profound implications of his research were not realized for years after his death.[3] No one knew what to do with the first computers[4] – it took someone like Steve Jobs to make them useful, to connect the mouse with a graphic interface and put it on your desk and in your lap. Would you ever guess that humanity would put wheels on luggage after a man on the moon? 

There’s a final bias that widens the gap between discovery and delivery. We often struggle to deliver a discovery because we don’t know what we’ve discovered. Consider the story of Penicillin; a serendipitous account in which the untidy tendencies of Sir Alexander Fleming gave rise to a growth of mold on a culture of staphylococci. The mold, it turned out, was a bacteria-killer that ultimately gave rise to penicillin, but Fleming did not know what it was initially. Even after publishing his findings in 1929 he was doubtful penicillin, as he termed it, could kill pathogenic bacteria in the human body. It would not be mass-produced until the early 1940s. Such is the history of medicine: discovery is followed, much later, by implementation because it’s unclear what’s been discovered. 

Think about evolution. Today we say that Darwin “discovered” evolution. It’s more accurate to say that over many years a few germinal hunches matured into a presentable theory. Darwin’s journals are littered with notes that capture the principles of natural selection yet it took him nearly two decades to fully realize his theory. Even after Darwin published On the Origin of Species he doubted some of his ideas.[5]

I’ll summarize: The gap between invention and implementation, what I’ve also termed the discovery-delivery gap, is plagued by one obvious bias: a commitment to the current state of affairs contrary to potential improvements, what psychologists term the status-quo bias. But there are two other, less apparent mistakes. Priestley illustrates the first: implementation does not necessarily follow invention because it’s difficult for the inventor to foresee the full potential of his creation. Fleming and Darwin demonstrate the second: it’s often impossible to know that you’ve made a discovery – after all, if you foresee a discovery, you’ve already made it. There is a third category: inventions with implementations pending future technology. Imagine if a Pleistocene hunter-gatherer was fooling around with some materials and by happenstance created the synthetic rubber now used to erase pencil marks. At any point in time before humans used graphite to draw or write this invention would have been effectively useless. It required a technology that didn’t exist yet – the pencil. 

A timeline of advances in science, technology and medicine suggests a sequential perspective of history, in which each invention and innovation perfectly stems from the previous ones. This linearity is an illusion of the mind and not a fact of history. Unlike narratives in fiction, the story of discovery is intermittent. It is, in fact, not a story but a collection of events and experiences the mind streamlines into a coherent account, easy to digest. For the sake of a more accurate picture of creativity, let’s remember that invention and implementation are distinct relatives, not twins. 

Post scriptum 

I omitted a fourth category: implementations before inventions, or deliveries before discoveries. Think about The Wealth of Nations. Economics 101 professors and intellectual historians teach Smith’s tome as if it caused the free market, created a middle class and launched the Industrial Revolution. No germinal event marks the beginning of the Industrial Revolution but it was well on its way before Smith published The Wealth of Nation in 1776. Smith simply observed what was happening and translated it into theory. In hindsight we mistakenly believe that Smith’s book gave rise to the free market and claim that economics cannot function without the theory it outlines. In other words, we believe that since the theory preceded the practice (it didn’t) the practice requires the theory. This is a mistake.

Taleb terms this “Lecturing Birds on How to Fly.”

The bird flies. Wonderful confirmation! [Harvard professors] rush to the department of ornithology to write books, articles, and reports stating that the bird has obeyed them, and impeccable inference. The Harvard Department of Ornithology is now indispensable for bird flying. It will get government research funds for its contribution. 

It also happens that birds write no such papers and books, conceivably because they are just birds, so we never get their side of the story. Meanwhile, the priests keep broadcasting theirs to the new generation of humans who are completely unaware of the conditions of the pre-Harvard lecturing days. Nobody discusses the possibility of the birds’ not needing lectures – and nobody has any incentive to look at the number of birds that fly without such help from the great scientific establishment. 

Another example. Think about the literature on cognitive bias and heuristics. In the last several decades psychologists have accumulated a long list of biases that plague rational thinking. The usual suspects are confirmation bias, framing, the availability heuristics; I mentioned the status-quo bias in the first paragraph. Non-empirical descriptions of cognitive biases existed at least since the Ancient Greeks; the only “news” is the empirical stuff. The problem is believing that 1) nobody knew about cognitive biases before empirical accounts existed and 2) we cannot think more rationally (or at least compensate for biases) without an empirical account of biases. In this example we see that delivery (i.e., Ancient Greek philosophy) preceded discovery (Kahneman & Tversky) while judgment and decision-making enthusiasts, confusing the arrow of causation, conclude: “We can improve rational thinking now that we’ve discovered biases!” 

image via Triff/Shuttershock


[1] Well, that’s not exactly true. The Swedish chemist Carl Wilhelm Scheele discovered oxygen two years earlier in 1772 but didn’t published his findings until 1777 – two years after Priestley published Experiments and Observations on Different Kinds of Air, which expounds his 1774 discovery. Priestley simply beat his European counterpart to the punch on paper.

[2] Unfortunately, the event was deemed a “humbug affair” (the gas bag was improperly administered) and everyone ignored the potential of nitrous oxide as an anesthetic for many years. History would treat Wells kindly, though. The American Dentist Association posthumously recognized him as the discoverer of anesthesia in 1864 – 16 years after he committed suicide in prison. I should say that it’s debatable who deserves credit for anesthetics. See here for more.

[3] For instance, it explained the variation upon which natural selection acted.

[4] Waton’s “There is a world market for maybe five computers” is hyperbole but indicative.

[5] For example, for the theory of evolution to work the Earth needed to be billions of years old, which it is. At the time, however, the best guess was that it was 100 million. It wasn’t until the 1920s and 30s geologists discovered that Earth was about 4 billion years old. Not knowing plate tectonics also put a thorn in some of Darwin’s ideas.

Sign up for the Smarter Faster newsletter
A weekly newsletter featuring the biggest ideas from the smartest people

Related

Up Next