26 Years Later: What the Challenger Disaster Teaches Us
In his book Blind Spots, Professor Max Bazerman of Harvard Business School argues that the Challenger fiasco exploited inconsistencies in the decision-making mechanisms of the brain.
The space shuttle Challenger crashed on January 28th, 1986, seventy-three seconds after taking off from Cape Canaveral. The ship disintegrated in midair, sending six astronauts and the schoolteacher Christie McAullife plunging into the Atlantic Ocean. The tragedy transfixed America and ended NASA’s golden years.
In the subsequent days and months, the crash became shrouded in myth and rumor (add link: e-space/t/myths-about-challenger-shuttle-disaster/). Misperceptions persist to this day. Many Americans believe, for instance, that the shuttle exploded and that the astronauts were killed instantly. In fact, Challenger broke apart and only sections of it were destroyed. The occupants of the crew cabin were still alive—although most likely unconscious—when they hit the water at 200 miles an hour.
Many Americans claim to have watched the explosion live on television, but this too is untrue. Only one channel—CNN—was showing the launch when the tragedy occurred, and all of the major networks played the accident only on tape delay. Another common myth is that the Environmental Protection Agency banned use of a sealant that could have been used to make Challenger safer. The list goes on and on.
Of all the inaccuracies about the disaster, perhaps the most dangerous is the idea that accidents of this kind are an unavoidable part of space exploration. Travelling to outer space is immensely complex, so the thinking goes, and something is bound to go wrong once in a while. But follow up investigations found that the tragedy was not the result a chaotic, low frequency event; it was the result of an obvious oversight. Flight engineers should have noticed Challenger’s mechanical flaws long before the shuttle took off.
In his book Blind Spots, Professor Max Bazerman of Harvard Business School argues that the Challenger fiasco exploited inconsistencies in the decision-making mechanisms of the brain. Bazerman is an expert in “behavioral ethics,” which seeks to explain how people react when faced with ethical dilemmas. He argues that NASA leadership failed because they did not view the launch decision in ethical terms concerning the lives of the crew. Instead, they allowed political and managerial considerations to drive their decision-making.
What’s the Significance?
Historians—and journalists—tend to assume that people recognize an ethical dilemma when it is presented to them. When writing about tragedies like the Challenger disaster, we often imply that those who behaved immorally did so out of conscious effort. Bazerman, however, argues that ethical lapses are usually unconscious. In his view, people’s emotional needs can be so great that they drown out our ethical considerations completely.
We are also prone to “Groupthink,” the tendency to favor unanimity over careful reasoning. Therefore, we often behave immorally without even realizing it. This is why good people do bad things.
Luckily, people and the organizations that employ them are not slaves to human nature. Bazerman thinks there are several steps that leaders can take to ensure ethical decision-making among their employees. For instance, he tells executives that they should monitor the incentives and managerial structures that they impose on their employees lest conflicts of interest emerge. Also, they should pay close attention to data that might reveal their organization’s biases. For instance, leaders should use hard data to confirm that their companies are hiring sufficient women and minorities; relying on gut feeling isn’t enough.
In striving to improve our ethical decision-making, it also helps to be aware that thinking clearly is a lot more difficult in the heat of the moment. During the planning phase of a decision, we tend to rely on cool-headed rationality. When a crisis hits, however, this kind of thinking takes a back seat to powerful emotions—what Bazerman calls the “want” self. In Blind Spots, he writes that thinking through your likely emotional response to a situation beforehand can help you prepare for contingencies. “Thinking about your motivations at the time of a decision can help bring the ‘want’ self out of hiding during the planning stage and thus promote more accurate predictions,” he writes.
Pfizer's partnerships strengthen their ability to deliver vaccines in developing countries.
- Community healthcare workers face many challenges in their work, including often traveling far distances to see their clients
- Pfizer is helping to drive the UN's sustainable development goals through partnerships.
- Pfizer partnered with AMP and the World Health Organization to develop a training program for healthcare workers.
The controversy around the Torah codes gets a new life.
- Mathematicians claim to see a predictive pattern in the ancient Torah texts.
- The code is revealed by a method found with special computer software.
- Some events described by reading the code took place after the code was written.
Be glad your name isn't attached to any of these bad ideas.
- Some inventions can be celebrated during their time, but are proven to be devastating in the long run.
- The inventions doesn't have to be physical. Complex mathematical creations that create money for Wall Street can do as much damage, in theory, as a gas that destroys the ozone layer.
- Inventors can even see their creations be used for purposes far different than they had intended.
Orangutans join humans and bees in a very exclusive club
- Orangutan mothers wait to sound a danger alarm to avoid tipping off predators to their location
- It took a couple of researchers crawling around the Sumatran jungle to discover the phenomenon
- This ability may come from a common ancestor
SMARTER FASTER trademarks owned by The Big Think, Inc. All rights reserved.