Skip to content

Why Must Seeing Be Believing?

Sign up for the Smarter Faster newsletter
A weekly newsletter featuring the biggest ideas from the smartest people

Can we pack the entire human race into Missouri, the “Show Me” state? We might as well try, because when it comes to making important decisions, we humans have a bad habit of not heeding warnings when we don’t like the consequences. Why do we acknowledge risks only after we experience the downside firsthand?


In business, government, and our own households, we often ignore warnings that could save us a lot of trouble. For example, I remember seeing a distraught homeowner in New Jersey on the news after Hurricane Sandy hit in 2012. “We didn’t know it would be this bad,” she said, standing in front of her wrecked house. Well, every weather forecast and alert from the local authorities said it would be that bad, and people like her were supposed to evacuate. But for some reason, she didn’t listen.

It was the same story on Wall Street before the financial crisis. A slew of economists, some of them very well known, made dire predictions about unregulated derivatives market, the ballooning housing market, and the enormous leverage in financial markets years before the crisis occurred. But the big banks went on incurring debts – both notional and real – and writing subprime loans, until the entire system crashed. The federal government didn’t exactly take the risks seriously either. Since 2000, it had refused even to monitor the biggest derivatives markets, let alone regulate them.

So why won’t we heed the warnings, even when they come from people who ought to know much better than we do? For some people, it may be a case of “I didn’t think it would happen to me,”  a sort of exceptionalism or overoptimism that has each of us standing above everyone else. It’s a stance that oncologists are used to hearing when they tell lifetime smokers that they have lung cancer.

Another possibility is that people deliberately make decisions that their future selves will reject. Just like when you hit the snooze button on your alarm in the morning (why didn’t you just set it for later and sleep all the way through?), the decision you make at one point in your life may not be optimal in retrospect. Most people have at least heard about the risks of smoking, yet they smoke anyway because the future consequences seem so remote.

Perversely, some people may actually have an incentive to ignore warnings. On Wall Street, stopping subprime lending would have meant curtailing a profitable business. Rather than sounding the alarm, it was easier just to go along, knowing that no individual would take all the blame if every bank suffered the same fate. All the way up the banking hierarchies, continuing to fly by the seat of their pants could have been a calculated risk.

Indeed, for some companies, flying by the seat of their pants may be an integral part of their strategy. The car-sharing service Zipcar, for example, relies almost exclusively on its members to find problems with its cars. I know this because, as a Zipcar member, I drove a car that was in such bad shape it was later removed from the fleet. Fortunately, I didn’t have an accident, but what if someone else did? What if they had children in the car? There would be a lawsuit, unwelcome publicity, and the exposure of a potentially unsustainable business model. In fact, the business model is already there for all to see; it’s just that nothing has happened… yet.

There’s also the possibility that warnings aren’t being conveyed in ways that are easy to understand. Recently, Joseph Stromberg of Vox wrote that the world is completely unprepared for the possibility that an asteroid will strike our planet. It’s true that asteroid strikes don’t quite have the immediacy of a war in Iraq or health insurance costs, but we humans also have trouble gauging risks that are measured in thousands of miles and years.

For economists, these are problems worth solving. We don’t want people to make decisions that they will eventually regret. Yet the solutions may be more about psychology and communication. How can we make warnings more effective, so they’re almost as vivid as experiencing the bad outcomes themselves? How can we get people to take warnings seriously in a society that often distrusts scientists and experts?

Of course, we might not want to go overboard with our attempts to correct behavior. Humans’ unwillingness to believe that the worst could happen may also be a useful trait; it keeps us taking risks and doing extraordinary things. I just hope one of those extraordinary things won’t be surviving a direct hit by an asteroid or, for that matter, another global financial crisis.

Sign up for the Smarter Faster newsletter
A weekly newsletter featuring the biggest ideas from the smartest people

Related

Up Next