Julia Galef is a New York-based writer and public speaker specializing in science, rationality, and design. She serves on the board of directors of the New York City Skeptics, co-hosts their official podcast, Rationally Speaking, and co-writes the blog Rationally Speaking along with philosopher of science Massimo Pigliucci. She has moderated panel discussions at The Amazing Meeting and the Northeast Conference on Science and Skepticism, and gives frequent public lectures to organizations including the Center for Inquiry and the Secular Student Alliance. Julia received her B.A. in statistics from Columbia in 2005.
Julia Galef: In a nutshell, we take the most useful research from cognitive science about how the human brain reasons and makes decisions and the errors that the human brain tends to make when reasoning or making decisions and we turn that research into workshops that people can use to apply it to their own lives and improve their own decision making about their health, their finances, their relationships and the decisions that they make for society and the world in general about how to vote and how to treat other people and what they can do to improve the world.
One example of rationality in action, just to give you a sense of what it looks like and how it's relevant, back in 1985 Intel had a large foot in the memory chip manufacturing business and they’d been losing money on memory chips for years, so the two cofounders Andy Grove and Gordon Moore met to figure out what to do, and at one point Andy asked, "What do you think a new CEO would do if the board kicked us out and brought in a new CEO?" And without hesitating Gordon replied, 'Oh, he would get out of the memory business." And Andy said, "Well, so is there any reason we shouldn't do that, if we just walk out of the door and come back in and switch out of the memory business?"
And in fact that’s exactly what they decided to do, and it was a huge success. And this is just one example of a cognitive bias that appears in lots of contexts in lots of skills called the commitment effect, where we stick with a business plan or a career or a relationship long after it has become quite clear that it's not doing anything for us or that it's actively destructive or self-destructive because we have an irrational commitment to whatever we have been doing for a while because we don't like to idea of our past investment having gone to waste or because it’s become part of our identity.
And the technique that Andy and Gordon used to snap themselves out of the commitment effect is also a really generally useful technique called looking at a problem as if you were an outsider, an outside party. Over the past few decades cognitive scientists have learned a lot about this and many other biases that human brains are subject to when we try to make decisions, but fortunately, cognitive science has also learned a lot about things that we can do to improve. So at the Center for Applied Rationality we're taking that research, teaching people about the biases, where they occur, when we're vulnerable to biases and then teaching them simple and easy mental habits like looking at a problem as if you're an outsider to overcome those biases.
Rationality also is a significant public good, and that's one of the main motivations behind the founding of CFAR. Society would look very different if rationality, rational thinking and decision making were widespread. Just to name a few of many, many things that I could name, we as a society would demand evidence from politicians for the claims that they made, we would notice when politicians were misdirecting us by playing on our emotions, we would be less vulnerable to prejudice and to stereotypes because we would be weary of the confirmation bias, which is a very universal bias in which you look for examples that fit a stereotype, but you don't look for examples that don't fit the stereotypes, so you end up confirming and reinforcing a stereotype in your mind.
We would spend our money much more effectively as a society to stave off important risks. The fact that things like terrorism and crimes like abduction are so vividly portrayed on the news makes them much more salient and makes us overweight those risks the same way we overweight any kind of evidence that is particularly vivid or salient, even if it isn’t actually the best bang for our buck in terms of risk reduction.
Directed / Produced by
Jonathan Fowler & Elizabeth Rodd
One widely useful mental habit that we teach in our class at the Center for Applied Rationality is called reference class forecasting, and it's for the most part, in the literature, been tested with reference...