Lets start with a simple question: how many of each animal did Moses take into the ark? If you pounced with the answer “2,” you have fallen into the same trap as most people. (The answer is zero—figure the rest out yourself.) Cognitive biases tell us we know when we don’t, create absurdly optimistic estimates of what we can achieve, and keep us stuck in bad relationships and bad jobs.
Here are three biases and some strategies for getting out of the trap they set.
1. The Sunk Cost Fallacy
Imagine you have a ticket to the movies for which you forked out 10 bucks, but you are attending with a friend who got hers for free. The weather turns sour and they are re-running Dukes of Hazard. Which one of you is more likely to cancel? If you say “my friend, duh,” you are trapped by the sunk cost fallacy.
Your ten bucks is gone (assuming you can’t plead a refund). Since you are out ten bucks whether you go or not, it should not affect your choice. What matters is the cost-benefit of braving the weather, and whether your movie features more interesting characters than Boss Hogg. (Unlikely. Still.)
The sunk cost fallacy traps people in bad relationships, bad investments, and traps countries in destructive, no-win wars. (“We can’t withdraw because we have spent billions and people have given their lives.”) What matters is the future—whether you can turn the relationship around, or whether the next billion dollars and young lives will be squandered in vain.
The sunk-cost fallacy is an example of a cognitive bias—a habitual, predictable, way of thinking that leads to error. Wiki lists over 100; it seems the amazing human brain has many hard-wired flaws.
Some of these flaws may have conferred an evolutionary advantage. Who knows what the exact conditions were five thousand years ago, but the hard-wiring of our brains may not have changed quickly enough to keep up with the white-heat of cultural and technological evolution that has happened in the last 5000 years (a blink of an eye in genetic evolution).
Conquering the sunk-cost fallacy is very tough. Who has not poured time and money into something and wished they hadn’t, only to pour more in on the next occasion? We like to self-justify (to believe that we made good decisions in the past); who likes to say “I was a fool then”? Then, we look for confirming evidence things are going our way. “He stopped drinking for a week, and had a job last year.”
One technique is to create an imaginary scenario. Imagine you parachuted into the (house, relationship, investment) for free, with nothing invested. What would you do then? If the answer is “run for the hills,” then you have your answer.
2. The Planning Fallacy
A second bias which causes enormous stress is the “planning fallacy.” Humans suck at estimating how long things will take. Partly, we like to believe we are super-human, but mostly we are deluded about how complex things get. As a writer, I’m constantly amazed that the last 5% of a project takes 30% of the time. The average overrun on big technology projects is 27%, and many really big ones overrun by one hundred percent or more! A group of students were asked to estimate how long a term paper would take, their “best case” guess was 29 days, and the “worse case” (excrement hits the fan) was 48 days. They took an average of 55 days!
How much stress and misery, I wonder, comes from people in offices saying “I can do it by Friday,” only to find that a couple of more Fridays are required? We like to people-please, and to look confident and competent, but we are incompetent at estimating how long things take!
3. Optimism Bias
Our final bias rears its head in conflict situations, where everyone is sure of their “facts” and confident in their predictions about how different actions will pan out. This family of biases means we take a rosy view of our knowledge, and a dim view of other’s. Nobody is as right as they think they are.
Professor Philip Tetlock has studied expert predictions over a lifetime. He found that experts (real experts, not talk radio experts) who were 100% sure of an outcome were wrong 25% of the time. Further, when they thought an outcome had “no chance,” it happened 15% of the time. What percentage of people are above average listeners? 96%!
This “confidence without competence” is one cause of conflict running out of control. People who are dogmatically sure of themselves beget adversaries who become similarly dogmatic. The next time you are in a conflict situation, make a table with two columns; write the facts (as you see them) in one column, and your opinions and conclusions in the other column. Ask your adversary to do the same (nicely!). Check off the facts on which you agree, and where you disagree. Do some homework together.
The difficult part of resolving conflict lies in the area of opinions, interpretations, values and predictions, so you are only part of the way there. But going through the process of developing a shared set of facts will diminish the polarization, and allow you to get down to business.
Learning about our biases can help.
The sunk-cost fallacy keeps us stuck in a miserable past, throwing good time and money away after bad decisions. The planning fallacy creates tremendous stress as we struggle to meet unrealistic deadlines. Optimism biases make us feel sure of ourselves when we have no right to be, which leads us to prolong and exacerbate conflict.
We didn’t learn these things in school because they were not well understood, were not part of any college curricula (unless behavioral economics gives you jollies), and certainly far from mainstream understandings of how humans work.
Learning about our biases puts us back in the game. Like sharpshooters who correct for wind velocity and direction, knowing our thinking is skewed in a particular direction means we can auto-correct, make better decisions, and get more of what we want in life.