You bought tickets for the cinema. The film is awful. Do you stay until the end?
One of the well-known biases in human judgment and decision making is our tendency to refuse to give up a job, a relationship, a strategy, a decision we made in the past because we have invested too much time, energy, effort, or money in it. Known as sunk cost bias since, as you know, sunk costs are investments we made in the past that cannot be recovered.
Anyone who has taken an introductory course in Microeconomics has probably learned that past investments should not be considered in future investment decisions when the investment is unrecoverable.
And yet... there is this strong human tendency to want to escalate our commitment to whatever we have invested in the past.
This happens, for example, because we have invested money in a particular product line or because we have taken the time to mentor a person or visit a place.
It can even be as simple as we bought tickets to the cinema, it turned out to be a lame movie and yet most of us feel we have to stay until the end, even though there would have been better ways to spend that time.
So far so good.
But where sunk cost bias starts to get really interesting is in the research that shows that most professionals disagree and don't consider sunk cost as a bias. Instead, the decisions they make are consciously based on non-recoverable investments as well. Why?
Because any leader who backs out of something they invested in will suffer a profound credibility hit. And a good reputation, and consequently or trust that goes with it, is very important.
So what’s the verdict?
You may have noticed that the different interpretations of sunk cost reflect different levels of analysis. First, at the corporate level and second, at the level of the decision making leader.
A classic principal-agent problem.
Simply put, leaders are rewarded for sticking to a course, even if it appears that this course will not be successful. They are rewarded by those around them who feel that they can continue to trust that person because they did not break their word. In the long run, however, this does not work well for the organization as a whole because the organization is required to spend its monetary resources wisely, calculating only the costs and benefits of future investments and making decisions based on that.
The result? On average, companies, and organizations in general, persist with unsuccessful projects longer than they should.
So, how should decision makers structure their environment to avoid this?
There are specific strategies designed by Harvard University researchers led by Professor Jennifer Lerner based on decision science and behavioral science to combat this phenomenon and improve organizational decision-making.
One such strategy is:
Design ahead of time preset intervals for checking progress and have fixed contingency plans.
For example:
"Three months from now, we need to be at X level of performance and if we're not, we have a contingency plan for what we're going to do."
When we don't have those kinds of systems in place, we get to the three-month point and say to ourselves:
"You know what, we're not where we really hoped to be, but that's because we're learning so much and we really need to keep doing this just with more effort, or a little bit of improvement or ......."
Behind this strategy is a basic principle of decision-making and something that more and more companies are beginning to understand by focusing more on the decision-making process than the outcome of the decision. This principle is that a good decision is not judged by the outcome (which involves luck) but by the process that was followed.
It goes without saying that a good decision can also lead to failure. In the long run, however, the set of good decisions made (i.e. decisions based on correct procedures) will involve far fewer failures than decisions based on bad or non-existent procedures.
For example, some companies in the biotechnology sector follow the following rationale:
"A lot of money has been invested in these products. We cannot afford to spend one more day on something that is not going to perform as well as we want it to. Therefore, we will provide significant incentives to have an excellent analytical process that will evaluate at predetermined intervals whether the experiments are producing the results we expected. We will even reward this process by throwing a party for every product that fails." And so they started to organize failure parties, where they actually celebrate the good analytical process that supports good decision making.
This strategy carries a double benefit for leaders. Because of good analytics they are able to reach goals faster, having essentially accelerated organizational learning, and by signaling their way of doing things from the start, they have helped build trust with others in the organization and maintain their good reputation.
Complimentary strategies are:
If you're hiring an external consultant, acknowledge that they too have a conflict of interest: they don't want to be the bearer of bad news ("say goodbye to your past investment"), especially if they want to get hired again. So they too will suffer from sunk cost bias. In that case, consider not telling them what you’ve already invested or tell them that this is a one-off engagement in order to secure their objectivity.
Split your advisory team into two groups. Share past investment information with one group so they can consider reputation concerns. For the other group, don't give them any details about past investments. This way, one group follows a rational approach as per neoclassical economics, while the other group considers social reputation in your organization. If these two teams provide different suggestions, it's your job as a leader to balance them.
Surround yourself with people who are willing to honestly communicate with authority figures, and create incentives for them to do so.
Question for you
How did you fail today? What did you learn from that?
Comments