When I planned my first road trip from Bangalore to Goa, I calculated that the distance (about 560 km) should take little more than 9 hours. Factoring in stopovers and few unexpected events like a flat tyre or traffic, I assumed that 12-15 hours should be sufficient for the road trip. It took 15 hours.
“Good job, Anshul!” I patted myself on the back. Not a bad estimate.
Now based on this, if I had to forecast the time it would take to cover a distance of say 5,000 km, a road trip to cover major cities in India, I might be tempted to extrapolate the Bangalore-Goa trip time. I’ll probably calculate that 560 km took one day so 5,000 km should take 10 days plus 2-3 more days.
Am I being reasonable in my estimation?
What I am forgetting here is that the second road trip is not only longer but more complex and subject to many more unforeseen and unexpected events. My estimation is fraught with over-optimism bias. And I am not alone in making this kind of mistake.
There are many ways a plan can fail and most of those things are too improbable to be anticipated. The likelihood that something will go wrong especially in a big project is high. Overly optimistic forecasts of the outcome of projects are found everywhere.
In fact, how often are you able to complete everything on your to-do list at the end of the day? This shows how absurdly ambitious we’re in planning.
This bias, a phenomenon in which predictions about how much time will be needed to complete a future task display an optimism bias (underestimate the time needed), is called Planning Fallacy. The term was coined by Nobel Laureate Daniel Kahneman and his colleague Amos Tversky.
Planning Fallacy happens when plans and forecasts are unrealistically close to best-case scenario and ignore the statistics of similar cases. In his book Seeking Wisdom, Peter Bevelin explains –
We react to stimuli that we personally encounter or that grabs our attention. We react more strongly to the concrete and specific than to the abstract. We overweigh personal experiences over vicarious. We see only what we have names for. We tend to focus only on the present information rather than what information may potentially be missing. For example, when planning, we often place too much importance on the specific future event and not enough on other possible events and their consequences that can cause the event to be delayed or not happen.
The planning fallacy results in not only time overruns, but also cost overruns and benefit shortfalls.
Examples of the planning fallacy abound, says Kahneman, “in the experiences of individuals, governments, and businesses.” Here is a sample list of few such horror stories –
- In July 1997, the proposed new Scottish Parliament building in Edinburgh was estimated to cost up to £40 million. By June 1999, the budget for the building was £109 million. In April 2000, legislators imposed a £195 million “cap on costs.” By November 2001, they demanded an estimate of “final cost,” which was set at £241 million. That estimated final cost rose twice in 2002, ending the year at £294.6 million. It rose three times more in 2003, reaching £375.8 million by June. The building was finally completed in 2004 at an ultimate cost of roughly £431 million.
- The conch-shaped Sydney Opera House was planned in 1957 and was expected to be completed in 1963. A scaled-down version opened in 1973, a decade later. The original cost was estimated at $7 million, but its delayed completion led to a cost of $102 million – 14 times the original estimate.
- The Denver International Airport opened sixteen months later than scheduled with a total cost of $4.8 billion; over $2 billion more than expected.
- In fact, I don’t have to go that far to dig out the examples of projects overshooting their deadlines and estimated costs by a huge margin. The Bangalore Metro project was originally scheduled to begin operations in March 2010. Deadlines for completion were repeatedly missed, and the metro was finally opened (a very small distance track) to the public in October 2011.
So what causes people to succumb for planning fallacy?
The inside view, according to Kahneman and Tversky, is the culprit here. The inside view is the one that all of us, spontaneously adopt to assess the future of our project. We focus on our specific circumstances and search for evidence in our own experiences.
When we have information about an individual case, we rarely feel the need to know the statistics of the class to which the case belongs. We try to forecast based on the information in front of us. We are unable to foresee the succession of events that would cause the task/project to drag out for long.
In other words, we conveniently forget about the unknown unknowns. Rolf Dobelli, the author of The Art of Thinking Clearly, writes –
…we focus too much on the project and overlook outside influences. Unexpected events to often scupper our plans. This is true for daily schedules, too: you daughter swallows a fish bone. Your car battery gives up the ghost. An offer for a house lands on your desk and must be discussed urgently. There goes the plan. If you planned things even more minutely, would that be a solution? No, step-by-step preparation amplifies the planning fallacy. It narrows your focus even more and thus distracts you even more from anticipating the unexpected.
Most people overrate their own abilities and exaggerate their capacity to shape the future, writes David Brooks, “That’s fine. Optimistic people rise in this world. The problem comes when these optimists don’t look at themselves objectively from the outside. The planning fallacy is failing to think realistically about where you fit in the distribution of people like you.”
Overcoming Planning Fallacy
Taking an outside view, Kahneman reports, is a way to break the illusion created by planning fallacy.
To take an outside view is to take your attention away from our specific task/project and toward a class of similar cases. Take a look at the statistics of success/failure rate of similar projects in the past.
If you want to know how something is going to turn out for you, look at how it turned out for others in the same situation. If you find that for some people a 5,000 km road trip took 30 days (the base rate), what makes you think that yours will be over in 13 days?
More than individuals, planning fallacy plagues large organizations. Overcoming planning fallacy requires a check on the overconfident optimism.
Gary Klein, a noted psychologist and author of Seeing What Others Don’t, proposes something called ‘pre-mortem’ session for curing this bias in organizations. When the organization has almost come to an important decision but has not formally committed itself, Klein proposes gathering for a brief session a group of individuals who are knowledgeable about the decision.
The premise of the session is a short speech: “Imagine that we are a year into the future. We implemented the plan as it now exists. The outcome was a disaster. Please take 5 to 10 minutes to write a brief history of that disaster.”
The pre-mortem helps in overcoming the groupthink that affects many teams once a decision appears to have been made, and it unleashes the imagination of knowledgeable individuals in a much-needed direction. Pre-mortem takes our attention towards what can go wrong. Peter Bevelin sums it up well –
Humans make mistakes, equipment fails, technologies don’t work as planned, unrealistic expectations, biases including sunk cost-syndrome, inexperience, wrong incentives, contractor failure, untested technology, delays, wrong deliveries, changing requirements, random events, ignoring early warning signals are reasons for delays, cost overruns and mistakes.
Like other behavioural biases, just knowing about planning fallacy doesn’t make one immune to it. Despite attempts to combat it, the planning fallacy remains. The other cognitive errors like commitment-consistency bias and sunk cost fallacy make it very hard to counter the force of planning fallacy.
That’s why it’s crucial that the pre-mortem session is done at the very beginning of the project. Once a group of people has committed their time for a project, they find it hard to abandon the project even if the pre-mortem session points at the infeasibility of the task.
If the process of planning is plagued with so many problems, what’s the utility of making a plan at the first place?
Planning has its utility but it’s not what most people think it is. Plans are useful in the sense that they are proof that planning has taken place. The planning process forces people to think through the right issues. But as for the plans themselves, Muhammad Ali says, “Everybody has a plan until they get punched in the face.”
In Investing
When the management of a company shows too much of confidence about the future plans and an over-optimism about their own abilities to execute on that plan, you have strong reasons to suspect that planning fallacy is at play.
A pragmatic management would recognize that most businesses operate in a complex environment and a host of factors affect the short term earnings. Giving a precise earnings guidance without acknowledging the possibility of missing the target is a red flag. Presence or absence of planning fallacy should give you an important clue about ability and integrity of management while analysing a company.
Similarly, while making your investment thesis, especially the estimation of the value of a business, account for the possibility of things going wrong. Which means taking sufficient margin of safety to avoid permanent loss of capital.
This is the reason Buffett preferred businesses which were not subjected to much change. The absence of change in a business minimizes the probability of things diverging from the plan.
Conclusion
Most business executives, while forecasting the outcomes of risky projects, are vulnerable to planning fallacy. In its grip, they make decisions based on delusional optimism rather than on a rational weighting of gains, losses, and probabilities.
Overestimating benefits and underestimating costs is the surest sign of planning fallacy. Under its spell, individuals and organizations imagine scenarios of success, forgetting to make any room for possibility of mistakes and miscalculations. As a result, they pursue initiatives that are unlikely to come in on budget or on time or to deliver the expected returns —or even to be completed.
In this view, people often take on risky projects because they are overly optimistic about the odds they face. To deal with planning fallacy, decision makers need a realistic assessment of the costs and benefits of a proposal before making the final decision to approve it.
Benjamin Franklin aptly said, “He that builds before he counts the cost, acts foolishly; And he that counts before he builds, finds he did not count wisely.”
If you are in getting into a boat, you would probably want to know about any holes in it before you start paddling. Right?
Biases are such holes in our reasoning abilities and they can impair our decision making.
Now, simply noticing these holes isn’t enough. A boat will fill with water whether you are aware of a hole or not. But if you are aware of the holes, you can devise methods to patch them up.
In the same way, if you know how your biases (or faulty brain wiring) can hurt you, you will take precautionary action to safeguard yourself from them.
Take care and keep learning.
PS: For more insights read Richard Zeckhauser’s report Investing in the Unknown and Unknowable.
Sanant says
Great job in this space! The quote was by Mike Tyson, not Muhammad Ali
Anshul Khare says
Thanks Sanant.
Vishal says
Insightful post Anshul. Along with the Bengaluru Metro, the Bandra Worli Sea Link is also an example of the planning fallacy. It took 2.5-3X more time than estimated, and the final cost went up by almost 400%.
Overconfidence is probably why the term ‘low-hanging fruit’ gets thrown around generously and incorrectly. Accounting for unknown unknowns and allocating extra time and resources, and contingency planning, are important.
I also think it’s important to know who we include in the pre-mortem phase. Many people want the status quo to be retained and believe what they’ve done is the textbook version. But it’s the outsiders who disrupt an industry. How do you think we should identify people to refer to during pre-mortem?
Anshul Khare says
Thanks Vishal.
For identifying people for pre-mortem, Klein proposes gathering a group of individuals who are knowledgeable about the decision. I guess one should also include people who don’t have incentives aligned to success of the project. Do read about the “tenth man rule”.
Vishal says
Thanks Anshul. Having people with no vested interests is an interesting insight.
Will read about the 10th man rule.