images

Making sense of uneconomic major projects: the psychology of planning

Posted on April 12, 2014 · Posted in Blog

imagesThe East West Link appears to be a classic example of a major project driven by politics rather than planning.  While some of the political angles are being explored, we also need to ask how such projects are helped along by planners and economists who should know better.

Many major projects over the last few decades have resulted in massive cost over-runs and often minimal benefits. While we have more data and more computing power than ever before, the accuracy of project evaluation seems not to have advanced at all. Two psychologists Daniel Kahneman and Amos Tversky describe this recurrence of errors in planning and decision-making as “planning fallacy”.[1]

Kahneman and Tversky found that planners tend to be optimistic regarding their own projects, known as “optimism bias”. In order to compensate for this misjudgement, planners should focus on the “outside view”. This involves examining other similar projects which have been completed in the past, and is known as “reference class forecasting.”

Bent Flyvbjerg applied this forecasting method to practical planning and decision-making. He emphasizes that the  following three steps are required when undertaking reference class forecasting:

Firstly, a sufficient number of similar projects completed in the past must be identified. Secondly, a probability distribution must be established, and lastly the current project must be compared with the reference class distribution. Ideally this should result in more accurate forecasts which are based on sound assumptions and not only wishful positive thinking.

There seems to be plenty of positive thinking out there.  9 out of 10 transport projects not only fail to meet targets, but also result in cost overruns. Transport isn’t unique. Museums, exhibition halls, aerospace projects, dams, sports arenas and oil and gas extraction projects are frequently affected by the same problems. Flyvbjerg states that the list of cost overruns and benefit shortfalls seem endless. The Transbay Transit Center in San Francisco, the Skytrain in Bangkok, the Sydney Opera House, the Scottish Parliament, the Berlin airport, the Millennium Dome in London, the 2004 Olympics in Greece and the Eurofighter aircraft are just a few projects characterised by significant miscalculations; and this is by no means an exhaustive list.

Miscalculations also appear on a smaller scale, for example, the installation of lights on Castle Hill in Queensland. This work was initially estimated to cost $650,000, but is now reported to cost $900,000. When asked why the figure initially fell short, Herbert MP Mr Jones admitted that “when we made a promise for $650,000 we did it in good faith;” this being a prime example of optimism bias.

Optimism bias is generally unintentional – planners do genuinely have the best intentions. It is a different matter when planners deliberately provide overestimated benefits and underestimated costs. Flyvbjerg describes this as “strategic misrepresentation”. The negative results are the same as the optimism bias, but the causes and the potential cure are different. The reasons for strategic misrepresentation can occur due to political-economic pressures. For example, planners seek approval for a project and compete with other projects for funding – every project appears more favourable if the costs are low and the benefits are high.

Every planner, economist and engineer is aware of this. Thus the incentive to generate misleading figures is enormous. In fact lying is economically rational for the planner and he or she is rewarded when a project is approved, despite the project not being economically worthwhile from the public point of view. Strategic misrepresentation results in “a negated Darwinism, with survival of the unfittest”, where the projects which look good on the paper but not in reality, are built.

Albeit hard to find evidence that planners provided misleading figures intentionally, Martin Wachs conducted a small number of interviews with planners in 1990 and published his findings. He points out that erroneous’ forecasts were the result of planners, engineers and economists who admitted that they were “cooking” their calculations in order to get their projects started. The number of cases where planners provided unrealistic calculations was not very high in Wachs’ study, however Flyvbjerg’s findings support Wachs’ claim that lying is commonly used for getting projects started.

Unfortunately reference class forecasting would not solely prevent this type of information asymmetry from occurring, as planners with the intent to defraud are not interested in obtaining correct and realistic calculations. Flyvbjerg recommends a set of accountability measures to ensure that forecasts are not defective. These include discretionary grants being capped, independent peer reviews being established, information regarding the project being available to the public, criticism from stakeholders being permitted and  miscalculated projects being ceased. In addition, he states that projects with realistic estimates should be rewarded, and those who consistently generate inaccurate forecasts be penalised.

In the so called age of big data, more and more data is available and easily accessible. Planners, economists and engineers can obtain data easier than 10 or 20 years ago, in order to build their own reference classes. With the right amount of data and the correct set of tools there are certainly fewer excuses for flawed economic forecasts. Without any doubt political-economic pressures and optimism bias will not disappear from one day to the other. However, as demonstrated in this article, reference class forecasting, and increased transparency and accountability are very useful measures to overcome too optimistic and deceptive economic analysis; in Australia and anywhere else in the world.



[1] The planning fallacy was part of the development of the prospect theory for which Kahneman received  the Nobel prize in economics in 2002, Tversky died in 1996 at the age of 59.