The crisis in economic forecasting

First they failed to predict the great financial crisis. Then they were wrong about the short-term impact of the Brexit vote. Do economists need to go back to school? Simon Wilson reports.

What’s happened?

“The only function of economic forecasting is to make astrology look respectable.” Canadian economist John Kenneth Galbraith’s pithy assessment of his own profession has rarely seemed more apposite, according to the dismal science’s growing chorus of detractors.

Last month, no less a figure than Bank of England chief economist Andy Haldane issued an unusual mea culpa, admitting that forecasting failures – particularly the failure to predict the financial crash of 2008, and mistaken warnings of immediate catastrophic consequences if the UK voted to leave the European Union – meant the economics profession was “to some degree in crisis”.

Haldane characterised the failure to anticipate the 2008 crash as a “Michael Fish moment” for economics – a reference to the 1980s TV weatherman and his infamous assurance in 1987 that southern England would not be hit by a hurricane, just hours before the hurricane struck.

So economics is in crisis?

As a general caveat, it’s important to remember that economics is far more than macroeconomics, forecasting and finance. Macroeconomics – perhaps the most public face of economics – is the study of how the economy behaves in aggregate; it involves analysis, theorising and prediction about economy-wide phenomena such as inflation, price levels, rates of growth, national income, gross domestic product (GDP) and changes in unemployment.

However, says Oxford professor Simon Wren-Lewis on his blog, if you look at a university economics department, then typically less than a fifth of the staff are macroeconomists, and in some departments there might be just one. Instead, most economists, such as those working on labour economics, experimental economics, behavioural economics, and so on, “would not have felt their sub-discipline was remotely challenged” by the 2008 crisis.

But macro is in the dock?

It’s certainly in the spotlight. Economic output represents the aggregated activity of millions, or billions, of people, influenced by forces seen and unseen. You could argue that, given the variables involved, it would be a marvel if forecasters ever got anything right. The trouble is, the evidence suggests that – when it really matters (ie, when recessions are coming down the track) – they don’t.

What’s the evidence?

In 2001 Prakash Loungani, an economist at the International Monetary Fund (IMF), looked at the accuracy of economic forecasts throughout the 1990s – both public sector and private sector – in the International Journal of Forecasting. He concluded that “the record of failure to predict recessions is virtually unblemished”. As Tim Harford notes in the FT, in 2014 he repeated the exercise with a colleague, Hites Ahir. Staggeringly, he found that even in September 2008 – well after the demise of both Bear Stearns and Northern Rock, and the same month that Lehman Brothers collapsed – the consensus among economists “remained that not a single economy would fall into recession in 2009”. In reality, 49 of the 77 countries under consideration fell into recession that year.

A large part of the problem seems to be that forecasts are largely backward looking. A Harvard study by Lant Pritchett and Larry Summers, cited by The Economist, found that forecasters tended to assume that fast-growing countries will keep speeding along, while the economic tortoises continue to crawl. In fact, the opposite is broadly true: “regression to the mean is perhaps the single most robust and empirically relevant fact about cross-national growth rates”.

What’s the reaction from economists? 

Some seem to think that we shouldn’t take forecasts so seriously. Haldane’s mea culpa on behalf of the profession clearly rankled with former Bank colleague David Miles, for example, who wrote a (friendly) demolition job in the Financial Times. If existing economic theory told us that events such as the financial crisis should be predictable, then “maybe there is a crisis”, he argues. But “economics says no such thing.

In fact, to the extent that economics says anything about the timing of such events it is that they are virtually impossible to predict; impossible to predict, but most definitely not impossible events. Any criticism of ‘economics’ that rests on its failure to predict the crisis is no more plausible than the idea that statistical theory needs to be rewritten because mathematicians have a poor record at predicting winning lottery ticket numbers.”

So why forecast at all?

Good question. As the plethora of forecasts rolled out to back or oppose Brexit demonstrate, economics, as a social science, can readily be skewed to serve political and business interests. Meanwhile, ever more complex mathematical models and jargon have given the discipline an aura of false precision. Yet, while we cannot know the future, that doesn’t mean we shouldn’t try to plan for it. But perhaps our methods need to change.

Is there a better way?

Forecasting isn’t just a struggle for economists – if 2016’s political shocks taught us anything, it’s that forecasting is hard, says Tim Harford on his Undercover Economist blog. This is nothing new – in 2005, a study by psychologist Philip Tetlock found that across the 1980s and 1990s, expert “political and geopolitical forecasts had been scarcely better than guesswork.”

However, Tetlock has also found that our forecasting skills can be improved. In Superforecasting, written with Dan Gardner, he explains how in more detail. But one key characteristic of successful forecasters is that they “aren’t afraid to change their minds, are happy to seek out conflicting views and are comfortable with the notion that fresh evidence might force them to abandon an old view of the world”. Given the defensive, political nature of academia, perhaps it’s no surprise economics has found it hard to adapt post-2008.


Leave a Reply

Your email address will not be published. Required fields are marked *