Making assumptions right in climate modelling

A recent paper written by the economist Steve Keen has been causing a storm on Twitter. It is called ‘The appallingly bad neoclassical economics of climate change’ (see also here). The paper takes aim at the small-scale ‘Integrated Assessment Models’ that have in the past been used to calculate ‘optimal’ rates of global warming and carbon prices.

The models are not much good and never have been. I gave my own critique in a paper to the UK’s Post Keynesian Economic Society in 2019. Fortunately, now that most of the world has signed up to the Paris Climate Agreement, the raison d’être of these models to estimate optimal rates of global warning has gone (not that they could do it anyway). The world now needs to know how to get to net-zero emissions, not the optimal amount decided by an economist.

Steve Keen’s critique focuses largely on the choices of damage functions used in the models, particularly Nordhaus’ DICE model. He describes an almost surreal, and at times circular, process by which the models are parameterised.

Notably, the models assume that 87% of the economy will be unaffected by climate change because it takes place in ‘carefully controlled environments’. They also assume that today’s cross-sectional climate and GDP data, with no change in the solar energy retained by the biosphere, can be used to predict what will happen when we large amounts of energy are retained and the climate is transformed.

Bad economics

Given the title of the paper and Steve’s previous work on ‘debunking’ neoclassical economics, I was expecting to see criticism of the standard economic assumptions that underpin the IAMs (including the newer generation now used for the IPCC assessment reports).

Let’s be clear though, they are only missing for reasons of space. The usual suspects (perfect markets, social planners, rational behaviour based on perfect knowledge, etc) are all in the models.

This left me wondering – was the title of the paper accurate? Most of the critique relates to the parameterisation of the models rather than anything directly descended from neoclassical economics. So why mention neoclassical economics explicitly?

The reason (which ironically will be missed by neoclassical economists) is that institutions are important. The paper provides a great example of such institutions, with the publishing arrangements that allowed neoclassical economists to take a stranglehold over the field.

Neoclassical economics is itself an institution. It is a form of economics in which theoretical purity is preferred over empirical relevance. It is the form of economics that gives economists a bad reputation for making unrealistic assumptions. It is the form of economics that decided it was superior to all the other social sciences.

In short, neoclassical economics provided the foundations as a discipline where you could make your own assumptions and force them on to others. The paper is correct to call this out.

But all models have assumptions…

The question we are left with is can we do better? Can we have models that are based on a realistic economic system, with sensible parameters?

We are getting there, albeit slowly. First, we must recognise that the low-carbon transition is based on the development and adoption of new technology. If a model doesn’t include technology, it’s no good for this purpose. However, if the model does include technology but treats it as ‘exogenous’ (i.e. not influenced by policy) it is even worse because it can draw misleading conclusions.

How technology is modelled is important too. Here, human behaviour matters. Technology means investment and investment means risk. Most people prefer to reduce risk and invest in what they know about. In the energy world that usually means fossil fuels (e.g. for cars, heating systems). This applies to businesses (which are run by people) too.

It therefore takes time for the diffusion of knowledge about new technologies to take place. Government can help, not just through changing prices but by helping technologies get established, e.g. through procurement schemes.

The parameters will be tricky though!

Still, in a model all this needs to be quantified. We face fundamental uncertainty. How do we know how people will behave in 30 years’ time under a different energy system? We need to avoid the trap of selective guesswork that the IAM modellers fell into. Econometrics at least provides an empirical basis but there is a long list of authors who point out that behaviour in one context cannot be extrapolated to another.

Complexity economics and agent-based modelling offer a route into the uncertainty. By considering agent interactions, some of the uncertainty in macro data is illuminated. And what determines individual behaviour is being probed by the fascinating field of neuroeconomics.

However, the honest answer to the question is that we don’t know and model results should be considered in a broader context. As Keynes suggested, a bit more humility from economists could go a long way. Deciding what is best for the future of the planet on the basis of a few equations is not a good way forward.

 


Cover image is by davisco

 

No Comments

Add your own comment

Your email address will not be published. Required fields are shown with a *