Can 2°C warming be avoided?

Guest comment by Malte Meinshausen, Reto Knutti and Dave Frame

Yesterday’s BBC article on the “Avoiding Dangerous Climate Change” report of the Exeter meeting last year, carried two messages that have left some a little confused. On the one hand, it said that a stabilization of greenhouse gases at 400-450 ppm CO2-equivalent concentrations is required to keep global mean warming below 2°C, which in turn is assumed to be necessary to avoid ‘dangerous’ climate change. On the other hand, people are cited saying that “We’re going to be at 400 ppm in 10 years’ time”.

So given that we will exceed 400 ppm CO2 in the near future, is a target of 2°C feasible? To make a long story short: the answer is yes.

The following paragraphs attempt to shed a little light on the background on why 2°C and 400 ppm are mentioned together. First of all, ‘CO2-equivalent concentration’ expresses the radiative forcing effect of all human-induced greenhouse gases and aerosols as if we only changed CO2 concentrations. We use that as shorthand for the net human perturbation – it’s not the same as real CO2 being at 400 ppm because of the substantial cooling effect from aerosols. However, the other greenhouse gases such as methane and N2O increase the forcing and compensate somewhat for the aerosol effects. Thus the CO2-equivalent concentration is roughly equal to current levels of real CO2.

The ecosystems on our planet are a little like a cat in an oven. We control the heating (greenhouse gas concentrations) and the cat responds to that temperature. So far, we have turned the controls to a medium level, and the oven is still warming-up. If we keep the oven control at today’s medium level, the cat will warm beyond today’s slight fever of 0.8°C. And if we crank the control up a bit further over the next ten years and leave it there, the fever is going to get a little worse – and there is even a 4 in 1 chance that it could exceed 2°C.

Where does that probability come from? If we knew the real climate sensitivity, then we would know the equilibrium warming of our oven/planet if we left the CO2-equivalent concentrations at, say, 400 ppm for a long time. For instance, if the climate sensitivity were 3.8°C, such an oven with its control set to 400 ppm would warm 2°C in the long-term1. But what are the odds that the climate sensitivity is actually 3.8°C or higher? The chances are roughly 20%, if one assumes the conventional IPCC 1.5-4.5°C uncertainty range is the 80% confidence interval of a lognormal distribution2. Thus, if we want to avoid a 2°C warming with a 4:1 chance, we have to limit the greenhouse gas concentrations to something that is equivalent to 400 ppm CO2 concentrations or below. (Note, though, that one might want to question, whether a chance of 4:1 is comforting enough for avoiding fever of 2°C or more…).

At the heart of the second statement (“We’re going to be at 400 ppm in 10 years’ time”) is the fair judgment that we seem committed to crank up concentrations not only to 400 ppm CO2 but beyond: anything less implies we would have to switch off our power plants tomorrow. Current CO2 concentrations are already about 380ppm and they rose about 20ppm over the last ten years.

Page 1 of 3 | Next page