Guest comment by Malte Meinshausen, Reto Knutti and Dave Frame
Yesterday’s BBC article on the “Avoiding Dangerous Climate Change” report of the Exeter meeting last year, carried two messages that have left some a little confused. On the one hand, it said that a stabilization of greenhouse gases at 400-450 ppm CO2-equivalent concentrations is required to keep global mean warming below 2°C, which in turn is assumed to be necessary to avoid ‘dangerous’ climate change. On the other hand, people are cited saying that “We’re going to be at 400 ppm in 10 years’ time”.
So given that we will exceed 400 ppm CO2 in the near future, is a target of 2°C feasible? To make a long story short: the answer is yes.
The following paragraphs attempt to shed a little light on the background on why 2°C and 400 ppm are mentioned together. First of all, ‘CO2-equivalent concentration’ expresses the radiative forcing effect of all human-induced greenhouse gases and aerosols as if we only changed CO2 concentrations. We use that as shorthand for the net human perturbation – it’s not the same as real CO2 being at 400 ppm because of the substantial cooling effect from aerosols. However, the other greenhouse gases such as methane and N2O increase the forcing and compensate somewhat for the aerosol effects. Thus the CO2-equivalent concentration is roughly equal to current levels of real CO2.
The ecosystems on our planet are a little like a cat in an oven. We control the heating (greenhouse gas concentrations) and the cat responds to that temperature. So far, we have turned the controls to a medium level, and the oven is still warming-up. If we keep the oven control at today’s medium level, the cat will warm beyond today’s slight fever of 0.8°C. And if we crank the control up a bit further over the next ten years and leave it there, the fever is going to get a little worse – and there is even a 4 in 1 chance that it could exceed 2°C.
Where does that probability come from? If we knew the real climate sensitivity, then we would know the equilibrium warming of our oven/planet if we left the CO2-equivalent concentrations at, say, 400 ppm for a long time. For instance, if the climate sensitivity were 3.8°C, such an oven with its control set to 400 ppm would warm 2°C in the long-term1. But what are the odds that the climate sensitivity is actually 3.8°C or higher? The chances are roughly 20%, if one assumes the conventional IPCC 1.5-4.5°C uncertainty range is the 80% confidence interval of a lognormal distribution2. Thus, if we want to avoid a 2°C warming with a 4:1 chance, we have to limit the greenhouse gas concentrations to something that is equivalent to 400 ppm CO2 concentrations or below. (Note, though, that one might want to question, whether a chance of 4:1 is comforting enough for avoiding fever of 2°C or more…).
At the heart of the second statement (“We’re going to be at 400 ppm in 10 years’ time”) is the fair judgment that we seem committed to crank up concentrations not only to 400 ppm CO2 but beyond: anything less implies we would have to switch off our power plants tomorrow. Current CO2 concentrations are already about 380ppm and they rose about 20ppm over the last ten years.
Figure: A schematic representation of (a) fossil CO2 emissions, (b) CO2-equivalent concentrations, and (c) global mean temperature for two scenarios: Firstly, an “immediate stabilization” which implies rising CO2-equivalent concentrations up to around 415 ppm in 2015 and stable levels after that (red dashed line). This scenario is clearly hypothetical as the implied emission reductions in 2015 and beyond would hardly be feasible. Secondly, a peaking scenario (green solid line), which temporarily exceeds and then returns to a 415 ppm stabilization level. Both scenarios manage to stay below a 2°C target – for a climate sensitivity of 3.8°C or lower. This is roughly equivalent to a 4:1 chance of staying below 2°C3.
Indeed, avoiding concentrations of – say – 475ppm CO2 equivalent (see Figure a) will require quite significant reductions in emissions. However, as long as we reduce emissions decisively enough, concentrations in the atmosphere could lower again towards the latter half of the 21st century and beyond. The reasons are the relatively short lifetimes of methane, nitrous oxide, other greenhouse gases and some CO2 uptake by the oceans (as discussed here).
Now, what is going to happen to our cat, if we turn up the heat control of our oven to about 475 ppm and then reduce it again? If we react quickly enough, we might be able to save the cat from some irreversible consequences. In other words, if concentrations are lowered fast enough after peaking at 475 ppm, the temperature might not exceed 2°C. Basically, the thermal inertia of the climate system will shave off the temperature peak that the cat would otherwise have felt if the oven temperature reacted immediately to our control button.
Thus, to sum up: Even under the very likely scenario that we exceed 400 ppm CO2 concentrations in the very near future, it seems likely that temperatures could be limited to below 2°C with a 4:1 chance, if emissions are reduced fast enough to peak at 475 ppm CO2 equivalent, before sliding back to 400 ppm CO2-equivalent.
Peaking at 475 ppm CO2-equivalent concentrations and returning to 400 ppm certainly comes at a cost, though: Our oven will approach 2°C more quickly compared to a (hypothetical) scenario where we halt the build-up of CO2 concentrations at 400 ppm (see Figure c). Thus decisions would need to be different if we care more about the rate of warming than the equilibrium. In fact, some emission pathways and some models also suggest that peaking at 475 ppm CO2 equivalent and returning to 400 ppm might even slightly decrease our chances to stay below 2°C (see Chapter 28 in the DEFRA report). Depending on the actual thermal inertia of the climate system, the peak temperature corresponding 475 ppm might be very close to the 400 ppm equilibrium temperature. This points to a more fundamental issue: Rather than discussing ultimate stabilization levels, it might be more worthwhile for policy and science to focus on the peak level of greenhouse gas concentrations. By the time that we managed to peak concentrations we could still decide whether we want to stabilize at 400 ppm or closer to pre-industrial levels. We will most likely be able to make wiser decisions in the future given that we certainly have learnt something about the cat’s behavior in its current fever.
1. The equilibrium warming dT can be easily estimated from the CO2 equivalent stabilization level C, if one would know the climate sensitivity S, with the following little formula: dT=S*ln(C/278 ppm)/ln(2)
2. This is of course a probability that merely reflects the uncertainty in our knowledge. The climate sensitivity is not random, it is just unknown. It expresses our degree of belief in that outcome, and is of course subject to future modification in the light of new evidence.
3. The depicted peaking scenario is the EQW-S475-P400 scenarios as presented in Chapter 28 of the DEFRA report. The “combined constraint” (see as well Chapter 28) has been chosen to find aerosol forcing and ocean diffusivity values for a 3.8°C climate sensitivity, which allow an approximate match to historic temperature and ocean heat uptake records. The historic fossil CO2 emission data is taken from Marland et al., the CO2 observations from Etheridge et al. and others are as given here and here, and the temperature observations and their uncertainties are from Jones, Folland et al.. The simple climate model that was used is MAGICC 4.1 as in Wigley and Raper (2001 – see here).