A paper on climate sensitivity today in Science will no doubt see a great deal of press in the next few weeks. In “Why is climate sensitivity so unpredictable?”, Gerard Roe and Marcia Baker explore the origin of the range of climate sensitivities typically cited in the literature. In particular they seek to explain the characteristic shape of the distribution of estimated climate sensitivities. This distribution includes a long tail towards values much higher than the standard 2-4.5 degrees C change in temperature (for a doubling of CO2) commonly referred to.
In essence, what Roe and Baker show is that this characteristic shape arises from the non-linear relationship between the strength of climate feedbacks (f) and the resulting temperature response (deltaT), which is proportional to 1/(1-f). They show that this places a strong constraint on our ability to determine a specific “true” value of climate sensitivity, S. These results could well be taken to suggest that climate sensitivity is so uncertain as to be effectively unknowable. This would be quite wrong.
The IPCC Summary For Policymakers shows the graph below for a business-as-usual carbon emissions scenario, comparing temperatures in the 1980s with temperatures in the 2020s (orange) and 2090s (red). The latter period is roughly when CO2 will have doubled under this scenario. The resulting global temperature changes cluster between 2 and 5 degrees C, but with a non-zero probability of a small negative temperature change and long tail suggesting somewhat higher probabilities of a very high temperature change (up to 8 degrees is shown).
We have very strong evidence for the middle range of climate sensitivities cited by the IPCC. But what Roe and Baker emphasize is that ruling out very high sensitivites is very difficult because even the relatively small feedbacks, if they are highly uncertain, can have a very large impact on our ability to determine S.
Paleoclimate data do provide a means to constrain the tail on the distribution and perhaps to show the likelihood of large values of S is lower than Roe and Baker’s calculations suggest. In particular, Annan and Hargreaves (2006) used a Bayesian statistical approach that combines information from both 20th century observations and from last glacial maximum data to produce an estimate of climate sensitivity that is much better constrained than by either set of observations alone (see our post on this, here). Their result is a mean value of deltaT close to 3ºC, and a high probability that the sensitivity is less than 4.5ºC, for a doubling of CO2 above pre-industrial levels. Thus, we emphasize that Roe and Baker’s result do not really tell us that, for example, 11°C of global warming in the next century entury is any likelier than we have suggested previously.
On the other hand, there is a counterpoint to such a comforting result. Roe and Baker note that the extreme warmth of the Eocene — something that has stymied climate modelers — could in principle be explained by not-very-dramatic changes in the strengths of the feedbacks, again because small changes in f can produce dramatic change in S. The boundary conditions for Eocene climate remain too poorly known to include in a formal calculation of climate sensitivity, but at the very least the extreme climate of this time suggests that we cannot readily cut the tail off the probability distribution of S.
It would be wrong to think that climate scientists have been ignorant of the non-linear nature of feedbacks on climate sensitivity. Several papers dating back a couple of decades show essentially the same result (for example, Hansen et al., 1984; Schlesinger, 1988; see below for full citations). But Roe and Baker’s paper is probably the most succinct and accessible treatment of the subject to date, and is a timely reminder of some very basic points that are not always appreciated. For example, it is often assumed that the tail on the distribution of climate sensitivity is due to the large uncertainty in some feedbacks, particularly clouds. Roe and Baker make it very clear that this is not the case. (The tail in S results from the probability distribution of the feedback strengths, and unless those uncertainties are distributed very, very differently than the Gaussian distribution assumed by Roe and Baker, the tail will remain). Furthermore, they point out that “uncertainty” in the feedbacks need not mean “lack of knowledge” but may also reflect the complexity of the feedback processes themselves. That is to say, because the strength of the feedbacks are themselves variable, the true climate sensitivity (not just our ability to know what it is) is inherently uncertain.
What will get the most discussion in the popular press, of course, are the policy implications of Roe and Baker’s paper. Myles Allen and David Frame take a stab at this in their Perspective.* Their chief point is that it is probably a bad idea to assign a specific threshold value for CO2 concentrations in the atmosphere, above which “dangerous interference in the climate system” may result. For example, 450 ppm is an oft-cited threshold since this keeps deltaT below 2°C using standard climate sensitivities. But the skewed nature of the distribution of possible sensitivities means that it is much more likely that 450 ppm will give us more than 4.5°C of global warming rather than less than 2°.
Allen and Frame suggest that the way to address this is though an adaptive climate change policy, in which there are movable CO2 concentration targets that can be revised downwards if future observations suggest that the climate sensitivity is indeed greater than the middle IPCC range. We agree that adaptive policies are needed. There is no point in continuing to pursue a 450 ppm stabilization goal in the eventuality that temperatures have already exceeded the expected 2 deg C. More reductions would be called for. Similarly, if temperature rises more slowly than expected, that would buy time. However, in our view, Allen and Frame’s discussion turns the precautionary principle on its head by implying that downward revision can always be done later, after more data are in. But a good adaptive strategy depends on nimble action and forward thinking — both of which are typically in short supply. If reactions to a worse-than-expected climate change are delayed, they make an overshoot of any temperature target very likely, and corrective action very expensive. Thus conservative strategies would seem in order, which probably implies initial targets of much lower than 450 ppm, and still subject to further revision.
The bottom line is that climate sensitivity is uncertain, but we can pretty much rule out low values that would imply there is nothing to worry about. The possibility of high values will be much harder to rule out. This is something policy makers should recognize and confront.
Hansen, J.E., et al., in Climate Processes and Climate Sensitivity, J. E. Hansen, T. Takahashi, Eds. (Geophysical Monograph 29, American Geophysical Union, Washington, DC, 1984), pp. 130–163.
Schlesinger, M.E., 1988: Quantitative analysis of feedbacks in climate model simulations of CO2-induced warming. In Physically-Based Modelling and Simulation of Climate and Climatic Change, M. E. Schlesinger, Ed., NATO Advanced Study Institute Series, Kluwer, Dordrecht, 653-736.
*See also the news article in Nature. And our congratulations to Myles Allen and his colleagues who won the Euro Prix award for their climateprediction.net work.