Gavin Schmidt & Michael Mann
Extending the instrumental record of climate beyond the late 19th Century when many of the national weather centers were first started is an important, difficult and undervalued task. It often is more akin to historical detective work than to climatology and can involve long searches in dusty archives, the ability to read archaic scripts and handwriting, and even Latin translations (for instance, when going through the archives of the Paris Observatory) (sounds like a recent bestseller, only less lucrative, no?).
Why bother? Well, it is unfortunate, though probably not coincidental, that the modern record starts at the same time that significant modifications of atmopsheric composition (greenhouse gases, aerosols etc.) were occuring on a global scale. Thus this period is not ideal for assessing the magnitude of natural changes (both intrinsic and forced by natural processes like solar variability or volcanic eruptions) since there is likely a contamination from human-related causes. So extending instrumental records back as far as possible is an important approach to providing a context for modern changes.
A paper published earlier this year (Vinther et al, 2006) on extending the records around Southern Greenland was a great example of how this can be successfuly done, the problems that occur and the techniques that have been developed to deal with them. The authors painstakingly digitised older archived data (back to before 1800 in one case), worked out the conventions that were used, and with a knowledge of present day climate, pieced together series and spotted potential shifts in observing location or time of day that would otherwise contaminate the record.
Proxy records (ice cores, ocean sediments etc.) that are related to climate (in some imperfectly known way, but with non-climatic ‘noise’) go back further. But they need to be calibrated to the instrumental record in order to be used quantitatively. Given that proxy records don’t always extend to the present (due to collection dates, collection method, the physics or biology of the specific proxy) and that it is important to retain some instrumental data for validation of any calibration, there are only ~60-100 years of record left for the calibration – making it very difficult to assess how well the low frequency component (a few decades or longer) is represented (and much of the recent attention to recent proxy reconstructions really relates to exactly this point, rather than technical arguments about data processing). Of course, proxy records can also be usefully combined in a qualitative ways that don’t require calibration (such as Osborn and Briffa, 2006).
The work by Vinther and colleagues in Southern Greenland is therefore key to helping calibrate the Greenland ice core records, and impressively, the correlations to the older data are as good as to the recent record, allowing us to have a little more confidence in the even longer term proxy data for this region. So a good result then, and a paper worth reading for anyone who is actually interested in how these things are done.
However, there is a bit of a cottage industry of people who micro-parse every new paper to see how it projects onto a narrow view of the climate change debate regardless of their actual relevance. This is a travesty of the way science is supposed to work and all too often ends up getting the story completely wrong. One persistent abuser of this technique is Pat Michaels, and in a recent piece he was unable to resist claiming that the century-scale trends (~0.8 C from 1891-1900 to 1991-2000 in the annual mean) seen in this extended Southern Greenland data apparently invalidate the notion of polar amplification as predicted by the ‘models’. This of course was not the conclusion of the authors themselves (though presumably if they felt that this was true they might have said so).
So what is wrong with this claim? Firstly, models do indeed predict polar amplification (particularly in the Arctic and particularly in winter) of global warming trends (see our previous piece on this concept) in general. But do they predict it for the 20th Century trend? and specifically in Southern Greenland? Michaels doesn’t enlighten us, preferring generalized vague statements to actual data-model comparisons.
Indeed, the dampened late 20th century winter warming over a substantial part of Greenland, particularly the western and southern regions emphasized by the network of stations analyzed by Vinther et al, is known (see e.g. this NOAA page) to be associated with a trend toward the positive phase of the Arctic Oscillation (‘AO’) pattern. Whether or not this latter trend can, in turn, be related to anthropogenic climate change is not yet agreed upon, but a plausible argument for this has indeed been made in the peer-reviewed literature. Nonetheless, even if the substantial recent trend in the AO pattern is simply a product of natural multidecadal variability in North Atlantic climate, it underscores the fact that western and southern Greenland is an extremely poor place to look, from a signal vs. noise point of view, for the large-scale polar amplification signature of anthropogenic surface warming. This is a fairly basic point.
What would a rational data-model comparison look like? Since the data show southern Greenland temperatures over the last 150 years, it would be most useful to look at model simulations for exactly that period, run with the best guesses for CO2, solar and volcanic forcing etc. Fortunately, over 20 model groups have deposited these simulations in a public database at PCMDI – and anyone who is actually interested in seeing what the models produce can have access (you need to register, but it’s just a formality).
What do they show? Interestingly enough, the models do not predict large trends in the vicinity of Southern Greenland over the last 100 or so years (the figure shows the ensemble mean results just from the GISS model, but others are similar). Mainly this is because these areas are relatively close to both open water and the ice sheet and that keeps temperatures pretty stable. Like a glass of water with ice cubes, any extra energy tends to go into melting rather than temperature changes. And in this region, changes in the AO pattern discussed above also appear to play some role. (It should also be noted that the trends in this region are not larger than the standard deviation, and so any one realisation is likely to have a lot of variability, as is seen in the observations).
But if the models don’t show much change over the last 100 years, surely the predictions for the future indicate that this area will be hit hard? Again, no. Southern Greenland turns out to have one of the slowest rates of warming of any land area in any of the scenarios (the figure is the mean over all models for the SRES A1B scenario). To some extent, this is again due to the factors mentioned above, but additionally, the models predict that the North Atlantic as a whole will not warm as fast as the rest of globe (due to both the deep mixed layers in this region which have a large thermal inertia and a mild slowdown in the ocean heat transports). This is of course some positive news for the Greenland ice sheet, but the warming there is already substantial enough to cause significant net melting.
All this to demonstrate that when people use vague generalisations – ‘models’ predict this, ‘scientists’ say that – when there are specific data that could be used instead (which model? what period? which scientist?), be wary – they are usually trying to pull a fast one.