by William Connolley and Eric Steig
The 10th Feb edition of Nature has a nice paper “Highly variable Northern Hemisphere temperatures reconstructed from low- and high-resolution proxy data” by Anders Moberg, DM. Sonechkin, K Holmgren, NM Datsenko, & W Karlin (doi:10.1038/nature03265). This paper takes a novel approach to the problem of reconstructing past temperatures from paleoclimate proxy data. A key result is a reconstruction showing more century-scale variability in mean Northern Hemisphere temperatures than is shown in previous reconstructions. This result will undoubtedly lead to much discussion and further debate over the validity of previous work. The result, though, does not fundamentally change one of the most discussed aspects of that previous work: temperatures since 1990 still appear to be the warmest in the last 2000 years.
The novel thing about this paper is the use of wavelets (a statistical tool common in image-processing software) to separate the low and high-frequency components of the data. The temperature reconstruction is then formed by combining the high-frequency (< 80 y) signal from tree rings with the low-frequency (> 80 y) signal from lake sediments and other such non-annually resolved proxies. This does two things that may be important: it allows the non-annually resolved proxies to be used (recent previous reconstructions, e.g. MBH98, Esper et al., used only those with at least a value each year, to allow calibration against the instrumental record; Moberg’s approach allows the use of data that only provide 50 y means); and it throws away the long-term signal from the tree rings, which they consider to be untrustworthy. Other techniques have also been used or suggested for the merging of the low and high frequency signals (Guiot, 1985; Rutherford et al. (2005)).
The result is a more long-term variable signal than, e.g., Mann, Bradley and Hughes (1998; henceforth MBH98). Moberg et al. end up with two “warm peaks” in the smoothed record around 1000 and 1100 A.D., at 0 ºC on their anomaly scale. A few individual years within these intervals are almost +0.4 ºC warmer than the average. In comparison, the most recent data from the instrumental record post 1990 peak at +0.6 or a bit more, on the same scale. The coldest years of the so called “Little Ice age” occur around 1600 and are about -0.7 colder than average, with individual years down to -1.2 ºC.
These results are bound to stir up interest beyond the scientific community, since the “hockey stick” shape of previous reconstructions has become so totemic (although just about everyone agrees that there is no need for this “totemising”). We hope that press reports about this paper that mention the increased variability will also emphasize the other key result: that there is “no evidence for any earlier periods in the last millennium with warmer conditions than the post-1990 period – in agreement with previous similar studies (1-4,7)” where (1)is MBH98, (2) is MBH99, (7) is Mann and Jones ’03. The “News” article in Nature explicitly rejects the idea that this means we’re not causing the current warming. And it quotes statistician Hans von Storch (who has been quite critical of the earlier work): “it does not weaken in any way the hypothesis that recent observed warming is a result mainly of human activity”.
There are a couple of technical concerns with the paper that are worth discussing, and which the compressed space of the Nature publication didn’t leave the authors space to address fully.
First, its not too clear that the use of wavelets has added much to the mix. Moberg et al. use the wavelets to merge the high-frequency data from the tree rings with low-frequency data from the other sources, which have lower temporal resolution. But that means the low-res proxies are doing all the work, and the tree rings are just adding a fringe of noise that your eye reads as sort-of error bars.
Second, because they have used the wavelets, they end up with a non-dimensional signal which has to be normalised against the instrumental record from 1859 to 1979. So if (for example) their reconstruction was too flat in that period, the renormalization would pump it up in the pre-industrial period. Or if too noisy, it would get toned down. The adjustment is done to match “mean value and variance” but (being in Nature) this is a bit brief: do they mean the variance of the smoothed or full series? If the full series (as we suppose), then most of the variance is probably coming from the interannual variations of the tree rings, *but* the bit that’s really important is the long-term signal. If its the long term signal that is being matched, then you only have 1 or 2 degrees of freedom in the calibration period (1859-1979). To some extent this uncertainty is eased by their figure 2a, which shows the original records used, with the scaling in ºC. Properly nailing down all these issues is going to take a while.
These points will undoubtedly be answered, and the authors of other temperature reconstructions (including other authors at RealClimate) will have comments on the method. It is worth noting that, in any case, the results of Moberg et al., if they prove correct, would not require any change in the the IPCC TAR summary for policymakers, which says “the increase in temperature in the 20th century is likely to have been the largest of any century during the past 1,000 years. It is also likely that, in the Northern Hemisphere, the 1990s was the warmest decade and 1998 the warmest year (Figure 1b). Because less data are available, less is known about annual averages prior to 1,000 years before present…”. The last statement is based largely on MBH99. All this remains true with the Moberg reconstruction, and is actually somewhat strengthened, since their results go back 1000 years before MBH99.