Moberg et al: Highly variable Northern Hemisphere temperatures?

by William Connolley and Eric Steig

The 10th Feb edition of Nature has a nice paper “Highly variable Northern Hemisphere temperatures reconstructed from low- and high-resolution proxy data” by Anders Moberg, DM. Sonechkin, K Holmgren, NM Datsenko, & W Karlin (doi:10.1038/nature03265). This paper takes a novel approach to the problem of reconstructing past temperatures from paleoclimate proxy data. A key result is a reconstruction showing more century-scale variability in mean Northern Hemisphere temperatures than is shown in previous reconstructions. This result will undoubtedly lead to much discussion and further debate over the validity of previous work. The result, though, does not fundamentally change one of the most discussed aspects of that previous work: temperatures since 1990 still appear to be the warmest in the last 2000 years.

Click on image for larger version from original source

The novel thing about this paper is the use of wavelets (a statistical tool common in image-processing software) to separate the low and high-frequency components of the data. The temperature reconstruction is then formed by combining the high-frequency (< 80 y) signal from tree rings with the low-frequency (> 80 y) signal from lake sediments and other such non-annually resolved proxies. This does two things that may be important: it allows the non-annually resolved proxies to be used (recent previous reconstructions, e.g. MBH98, Esper et al., used only those with at least a value each year, to allow calibration against the instrumental record; Moberg’s approach allows the use of data that only provide 50 y means); and it throws away the long-term signal from the tree rings, which they consider to be untrustworthy. Other techniques have also been used or suggested for the merging of the low and high frequency signals (Guiot, 1985; Rutherford et al. (2005)).

The result is a more long-term variable signal than, e.g., Mann, Bradley and Hughes (1998; henceforth MBH98). Moberg et al. end up with two “warm peaks” in the smoothed record around 1000 and 1100 A.D., at 0 ºC on their anomaly scale. A few individual years within these intervals are almost +0.4 ºC warmer than the average. In comparison, the most recent data from the instrumental record post 1990 peak at +0.6 or a bit more, on the same scale. The coldest years of the so called “Little Ice age” occur around 1600 and are about -0.7 colder than average, with individual years down to -1.2 ºC.

These results are bound to stir up interest beyond the scientific community, since the “hockey stick” shape of previous reconstructions has become so totemic (although just about everyone agrees that there is no need for this “totemising”). We hope that press reports about this paper that mention the increased variability will also emphasize the other key result: that there is “no evidence for any earlier periods in the last millennium with warmer conditions than the post-1990 period – in agreement with previous similar studies (1-4,7)” where (1)is MBH98, (2) is MBH99, (7) is Mann and Jones ’03. The “News” article in Nature explicitly rejects the idea that this means we’re not causing the current warming. And it quotes statistician Hans von Storch (who has been quite critical of the earlier work): “it does not weaken in any way the hypothesis that recent observed warming is a result mainly of human activity”.

There are a couple of technical concerns with the paper that are worth discussing, and which the compressed space of the Nature publication didn’t leave the authors space to address fully.

First, its not too clear that the use of wavelets has added much to the mix. Moberg et al. use the wavelets to merge the high-frequency data from the tree rings with low-frequency data from the other sources, which have lower temporal resolution. But that means the low-res proxies are doing all the work, and the tree rings are just adding a fringe of noise that your eye reads as sort-of error bars.

Page 1 of 2 | Next page