The 10th Feb edition of Nature has a nice paper “Highly variable Northern Hemisphere temperatures reconstructed from low- and high-resolution proxy data” by Anders Moberg, DM. Sonechkin, K Holmgren, NM Datsenko, & W Karlin (doi:10.1038/nature03265). This paper takes a novel approach to the problem of reconstructing past temperatures from paleoclimate proxy data. A key result is a reconstruction showing more century-scale variability in mean Northern Hemisphere temperatures than is shown in previous reconstructions. This result will undoubtedly lead to much discussion and further debate over the validity of previous work. The result, though, does not fundamentally change one of the most discussed aspects of that previous work: temperatures since 1990 still appear to be the warmest in the last 2000 years.
Readers of the Feb. 14th, 2005 Wall Street Journal may have gotten the impression that RealClimate is in some way affiliated with an environmental organisation. We wish to stress that although our domain is being hosted by Environmental Media Services, and our initial press release was organised for us by Fenton Communications, neither organization was in any way involved in the initial planning for RealClimate, and have never had any editorial or other control over content. Neither Fenton nor EMS has ever paid any contributor to RealClimate.org any money for any purpose at any time. Neither do they pay us expenses, buy our lunch or contract us to do research. All of these facts have always been made clear to everyone who asked (see for instance: http://www.sciencemag.org/content/vol306/issue5705/netwatch.shtml).
Here’s a curious observation. Some commentators who for years have been vocally decrying the IPCC consensus are lining up to support the ‘Ruddiman’ hypothesis. A respected paleoceanographer, Bill Ruddiman has recently argued that humans have been altering the level of important greenhouse gases since the dawn of agriculture (5 to 8000 years ago), and in so doing have prevented a new ice age from establishing itself. This intriguing idea is laid out in a couple of recent papers (Ruddiman, 2003; Ruddiman et al, 2005) and has received a fair degree of media attention (e.g. here, and here).
The conference last week in Exeter on “Avoiding Dangerous Climate Change” grew out of a speech by UK Prime Minister Tony Blair. He asked “What level of greenhouse gases in the atmosphere is self-evidently too much?” and “What options do we have to avoid such levels?”. The first question is very interesting, but also very difficult. As Roger Pielke has noted the conference organisers actually choose three “key questions”:
For different levels of climate change what are the key impacts, for different regions and sectors, and for the world as a whole?
What would such levels of climate change imply in terms of greenhouse gas stabilisation concentrations and emission pathways required to achieve such levels?
What technological options are there for achieving stabilisation of greenhouse gases at different stabilisation concentrations in the atmosphere, taking into account costs and uncertainties?
It is worth thinking about the difference between the initial aim and the “key questions” chosen. Question 1 is essentially IPCC WGII impacts); question 2 is firmly WGI (how-much-climate-change); question 3 is fairly WG III (mitigation, including technical options). I guess they switched questions 1 and 2 round to avoid making the identification too obvious. The conference steering committee report makes it very clear that they are building on the IPCC TAR foundation.