RealClimate logo


Technical Note: Sorry for the recent unanticipated down-time, we had to perform some necessary updates. Please let us know if you have any problems.

AGU Days 3&4

Filed under: — group @ 9 December 2011

(Day 1)(Day 2)

Sorry for the slow blogging, but with the AGU fun run starting at 6.15am, and the Awards ending at around ~10pm, and the actual science portion of the day squeezed in the middle, little time was available on Wednesday for reporting. Thursday seemed equally busy. So today you get two days in one.

One session on Wednesday that was really quite good was the session on Earth System Sensitivity. We’ve discussed this before (notably in discussing Hansen’s Target CO2 paper). The main idea is that the sensitivity of the climate system to a radiative forcing is not going to be constrained to effect only the factors included in GCM in 1979. That is, other feedbacks come into play – vegetation, ice sheets, aerosols, CH4 etc. will all change as a function a warming (or cooling), which are not included in the standard climate sensitivity definition. Talks by Eelco Rohling, Dan Lunt, and Jim Hansen all made excellent points on how one should think about constraints on ESS from paleo-climate records. The periods considered were mainly the Pleistocene ice age cycles, the LGM and the Pliocene, but Paul Valdes provided some interesting modeling that also included the Oligocene, the Turonian, the Maastrichtian and Eocene, indicating the importance of the base continental configuration, ice sheet position, and ocean circulation for sensitivity. Vegetation feedbacks were invariably reported as an amplifying feedback – which is interesting because that encompasses both ‘fast’ and ‘slow’ feedbacks.

Wednesday night was the awards, and as we reported, one of us (Gavin) was presented with the inaugural prize for Climate Communication. He will be posting a specific piece on this honor in a couple of days.

Thursday, there was a keynote (video available here) from Ben Santer at the Stephen Schneider event who persuasively argued that in doing the science necessary to refute baseless claims made in the media and in front of Congress, actual progress can be made beyond simply demonstrating that the original claim was made up. Specifically, he addressed a claim made by Will Happer, a Princeton professor, that no models demonstrate decadal variability in trends (which was not the case), and explored in depth the signal to noise ratio in determining climate trends much more comprehensively than had been done previously.

In sessions, there were a lot of papers on new approaches to estimating the climate of the common era (since 0 AD) – many of them using Bayesian methods of one sort or another. Hugues Goosse gave an interesting talk on paleo-data assimilation. A poster session had some first results from the CMIP5 models – including some intriguing results from Ben Booth looking at the Hadley Centre simulations of the role of aerosols in forcing multi-decadal variability in the North Atlantic.

Many of the lectures earlier this week are now available on demand. We hear that the Charney lecture from Graeme Stephens was particularly good.

(Day 5 and wrap up)

AGU 2011: Day 2

Filed under: — group @ 7 December 2011

(Day 1)

Tuesday


There were two interesting themes in the solar sessions this morning. The first was a really positive story about how instrumental differences between rival (and highly competitive) teams can get resolved. This refers to the calibration of measurements of the Total Solar Irradiance (TSI). As is relatively well known, the different satellite instruments over the last 30 or so years have shown a good coherence of variability – especially the solar cycle, but have differed markedly on the absolute value of the TSI (see the figure). In particular, four currently flying instruments (SORCE, ACRIM3, VIRGO and PREMOS) had offsets as large as 5W/m2. However, the development of a test-facility at NASA Langley the
University of Colorado, Laboratory for Atmospheric and Space Physics in Boulder
Colorado
– an effort led by Greg Kopp’s group – has allowed people to test their instruments in a vacuum, with light levels comparable to the solar irradiance, and have the results compared to really high precision measurements. This was a tremendous technical challenge, but as Kopp stated, getting everyone on board was perhaps a larger social challenge.

The facility has enabled the different instrument teams to calibrate their instruments, and check for uncorrected errors, like excessive scattering and diffusive light contamination in the measurement chambers. In doing so, Richard Wilson of the ACRIM group reported that they found higher levels of scattering than they had anticipated, which was leading to slightly excessive readings. Combined with a full implementation of an annually varying temperature correction, their latest processed data product has reduced the discrepancy with the TIM instrument from over 5 W/m2 to less than 0.5 W/m2 – a huge improvement. The new PREMOS instrument onboard Picard, a french satellite, was also tested before launch last year, and they improved their calibration as well – and the data that they reported was also very close to the SORCE/TIM data: around 1361 W/m2 at solar minimum.

The errors uncovered and the uncertainties reduced as a function of this process was a great testament to the desire of everyone concerned to work towards finding the right answer – despite initial assumptions about who may have had the best design. The answer is that space borne instrumentation is hard to do, and thinking of everything that might go wrong is a real challenge.

The other theme was the discussion of the spectral irradiance changes – specifically by how much the UV changes over a solar cycle are larger in magnitude than the changes in the total irradiance. The SIM/SOLSTICE instruments on SORCE have reported much larger UV changes than previous estimates, and this has been widely questioned (see here for a previous discussion). The reason for the unease is that the UV instruments have a very large degradation of their signal over time, and the residual trends are quite sensitive to the large corrections that need to be made. Jerry Harder discussed those corrections and defended the SIM published data, while another speaker made clear how anomalous that data was. Meanwhile, some climate modellers are already using the SIM data to see whether that improves the model simulations of ozone and temperature responses in the stratosphere. However, the ‘observed’ data on this is itself somewhat uncertain – for instance, comparing the SAGE results (reported in Gray et al, 2011) with the SABER results (Merkel et al, 2011), shows a big difference in how large the ozone response is. So this remains a bit of a stumper.

The afternoon sessions on water isotopes in precipitation was quite exciting because of the number of people looking at innovative proxy archives, including cave records of 18O in calcite, or deuterium in leaf waxes, which are extending the coverage (in time and space) of this variable. Even more notable, was the number of these presentations that combined their data work with interpretations driven by GCM models that include isotope tracers that allow for more nuanced conclusions. This is an approach that was pioneered decades ago, but has taken a while to really get used routinely.

(Days 3&4)(Day 5 and wrap up)

References

  1. L.J. Gray, J. Beer, M. Geller, J.D. Haigh, M. Lockwood, K. Matthes, U. Cubasch, D. Fleitmann, G. Harrison, L. Hood, J. Luterbacher, G.A. Meehl, D. Shindell, B. van Geel, and W. White, "SOLAR INFLUENCES ON CLIMATE", Rev. Geophys., vol. 48, 2010. http://dx.doi.org/10.1029/2009RG000282
  2. A.W. Merkel, J.W. Harder, D.R. Marsh, A.K. Smith, J.M. Fontenla, and T.N. Woods, "The impact of solar spectral irradiance variability on middle atmospheric ozone", Geophysical Research Letters, vol. 38, pp. n/a-n/a, 2011. http://dx.doi.org/10.1029/2011GL047561

Ice age constraints on climate sensitivity

Filed under: — group @ 28 November 2011

There is a new paper on Science Express that examines the constraints on climate sensitivity from looking at the last glacial maximum (LGM), around 21,000 years ago (Schmittner et al, 2011) (SEA). The headline number (2.3ºC) is a little lower than IPCC’s “best estimate” of 3ºC global warming for a doubling of CO2, but within the likely range (2-4.5ºC) of the last IPCC report. However, there are reasons to think that the result may well be biased low, and stated with rather more confidence than is warranted given the limitations of the study.

More »

References

  1. A. Schmittner, N.M. Urban, J.D. Shakun, N.M. Mahowald, P.U. Clark, P.J. Bartlein, A.C. Mix, and A. Rosell-Mele, "Climate Sensitivity Estimated from Temperature Reconstructions of the Last Glacial Maximum", Science, vol. 334, pp. 1385-1388, 2011. http://dx.doi.org/10.1126/science.1203513

2000 Years of Sea Level (+updates)

Filed under: — stefan @ 20 June 2011 - (Deutsch)

A group of colleagues have succeeded in producing the first continuous proxy record of sea level for the past 2000 years. According to this reconstruction, 20th-Century sea-level rise on the U.S. Atlantic coast is faster than at any time in the past two millennia.

Good data on past sea levels is hard to come by. Reconstructing the huge rise at the end of the last glacial (120 meters) is not too bad, because a few meters uncertainty in sea level or a few centuries in dating don’t matter all that much. But to trace the subtle variations of the last millennia requires more precise methods.

More »

Unforced variations: Apr 2011

Filed under: — group @ 1 April 2011

This months open thread. There are some Items of potential interest::

or whatever you like.


Switch to our mobile site