RealClimate logo


AGU 2011: Day 2

Filed under: — group @ 7 December 2011

(Day 1)

Tuesday


There were two interesting themes in the solar sessions this morning. The first was a really positive story about how instrumental differences between rival (and highly competitive) teams can get resolved. This refers to the calibration of measurements of the Total Solar Irradiance (TSI). As is relatively well known, the different satellite instruments over the last 30 or so years have shown a good coherence of variability – especially the solar cycle, but have differed markedly on the absolute value of the TSI (see the figure). In particular, four currently flying instruments (SORCE, ACRIM3, VIRGO and PREMOS) had offsets as large as 5W/m2. However, the development of a test-facility at NASA Langley the
University of Colorado, Laboratory for Atmospheric and Space Physics in Boulder
Colorado
– an effort led by Greg Kopp’s group – has allowed people to test their instruments in a vacuum, with light levels comparable to the solar irradiance, and have the results compared to really high precision measurements. This was a tremendous technical challenge, but as Kopp stated, getting everyone on board was perhaps a larger social challenge.

The facility has enabled the different instrument teams to calibrate their instruments, and check for uncorrected errors, like excessive scattering and diffusive light contamination in the measurement chambers. In doing so, Richard Wilson of the ACRIM group reported that they found higher levels of scattering than they had anticipated, which was leading to slightly excessive readings. Combined with a full implementation of an annually varying temperature correction, their latest processed data product has reduced the discrepancy with the TIM instrument from over 5 W/m2 to less than 0.5 W/m2 – a huge improvement. The new PREMOS instrument onboard Picard, a french satellite, was also tested before launch last year, and they improved their calibration as well – and the data that they reported was also very close to the SORCE/TIM data: around 1361 W/m2 at solar minimum.

The errors uncovered and the uncertainties reduced as a function of this process was a great testament to the desire of everyone concerned to work towards finding the right answer – despite initial assumptions about who may have had the best design. The answer is that space borne instrumentation is hard to do, and thinking of everything that might go wrong is a real challenge.

The other theme was the discussion of the spectral irradiance changes – specifically by how much the UV changes over a solar cycle are larger in magnitude than the changes in the total irradiance. The SIM/SOLSTICE instruments on SORCE have reported much larger UV changes than previous estimates, and this has been widely questioned (see here for a previous discussion). The reason for the unease is that the UV instruments have a very large degradation of their signal over time, and the residual trends are quite sensitive to the large corrections that need to be made. Jerry Harder discussed those corrections and defended the SIM published data, while another speaker made clear how anomalous that data was. Meanwhile, some climate modellers are already using the SIM data to see whether that improves the model simulations of ozone and temperature responses in the stratosphere. However, the ‘observed’ data on this is itself somewhat uncertain – for instance, comparing the SAGE results (reported in Gray et al, 2011) with the SABER results (Merkel et al, 2011), shows a big difference in how large the ozone response is. So this remains a bit of a stumper.

The afternoon sessions on water isotopes in precipitation was quite exciting because of the number of people looking at innovative proxy archives, including cave records of 18O in calcite, or deuterium in leaf waxes, which are extending the coverage (in time and space) of this variable. Even more notable, was the number of these presentations that combined their data work with interpretations driven by GCM models that include isotope tracers that allow for more nuanced conclusions. This is an approach that was pioneered decades ago, but has taken a while to really get used routinely.

(Days 3&4)(Day 5 and wrap up)

References

  1. L.J. Gray, J. Beer, M. Geller, J.D. Haigh, M. Lockwood, K. Matthes, U. Cubasch, D. Fleitmann, G. Harrison, L. Hood, J. Luterbacher, G.A. Meehl, D. Shindell, B. van Geel, and W. White, "SOLAR INFLUENCES ON CLIMATE", Rev. Geophys., vol. 48, 2010. http://dx.doi.org/10.1029/2009RG000282
  2. A.W. Merkel, J.W. Harder, D.R. Marsh, A.K. Smith, J.M. Fontenla, and T.N. Woods, "The impact of solar spectral irradiance variability on middle atmospheric ozone", Geophysical Research Letters, vol. 38, pp. n/a-n/a, 2011. http://dx.doi.org/10.1029/2011GL047561

AGU 2011: Day 1

Filed under: — group @ 6 December 2011

A number of us are at the big AGU meeting in San Francisco this week (among 20,000 other geophysicists). We will try to provide a daily summary of interesting talks and posters we come across, but obviously this won’t be complete or comprehensive.

Other bloggers are covering the event (twitter #AGU11). A small number of the posters are viewable on their website as well.

Monday

Two good general talks this morning – Harry Elderfield gave the Emiliani lecture and started off with a fascinating discussion of the early discussions of Harold Urey and Cesare Emiliani on isotope thermometry – and showed that even Nobel Prize winners (Urey – for the discovery of deuterium) are sometimes quite wrong – in this case for insisting that the overall isotope ratio in the ocean could not ever change. (This talk should become available online here).

The second general talk was by author Simon Winchester who excellently demonstrated how to communicate about geology by using human stories. He gave a number of vignettes from his latest book about the Atlantic ocean – including stories of the shipwreck of the Dunedin Star on the ‘Skeleton coast’ of Southern Africa, time on St Helena, and the fate of his book on the Pacific that apparently only sold 12 copies… He finished with a mea culpa and gracious apology to the assorted geophysicists for his rather hurried comments on the Tohoku earthquake disaster that caused some consternation earlier this year. In his defense, he only had 90 minutes to write what he was unaware would be the Newsweek cover story that week.

In the science sessions in the afternoon, there was some good talks related to attributing extreme events including Marty Hoerling discussing the Moscow heat wave and a very different perspective from the cpdn group in Oxford. It would have been good to have had some actual discussion between the different people, but AGU is not conducive to much back and forth because of the very tight scheduling. The oxford group estimated (based on volunteer computing) that the likelihood of the Russian heat wave was something like 3 times more likely with 2000’s background climate vs the 1980’s. Some good points were made about the non-Gaussian nature of observed distributions the semantic challenges in explain attribution when there are both proximate and ultimate causes. Kerry Emanuel gave an update of his views on hurricane climate connections.

In the next door session, there was interesting discussion on the philosophy of climate modelling (from actual philosophers!) and the strategies that need to be adopted in dealing with the multi-model ensembles of CMIP3 and CMIP5.

(Day 2)(Days 3&4)(Day 5 and wrap up)

Two-year old turkey

Filed under: — gavin @ 22 November 2011

The blogosphere is abuzz with the appearance of a second tranche of the emails stolen from CRU just before thanksgiving in 2009. Our original commentary is still available of course (CRU Hack, CRU Hack: Context, etc.), and very little appears to be new in this batch. Indeed, even the out-of-context quotes aren’t that exciting, and are even less so in-context.

A couple of differences in this go around are worth noting: the hacker was much more careful to cover their tracks in the zip file they produced – all the file dates are artificially set to Jan 1 2011 for instance, and they didn’t bother to hack into the RealClimate server this time either. Hopefully they have left some trails that the police can trace a little more successfully than they’ve been able to thus far from the previous release.

But the timing of this release is strange. Presumably it is related to the upcoming Durban talks, but it really doesn’t look like there is anything worth derailing there at all. Indeed, this might even increase interest! A second release would have been far more effective a few weeks after the first – before the inquiries and while people still had genuine questions. Now, it just seems a little forced, and perhaps a symptom of the hacker’s frustration that nothing much has come of it all and that the media and conversation has moved on.

If anyone has any questions about anything they see that seems interesting, let us know in the comments and we’ll see if we can provide some context. We anticipate normal service will be resumed shortly.

Conference conversations

Rasmus & Gavin

The reason why scientists like going to conferences (despite them often being held in stuffy hotel basements) is because of the conversations. People can be found who know what they are talking about, and discussions can be focused clearly on what is important, rather than what is trivial. The atmosphere at these conferences is a mix of excitement and expectations as well as pleasure at seeing old friends and colleagues.

The two of us just got back from the excellent ‘Open Science Conference‘ organised by the World Climate Research Programme (WCRP) in Denver Colorado. More than 1900 scientists participated from 86 different countries, and the speakers included the biggest names in climate research and many past and present IPCC authors.

Open Science Conference

More »

The Climate Data Guide

Filed under: — Jim @ 30 October 2011

The National Center for Atmospheric Research (NCAR) has, in the last few months, developed an interesting and potentially very useful website The Climate Data Guide devoted to the ins and outs of obtaining and analyzing the various existing climatic data sets. The site describes itself as “…a focal point for expert-user guidance, commentary, and questions on the strengths and limitations of selected observational data sets and their applicability to model evaluations.”

There are already many climate data set websites in existence, and lists of links to same, including at this site. Some of them host the actual data, while others provide various statistical analysis or graphing/visualization tools, all of which are helpful. What makes this new site unique is: (1) expert users contribute pages describing and pointing to various existing data sources within certain topic areas, (2) explanations of various existing data formats, gridding approaches, etc, (3) an online discussion forum dealing with the appropriateness of particular data sets for addressing particular scientific questions, and (4) a news section as well as links to a very wide range of data repositories, among other things. Here for example, is the page summarizing the existing reanalysis data sets.

The site, sponsored by the NSF, appears to be a unique and valuable approach to advancing climate data analysis. We encourage everyone to check it out, register as members as appropriate, etc. This would also be a good place to discuss or point to other useful data and analysis oriented sites that are out there.


Switch to our mobile site