The long-awaited first paper from the CERN/CLOUD project has just been published in Nature. The paper, by Kirkby et al, describes changes in aerosol nucleation as a function of increasing sulphates, ammonia and ionisation in the CERN-based ‘CLOUD’ chamber. Perhaps surprisingly, the key innovation in this experimental set up is not the presence of the controllable ionisation source (from the Proton Synchrotron accelerator), but rather the state-of-the-art instrumentation of the chamber that has allowed them to see in unprecedented detail what is going on in the aerosol nucleation process (this is according to a couple of aerosol people I’ve spoken about this with).
Technical Note: We have moved the share bar to the left, where it floats. Please let us know if that works better or if you want other sites highlighted.
This morning one of the most important (and most delayed) satellite launches in ages took place. The mission was to launch the Glory satellite into a polar orbit, where three key instruments would have been looking at solar irradiance, aerosols and clouds. Unfortunately, one of the stages failed to separate and the satellite did not make orbit.
The irradiance measurements were to be an important continuation of the SORCE mission results, and are needed to stably continue the Total Solar Irradiance (TSI) timeseries. However the big new measurements were those associated with the Aerosol Polarimeter Sensor (APS). A similar instrument has flown in space twice before (the French-developed POLDER instrument), but unfortunately only for short periods. Its uniqueness lies in its ability to detect aerosols over bright surfaces (like land), and more importantly, to distinguish what kind of aerosols it is seeing. (Update: There is a third POLDER instrument, PARASOL, that is currently in orbit, see comments).
It may seem surprising, but despite many different attempts, almost all remote sensing of aerosols from space is only capable of detecting the total optical depth of all aerosols. MISR can provide some discrimination in special cases (picking out dust via a retrieval of non-spherical particles, or using the single scattering albedo to distinguish black carbon), but overall the estimates mix up sulphates, dust, black carbon, sea salt, nitrates and secondary organics. These originate from different processes, have different properties and different impacts on both radiation and clouds. Sea salt comes from sea spray over the oceans, dust from dry desert areas, black carbon from burning of forests and fossil fuels, sulphates derive from ocean plankton and burning coal, nitrates derive from fertiliser use, car exhausts and lightning, and secondary organics come from the stew of volatile organic compounds from industrial and natural sources alike. There are also pollen, and fat particles from outdoor cooking etc.
Because we can’t easily distinguish what’s what from space, we don’t have good global coverage of exactly how much of the aerosol is anthropogenic, and how much is natural. That uncertainty is a big player in the overall uncertainty in the human caused aerosol radiative forcing. Similarly, we have not been able to tell how much of the aerosol is capable of interacting with liquid or ice clouds (which depends on the different aerosols’ affinity for water), and that impacts our assessment of the aerosol indirect effect. These uncertainties are reflected in the model simulations of aerosol concentrations which all show similar total amounts, but have very different partitions among the different types.
The APS technology is a big step forward on these issues. It turns out that while the reflected SW from many different aerosols is similar, the polarisation of that reflected light depends quite strongly on what kind of aerosol it is. This varies depending on the angle at which the light is shining, So by scanning through the angles and measuring the polarisation, we can get a better constraint on the distribution of key aerosols. Scientists have already been working with aircraft mounted versions of the instrument, and this will continue.
The story of how this launch actually happened is very long and twisted, and needless to say, has taken far longer than anyone envisaged at the start (over a decade ago). With the failure to make orbit this morning, the wait will unfortunately go on.
This is of course a huge setback for the mission team (many of whom I know), and I can only imagine how frustrating this must be. The loss of OCO two years ago was due to a similar problem, though 3 launches since then have been successful (and the same system is being replicated as OCO-2). With the postponement of CLARREO in the proposed 2012 budget, there is a huge hole building in the US contribution to Earth and Sun observing systems.
Working from space is hard, expensive and risky. We cannot take it for granted, and yet we need that information more than ever.
A little behind schedule, I finally found time to read the article in the July 2010 edition of Physics Today “Touring the atmosphere aboard the A-Train” by Tristan S. L’Ecuyer and Jonathan H. Jiang. I think this article is a worth-while read, telling a fascinating story about how new satellite missions lead to greater understanding of our climate system.
Global warming is turning 35! Not only has the current spate of global warming been going on for about 35 years now, but also the term “global warming” will have its 35th anniversary next week. On 8 August 1975, Wally Broecker published his paper “Are we on the brink of a pronounced global warming?” in the journal Science. That appears to be the first use of the term “global warming” in the scientific literature (at least it’s the first of over 10,000 papers for this search term according to the ISI database of journal articles).
In this paper, Broecker correctly predicted “that the present cooling trend will, within a decade or so, give way to a pronounced warming induced by carbon dioxide”, and that “by early in the next century [carbon dioxide] will have driven the mean planetary temperature beyond the limits experienced during the last 1000 years”. He predicted an overall 20th Century global warming of 0.8ºC due to CO2 and worried about the consequences for agriculture and sea level.
Guest commentary by Alan Robock – Rutgers University
Bjorn Lomborg’s Climate Consensus Center just released an un-refereed report on geoengineering, An Analysis of Climate Engineering as a Response to Global Warming, by J Eric Bickel and Lee Lane. The “consensus” in the title of Lomborg’s center is based on a meeting of 50 economists last year. The problem with allowing economists to decide the proper response of society to global warming is that they base their analysis only on their own quantifications of the costs and benefits of different strategies. In this report, discussed below, they simply omit the costs of many of the potential negative aspects of producing a stratospheric cloud to block out sunlight or cloud brightening, and come to the conclusion that these strategies have a 25-5000 to 1 benefit/cost ratio. That the second author works for the American Enterprise Institute, a lobbying group that has been a leading global warming denier, is not surprising, except that now they are in favor of a solution to a problem they have claimed for years does not exist.