RealClimate

Comments

RSS feed for comments on this post.

  1. But the world is warming overall and surely that is a statistical fact that cannot be ignored?

    Comment by pete best — 4 Oct 2006 @ 11:28 AM

  2. Very very helpful, as always, Gavin.

    And also conveys why the evidence for human-forced warming does not fit easily into the black-and-white framework that the media, and society, more generally, try to apply to it.

    Comment by Andy Revkin — 4 Oct 2006 @ 11:31 AM

  3. I think it would help to include discussion on methane and CO2 feedbacks (from thawing permafrost) and on reduced CO2 absorption (by more acidic warmerr oceans) – in relation to the bit of context you presented above.

    Comment by Pat Neuman — 4 Oct 2006 @ 11:41 AM

  4. it does seem that the forcings and the temperature rise are occuring at the same time, it seems to me that you are suggesting that since evidence is ambigous (in your view) we should continue the experiment of increaseing greenhouse gases to see if the global temperature will continus to rise or at least give us more concrete evidence,
    i would like to suggest a new experiment, reduce the amount of suspected greenhouse gases and see if the temperature of the globe starts to fall, or at least stabilize, embarking on such an experiment might have some very interesting social and technological side effects

    Comment by ron jeffers — 4 Oct 2006 @ 12:24 PM

  5. Sure there may be several different gases or concoction of gases that are causing the effect of global warming. But isn’t the evidence clear that we are currently experiencing an unprecedented level of temperature increase and that has risen in line with CO2, which is generated by record population levels and demands for energy. Given lack of other evidence that is causing this effect and given common sense, I would presume that it isn’t hard to surmise that if we continue to live as we do and for the world population to continue growing as it is now, we WILL run out of resources, assuming our own wastes don’t kill us first. If we want to avoid doing that, we better address these issues now, or as many of them as we believe we can.

    Comment by Kenneth Ng — 4 Oct 2006 @ 12:38 PM

  6. I recently read somewhere (I thought ClimateArk, but can’t find it) that according to some study atmospheric methane is now found to be on the increase & had been suppressed for some decades by some factors, but now is not only increasing as expected, but even more. That’s what my faulty memory still has.

    And my understanding is that even if methane is short-lived in the atmosphere it degrades into CO2 (is that right?), which is long lived in the atmosphere — is that included in your “indirected effects” re methane?

    Then also, if methane (a much more potent GHG) has a short life of up to 10 yrs, the fact that the methane releases & increases are really rapid (geological speaking) might be something important to consider…which David Archer mentioned a couple of years ago on this site. We may not just be repeating some past GW extinction level epoch that took hundreds or thousands of years to build up & play out — it seems we’re doing it with a vengeance as fast as we can. This may be new territory for planet earth, with the (geologically speaking) rapidity being a new factor.

    I’m getting the sense that as science reports slowly & meticulously grind out the overall thrust continues to be “It’s worse than we earlier thought.”

    Comment by Lynn Vincentnathan — 4 Oct 2006 @ 2:08 PM

  7. Gavin- You dismissed the analysis that I did too cavalierly in your text

    “Recently, Roger Pielke Sr. came up with a (rather improbably precise) value of 26.5% for the CO2 contribution. This was predicated on the enhanced methane forcing mentioned above (though he didn’t remove the ozone effect, which was inconsistent), an unjustified downgrading of the CO2 forcing (from 1.4 to 1.1 W/m2), the addition of an estimated albedo change from remote sensing (but there is no way to assess whether that was either already included (due to aerosol effects), or is a feedback rather than a forcing). A more appropriate re-calculation would give numbers more like those discussed above (i.e. around 30 to 40%)”

    The number I presented was just the result of the assessment of the relative magnitudes of the radiative forcing that is in the literature; it is these values of radiative forcing that you should comment on (such as tropospheric black carbon; black carbon deposition). I am certainly comfortable with rounding to the nearest value of ten, which is 30% which is within the range that you present in your weblog! Thus your posting is actually supportive of the short analysis that I comppleted, but the sense of your post is otherwise.

    I invite interested Real Climate readers to read the several postings on Climate Science on this issue; e.g.

    http://climatesci.atmos.colostate.edu/2006/04/27/what-fraction-of-global-warming-is-due-to-the-radiative-forcing-of-increased-atmospheric-concentrations-of-co2/

    http://climatesci.atmos.colostate.edu/2006/05/05/co2h2o/

    http://climatesci.atmos.colostate.edu/2006/05/10/more-on-the-relative-importance-of-the-radiative-forcing-of-co2/

    Your weblog also is incomplete in focusing on just CO2 as a long lived climate forcing. Land use/land cover change, and several of the aerosol effects (e.g. nitrogen deposition) are also long term forcings of the climate system; see

    http://climatesci.atmos.colostate.edu/2006/06/27/lags-in-the-climate-system/

    [Response: Hi Roger, Thanks for the comment. The point of the post was to point out how arbitrary some of these percentages are (including yours), not to support one or the other. But related to your specific calculation I maintain that you misread the Shindell et al paper (on which I was a co-author) - it does not give an increase in methane forcing alone. Instead, it restates the combined forcing of methane and ozone to be a forcing by methane + CO and NOx and since the (negative) NOx term is small (0.1 W/m2), it makes much less of difference than you conclude. Additionally, your downgrading of the forcing by CO2 was simply wrong. That number (1.4 W/m2 in 2001, 1.5 now) is independently derived and shouldn't be changed to compensate for an increase in methane. I don't think we disagree on the importance of including all 20th Century forcings, and as physics of new effects become better quantified they should be included as well. - gavin]

    Comment by Roger Pielke Sr — 4 Oct 2006 @ 2:22 PM

  8. It is very difficult to find real skeptics. That is people who have serious questions regarding the science involved and who are really interested in the issue. 95%+ are simply deniers who keep the focus firmly on basic science that is totally not under debate. When you do try to point out the science, they merely focus on all the “uncertainty” in the conclusions.

    However, they are helpful in that by taking up some extreme points of view (The Royal Society is Stalinist) they push more “moderate” skeptics closer to the science behind the theory of climate change.

    Comment by Mark UK — 4 Oct 2006 @ 3:42 PM

  9. re #3 and #6: Rate of increase of atmospheric methane has leveled off. This is not what one would expect from increased melting of permafrost.

    Comment by David C. Greene — 4 Oct 2006 @ 3:53 PM

  10. Can anyone point me to an explanation for the apparent leveling of the increase in global temperature from 1940 to about 1970?

    Thanks

    Comment by Mark Zimmerman — 4 Oct 2006 @ 4:27 PM

  11. Well, you’ve found a skeptic, although not a denier (that’s a metric).

    I have a problem with the grammar of your statement “global mean surface temperatures”.

    Literally, this means that the average temperature at every point on the earth is rising, which is untrue. You should say something like “The mean temperature of the whole earth”, but I have a problem with that, too.

    Temperature is a poor measure of the energy in a body; most materials, like water and air, have a non-linear relationship between temperature and energy, so I see no basis for an application of the arimetic mean with futher qualification about the distribution of measurements.

    This may seem a pedantic point, but my information is that oxygen disobeys Boyle’s law by about 1 part per thousand over a pressure range of 1 to 2 atm, and the temperature change you are calculating is about 0.6 degrees above 15 degrees kelvin, or 288 degrees absolute, which is about 2 parts per thousand, so I would welcome your comments on the effects of the non-linearities of the atmosphere on your models.

    Comment by Tim Hughes — 4 Oct 2006 @ 4:39 PM

  12. #3, #6, #9
    Mystery of Methane Levels in 90′s Seems Solved – New York Times
    A new study suggests that the reason why the atmospheric concentration of methane slowed in the 1990′s was related to the collapse of the Soviet Union.
    http://www.nytimes.com/2006/09/28/science/28methane.html?ref=science -

    Comment by Hank Roberts — 4 Oct 2006 @ 4:56 PM

  13. > 1940s-1970s
    Google search:
    http://www.google.com/search?q=global+temperature+1940s+1970s+particulates&start=0

    Example:
    http://www.stanfordreview.org/Archive/Volume_XXXVI/Issue_8/Opinions/opinions1.shtml

    —excerpt—
    “While global temperatures show an upward trend since 1860, dimming and cooling started to outweigh the effects of global warming in the late 1940′s. Then starting in 1970 with the Clean Air Act in the United States and similar policies in Europe, atmospheric sulfate aerosols declined significantly. The EPA reports that in the U.S. alone from 1970 to 2005, total emissions of the six principal air pollutants, including PM’s, dropped by 53 percent. In 1975, the masked effects of trapped greenhouse gases finally started to emerge and have dominated ever since.”
    —end excerpt—-

    Comment by Hank Roberts — 4 Oct 2006 @ 7:37 PM

  14. Hi Gavin- Thank you for your reply to #7.

    With respect to your comment on the fractional contribution of methane, I used the information from http://www.pollutiononline.com/content/news/article.asp?DocID=%7B92402192-8574-45C2-8319-32A75F1E8ECE%207D&Bucket=Current+Headlines&VNETCOOKIE=NO that stated,

    “According to new calculations, the impacts of methane on climate warming may be double the standard amount attributed to the gas. The new interpretations reveal methane emissions may account for a third of the climate warming from well-mixed greenhouse gases between the 1750s and today. The IPCC report, which calculates methane’s affects once it exists in the atmosphere, states that methane increases in our atmosphere account for only about one sixth of the total effect of well-mixed greenhouse gases on warming.”

    Were your conclusions misrepresented?

    On the magnitude of the radiative forcing of CO2, if the fraction of the radiative forcing attributed to the well-mixed greenhouse gases of methane is increased, and the total radiative forcing of the well-mixed greenhouse gases is unchanged, the fractional contribution attributed to CO2 must decrease. Are you thus suggesting that the IPCC value for the total of the well-mixed greenhouse gases needs to be increased to a value above 2.4 Watts per meter squared (the IPCC figure is presented, for example, at http://darwin.nap.edu/books/0309095069/html/3.html) ?

    Finally, there is also the issue of time over which these radiative forcings have been increasing. CO2 has been a significant radiative forcing since the industrial period began, while the large increase in the input of black carbon into the troposphere, for example, has been more recent. If this is so, the “global average temperature” has responded (i.e. equilibrated) to a fraction of the CO2 that was input in the earlier years, such that IPCC Figure overstates the current contribution of CO2 to the global average radiative forcing (since that Figure presents a difference between preindustrial times and 2000, not the current radiative forcing). In the preparation of our 2005 NRC Report, the estimate in our discussions was that a value of 80% is a reasonable estimate of the added CO2 since preindustrial times which has not equilibrated.

    I do agree with the theme of your post on the arbitrary aspect of attributing specific numbers to each of the radiative forcings.

    However, it is an important issue to estimate the fractional contribution to positive radiative forcing due to CO2. If it dominates the other radiative forcings (and other “non-radiative climate forcings), than policy actions that focus on CO2 make good sense. However, if it is only one of several important radiative forcings, such as I summarize on Climate Science (in the hyperlinks for my weblogs given in #7), and, if we also need to be concerned about the spatial scales of the radiative forcing as is presented in

    Matsui, T., and R.A. Pielke Sr., 2006: Measurement-based estimation of the spatial gradient of aerosol radiative forcing. Geophys. Res. Letts., 33, L11813, doi:10.1029/2006GL025974.
    http://blue.atmos.colostate.edu/publications/pdf/R-312.pdf,

    as well as the “non-radiative” forcings as reported in the 2005 NRC Report “Radiative Forcing of Climate Change: Expanding the Concept and Addressing Uncertainties”,

    than the emphasis of CO2 alone is an inadequate recommendation for us as scientists to give to policymakers.

    [Response: When in doubt, read the original paper (Shindell et al, 2005) - figure 1 is extremely clear. On the second point, if you change the forcing attributed to one gas, why should the total remain the same? There are no constraints of the total - it's merely the sum of the individual contributions. Why the line-by-line calculation of forcing by CO2 should be affected by our atmospheric chemistry calculation is a little puzzling... IPCC used an abundance based calculation for the current forcings and that's fine. Our point was that for emissions reductions in the future, it helpful to know the forcing associated with each emitted component, so that targets can take account of atmospheric chemistry changes too. However, CO2 remains the largest single component and is the one with the largest projected growth, and while there is a lot that can be done to reduce the other forcings, the climate change problem in future is in many ways a CO2 problem. You and I clearly disagree on that, and that's fine, we should however be able to agree on 20th Century forcings. -gavin]

    Comment by Roger Pielke Sr — 4 Oct 2006 @ 10:24 PM

  15. Re #10

    Hi Mark,

    Can anyone point me to an explanation for the apparent leveling of the increase in global temperature from 1940 to about 1970?

    You might try here:
    http://illconsidered.blogspot.com/2006/03/what-about-mid-century-cooling.html

    Comment by Coby — 5 Oct 2006 @ 12:32 AM

  16. Sorry to veer off-topic, I have a layman’s question that I thought would be best answered here: according to http://www.nzherald.co.nz/section/story.cfm?c_id=2&ObjectID=10404255 a study by the Hadley Centre warns that global warming will lead to excessive drought. That sounds very counter-intuitive to me: higher temperatures should mean higher rates of evaporation and, eventually, higher rates of precipitation. Is it the case that evaporation will increase primarily over land while precipitation will rise mostly over oceans? If not, what am I missing?

    Comment by WinterIsComing — 5 Oct 2006 @ 2:45 AM

  17. Re: 11

    I thought that the average surface temperature of the planet would be a pretty simple concept. Usually this is measured by shaded thermometers in the air, so you are always measuring the same thing, i.e. the air temperature. Regarding Boyle’s law, the 0.2% difference from the idea would apply to the actual difference in temperature, i.e. 0.6K, not the absolute temperature. So the given temperatures are indeed consistent.

    It is true that heat could be going elsewhere (i.e. melting ice, deep ocean heating), but most of these would actually suppress a warming signature. It is hence plausable that the equlibrium response of the planet to the forcings already present is higher than the observed temperature increase.

    Comment by Andrew Dodds — 5 Oct 2006 @ 3:11 AM

  18. This is very important site, cause it helps one to understand what the word “fearmongering” really means. It means – more money to another fearmongering studies and proxies.

    Comment by Timo — 5 Oct 2006 @ 4:19 AM

  19. Re #8-

    Technically, all climate scientists are skeptics, and should be skeptics about the issue until they are proven- if people accept theories without background for global warming and does not form their own opinion based on a fairly comprehensive study, then they are as bad as simple deniers.

    Theories and opinions formed by the scientific process cannot be substituted with blind faith in any scientific issue.

    [Response: This previous post is relevant to this discussion: http://www.realclimate.org/index.php/archives/2005/12/how-to-be-a-real-sceptic/ -gavin]

    Comment by Jonathan Fairman — 5 Oct 2006 @ 4:40 AM

  20. Thanks for a very clear article Gavin. The black carbon contribution in the Hansen paper is very interesting. Another area where improving air quality (by removing dangerous particles from exhausts) will hopefully have a useful impact in reducing climate change.

    [Response: Agreed. There is a lot of scope, particularly in Asia for a reduction in black carbon emissions to have a strong impact on air quality and radiative forcings. - gavin]

    Comment by James Davey — 5 Oct 2006 @ 4:43 AM

  21. I am not sure I understand the point made in (13)

    But, maybe, I am just having the Barbie Doll moment.

    Ny understand of global dimming seems to be a little different than yours.
    It comes from ARM and James Hansen’s references to
    global dimming developing from clouds being enhanced and or formed by pollution, sulfates, and or certain aerosols and so forth.

    Because the pollution is there, the clouds form.
    And there is still a lot of uncertainty with what is known, and unknown
    about the matter. A lot.

    ARM’s publication:
    “Global Dimming: A Hot Climate Topic Global dimming”

    http://google.arm.gov/search?q=cache:DlnfzQyEZmQJ:education.arm.gov/outreach/publications/sgp/jul04.pdf+global+dimming&access=p&output=xml_no_dtd&site=default_collection&ie=UTF-8&client=default_frontend&proxystylesheet=default_frontend&oe=ISO-8859-1

    And haven’t you heard of the Asian brown cloud?

    “The Asian Brown Cloud (ABC)”
    Margaret Hsu and Laura Yee

    http://www.sfuhs.org/features/globalization/asian_cloud/
    “A recent comprehensive United Nations Environmental Program (UNEP) report, released on August 12, 2002, indicates that a 2-mile-thick toxic umbrella dubbed the â??Asian Brown Cloud,â?? (ABC) stretches over Afghanistan, Pakistan, Bangladesh, Bhutan, India, Maldives, Nepal, and Sri Lanka, which are among the most densely populated places in the world. “…..”The haze, 80 percent man-made, is composed of a grimy cocktail of toxic ash, black carbon, sulfate, nitrates, acids, and aerosolsâ??tiny solid or liquid particles suspended in the atmosphere. The haze also extends far beyond the study zone of the Indian subcontinent. Scientists say that similar clouds exist over East Asia (especially China), South America and Africa. Asian air pollution is unprecedented and will intensify as population increases and countries like China and India rapidly industrialize.”…..”The blanket of pollution is reducing the amount of solar energy hitting the Earthâ??s surface by as much as 15 percent. This has a direct effect on agriculture, by infringing on the important process of photosynthesis in plants. In fact, research carried out in India indicated that the haze could be reducing the winter rice harvest by as much as 10 percent. Furthermore, heat is trapped in the lower atmosphere, cooling the Earthâ??s surface while heating the atmosphere. This combination of surface cooling and lower atmosphere heating appears to alter the winter monsoon, leading to sharp decrease in rainfall over northwestern parts of Asia and an increase in rainfall along the eastern coast of Asia. “…..”The UNEP report seems to suggest that Greenhouse gases warm the Earthâ??s surface while the aerosol concentrated haze cools it. If this were true, it would worsen the abnormal temperature differences between places, and further disrupt the global climate. The UN report dealt specifically with the cloud; its relationship with global warming still requires further investigation.”…..

    And to myself, it seems, the gist of all the real debates, would boil down to how to dissipate, or draw down, the trapped GHG gases.

    Dimming and cooling, in the end, is more a matter of backscatter and or brownian motion, and the refraction matters and properties of particles, aerosols, and so forth.

    Meanwhile, here on earth, we still have the same remaining problem of our trapped thermal atmospheric content that can not escape away from Earth’s self contained system that is maintained by the greenhouse gases that surrounds the earth that is said to be increasing in content, and because it increasing in content, the thermal kinetic capacity (global warming potential of certain said gases will rise with it.)

    For example ….. gases produce kj of energy when thermally(radiated) vibrated and they collide with each other.

    Just like when two atoms of hydrogen gas and one atom of oxygen gas violently collide, they produce water. It’s a chemical combustion reaction, a exothermic reaction, producing energy of 572 kJ.

    And can and does man make water? Uh, no. Do they try to seed clouds, yes. But does man make water. No.

    The said gases naturally collide midair, luckily for us Flintstone head humans.

    (But I have read they would like to make recycled pee and poo water to sell for people to drink, but that hardly counts, that’s beyond cheating.)

    So, really, how to draw down the greenhouse gases while devesting ourselves of their energy content, and or finding a way to tap into it as a energy source ….

    Or on the ground technology to contain the fossil fuel emissions of the GHG by products before they EVER reach the atmosphere.

    That’s OUR main problem in my opinion; (besides the natural resource
    depletion, but if they could contain on the ground, they could get to a point of recycling the carbon content, at some point and time, and do their thermal cracking with the geo storage just like they do today with the petro supplies. It’s doable. It’s exactly how they plan to go after shale rock, tar sands, and any oil locked up in a rock. The issue is making them do it, environmentally safe, NEPA.)

    It would seem to me, that would be our problem to address, because we need to solve it, and it’s something that could be addressed.

    It matters, somewhat little, if some clouds, some of the time, in some parts of the world, are mirroring away solar EMF in some small ways if you think about in a GLOBAL context.

    However, a similar idea has been radically put out, besides the Nobel Prize winners…. by

    Stratospheric Injections Could Help Cool Earth, Computer Model Shows
    September 14, 2006
    http://www.ucar.edu/news/releases/2006/injections.shtml
    …”Wigley calculates the impact of injecting sulfate particles, or aerosols, every one to four years into the stratosphere in amounts equal to those lofted by the volcanic eruption of Mt. Pintabuto in 1991. If found to be environmentally and technologically viable, such injections could provide a â??grace periodâ?? of up to 20 years before major cutbacks in greenhouse gas emissions would be required, he concludes. “…

    But….Where would you put that information at, into an actionable level ladder scale, in the things we as people could do to effect a tangible REAL change?

    It’s a bandaid, radical, and a way to do nothing more than buy time and we have no REAL data on this.

    We have absolutely no way of knowing what might happen if MAN does this. It’s much different if nature does this, and then there isn’t any control over it, versus man. But with such a rapid change in the atmosphere in such a short time span, there is bound to be some repercussions versus a natural volcanic cycle.

    At least with Mr. Pielke’s call for the addition of land use changes,(a Denver newspaper article I believe I read) I can appreciate the information he threw out there and I agree with him on that line of thinking.

    Landuse changes are a significant contribution to albedo changes, as is deforestation, and we have the addition of soil degradation and or erosion, and so on (the loss of carbon stores).

    Just for a comparison here to dimming, we have a study, alternate to dimming, that goes back a 1000 years, in relation to sunspots and faculae affecting the brightness of the sun. I’d say it offers in brightness, a very different view from dimming.

    Foukal, P. et al (2006)
    “Variations in solar luminosity and their effect on the Earth’s climate”
    Nature, Volume 443, Issue 7108, pp. 161-166 (2006).
    http://adsabs.harvard.edu/abs/2006Natur.443..161F

    …” In this Review, we show that detailed analysis of these small output variations has greatly advanced our understanding of solar luminosity change, and this new understanding indicates that brightening of the Sun is unlikely to have had a significant influence on global warming since the seventeenth century.”..

    Trying to find other issues, “Masking”, or crutches, …still isn’t ever goin to reduce the numbers on the GHG content we already have and their future global warming potentials that continue to go calculate,
    compound daily…

    When one thinks about how much we are adding daily on the fossil fuel and cement emissions side to the volume of the atmospheric gas contents in the way of any possible combinations of thermal chemical equations and its concerning.

    Water vapor alone increases your heat index umpteenth thermal times. That’s why no one likes humidity.

    I won’t go into geek speak.

    So, our GHG side is always compounding daily, regardless of what is happening on the solar side.

    Whether its sunspots, faculae, brightness, or dimming.

    The WMO assessments, if you see the PDF below do have some uncertainties….depending upon where one lives, but the trend, global dimming, according to ARM, has reversed.

    But, regardless, according to the WMO, the ozone layer plays a great role, as does air pollution and aerosols, for any attenuation.

    5/11/05 – ARM Research Helps Identify A Brighter Earth
    …”Based on a decade of surface solar energy measurements, the finding is a reversal of the “dimming” trend previously reported for the 1960s through 1990.”…
    http://www.arm.gov/about/newsarchive_aprjun05.stm

    “Executive Summary Scientific Assessment of Ozone Depletion: 2006″ WMO/UNEP 18 August 2006
    http://www.wmo.int/web/arep/reports/ozone_2006/exec_sum_18aug.pdf

    Excerpted here
    http://www.connotea.org/uri/d4b52caf73b87a8d9974ffba8b82b034

    ..”Measurements from some stations in unpolluted locations indicate that UV irradiance (radiation levels) has been decreasing since the late 1990s, in accordance with observed ozone increases. However, at some Northern Hemisphere stations UV irradiance is still increasing, as a consequence of long-term changes in other factors that also affect UV radiation. Outside polar regions, ozone depletion has been relatively small, hence, in many places, increases in UV due to this depletion are difficult to separate from the increases caused by other factors, such as changes in cloud and aerosol. In some unpolluted locations, especially in the Southern Hemisphere, UV irradiance has been decreasing in recent years, as expected from the observed increases in ozone at those sites. Model calculations incorporating only ozone projections show that cloud-free UV irradiance will continue to decrease, although other UV-influencing factors are likely to change at the same time.”…”The previous (2002) Assessment noted that climate change would influence the future of the ozone layer.”…”Climate change will also influence surface UV radiation through changes induced mainly to clouds and the ability of the Earthâ??s surface to reflect light. Aerosols and air pollutants are also expected to change in the future. These factors may result in either increases or decreases of surface UV irradiance, through absorption or scattering. As ozone depletion becomes smaller, these factors are likely to dominate future UV radiation levels.”….” Air pollutants may counterbalance the UV radiation increases resulting from ozone Observations confirm that UV-absorbing air pollutants in the lower troposphere, such as ozone, nitrogen dioxide (NO2) and sulfur dioxide (SO2), attenuate surface UV by up to ~20%. This effect is observed at locations near the emission sources. Air pollution exerts stronger attenuation in UV compared with attenuation in total solar irradiance.”…

    Comment by barbie doll moment — 5 Oct 2006 @ 5:48 AM

  22. Gavin, You ask “But does the specific percentage attribution really imply much for the future? (i.e. does it matter that CO2 forced 40% or 80% of 20th Century change?).” You seem to be making the rhetorical point that the percentage doesn’t matter because it depends on how you define it. However, the point of skeptics is, that no matter how you define it, if the modelers mistakenly attribute 80% to CO2 when it is only responsible for 40%, by whatever definition, then the modelers have their climate sensitivities to this particular forcing wrong, and their projections of future warming are likely to be even more wrong.

    Naive statements such as that by Ng above “But isn’t the evidence clear that we are currently experiencing an unprecedented level of temperature increase and that has risen in line with CO2,” neglect the research showing that solar activity is also at one of its highest levels in the last 8000 years, and that paleo temperature proxies show stronger correlations with solar activity, than modelers can currently explain and that skeptics are unwilling to dismiss, especially just on the basis of the current state of modeling technology.

    Hansen in his 2005 paper, made a point of his model being able to explain all the modern warming with GHGs. It would have been more interesting to see how much of the warming he could have explained with solar, if he had really tried. Runs that doubled the solar forcing, to account for the uncertainty in that forcing would have been a start. Then since the focus of the paper was on the importance of the heat storage in the ocean, a possible next step would be to model what the effects might be if the solar and GHG couplings to the oceans were quite different, which they apparently are. This would not give us a more informative an answer about what the relative attribution of the 20th century warming is, but would perhaps give us a range on what it could be, given our current lack of knowledge and understanding. We probably will never be able to use the 20th century data to parameterize our models, because we don’t have the coverage and accuracy we need for much of the century. Hopefully, more refined work with recent and future data, and incorporation of research into the coupling mechanisms themselves, will allow us to validate the model climate sensitivities to the various forcings, and confidently reproduce multidecadal internal climate modes. Perhaps then, since we are curious, we may be able to properly attribute the 20th century and recent warming.

    [Response: ??? Try reading Hansen et al 2005 (and the follow-up submitted paper) - all available at the GISS website. We used I think 14 different forcings individually and together to assess how important each one was. Each one is specified as independently as possible (with uncertainties of course) and run through the model. GHGs alone give more warming than observed, solar is a minor component of that, and reflective aerosols (both direct and indirect effects) and black carbon are shown to be important. If you throw it all in together, it does a good job for the 20th Century. My point in the piece is that attribution is a modelling exercise - there can be no attribution without a model to link cause and effect. Your claim that we could have attributed everything to solar 'if we tried' is just ridiculous. We use the best forcing time series that we can get from the rest of the community, and we can't arbitrarily scale them to get your preferred response - that would indeed be pointless. - gavin]

    Comment by Martin Lewitt — 5 Oct 2006 @ 7:17 AM

  23. Gavin,

    This is slightly off-topic, but I wonder if RC might consider a post on the connection between air quality and climate change (e.g. the GHG impact of troposheric ozone).

    [Response: See Loretta Mickley's guest piece: http://www.realclimate.org/index.php/archives/2005/04/pollution-climate-connections/ -gavin]

    Comment by Marlowe Johnson — 5 Oct 2006 @ 10:46 AM

  24. Gavin, thanks again for this clear, insightful and challenging (to some) entry. Your site does a great service to the global community; not only directly, but also indirectly in that user responses to common misunderstandings such as overattribution to Solar forcing and challenging the measurability of the climate system get dispatched quickly and evenly. If they can’t be so dispatched, that’s where the science comes from.

    With respect to policy decisions, perhaps with these data we can make a stab at “best-path” efforts. Clearing carbon black may in the short term be cheaper and meet less opposition than hitting CO2 on the head. Of course, we’ll have to hammer away at CO2 eventually, but if there’s $250 billion a year worth of established industry on one side of the equation, and nothing but silence on the other, shouldn’t we be hitting carbon black with a vengence? My concern is that the puny 1 oC warming of the Eemian was enough to raise sea levels 5-8 meters (see Fig. 1 of http://columbia.edu/~jeh1/hansen_re-crichton.pdf and the Wikipedia article on Eemian for this). We’re almost at that now! While we have to hit CO2 soon enough, and sooner is better, it occurs to me that the best-path is to hit the cheapest targets first. Does anyone know whether you can scrub soot from, say, coal emissions while leaving aerosols alone? If so, we should probably be doing that already, since political opposition will be disorganized and far smaller – and we don’t want to have to build 8 meter seawalls around the entire inhabited coast of the planet.

    Comment by Steffen Christensen — 5 Oct 2006 @ 11:07 AM

  25. Thanks Gavin for this insightful post showing the nuances, both pro and con, of how the peer-reviewed climate communtity tries to honestly go about its business. These are insights that the public needs to read.

    Comment by Richard Ordway — 5 Oct 2006 @ 12:19 PM

  26. Gavin, I don’t clearly understand this : “But does the specific percentage attribution really imply much for the future? (i.e. does it matter that CO2 forced 40% or 80% of 20th Century change?).” If you want to estimate climate sensitivity to doubling CO2, don’t you need to estimate as precisely as possible the direct and indirect effects of each forcing on temperature trends ? If (delta)t2xCO2 is 1,5, 3 or 5 °C, it’s not exactly the same perspective for mitigation / adaptation.

    [Response: Umm.. I agree. But the sensitivity of a model to only CO2 forcing is a very different issue than the looking at the relative amount of that forcing compared to all the others. You are correct though, in thinking that the sensitivity is a more important question for future projections. - gavin]

    Comment by muller.charles — 5 Oct 2006 @ 1:01 PM

  27. Now that Svensmark et al. showed in their SKY experiment that indeed GCR (galactic cosmic rays) are a possible factor in contributing to low cloud cover (which has a net cooling effect), and that the 100 year past behaviour of suns high activity with corresponding lower cloud cover might explain 1.2 W/m2 of the forcings (compared to IPCC’s 1.4 W/m2 attributed to anthropogenic greenhouse gases), is it not dishonest to ignore this (possible major, and CO2 independant) contribution?

    [Response: There is an awfully long way to go from a simple lab experiment demonstrating a proof of concept to showing that this is an important mechanism of aerosol formation (of which there are many), and showing that it effects cloud cover significantly, and that it gives a non-negligible radiative forcing over the 20th Century. Over the last fifty years there has been no trend in cosmic rays anyway, and so for the temperature rises in recent decades it is highly unlikely to play a role. As has been stated often, the long term component associated with solar changes is pretty uncertain, but as the numbers above show, you would need to have the long term trend in solar-related forcings increase by a factor of five to even match CO2, let alone the total from all GHGs. Potential mechanisms for solar impacts on climate are a fascinating subject and may well help explain observed changes in the paleo-climate record, however, hoping that they will somehow magically reduce the effect of GHG increases today is foolish. -gavin]

    Comment by Francis Massen — 5 Oct 2006 @ 2:15 PM

  28. Re #22 (comment):

    Gavin, one of the main problems with current models is the attribution to aerosol forcings. There is a huge offset between aerosols and CO2 sensitivity, as can be seen in Climate sensitivity and aerosol forcings.
    If aerosols have less influence (these are quite uncertain) than currently implemented in climate models, this reduces climate sensitivity for CO2, but not necessary for solar and volcanic.

    With a simple EBM (energy balance model, used in a course at the University of Oxford) one can see that a halving of the sensitivity for CO2, compensated with a huge reduction of aerosol influence fits the temperature trend of the previous century as good as the original…
    See here

    [Response: No it doesn't. The sensitivity of any model to CO2 is completely independent of it's sensitivity to aerosol forcings. The problem of being able to tune an EBM to get anything you want is exactly why we use physically based models like GCMs, and use spatial and temporal patterns to do formal attribution. -gavin]

    Comment by Ferdinand Engelbeen — 5 Oct 2006 @ 3:17 PM

  29. Gavin, care to comment about the state of knowledge regarding radiative forcings? As in, the degree to which actual high quality measurements exist of the fluxes? Not only by satellites but also “looking” from other perspectives, across all pertinent interfaces, in all the pertinent dimensions? Also, how about the general heat flows at the various scales, accounting for non radiative transfers as well.

    [Response:Radiative forcings are estimated from radiative transfer models, they are not observed quantities. Radiative fluxes are observed to an increasing level of accuracy but long consistent time series are few and far between. It isn't (yet) possible to measure the radiative imbalance at the top of the atmosphere, which is why it is easier to look at heat storage. Kiehl and Trenberth's figure is still a good approximation to what we think we know. - gavin]

    Comment by Steve Sadlov — 5 Oct 2006 @ 3:38 PM

  30. RE: #17 – and conveniently, all the thermometers are located on some sort of equidistant grid, across the whole earth, with no error inducing factors of any kind. Then there is reality.

    Comment by Steve Sadlov — 5 Oct 2006 @ 3:47 PM

  31. re 17

    The following is just an example, and my original question was that I wonder how climate models adjust to non-linearities in the atmosphere.(Could one of the scientists at realclimate please respond)

    Thank you for your posting, but if I understand you then I think that you miss the point, which must be my fault for not making myself clear.

    Temperature is not, I’m afraid, a simple concept, which is what I am trying to say. It’s a logarithm ratio of entropy versus energy. Humans think they know what temperature is, because it is quoted at them all of the time, but materials do not show a linear relationship between energy and temperature. This is well known.

    Ground temperatures vary from minus 50 degrees to plus 50 degrees and pressure varies by plus or minus 10%, or thereabouts. Within this range, gases such as oxygen do not obey the gas laws and the non-linearity is roughly of the same order as shown in AGW. The atmosphere as a whole has a much wider range of temperature and pressure and is non-linear over these ranges.

    “It is true that heat could be going elsewhere (i.e. melting ice, deep ocean heating), but most of these would actually suppress a warming signature. It is hence plausable that the equlibrium response of the planet to the forcings already present is higher than the observed temperature increase.”

    Maybe, but if you take a kilo of water at 100 degrees and a kilo of ice at 0 degrees and mix the two, the resultant temperature is not the arithmetic mean (50 degrees) It’s a lot lower; try it yourself.

    The reason for this is that the relationship between energy and temperature for water is highly non-linear around 0 degrees. The energy that you have to remove from water to create ice is roughly the same as that which you have to put in to take it from 0 degrees to 100 degrees.

    Solar radiation is an example of a forcing that is a small sinusoid around a larger mean. If the amplitude of the sinusoid increases [if the hot gets hotter and the cold gets colder], the mean of the solar forcing remains the same since the mean of the sinusoid is always zero, independant of the amplitude.

    But the mean of the atmospheric temperature (which is not linearly related to the input solar radiation) changes because the distribution of the output is a non-linear, distorted sinusoid around a mean. The mean of a distorted sinusoid is dependant upon the amplitude of the sinusoid and the type of the distortion.

    So if there is more variation in energy from solar radiation today than there was 100 years ago, the mean temperature would change, but the actual energy of the earth would be the same.

    The rise in mean would be an artifact of the relationship between temperature and energy.

    [Response: It wouldn't be an artifact, it would just be a fact. Regardless, models are designed to conserve energy and temperature is a diagnostic field which can be averaged any way you want. We use anomalies of global or hemispheric mean of surface air temperature (usually at 2m or so) to compare to observations because that is what we think we can reliably estimate from the real world. -gavin]

    Comment by Tim Hughes — 5 Oct 2006 @ 4:01 PM

  32. Gavin,

    Thanks for the link to the previous post, but my question/interest was the opposite of what the post talks about. I’m interested in a discussion about the forcing associated with tropospheric ozone and how this is modeled, not what impact AGW may have on future ozone concentrations. Sorry if that wasn’t clear. My understanding is that the net forcing of tropospheric O3 also takes into account its effect on methane decomposition in the atmosphere, but I’ve never seen a nice summary that explains the science.

    Another reason I bring this up is that people often think that addressing air quality and climate change is an either/or proposition when in fact it isn’t.

    Comment by Marlowe Johnson — 5 Oct 2006 @ 4:36 PM

  33. Re: #22,

    Gavin you state: “Each one is specified as independently as possible (with uncertainties of course) and run through the model.”

    I have read the papers, I don’t recall that you ran the uncertainties in solar forcing through the model, and there was a factor of two in those uncertainties. You only ran one effective forcing through the model, which is the estimate of the forcing reduced by a factor of 0.92.

    You also state “If you throw it all in together, it does a good job for the 20th Century.” The point is that other models with quite different sensitivities also do a “good job” for the 20th century. The problem is poorly constrained.

    You state, “we can’t arbitrarily scale them to get your preferred response – that would indeed be pointless”. I have read published research that did exactly this and claimed that the fit to the recent warming was poor. That was not pointless. It is a problem for the skeptics in fact, although this problem may be partially resolved by the correlated albedo biases in the AR4 models.

    I agree that attribution is a modeling exercise for a small temperature increase such as we have experienced in the 20th century. But the paleo data does suggest a larger sensitivity for solar forcing than for GHG forcing. Exploring the implications of differential climate sensitivities for the forcings is a legitimate exercise, especially since the current climate models have a wide range of sensitivities, and all have simplified, parameterizations of the couplings. The difference in solar and GHG coupling to the ocean does not appear to be properly modeled.

    Comment by Martin Lewitt — 5 Oct 2006 @ 5:09 PM

  34. Re #28 comment:

    Gavin, if the influence of aerosols is overestimated, you can’t fit the past century’s temperature with a 3 degr.C / 2xCO2, no matter what model you use, as the 1945-1975 and beyond temperature would be way too high. Thus as you need to fit the past century, one must lower the sensitivity for CO2. And solar sensitivity is anyway underestimated. I don’t know of any physical reason that the sensitivity for solar should be only 0.92 of that for CO2 for the same forcing. To the contrary, as there is an inverse correlation between low cloud cover and solar irradiation, and solar/volcanic have influences in the stratosphere, non-excisting for CO2 or human made aerosols. A sensitivity study for solar in the HadCM3 model showed that it probably underestimates solar with a factor 2 (see Stott ea.)

    Further, does a full GCM do a better job than a simple EBM model? Just curious… And why not using multivariate analyses, to see what the different factors might be, if climate was just a black box with known inputs and output(s), but unknown mechanisms (which there are more than enough) in between…

    Comment by Ferdinand Engelbeen — 5 Oct 2006 @ 6:25 PM

  35. Re #33 Matin, you write “I have read the papers, …”

    Which papers? Gavin isn’t the only person reading this blog. He may know which papers your are talking about, but the rest of us don’t. The key to a good post is to know your audience. It is not just Gavin!

    Comment by Alastair McDonald — 5 Oct 2006 @ 8:09 PM

  36. You say he’s wrong on the land use yet Crichton sources Nature. And from the first paragraph of that article:

    “Moreover, our estimate of 0.27 C mean surface warming per century due to land-use changes is at least twice as high as previous estimates based on urbanization alone7,8.”

    What am I missing?

    [Response: The study quoted uses the difference between the weather models and the mostly independent surface temperature record to estimate a residual trend. With no physical reasoning at all they attribute the difference to land use changes, despite the fact that no bottom up study of land use changes produces even a warming, let alone one that matches their numbers. My main problem with that study is that the weather models don't use any forcings at all - no changes in ozone, CO2, volcanos, aerosols, solar etc. - and so while some of the effects of the forcings might be captured (since the weather models assimilate satellite data etc.), there is no reason to think that they get all of the signal - particularly for near surface effects (tropospheric ozone for instance). Residual calculations can sometimes be useful, but you have to be sure that you have taken everything else into account. These studies have not. -gavin]

    Comment by Wacki — 5 Oct 2006 @ 10:21 PM

  37. Thanks for the clear explanation. As mentioned above, it appears that CO2 is presently only 40% of the problem, and CO2 emission is very expensive to reduce in the short term.

    Potentially we can eliminate nearly 60% of the short term forcing at reasonable cost by radically reducing real polution (such as O3, NOx, and BC) that also causes immediate health problems. Selling such a plan where everyone sees benefits will be a lot easier than selling a very expensive approach with almost no near term benefits.

    CO2 reduction may be much cheaper using technology developed in the next 20 years.

    Comment by Steve Reynolds — 5 Oct 2006 @ 11:00 PM

  38. Is the following ever accounted for in models — and should it be? Currently the biosphere is largely controlled to produce what humans want. In the paleo record, the biosphere responded to atmospheric, solar and GHG changes in the way that provided selective advantage. If there is significant feedback possible, this would result in the planet responding differently now than it always did before.

    In other words, today we try to squeeze the maximum economic value out of every acre. We do that in the face of changing climate so agricultural systems respond by changing. In the distant past, evolution determined how plant communities responded. That could affect carbon and other uptakes, water cycles, albedo, etc.

    Comment by Tim Ream — 6 Oct 2006 @ 12:42 AM

  39. Re: #34

    Apologies Alastair. I just didn’t want to be repetitive, Gavin and I have been discussing across several threads and forums. Here are the two abstracts with links to the full text:

    http://pubs.giss.nasa.gov/abstracts/2005/Hansen_etal_2.html

    http://pubs.giss.nasa.gov/abstracts/2005/HansenNazarenkoR.html

    The first is cited in the second and provides basis for the “effective forcings” used in the second.

    Comment by Martin Lewitt — 6 Oct 2006 @ 2:13 AM

  40. Thanks, Gavin, for your very clear discussion about attribution.
    I would like to point out that, recently, also non-dynamical modeling (namely neural network modeling) has shown that it is not possible to reconstruct the temperature course of the last 150 years if anthropogenic forcings are not taken into account. Furthermore, a clear contribution of ENSO has been revealed as far as the correct reconstruction of inter-annual variability of global T is concerned.
    I would like to point out this, because this paper of mine has been published in a journal which is not completely dedicated to climate and therefore it can be unknown to many of you.
    Please refer to Pasini et al. (2006), Neural network modelling for the analysis of forcings/temperatures relationships at different scales in the climate system, Ecological Modelling 191, 58-67:
    http://dx.doi.org/10.1016/j.ecolmodel.2005.08.012
    Thanks again
    Antonello

    Comment by Antonello Pasini — 6 Oct 2006 @ 3:46 AM

  41. Gavin

    Thank you very much indeed for taking the time to reply, but I’m afraid that your reply does not address the issue.

    I don’t understand your statement that I could apply any average as it would be mathematically unsound.

    A mean temperature is not a useful fact, since it does not give any information as to the rise in energy of the earth as a whole. Any meaningful information is lost because you have not qualified the non-linearities or the distribution of the energy, and information may well be lost because of the distortion, anyway.

    It is entirely possible for the mean temperature to rise when the total energy of the earth does not, and your old AM radio would not work if this were not the case as it relies on this effect in non-linear regions. Hence my statement that a mean temperature rise would be an artifact, at least in the example given which was only for explanatory purposes.

    Anyway, that wasn’t the original question, which was how do your models cope with non-linearities in the atmosphere. If you are not taking into account deviations from ideal gas laws, then I am concerned that e.g. rising air will, in your model, end up at the wrong energy level and your temperatures will then be in error by roughly the same order as the temperature as attributed to AGW.

    Comment by Tim Hughes — 6 Oct 2006 @ 4:06 AM

  42. RE: 24. and 35.

    “Does anyone know whether you can scrub soot from, say, coal emissions while leaving aerosols alone? ”

    An aerosol may, or may not form clouds. That’s a wide football field hail mary let it land where it may generalization.

    One needs to realize ….

    to begin with particulate matter is so miniscule, ppm..

    Anyhow…

    The only technologies for scrubbing, or removing that I am aware of, are electrostatic precipitators (ESPs) or fabric filters.

    Particulate Control R&D
    http://www.fe.doe.gov/programs/powersystems/pollutioncontrols/overview_particulatecontrols.html

    So, it’s a matter of improving that technology…
    and in my opinion, its, improving the technology, because it’s harmful to a human’s health, toxic, and particulates are inhalable, so it adds to the burden of the health care system and medicare benefits,disability benefits, in incremental cost to society for health care, if we are increasing particulate emissions.

    The logic behind this entire train of thought and thinking process escapes me.

    Stronger Standards for Particles Proposed
    12/21/2005 EPA
    http://yosemite.epa.gov/opa/admpress.nsf/68b5f2d54f3eefd28525701500517fbf/1e5d3c6f081ac7ea852570de0050ae2b!OpenDocument
    …”Numerous studies have associated fine particulate matter with a variety of respiratory and cardiovascular problems, ranging from aggravated asthma, to irregular heartbeats, heart attacks, and early death in people with heart or lung disease. EPA has had national air quality standards for fine particles since 1997 and for coarse particles 10 micrometers and smaller (PM10) since 1987. Particle pollution can also contribute to visibility impairment.”…

    Coal plants have been using sorbents, and the technology is improving all the time, such as amine scrubbers. Essential the scrubbing of coal flue gas emissions incorporates various sorbents, limestone, and
    amines to scrub out soot, sulfates and all those lovely things…

    See …
    Key Issues & Mandates
    Secure & Reliable Energy Supplies – Coal Becomes a â??Future Fuelâ??
    http://www.netl.doe.gov/KeyIssues/future_fuel.html

    …”Scrubbers can reduce sulfur emissions by 90 percent or more. They are essentially large towers in which aqueous mixtures of lime or limestone â??sorbentsâ?? are sprayed through the flue gases exiting a coal boiler. The lime/limestone absorbs the sulfur from the flue gas. “….” In the late 1970s and 1980s, power plant engineers tested a new type of coal burner that fired coal in stages and carefully restricted the amount of oxygen in the stages where combustion temperatures were the highest. This concept of â??staged combustionâ?? led to â??low-NOx burners.â?? Low-NOx burners have been installed on nearly 75 percent of large U.S. coal-fired power plants. They have typically been effective in reducing nitrogen oxides by 40 to 60 percent. “…..”In 1990, new amendments to the Clean Air Act mandated that nationwide caps be placed on the release of sulfur dioxide and nitrogen oxides from coal-burning power plants. In some areas of the United States â?? particularly the eastern portion of the Nation â?? many states must implement plans to reduce nitrogen oxides to even greater levels than those mandated by the nationwide cap. To reduce NOx pollutants to these levels, scientists have developed devices that work similar to a catalytic converter used to reduce automobile emissions. Called â??selective catalytic reductionâ?? systems, they are installed downstream of the coal boiler. Exhaust gases, prior to going up the smokestack, pass through the system where anhydrous ammonia reacts with the NOx and converts it to harmless nitrogen and water.” …

    “The Coal Plant of the Future”
    Key Issues & Mandates
    Clean Power Generation
    http://www.netl.doe.gov/KeyIssues/clean_power.html
    A new breed of coal plant that relies on coal gasification represents an important trend in coal-fired units â?? distinctly different from the conventional coal combustion power station. Rather than burning coal, such plants first convert coal into a combustible gas. The conversion process â?? achieved by reacting coal with steam and oxygen under high pressures â?? produces a gas that can be cleaned of more than 99 percent of its sulfur and nitrogen impurities using processes common to the modern chemical industry. Trace elements of mercury and other potential pollutants can also be removed from the coal gas; in fact, the coal gas can be cleaned to purity levels approaching, or in some cases, surpassing those of natural gas. “…..”A key to successful carbon sequestration will be to find affordable ways to separate carbon dioxide from the exhaust gases of coal plants. Techniques are being developed that can be applied to conventional combustion plants, but it is likely that capture methods will be even more effective when applied to integrated gasification combined-cycle plants. Integrated gasification combined-cycle plants release carbon dioxide in a much more concentrated stream than conventional plants, making its capture more effective and affordable. “……”

    The following two, articles

    “New Sorbents For Carbon Dioxide”
    “Metal Sorbent Removes Mercury from Industrial Gas Streams”

    can be found in the
    National Energy Technology Laboratory
    The June 2006 NETL Newsletter
    http://www.netl.doe.gov/newsroom/netlog/july2006/Jul06netlog.html

    And pricewise, it has become competive [ http://midcont.spe.org/images/midcont/articles/28//CO2EOR_15.ppt. to use
    CO2, and to sequestering in relation to oil fields. It actually increases oil field production...and miscible oil field flooding has a long history.

    Improved Displacement and Sweep Efficiency in Gas Flooding
    http://www.cpge.utexas.edu/re/gas_flooding.html
    ..."Oil recovery from miscible gas flooding is the fastest growing improved oil recovery technique in the US. The contribution of miscible and immiscible gas flooding to US production is currently about 330,000 barrels of oil per day."...

    [Beeson, D.M. and G.D. Ortloff, 1959, Laboratory Investigation of the Water-Driven Carbon Dioxide Process for Oil Recovery, trans., AIME 216, p. 388â??391.]

    Johnson, J. W. et al (2002) “Geologic CO2 Sequestration: Predicting and Confirming Performance in Oil Reservoirs and Saline Aquifers” American Geophysical Union, Spring Meeting 2002, abstract #GC31A-04 http://adsabs.harvard.edu/abs/2002AGUSMGC31A..04J
    ….”Oil reservoirs offer a unique “win-win” approach because CO2 flooding is an effective technique of enhanced oil recovery (EOR), while saline aquifers offer immense storage capacity and widespread distribution. Although CO2-flood EOR has been widely used in the Permian Basin and elsewhere since the 1980s, the oil industry has just recently become concerned with the significant fraction of injected CO2 that eludes recycling and is therefore sequestered. This “lost” CO2 now has potential economic value in the growing emissions credit market; hence, the industry’s emerging interest in recasting CO2 floods as co-optimized EOR/sequestration projects.”…..

    “BP and Edison Plan California Power Plant with CO2 Sequestration”
    http://www.energyonline.com/Industry/News.aspx?NewsID=7011&BP_and_Edison_Plan_California_Power_Plant_with_CO2_Sequestration
    “LCG, February 15, 2006–Edison Mission Group (EMG), a subsidiary of Edison International, and BP recently announced plans to build a hydrogen-fueled power plant in southern California that would generate electricity from petroleum coke with minimal carbon dioxide (CO2) emissions. The proposed, 500-MW project would utilize new financial incentives included in the Federal Energy Policy Act of 2005 for advanced gasification technologies.”…….”As part of the process, the CO2 gas would be captured and transported via pipeline to oil fields, where the CO2 would be injected underground into the oil reservoirs to improve oil production and sequester the CO2 from the earth’s atmosphere. The CO2 is produced along with the recovered oil, then recycled and reinjected. The companies estimate that about 90 percent of the CO2 would be sequestered. In November 2005, the Department of Energy (DOE) announced that a DOE-funded project had successfully sequestered CO2 into the Weyburn Oilfield in Saskatchewan, Canada, while doubling the fields oil recovery rate.”….The estimated cost of the plant is $1 billion. The companies plan to finish detailed engineering and commercial studies this year and to complete project investment decisions in 2008, with operations commencing in 2011.”

    And the referred to Weyburn project is the 4 year geological sequestering CO2 project; the world’s largest project. Also monitored by the IEA-GHG who released a study on it.

    “Successful Sequestration Project Could Mean More Oil and Less Carbon Dioxide Emissions”
    DOE Fossil Energy Techlines News (2005 Techlines) 05058-Weyburn Sequestration Project (Nov 15, 2005)
    http://fossil.energy.gov/news/techlines/2005/tl_weyburn_mou.html

    “Weyburn Carbon Dioxide Storage Project Largest in the World”
    http://www.sk25.ca/Default.aspx?DN=94,93,16,1,documents
    ..”The $40-million project was conducted by the PTRC and endorsed by the International Energy Agency Greenhouse Gas R&D Program. The Weyburn project has become a model of international co-operation between the public and private sectors, as well as research organizations from Canada, Europe, Japan and the United States.”…..”The Weyburn project was the world’s first large-scale study on the geological storage of CO2 in a partially depleted oil field,” explained Mike Monea, executive director of the PTRC. “While there are numerous large commercial CO2-enhanced oil-recovery operations around the world, none of these has undertaken the depth and extent of research that we have.”…”The Weyburn project is good news for addressing climate change because it proves that you can safely store 5,000 tonnes of CO2 per day in the ground rather than venting this greenhouse gas into the atmosphere,” said Dr. Wilson.”…

    IEA Greenhouse Gas Programme
    Capture and Storage of CO2
    http://www.ieagreen.org.uk/ccs.html
    ..”Many of these geological traps have already
    held hydrocarbons or liquids for many millions of years. “..
    “storage capacities quoted are based on
    injection costs of up to 20 US $ per tonne of CO2 stored.”..

    “Report on “IEA GHG Weyburn CO2 Monitoring & Storage Project”
    http://www.ieagreen.org.uk/glossies/weyburn.pdf

    Also see the “NETL Carbon Sequestration Page”
    http://www.netl.doe.gov/technologies/carbon_seq/index.html

    And there many ongoing DOE efforts into the numerous technologies

    September 12, 2006
    Critical Carbon Sequestration Assessment Begins:
    Midwest Partnership Looks at Appalachian Basin for Safe Storage Sites
    http://www.fossil.energy.gov/news/techlines/2006/06052-
    ..”Researchers have estimated that the formations may have the capacity to store CO2 for more than 200 years.”…

    “DOE Releases 2006 Carbon Sequestration Technology Roadmap,
    Project Portfolio”
    Department of Energy Techlines News (2006 Techlines)
    06049-Sequestration Roadmap (Aug 22, 2006)
    http://www.fossil.energy.gov/news/techlines/2006/06049-Sequestration_Roadmap_2006.html
    http://www.netl.doe.gov/KeyIssues/future_fuel.html

    …”Among the past year’s program highlights contained in the roadmap are the following:” The Regional Carbon Sequestration Partnerships have progressed to a validation phase in which they will conduct 25 field tests involving the injection of CO2 into underground formations where it will be stored and monitored. Pilot-scale tests and modeling of amine-based CO2 capture have shown that operating an amine stripper at vacuum can reduce energy use 5â??10 percent per unit of CO2 captured. Novel metal organic frameworks have shown significant potential as CO2 sorbents.”…
    [*MORE INFO 8.76MB pdf file "Carbon Sequestration Technology Roadmap and Program Plan"
    http://www.fossil.energy.gov/programs/sequestration/publications/programplans/2006/2006_sequestration_roadmap.pdf

    As to aerosols, there just isn't the control over whether the aerosol is going to make the cloud form correctly.

    For example with dry dust, one doesn't have any rain, or little rain, due to the LACK of cloud formation;

    ..."The second type of aerosol that may have a significant effect on climate is desert dust. "..."Because the dust is composed of minerals, the particles absorb sunlight as well as scatter it. Through absorption of sunlight, the dust particles warm the layer of the atmosphere where they reside. This warmer air is believed to inhibit the formation of storm clouds. Through the suppression of storm clouds and their consequent rain, the dust veil is believed to further desert expansion."... http://oea.larc.nasa.gov/PAIS/Aerosols.html

    likewise say with the issue of small particle formation in clouds, you have a different refraction
    and the radiation budget to deal with ["Impacts from Aerosol and Ice Particle Multiplication on Deep Convection Simulated by a Cloud-Resolving Model with a Double-Moment Bulk Microphysics Scheme and Fully Interactive Radiation" Phillips, V T et al (2005)]

    This material explains it well in laymen’s terms….

    “NASA Explains Puzzling Impact Of Polluted Skies On Climate” July 14, 2006
    http://www.sciencedaily.com/releases/2006/07/060714082130.htm

    “In a breakthrough study published today in the online edition of Science, scientists explain why aerosols tiny particles suspended in air pollution and smoke — sometimes stop clouds from forming and in other cases increase cloud cover.”…..”"When the overall mixture of aerosol particles in pollution absorbs more sunlight, it is more effective at preventing clouds from forming. When pollutant aerosols are lighter in color and absorb less energy, they have the opposite effect and actually help clouds to form,” said Lorraine Remer of NASA’s Goddard Space Flight Center, Greenbelt, Md.”…..”With these observations alone, the scientists could not be absolutely sure that the aerosols themselves were causing the clouds to change. Other local weather factors such as shifting winds and the amount of moisture in the air could have been responsible, meaning the pollution was just along for the ride.”…”"Separating the real effects of the aerosols from the coincidental effect of the meteorology was a hard problem to solve,” said Koren. In addition, the impact of aerosols is difficult to observe, compared to greenhouse gases like carbon dioxide, because aerosols only stay airborne for about one week, while greenhouse gases can linger for decades..”…”Using this new understanding of how aerosol pollution influences cloud cover, Kaufman and co-author Ilan Koren of the Weizmann Institute in Rehovot, Israel, estimate the impact world-wide could be as much as a 5 percent net increase in cloud cover. In polluted areas, these cloud changes can change the availability of fresh water and regional temperatures.”….

    Comment by barbie doll moment — 6 Oct 2006 @ 4:13 AM

  43. Re #18 and “This is very important site, cause it helps one to understand what the word “fearmongering” really means. It means – more money to another fearmongering studies and proxies.”

    If the climatologists wanted more money, they’d be saying the problem needs more study. What they’re saying is that they know what the problem is. So the attribution of global warming warnings to climatologist greed is silly. It’s a much more effective ad hominem the other way; Lindzen and Baliunas and Soon and other AGW skeptics are receiving non-negligible subsidies from oil companies.

    -BPL

    Comment by Barton Paul Levenson — 6 Oct 2006 @ 7:09 AM

  44. Re Comment #37 (CO_2 reductions vs. black carbon and other pollutants):

    Hansen made this point several years ago in a paper which received attention in the media. This was seen as back tracking by a major ‘prophet’ of global warming. But note that his latest warnings suggest that we may not have all that long to start doing something about curbing the growth of CO_2 emissions. It seems to me that resistance to slowing and eventually reducing the growth of CO_2 emissions is mainly political and fueled by certain economic interests who don’t want to change their current practices. As Hansen makes clear, we have to proceed on all fronts and ignoring CO_2 emissions for 20 years is not a viable policy if we are serious about the problem. Consider for example how different things would be today if Congress had never allowed the truck examption on fuel economy standards to be extended to vans and SUVs. The argument today seems to be that we can’t raise fuel economy standards because that would place an unfair restriction on US auto manufacturers. But these manufacturers seem to be doing a good job of losing market share all by themselves. I suspect they would be much better off today if they had been forced to compete on fuel economy since the 70s. The economic arguments for waiting are all based on rather short term considerations and ignore the potentially very large cost of waiting to begin taking appropriate measures. Choosing short term gain over long term advantage is childish, and it is time we grew up.

    The crucial point has been a failure of leadership. Can you imagine Senator Inhofe getting way with the same sort of nonsense or Crichton being invited to the White House to discuss climate science with a President Gore or a President McCain? And remember that even Bush originally ran on a platform calling for reductions in CO_2 emissions which he promptly recanted upon being elected.

    Comment by Leonard Evens — 6 Oct 2006 @ 9:59 AM

  45. Re: #16 The Hadley Centre drought study.

    Essentially you’re understanding is correct. The total amount of precipitation over the whole globe would be expected to increase, but the pattern of this rainfall might not stay the same. In particular, it is often the case that rainfall in one place inhibits rainfall in another place, because of the effect it has on the large-scale circulation [at least in the tropics].

    Consequently, if there is more rainfall [due to greater evaporation] where there is already a lot of rainfall, then the effect that rainfall has on the circulation will be increased.

    The Hadley study is suggesting that rainfall will become more variable, so there willbe more droughts and also presumably more floods. These effects of climate change are much harder to predict than global mean temperature, but will have a much greater impact and be much harder to adapt to.

    The climate models aren’t really good enough in their representation of present-day circulation to give you much confidence in the specifics of their predictions [so that you could use them to do a cost-benefit analysis for example], but the risk of widespread change is still there.

    Comment by Timothy — 6 Oct 2006 @ 10:24 AM

  46. RE: #38 – Also consider the thermal inertia of the biomass itself. Examples:
    * Deserts for thousands of years, now irrigated growing 3 crops per year, continuous cover, year round
    * Old marginal farmland in Europe and North America now either reverted to unmanaged / lightly managed forest, or overtly tree farmed
    * The change from only the richest people living in leafy suburban / rural residential areas and the rest in nearly treeless urban squalor, to the masses increasingly also living in suburban areas and urban areas redeveloped into a high density suburban setting (and the corresponding explosion in such leafy residential areas not only in the so called first world, but increasingly, in the developing world)
    * The inexorable and ongoing change in the tropics from slash and burn agriculture to fixed site, managed agribusinesses
    Etc…..

    Comment by Steve Sadlov — 6 Oct 2006 @ 11:51 AM

  47. RE: #45 – One of the proposed “shifts” that would impact me directly where I live, has been an expansion of the deserts here in the SW US. Ironically, where I see, based on the past 30 or so odd years of actual realized weather and decade scale climate, a high desertification potential, is in areas not even currently contiguous with the current SW deserts. Examples: Mississippi Valley, Florida, even pockets along the Eastern Seaboard. In fact, if anything, here in the southwest, when we set aside things such as overgrazing, impacts of wildfires and other human induced local effects, there may actually be current shrinkage of deserts. My own specific setting is telling. I live at an interface between open forest and closed forest. The forest at my location is thickening.

    Comment by Steve Sadlov — 6 Oct 2006 @ 12:00 PM

  48. RE 47 (Sadlov):

    My own specific setting is telling. I live at an interface between open forest and closed forest. The forest at my location is thickening.

    This may be attributable to a number of things, including fire suppression, curtailment of grazing, climate change, etc., and not just purported desert shrinkage.

    Best,

    D

    Comment by Dano — 6 Oct 2006 @ 12:54 PM

  49. I live in Scotland and there has been a clear change in rain over here. It was always wet but the last few years the rain is shorter but more intense. This leads to flash floods and the water not actually getting into the soil. Rain might increase but is going to be useful rain?

    Comment by Mark UK — 6 Oct 2006 @ 3:28 PM

  50. Re #47/8: Especially since the nearest desert is about 200 miles from Sadlov’s house. Also, I’m fairly sure the claim about shrinking deserts is simply wrong. How about a source for that, Steve S.? Also, desertification “potential” on the East Coast or in the Mississippi Valley? Big news, if true.

    Comment by Steve Bloom — 6 Oct 2006 @ 4:42 PM

  51. Can you recommend any good textbooks in meteorology and climate science appropriate for an mathematics undergraduate who also knows the basic of physics? (Classical mechanics, physics of heat transfer and so on.)

    [Response: Ruddiman's text is good, and so is David Archer's. - gavin]

    Comment by Johan Richter — 6 Oct 2006 @ 4:47 PM

  52. re 10.

    The apparent leveling off in global temperatures from the 1940s to 1970s might be explained as follows. A reduction in global cloud cover and thus an increase in solar radiation reaching the surface occurred during the dust bowl years of the early-mid 1930s. The warm mid 1930s to 1940s period had frequent EL Nino conditions, perhaps initiated by the reduced cloud cover during the early 1930s. Given that the 1930s-1940s was unusually warm for the 19th – mid 20th century period, the
    apparent leveling off in global temperatures in the 1950s and 1960s may merely have been a return to meteorologic conditions similar to that which existed earlier in the 20th century.

    A recent study (excerpt below) on cosmic rays and clouds seems to support my comments (above).

    Excerpt:
    … ‘A team at the Danish National Space Center has discovered how cosmic rays from exploding stars can help to make clouds in the atmosphere. … , during the 20th Century, the Sun’s magnetic field which shields Earth from cosmic rays more than doubled, thereby reducing the average influx of cosmic rays.’ …

    Cosmic radiation entering Earth’s atmosphere. Credit: Danish National Space Center.
    by Staff Writers
    Copenhagen, Denmark (SPX) Oct 06, 2006
    http://www.spacedaily.com/reports/Exploding_Stars_Influence_Climate_Of_Earth_999.html

    Comment by Pat Neuman — 6 Oct 2006 @ 7:32 PM

  53. RE: #50 – What is the average annual “normal” rainfall at Coalinga or Leemore? (I already know the answer, this is just to make a rhetorical point) Look up their classification, my prefered source is Landis. What are those two places’ coordinates? They are closer than 200 miles. However, an even more telling type location would be somewhere near the intersection of Interstate 5 and State Route 138. The areas at the edges of the deserts have been experiencing a trend toward above normal rainfall for years. And alluding to Dano’s snipe, the forest densification in coastal central and northern Cal is clearly attributable to moisture abundance. The past 10 years have been, on the whole, very wet – that has been true for all ENSO phases. As for Steve B’s doubting of the possibility of desertification in places like East Central Arkansas, again, look at the recent stats. Then, you could do as I did and take a look. Things have been very, very dry there and for hundreds of miles around, for years. Finally there has been rain there over the past month. But the overall trend over the past few years has been dry. The Dust Bowl was not far from that region. Etc.

    Comment by Steve Sadlov — 6 Oct 2006 @ 7:51 PM

  54. Gavin, another point, but it’s not directly adressed to the core of the present discussion (so, you’re free to ignore :-). You write :”Indeed, for the last decade, by far the major growth in forcings has come from CO2, and that is unlikely to change in decades to come.” More generally, 1976-2005 is considered as the three decades of “anthropogenic-rather-than-any-other-cause warming”. But are we really sure of that ? For just an example in the recent litterature, Pinker et al. (2005) and Wild et al. (2005) observed that nebulosity trends as measured by ISCCP (or terrestrial albedo measured by two other means) between 1985-2003 could have imposed a positive forcing of 3-6 W/m2, nearly twice the level of GHG since 1750. For the moment, I did not read any reply to these papers (and graphs of ISCCP for 2003-2005 show quite the same trend for low-level / high-level clouds). So, my (prudential) question is : Three decades, is it enough for definitive conclusions about GHG’s primary role in current and future warming ?

    [Response: The big problem with inferring radiative forcings from observations of clouds is that you don't have a complete picture of all of the factors that influence the clouds changes on the radiation (i.e. SW and LW effects depend on the vertical profile, optical thickness, cloud particle size etc.). Plus these are extremely noisy records. Plus you wouldn't know whether these are forcings or feedbacks. I don't know anyone who would suggest that we have a good idea about what the trends in albedo are - all the records are partial, noisy and prone to confounding factors. In some of them, the huge albedo change due to the Pinatubo aerosols doesn't even appear, in others the trends are opposed. Our understanding particularly of aerosol changes is indeed woefully inadequate, but uncertainty is not the same as knowing that they are changing dramatically (and they'd have to be reducing at a rapid rate to match the changes in CO2). The Pinker and Wild papers are talking about surface fluxes anyway (as I recall), not TOA changes, which are the important ones for radiative forcing. So I would put the statement that the growths in forcings over the last 30 years are predominantly due to GHGs in the highly likely (but not incontrovertable) category. - gavin]

    Comment by muller.charles — 6 Oct 2006 @ 7:58 PM

  55. re 52

    The problem for sun-cloud-climate connexion in the XXth century is that the stabilizing / cooling trend began in the 40′s, whereas maximum spotnumber of the century occured in year 1959. This 15-20 years lag is not too coherent (or we need a very very strong albedo from aerosols, as long as between 1940 and 1960, GHGs continue to accumulate in the atmopshere and there’s no particularly large volcano forcing).

    Comment by muller.charles — 6 Oct 2006 @ 8:08 PM

  56. The last statement in 54. should read – there is little or no doubt that the last three decades of warming were driven by the accumulation of anthropogenic greenhouse gases in the atmosphere. Station temperature plots at all northern locations in the Upper Midwest, Alaska and in high latitude areas of the Rocky Mountains show dramatic increases in overnight minimum temperatures during the winter months.
    See:
    http://pg.photos.yahoo.com/ph/patneuman2000/my_photos

    Comment by Pat Neuman — 6 Oct 2006 @ 10:16 PM

  57. re 33

    other models with quite different sensitivities also do a “good job” for the 20th century. The problem is poorly constrained.

    I thought this was a terrific question and was disappointed to not see an answer. Because I sometimes get the sense that GCMers do not care that the problem is poorly constrained. Granted, they’re powerless to do anything about it. But doesn’t that argue for cautious interpretation by those that rely on (but do not develop) these models? [Am I being overly presumptuous here?]

    [Response: The relatively under-constrained nature of the global mean temperature record is precisely why most of the formal attribution studies using spatial patterns and as many different fields as they can. There is obviously much more data available than remains in the mean temperature record. The whole point of using GCMs is to be able to compare more directly with all sorts of different data. -gavin]

    Comment by bender — 7 Oct 2006 @ 12:44 AM

  58. re 57
    At the risk of trying your patience, I think you may have missed the point (and please correct me if I’m wrong on that) – as #33 was referring to the parameterizing of the GCMs as being a poorly constrained problem. Meaning you have hundreds of parameters (how many I don’t actually know) but very few inputs and outputs with which to constrain the universe of possible parameterizations. The literature on feedbacks, forcings and fluxes contrains you somewhat, but not much in the grand scheme of things. I don’t see that this has much to do with the “under-constrained nature of the global mean temperature record”, but it is always possible I’m missing something. Last post. Thanks.

    [Response: Sorry, I did miss your point. In any model there are dozens of paramterisations - for the bulk surface fluxes, for the entrainment of convective plumes, for mixing etc. The parameters for these processes are generally estimated from direct observations of these small scale processes (though there is some flexibiilty due to uncertainties in the measurements, sampling issues, confounding effects etc.). The tuning for these processes is generally done at the level of that process. When the model is put together and includes all of this different physics, we evaluate the emergent properties of the model (large scale circulation, variability etc.) and see how well we do. When there are systematic problems we try and see why and learn something (hopefully). The number of parameters that are tuned to improve large scale effects are very small (in the GISS Model, maybe 3 or 4) and those are chosen to adjust for global radiative balance etc. Once the parameters are fixed, we do experiments to evaluate the variability, forced (impact of volcanoes, 20th C forcings, LGM etc. ) and internal (NAO, ENSO, MJO etc.). Again, we learn from the comparisons. The amount of data available to evaluate the models is vast - and more than we can handle, though directly assessing individual parameterisations from large scale evalulations is difficult (since it's a complex system). Thus, most parameterisations are tested against small scale data (field experiments etc.). The problem is not so much under-constrained, as under-specified. There is generally more going on than is included within the (usually simple) parameterisation, and so parameterisation development is not really about changing some constant multiplier, but changing the underlying conception to better match the real physics. - gavin]

    Comment by bender — 7 Oct 2006 @ 1:44 AM

  59. #51 (Johan Richter):

    These are more meteorologically oriented than the textbooks Gavin recommended:

    “Atmospheric Science: an Introductory Survey,” by Wallace and Hobbs (1977) is a classic. Also, “Meteorology for Scientists and Engineers,” by Roland Stull (2000?) is fantastic, and gets better each time I read through it.

    Also, if you can find a copy of “Synoptic Climatology of the Westerlies,” by Jay Harman, you will be in for a treat.

    Best of luck.

    Comment by Kenneth Blumenfeld — 7 Oct 2006 @ 3:40 AM

  60. Anyone wanting to learn more, or wanting to try their hand at running climate models can and could join the UCAR community, (and their netcdf group to help you understand any computer coding glitches)…

    Their climate models are free and open to anyone willing or wanting to do so.

    I have been running one for the last couple of years or so. I am running UNIX in the background on my HP Windows with an Oracle SQL program database along with my Sun Java and somehow it still works fine for me; another words my computer did not blow up with my climate runs and all the other weird stuff I have going on with my PC.

    “The Community Climate System Model (CCSM) is a fully-coupled, global climate model that provides state-of-the-art computer simulations of the Earth’s past, present, and future climate states. ”
    http://www.cgd.ucar.edu/csm/

    Experiments and Output Data
    http://www.ccsm.ucar.edu/experiments/

    “CCSM3 Experiments and Data”
    http://www.ccsm.ucar.edu/experiments/ccsm3.0/

    “To document and validate CCSM3.0, various multi-century control runs were carried out. All output data from these control runs is available to the public. The Earth System Grid (ESG) http://www.earthsystemgrid.org/ is the primary method for distributing this output data (more information below http://www.ccsm.ucar.edu/experiments/ccsm3.0/#data ). Also available on this web page is a sampling of diagnostic plots http://www.ccsm.ucar.edu/experiments/ccsm3.0/#plots from these control runs.”

    “Another series of CCSM3 experiments is underway which will provide data for an upcoming IPCC Assessment Report. Data from these experiments will be made available after CCSM3 results are submitted to the IPCC. http://www.ipcc.ch/

    However, the EarthSystem Grid will be assimilated into the Teragrid http://www.teragrid.org/
    but one should be able to find all of these climate models until its totally borged together.

    CCSM (Community Climate System Model)
    CCSM POP (modified version of Parallel Ocean Program)
    CSIM (CCSM Sea Ice Model)
    CLM (CCSM Community Land Model)
    PCM (Parallel Climate Model)
    POP (Parallel Ocean Program)
    Scientific Data Processing and Visualization Software

    UCAR’s latest journal papers
    “Special Issue on Climate Modeling”
    Fall 2005, Volume 19, No. 3
    http://hpc.sagepub.com/content/vol19/issue3/

    UCAR’s 11th Annual CCSM Workshop
    http://www.cgd.ucar.edu/csm/news/ws.2006/index.html
    notes
    ….”In addition, there have been over 200 papers written comparing simulations from all the climate models submitted to the Fourth Assessment Report of the IPCC. Many of these are listed at the IPCC Model Output Page.
    http://www-pcmdi.llnl.gov/ipcc/diagnostic_subprojects.php” …

    Incidentally, there are 553 subprojects (many dealing with the topic of soley GCMs) listed at Lawrence Livermore National Laboratory IPCC’s project page and a large number already have presented their subsequent publications (links provided on LLNL IPPC website).
    “Last Updated: 2006-09-15 Current Total: 553 subprojects”
    http://www-pcmdi.llnl.gov/ipcc/diagnostic_subprojects.php

    Comment by barbie doll moment — 7 Oct 2006 @ 5:28 AM

  61. Re #51 and “[Response: Ruddiman's text is good, and so is David Archer's. - gavin]”

    I went to Archer’s site and began to read his fourth chapter, which is available on-line, and I immediately found an error — he says partial pressure is directly proportionate to volume fraction, which is just not true. You have to multiply by the molecular weight of the gas and divide by the mean molecular weight of the atmosphere. pCO2 is 57 Pascals, not 38.

    -BPL

    [Response: But since the molecular weights are constant in that calculation, the statement about proportionality is correct. -gavin]

    Comment by Barton Paul Levenson — 7 Oct 2006 @ 7:27 AM

  62. Re: #16 (Summer is coming to New Zealand, isn’t it?)

    My explanation of what I think basics is a little to long to write here, so I have place it at
    http://web.sfc.keio.ac.jp/~masudako/memo_en/precip_evap_warming.html .

    Comment by Kooiti Masuda — 7 Oct 2006 @ 10:55 AM

  63. Re #60,

    The problem with these experiments (as good as the climateprediction.net experiment which uses the HadCM3 model) is that one doesn’t have a possibility to change the sensitivities oneself. All experiments are pre-described.

    I am looking for a full GCM, running in background on my own computer, where it is possible to make runs with a reduced sensitivity for GHGs (less feedback) and aerosols (less forcing and feedbacks) and an increased sensitivity for solar. The runs I made with the simple Oxford EBM model (in #28) indicates that it is possible to find a similar (to better) fit of 20th century temperature trend as with the original 3 K/2xCO2…

    As there is a high probability that the (cooling) influence of aerosols is overestimated and the influence of solar variations is underestimated, some alternative runs might be of interest…

    Comment by Ferdinand Engelbeen — 7 Oct 2006 @ 2:31 PM

  64. Re #27 (comment):

    Gavin, you say:

    Potential mechanisms for solar impacts on climate are a fascinating subject and may well help explain observed changes in the paleo-climate record, however, hoping that they will somehow magically reduce the effect of GHG increases today is foolish

    I don’t think that solar alone can explain the full warming of the past century, but any increase in attribution to solar anyway will reduce the impact of GHGs, as the sum of all forcings with their (individual and common) feedbacks must fit the past century temperature trend.

    Another interesting GCR-cloud finding, based on observations, is described by Harrison and Stephenson (Univ. of Reading): “Empirical evidence for a nonlinear effect of galactic cosmic rays on clouds”. But needs more investigation for longer-term effects.

    Comment by Ferdinand Engelbeen — 7 Oct 2006 @ 2:54 PM

  65. >61, Barton and Gavin
    A plea. Try to provide an explicit reference so readers who aren’t experts can figure out what you’re talking about.

    Re Archer’s book, I looked at Ch. 4; I’m guessing you two are talking about this from the first page:

    “…The mixing ratio of a gas is numerically equal to the pressure exerted by the gas, denoted for CO2 as pCO2. In 2005 as I write this the pCO2 of the atmosphere is reaching 380 μatm.”

    But there’s no “partial pressure” or “Pascal” in the text of the chapter, if Adobe’s search tool isn’t lying. I searched for “38″ — and found “380″ — so — a plea to you all to try to remember your readers.

    Google: define:Pascal yields this: * a unit of pressure equal to one newton per square meter …

    Comment by Hank Roberts — 7 Oct 2006 @ 9:51 PM

  66. Re 12:
    Hank, you mention a study which suggests a possible cause why methane levels levelled off in the 1990s. For those of us who don’t have access to the subscription-only reports, would you mind summarising briefly what they suggest? Many thanks!

    Comment by Almuth Ernsting — 8 Oct 2006 @ 8:14 AM

  67. I’ve been looking at the various CO2 mixing ratio measurements at the CDIAC, and particularly the seasonal variation. I think there is useful information in the seasonal variations that I have not seen commented on before. I wrote this up as a “white blog post” and would appreciate comments.

    I have the impression that the approach taken has been to use isotopic ratios to decompose the seasonal contributions.

    Comment by Eli Rabett — 8 Oct 2006 @ 12:20 PM

  68. Dr. Masuda, while humidity (and relative humidity) are controlled by thermodynamics, the equilibrium being quickly established, in my understanding condensation is controlled by kinetic factors, among other things the availability of condensation nuclei. Since the air is swept clear by rain (which is why things appear sharper in the distance after a rainstorm), it is not necessarily true that precipitation will increase.

    Comment by Eli Rabett — 8 Oct 2006 @ 2:46 PM

  69. Re:#11. The very small non-ideal gas contribution from oxygen is not the “major” deviation from ideality of the atmosphere, but rather the small contribution from variable water vapor concentrations is. Even that is only ~0.1%.

    In both cases, if one wishes to relate temperature to the energy of the atmosphere, the small deviations from ideality are very, very small when compared to the linear term. For example, assume that temperature rises by 3 C, and over that range the specific heat, Cp, changes by .1%. The “error” in treating the case as linear would be 0.001Cp/(3Cp), or ~0.03%.

    Thus, for practical purposes by Tim Hughes’ own argument, there is a linear relation between temperature and energy content of the atmosphere over the range of temperatures and pressures one encounters in the atmosphere.

    Comment by Eli Rabett — 8 Oct 2006 @ 3:55 PM

  70. >66, 12
    Almuth, here’s another science news article on the Nature story
    http://www.sciencedaily.com/releases/2006/09/060927201651.htm

    excerpt:
    ———
    …. it was a decline in emissions of methane from human activities in the 1990s that resulted in the recent slower growth of methane in the global atmosphere.

    Since 1999, however, sources of methane from human activities have again increased, but their effect on the atmosphere has been counteracted by a reduction in wetland emissions of methane over the same period.

    According to one of the authors of the Nature paper, Dr Paul Steele from CSIRO Marine and Atmospheric Research, prolonged drying of wetlands — caused by draining and climate change — has resulted in a reduction in the amount of methane released by wetlands, masking the rise in emissions from human activities.

    “Had it not been for this reduction in methane emissions from wetlands, atmospheric levels of methane would most likely have continued rising,” he says.

    “This suggests that, if the drying trend is reversed and emissions from wetlands return to normal, atmospheric methane levels may increase again, worsening the problem of climate change.”

    The researchers used computer simulations of how the gas is transported in the atmosphere to trace back to the source of methane emissions, based on the past 20 years of atmospheric measurements.
    ———

    Nature’s science news site of course is pay-to-view; this is their teaser:

    Methane emissions on the rise
    Quirin Schiermeier
    SUMMARY: Industrial greenhouse-gas increase has been masked by natural declines.
    CONTEXT: Current projections of methane emissions are likely to be too optimistic, an international team of atmospheric scientists reports today in Nature. Methane, which is less abundant in the atmosphere than carbon dioxide but 20 times more…
    News@Nature (25 Sep 2006) News

    You might look for references to Paul Steele from CSIRO Marine and Atmospheric

    Comment by Hank Roberts — 8 Oct 2006 @ 6:08 PM

  71. What I find most intersting about CO2 is the pattern it shows of increasing steadily after the end of the last several glacial maximums then dropping quickly as glaciation resumes. These cycles were present long before humans had any ability to affect climate.

    I recognize that we humans are now driving a lot of the CO2 increase in the last hundred years but has this natural cycle ever been fully explained?

    Maybe somebody could point me to post.

    Comment by Jim Cross — 8 Oct 2006 @ 8:37 PM

  72. Re: 71.

    I see a different pattern in viewing the figure on CO2 at realclimate’s 650,000 years of greenhouse gas concentrations (24 Nov 2005) at:
    http://www.realclimate.org/index.php/archives/2005/11/650000-years-of-greenhouse-gas-concentrations/

    Figure:
    http://www.realclimate.org/epica_co2_f4.jpeg

    Comment by Pat Neuman — 8 Oct 2006 @ 10:14 PM

  73. Jim, some of these charts are plotted with older info on the left, others with older on the right.
    You have to look at the label on each chart to see which way the time arrow runs in the sequence.
    Would you look again at whatever you were looking at when you wrote 71, and check that?
    Compare it to the figure Pat links to in 72, noting the time direction.

    Comment by Hank Roberts — 9 Oct 2006 @ 12:46 AM

  74. Re #71:

    The main driver for the pre-industrial temperature-CO2 level correlation seems to be the ocean uptake. Colder water can absorb more CO2, while warmer water releases CO2. Add to this the changes in biosphere (ice ages – less land left for trees/shrubs/pasture, ocean algue production) and physico/chemosphere (ocean currents, deep water formation/uptake, rock wearing, dissolution) and one sees a remarkable stable correlation over the ice ages (~10 ppmv/K), which holds even in recent high-accumulation ice cores up to the LIA.

    The 10 ppmv/K is the reaction of CO2 on temperature changes. That doesn’t tell anything about the influence of CO2 on temperature. According to GCM’s, the 10 ppmv/K in the pre-industrial ages includes a feedback of CO2 levels on temperature, but in near all cases, there is a huge overlap between temperature changes and CO2 changes. That means that it is not possible to separate what influences what. With one exception: the end of the last interglacial (the Eemian). Temperature and methane levels decreased, but CO2 levels remained high until the temperature nearly reached its minimum (see here). The subsequent decrease of CO2 levels (~40 ppmv) has little influence on the temperature. That points to a low influence of CO2 on temperature in that range. Of course, the much higher increase in recent times will have an influence on temperature, but probably less than what is implemented in current GCM’s…

    Comment by Ferdinand Engelbeen — 9 Oct 2006 @ 4:31 AM

  75. re #69

    Lots of people seem to make this point, so thanks for putting it; the error is assuming that the temperature change is only 3 degrees.

    Air warming over land creates thermals, which commonly rise to the tropopause, going from surface temperature to minus 75 degrees, and from humidities of 100% to 0%, and pressures of 1 atm to 0.2 atm. If the deviation from standard gas laws are ignored, tben the ratio of potential and kinetic energy in the thermal gases will be incorrect by (roughly) the amount attributed to AGW.

    But that is just another example.

    For some reason, you have ignored that gas conditions across the earth are highly variable and that these gases move around the earth due to winds.

    Comment by Tim Hughes — 9 Oct 2006 @ 4:39 AM

  76. While the points are well taken, it remains possible, and I think adisable to make a much clearer and stronger statement for public consumption.

    If the greenhouse gas forcing is 2.5 W/m^2, and the total forcing is 1.7 W/m^2, then the greenhouse forcing is responsible for 147% of the total forcing. In other words, even leaving aside the lags in the system, we are only seeing about 2/3 of the warming that would be due to greenhouse gases alone.

    Because greenhouse gases are essentially cumulative and aerosols are essentially instantaneous, and because these two terms dominate, simple extrapolation tends to make the casual observer underestimate the sensitivity of the system to future greenhouse forcing.

    (Additionally, the various lags in the system also make matters worse than they might appear at first consideration. This is somewhat off the present point, except in that it also tends to make generalizations either too opaque or too sanguine.)

    This is why I suggest using the number 147% in this context: it is the ratio of all greenhouse forcings to the net of all forcings. To say the number falls between 40% and 80% depending on interpretation seems to me to contribute to systematic understatement of the problem. At least let us say that the number falls between 40% and 150%.

    Comment by Michael Tobis — 9 Oct 2006 @ 10:49 AM

  77. Eli at #68 – I think evaporation is more important to consider than condensation for this process. Clearly there can be no rain unless it has first evaporated. I think it is also true that the amount of evaporation in a year is >> than the humidity content of the atmosphere and, consequently, there is not much scope for there to be an imbalance between evaporation and precipitation on a global scale.

    Therefore, what goes up, must come down.

    Evaporation would be assumed to increase because of the higher temperatures, acting on the 2/3rds of the world covered by ocean. There are other factors, such as cloudiness and surface winds, that might impact this, but it is less obvious that they will have a strong trend.

    Consequently I find it hard to accept that lack of cloud condensation nuclei could act as a limiter on condensation and subsequently precipitation.

    Comment by Timothy — 9 Oct 2006 @ 11:46 AM

  78. >79, 65
    “thermals, which commonly rise to the tropopause”

    What’s your source for this statement? Where did you read it? Why do you consider that source reliable? Can you give us a link to it, so we can look up the references on which it is based?

    I think you’re talking about “thunderstorms” not “thermals” — thunderstorms do rise that high at times, the flat top “anvil cloud” is what you see where a thunderstorm cloud has risen far enough to reach the tropopause.

    Most thermals don’t have enough moisture carried high enough to even make a little puffy white cumulus cloud. A thunderstorm is a heat engine drawing up enough warm moist air, which continues to expand and release energy and punches on up for a while as it expands and cools, and moisture condenses releasing more heat energy to boost its climb.

    Sometimes, yes, til it gets an anvil top.

    Forest fires usually don’t punch up that high — and a forest fire is a much bigger heat source than a single thermal. Typically you get one of these:

    http://www.atmos.washington.edu/2003Q3/101/notes/Pyrocumulus.jpg

    From “Atmosphere 101″ — looks like a good class there: excerpt:
    ——
    The surface heating of the fire destabilizes the column, air (a “parcel”) from the surface rises, it remains warmer than the surrounding air and keeps rising, then it reaches its condensation level where it has cooled to its dew point temperature and is thus saturated, the cumulus cloud begins forming, but the air since now saturated is cooling at the moist adiabatic lapse rate rather than the dry adiabatic lapse rate, but it continues rising because it remains warmer than its surrounding, thus the cloud grows vertically. ”
    — end excerpt—

    Talk to anyone who’s flown hang gliders or regular gliders about this. Most thermals are quite well mixed and broken up within a few thousand feet above the ground.
    https://ntc.cap.af.mil/ops/DOT/school/images/cullift.gif

    Comment by Hank Roberts — 9 Oct 2006 @ 11:47 AM

  79. Re #75

    If models do neglect gas law deviations and other nonlinearities (you haven’t presented any actual evidence that they do), that would lead to a steady state bias that would be fairly easy to diagnose. Even if the magnitudes were similar, the nonlinearity bias wouldn’t look at all like the time-varying AGW signal, so it’s misleading to compare the two as if AGW might be misattributed nonlinearities.

    Comment by Tom Fiddaman — 9 Oct 2006 @ 1:37 PM

  80. re: 72, 73, 74 in response to my post 71

    I’m referring to a chart such as this:

    http://en.wikipedia.org/wiki/Image:Ice_Age_Temperature.png

    that seems to show temperature, CO2, and glaciation almost in lock step with higher temperatures, higher CO2 matching with reduced glaciation. What’s more, the increase in CO2 and decrease in glaciation seems to occur very quickly in geological timeframes.

    Post 74 seems to be saying the increased CO2 is caused by warmer oceans and less ice. This would seem to be saying the increased CO2 is a response to warming rather than a cause of it.

    If CO2 is the cause of the warming (which was what my initial post had assumed), then my question was what was driving the CO2 increase.

    Comment by Jim Cross — 9 Oct 2006 @ 6:34 PM

  81. Jim, your question is one of the Highlights of this site — see the right side of the main page for the link, or
    http://www.realclimate.org/index.php?p=13

    Comment by Hank Roberts — 9 Oct 2006 @ 7:21 PM

  82. Re: 80.

    Excerpts:
    Ancient Climate Studies Suggest Earth On Fast Track To Global Warming

    Human activities are releasing greenhouse gases more than 30 times
    faster than the rate of emissions that triggered a period of extreme
    global warming in the Earth’s past, according to an expert on
    ancient climates.

    “The emissions that caused this past episode of global warming
    probably lasted 10,000 years. By burning fossil fuels, we are likely
    to emit the same amount over the next three centuries,” said James
    Zachos, professor of Earth sciences at the University of California,
    Santa Cruz.


    During the PETM, unknown factors released vast quantities of methane
    that had been lying frozen in sediment deposits on the ocean floor.
    After release, most of the methane reacted with dissolved oxygen to
    form carbon dioxide, which made the seawater more acidic.


    Santa Cruz CA (SPX) Feb 16, 2006, text posted at:
    http://groups.yahoo.com/group/ClimateArchive/message/2816

    Comment by Pat Neuman — 9 Oct 2006 @ 11:10 PM

  83. Re: #71 Jim’s Question:

    “What is causing the drop in CO2, as the earth cools, from the interglacial warm period to the glacial coldest period?”

    See this review article in Nature “Glacial/interglacial variations in atmospheric carbon dioxide” by Sigman and Boyle (2000) for an explanation of why there is a 100 ppm drop in atmospheric carbon dioxide (280 ppm to 180 ppm) as glacial cycle progressed.

    http://scholar.google.com/url?sa=U&q=http://www.atmos.ucla.edu/~gruber/teaching/papers_to_read/sigman_nat_00.pdf

    The 100ppm drop in CO2 is not, primarily due to colder oceans. The following is an explanation of why colder oceans alone can not account for a 100 ppm drop in CO2. (See Nature paper for details).

    As there is a vast amount of fresh water in the glacial period, in the new ice sheets, the ocean becomes Salter (3%). Salter water can hold less carbon dioxide (6.5 ppm less for a 3% increase in salt content). Colder water can hold more carbon dioxide, however, the deep ocean is already an average of 4C and will freeze (salty or not) at around -1.8C. The estimated maximum drop deep in deep ocean temperature is 2.5 C. The surface subtropical oceans were estimated to have cool by about 5C. (Note vast areas of the high latitude oceans were covered by ice, during the coldest period and could hence no longer absorb carbon dioxide.)

    The reduction in carbon dioxide, due to colder oceans, is estimated to be max. 30 ppm. Now as vast areas of land which are currently forested, were covered by the glacial period ice sheets, the temperate forest is no longer using carbon dioxide which adds carbon dioxide to the atmosphere. In addition, during the glacial period large sections of tropical rain forest changes to savanna (About a third of the tropical forest changes to savanna. The planet is drier when it is colder), as savanna is less productive that tropical forests that change also adds carbon dioxide to the atmosphere. The Nature article estimates the temperate forest change and the increase in savanna, adds 15 ppm of carbon dioxide to the atmosphere. The net for this calculation is therefore = – 30 ppm + 6.5 ppm + 15 ppm = -8.5 ppm.

    As there is 100 ppm to explain the above are not the solution. The above article explains that increased biological products in the ocean.

    During the glacial period there are periodic (200yr, 500 yr, and 1500 yr cycle)rapid cooling events (RCCE “Rickeys”) during which there is an increase in dust (800 times above current in the Northern Hemisphere, Greenland Ice sheet cores, and about 15 times in the Southern Hemisphere, Antarctic sheet cores). The iron and phosphate in the dust causes an increase in the biologic production in regions of the earth’s ocean which are currently almost lifeless due to a lack of nutrients. The increase in biologic production removes the CO2.

    Comment by William Astley — 10 Oct 2006 @ 12:00 AM

  84. re #78

    Sorry, I was using “thermals” badly to mean any thermal column. I was trying to keep things too simple.

    re #79

    I don’t have any evidence which is why I asked the question “do the models take into account gas law deviations and other non-linearities.”

    [Response: Gas law deviations are too small to matter -W]

    Comment by Tim Hughes — 10 Oct 2006 @ 3:44 AM

  85. Thanks for the responses to my question in #71.

    To quote from one of the links regarding the warming at the end of maximum glaciation:

    “Some (currently unknown) process causes Antarctica and the surrounding ocean to warm. This process also causes CO2 to start rising, about 800 years later.”

    It is the “currently unknown” part of that I find intriguing. It seems that understanding what triggers the CO2 increases and decreases, which may not be the same thing, is critical to understanding our past and future climate.

    Comment by Jim Cross — 10 Oct 2006 @ 7:12 PM

  86. Re: 85.

    Some links about climate change episodes in Earth’s past:

    http://www.geolsoc.org.uk/template.cfm?name=fbasalts

    http://www.geology.sdsu.edu/how_volcanoes_work/

    http://www.scotese.com/

    Also see:
    http://groups.yahoo.com/group/ClimateArchive/links

    Comment by Pat Neuman — 10 Oct 2006 @ 9:27 PM

  87. RE: 86,

    Pat, I will be forever grateful that you provided the link to
    Chris Scotese web page. His graphics and animations are what make earth science so fascinating. And, thanks to Chris as well.

    Every science teacher should have his web page bookmarked.

    Comment by John L. McCormick — 11 Oct 2006 @ 10:02 AM

  88. John, Thank you! The information on the Scotese web page was very helpful to me as I was learning about climate change, and continue to learn of course.

    Maybe someone from the Alaska Climate Research could help me out with this question. When was the last time 9 inches of rain was measured within a two-day period or less in Alaska during the month of October?

    “The National Weather Service reported that 9 inches of rain had fallen
    in Seward between noon Sunday and 5 p.m. Monday. Tom Dang of the
    National Weather Service said the low pressure system that caused the
    storm moved in on the jet stream from the Aleutian Islands, pulling in
    tropical moisture that had welled there. … “Within a half hour there
    were chunks of ice — I can only assume from Exit Glacier — flowing
    down Exit Glacier Road,” …

    (May need to register with The Anchorage Daily News to get to the story)
    October 10, 2006
    http://www.adn.com/news/alaska/kenai/story/8288801p-8185319c.html

    Comment by Pat Neuman — 11 Oct 2006 @ 9:51 PM

  89. If Stu Ostro, Senior Meteorologist with The Weather Channel, can discuss what he sees as the impact of climate change upon day-to-day weather patterns – then how come Tom Dang, NOAA’s National Weather Service, is not allowed (or refuses) to do that … in the public interest?

    Excerpt 1 of 2:
    The Weather Channel Blog

    “I have written on the impacts of climate change upon day-to-day weather patterns in these pages during the past year … and will have more comment on this one when I have the time (was going to allude to this aspect in my entry last night but couldn’t have quickly done it justice). For now, suffice it to say that I think the occurrence of this event in Alaska was not an “accident.” More broadly in regard to global warming’s impact in Alaska, TWC did a feature on this a couple of years ago. The videos and a text piece can still be found online here” :
    http://www.weather.com/aboutus/television/forecastearth/alaska.html
    Comment posted by Stu Ostro, October 11, 2006
    FROM HAWAII TO ALASKA
    October 10, 2006
    The Weather Channel Blog
    http://www.weather.com/blog/weather/?from=wxcenter_news

    Excerpt 2 of 2:
    Anchorage Daily News:

    “Tom Dang of the National Weather Service said the low-pressure system that caused the storm moved in on the jet stream from the Aleutian Islands, pulling in tropical moisture that had welled there. … “Within a half hour there were chunks of ice — I can only assume from Exit Glacier — flowing down Exit Glacier Road,” …
    (May need to register with Anchorage Daily News to get to story)
    October 10, 2006
    http://www.adn.com/news/alaska/kenai/story/8288801p-8185319c.html

    Comment by Pat Neuman — 12 Oct 2006 @ 9:52 AM

  90. For those who may think the flooding in Alaska earlier this week is not out of the ordinary please check out the link (below) to the story and photo.

    Caption reads:

    “Floods pound highway: Waterfalls on Tuesday wash over the Richardson Highway running through Keystone Canyon, north of Valdez. Floodwaters severely damaged a stretch of the highway, closing the road and blocking Valdez from the rest of Alaska.”

    http://www.juneauempire.com/stories/101206/sta_20061012015.shtml

    Comment by Pat Neuman — 12 Oct 2006 @ 8:21 PM

  91. re 87. 88.

    The same storm that dumped nine inches of rain this week in Alaska has now dumped more than nine inches of lake-effect snow in western New York.

    The hydrologic effect of climate change on lowering the Great Lakes water levels is a huge concern for many people.

    The Oct 13, 2006 Weekly Great Lakes Water Level Update from the Corps
    of Engineers Detroit District Office shows that:

    “Lake Superiorâ??s water level is currently 11 inches lower than it was
    a year ago, while Lake Michigan-Huron is 1 inch below last year.”

    Recorded lake level data (1918-PRESENT)

    Difference from recorded average Oct. levels:
    SUPERIOR (-17 inches), MICH-HURON(-19 inches).

    Diff. from recorded lowest Oct. mean levels:
    SUPERIOR (0 inches), MICH-HURON (+11 inches).

    http://www.lre.usace.army.mil/

    Lakes Michigan and Huron are considered a single lake, hydrologically.

    The hydrologic effects of climate change on Great Lakes water levels include greater evaporation and longer growing seasons. Longer growing seasons result in more transpiration and less inflow to the lakes.

    A combination of high winds and warm lake surface water temperatures has caused the heavy snow falls along the downwind lake-effect areas of the Great Lakes.

    Lake Superior is currently near a record low level for this time of year (records from 1918 to current).

    Comment by Pat Neuman — 13 Oct 2006 @ 3:07 PM

  92. Pat,

    Got off the scale vertical sun disc expansions to the southwest of Resolute a few days ago, this is from Pacific Ocean powering a huge Cyclone. North Pacific low pressures seem to be heading Northwards because it aint so cold in the Arctic. Like your Great Lakes piece, do you think no ice during last winter contributed to Great lakes water levels decline?

    Comment by wayne davidson — 14 Oct 2006 @ 1:46 AM

  93. Re # 91
    Erie levels predicted to drop
    Story from the Monday, July 24, Edition of the Chronicle-Telegram (Elyria, Ohio)
    The Associated Press

    CLEVELAND – The newest update to a Lake Erie management plan predicts global warming will lead to a steep drop in water levels over the next 64 years, a change that could cause the lakeâ??s surface area to shrink by up to 15 percent.
    The drop could undo years of shoreline abuse by allowing water to resume the natural coastal circulation that has become blocked by structures, experts said.
    Updated annually, the plan is required by the Great Lakes Water Quality Agreement between the United States and Canada. It is developed by the U.S. Environmental Protection Agency, Environment Canada and state and local governments with help from the shipping industry, sports-fishing operators, farm interests, academics and environmental organizations.
    The newest update addresses for the first time, when, where and how the shoreline will be reshaped. It says the water temperature of Lake Erie has increased by one degree since 1988 and predicts the lake’s level could fall about 34 inches. It also says the other Great Lakes will lose water.
    If the projections are accurate, Lake Erie would be reduced by one-sixth by late this century, exposing nearly 2,200 square miles of land and creating marshes, prairies, beaches and forests, researchers said.
    Researchers said new islands are appearing in the western basin, where Lake Erie is at its lowest and some reefs are about 2 feet below surface.
    “There is now stronger evidence than ever of human-induced climate change,” states the report, dated this spring. “Our climate is expected to continue to become warmer. This will result in significant reductions in lake level, exposing new shorelines and creating tremendous opportunities for large-scale restoration of highly valued habitats.”
    A predicted drop in water levels also has been addressed by the International Joint Commission, an American-Canadian panel that controls water discharges out of Lake Superior and the St. Lawrence River. The commission told scientists at a workshop in February that research showed water levels should begin decreasing before 2050.
    “We can try to be positive about climate change, really positive,” said Jeff Tyson, a senior fisheries biologist at the Ohio Department of Natural Resources, who helped write a portion of the management plan. “If it continues to be hot, once you lose that meter of water over the top, we get an entirely natural, new shoreline along a lot of the lakefront. If we manage it right, things could look a lot like they did when the first white settlers arrived.”
    The report was written in an effort to spark thought about what the shoreline could become, said Jan Ciborowski, a professor at the University of Windsor who specializes in aquatic ecology and also helped write the plan.
    “There is a lot of opinion among scientists who study the Great Lakes that we need to get the public to start thinking: ‘What are things going to look like?’ ” Ciborowski said.
    The plan monitors issues ranging from pollution to invasive species, said Dan O’Riordan, an EPA manager at the Great Lakes National Program Office in Chicago. He said the agency recognizes the views of experts who predict the lake will shrink.
    “They’ve done the math; I would trust the math,” he said.

    http://www.chroniclet.com/Daily%20Pages/072406local1.html

    Comment by Chuck Booth — 15 Oct 2006 @ 12:12 AM

  94. re. 93

    A plot showing Lake Superior monthly elevations from 2004 to 2006 and the 1925-1926 record low monthly elevations (for comparison) are at the link below. I requested at cleveland.indymedia that people let me know if they want to see a similar plot for Lake Erie.

    In my opinion, the current near record (1918-2005) low October average level on Lake Superior and the low levels on Michigan-Huron are due to climate change. The severe drought in the Upper Midwest this year was a large contributing factor to the low levels on Lake Superior.

    The level of Michigan-Huron (a single lake hydrologically) has been low for several years already but it will take a tumble over the next few months unless multiple periods of heavy rain develop during that time.

    http://cleveland.indymedia.org/news/2006/10/22834.php

    Comment by Pat Neuman — 15 Oct 2006 @ 9:58 PM

  95. Gavin, My question relates to methane forcing and policy. How much of the atmospheric methane is released by incomplete combustion of fossil fuels? If this is a significant source of atmospheric methane as well as the major driver of CO2, then policies that reduce fossil fuel combustion would reduce releases of both gases.

    Comment by Sue Radford — 19 Oct 2006 @ 11:05 PM

  96. So how do we know how much increased CO2 has increased convection? With more heat at the bottom of the column convection will increase, correct? Won’t that make more cumulous clouds causing more rain and higher albedo?

    Comment by Steve Hemphill — 21 Oct 2006 @ 11:53 AM

  97. Further, you give a forcing of -0.15 w/m^2 to land use changes. Is this number simply a basic surface albedo difference forcing which does not include feedbacks such as increased cloud cover over vegetated land or increased biochemical production as opposed to sensible heat production by biomass? Is a precision of two significant figures appropriate?

    Thank you for your constructive dialogue.

    Comment by Steve Hemphill — 24 Oct 2006 @ 12:01 AM

  98. Re #97 and “Further, you give a forcing of -0.15 w/m^2 to land use changes. Is this number simply a basic surface albedo difference forcing which does not include feedbacks such as increased cloud cover over vegetated land or increased biochemical production as opposed to sensible heat production by biomass? Is a precision of two significant figures appropriate?”

    Land use changes include deforestation, which releases stored CO2 into the atmosphere. Especially when the deforestation is accomplished by burning the forest, which it frequently is these days.

    -BPL

    Comment by Barton Paul Levenson — 24 Oct 2006 @ 7:15 AM

  99. This explanation of global warming although maybe well intentioned is unclear,foggy and dangerous.It pretends to inform and educate but instead obviscates the real effects of co2,methane,etc.and supplies a convienient red herring to the companies and individuals that continue to chip away at the land,sea and air of planet earth making it unhabitable for all living things.

    Comment by ralph adams — 24 Oct 2006 @ 8:53 PM

  100. Re #98,
    Quite true, and a question is how much of the CO2 taken up by the forests and subsequently burned turns to ash instead of returning to CO2.

    Re #99,
    Can you be more specific than “This explanation”?

    Thanks
    Steve

    Comment by Steve Hemphill — 25 Oct 2006 @ 1:32 AM

  101. CO2 levels appear to have risen steadily only since the 1920′s … So what caused the temperature increase from 1900 – 1920′s ? and what caused the temperature DECREASE from 1940 – 1970 … Until you explain either situation forget about convincing anyone that CO2 is the main forcing agent …

    Comment by Jeff — 27 Oct 2006 @ 3:40 PM

  102. re: 101. Goodness, all you have to do is type “1940-1970 cooling” in the search box at the top of the page to see the scientific explanation for that cooling trend! Is it that hard to do? Until you make the simple effort to look for clear results to your basic questions, forget about convincing anyone that CO2 is not the main forcing agent.

    Comment by Dan — 27 Oct 2006 @ 5:45 PM

  103. “Jeff” — what’s your source for your belief that “CO2 levels appear to have risen steadily only since the 1920′s” — you wrote that at 3:40 am ?

    I can’t find a source for that, searching the web. Who’s your source? And, why do you believe your source? Are you looking at a picture or table in a publication? Or did someone you trust tell you that’s a fact?

    Comment by Hank Roberts — 27 Oct 2006 @ 8:17 PM

  104. Jeff,

    You could even check out

    http://en.wikipedia.org/wiki/Attribution_of_recent_climate_change

    It seems more-or-less accurate to me.

    Comment by David donovan — 28 Oct 2006 @ 4:46 AM

  105. I don’t find Stocker with the search tool; has this been discussed?

    Broecker, W.S., T.F. Stocker, 2006, The Holocene CO2 rise: Anthropogenic or natural? EOS Transactions of the American Geophysical Union 87, 27-29. PDF (0.4 MB)

    http://www.climate.unibe.ch/~stocker/papers/broecker06eos.pdf

    Comment by Hank Roberts — 31 Oct 2006 @ 7:54 PM

  106. This is perhaps a little off-topic, but I would very much appreciate a response.

    If the level of our emissions of carbon dioxide does not fall below the biosphere’s capacity to absorb it, is there any way that the concentration of carbon dioxide in the atmosphere can fall?

    The 2001 IPCC Report states that carbon dioxide stays in the atmosphere for around 200 years. What happens to it after that?

    My reason for asking these questions is that, in his latest book – “Heat: How to stop the planet burning” – George Monbiot tells us that the current concentration of greenhouse gases in the atmosphere is a carbon dioxide equivalent of 440 or 450ppm, as well as asserting (based on research from the Potsdam ICI) that this concentration must not exceed 440ppm in 2030 if we are to stand a good chance of avoiding a 2 degrees centrigrade rise in the average global temperature.

    With my very basic understanding of the issues, if what he says is true it would seem to me that – to avoid a 2 degree rise – we cannot afford to emit any more carbon dioxide than the biosphere will absorb, and we may need to emit less than it will absorb. Monbiot proposes nothing so drastic.

    Am I missing something?

    [Response: See http://www.realclimate.org/index.php/archives/2005/03/how-long-will-global-warming-last/ - gavin]

    Comment by Jack Cregan (Layman) — 2 Nov 2006 @ 2:17 PM

  107. Dear Gavin
    As a layman I would appreciate your comments on the opinions of Lubos Motl on his Physics Blog which I reproduce below – specifically that the forcing response of CO2 has a reducing (1-exp) component.

    Motl says this:
    You should realize that the carbon dioxide only absorbs the infrared radiation at certain frequencies, and it can only absorb the maximum of 100% of the radiation at these frequencies. By this comment, I want to point out that the “forcing” – the expected additive shift of the terrestrial equilibrium temperature – is not a linear function of the carbon dioxide concentration. Instead, the additional greenhouse effect becomes increasingly unimportant as the concentration increases: the expected temperature increase for a single frequency is something like

    1.5 ( 1 – exp[-(concentration-280)/200 ppm] ) Celsius

    The decreasing exponential tells you how much radiation at the critical frequencies is able to penetrate through the carbon dioxide and leave the planet. The numbers in the formula above are not completely accurate and the precise exponential form is not quite robust either but the qualitative message is reliable. When the concentration increases, additional CO2 becomes less and less important.

    The formula above simply does not allow you more than 1.5 Celsius degrees of warming from the CO2 greenhouse effect. Similar formulae based on the Arrhenius’ law predicts a decrease of the derivative “d Temperature / d Concentration” to be just a power law – not exponential decrease – but it is still a decrease.

    He goes on to say:

    When you substitute the concentration of 560 ppm (parts per million), you obtain something like 1 Celsius degree increase relatively to the pre-industrial era. But even if you plug in the current concentration of 380 ppm, you obtain about 0.76 Celsius degrees of “global warming”. Although we have only completed about 40% of the proverbial CO2 doubling, we have already achieved about 75% of the warming effect that is expected from such a doubling: the difference is a result of the exponentially suppressed influence of the growing carbon dioxide concentration.

    Many thanks for your time in advance
    Lawrie Boxall

    [Response:If the temperature-CO2 relation were as simple as Lubos suggests, all would indeed be simple. But it isn't, as he knows full well. That the T-CO2 relation is approximately logarithmic is no surprise, its why future T increases tend to be approximately linear when CO2 increases exponentially - see for example http://www.grida.no/climate/ipcc_tar/wg1/fig9-5.htm - William]

    Comment by Lawrie Boxall — 5 Nov 2006 @ 8:15 AM

  108. moderator – this is actually about another article “Tropical Glacier Retreat”

    I regard myself as a skeptic, but I don’t have a “skeptic press”, and grouping me in with people whose work I have never even read is slightly offensive.

    ANyway, I thought all scientists were skeptics and skeptics are your friends.

    My thoughts are that it would be better if your articles did not mention works that you disapprove of, as they are not relevant. Perhaps you should have articles devoted just to those works that you disapprove of.

    Comment by Tim Hughes — 5 Nov 2006 @ 11:46 AM

  109. Re #107: I think there is some confusion here between the direct radiative effect of adding carbon dioxide to the atmosphere, and actual effect felt on Earth because of various feedbacks.

    Lubos correctly points out (confirmed by William) that the effect of greenhouse gases, including carbon dioxide, is logarithmic. That means each doubling of greenhouse gas levels produces the same forcing (ie. rise in temperature). It does not mean the greenhouse effect is limited to 1.5 degrees – it seems our physicist is incapable of elementary math.

    He does (sort of) correctly state that the direct forcing due to a doubling of carbon dioxide is 1 degree (actually 1.2 degrees C). He leaves out the fact that feedbacks will amplify this by a factor of two or there (exactly how much is called “climate sensitivity”, and is still a matter of debate). So the amount of warming so far includes these feedbacks, and is not inconsistent with current theory. The future warming from more carbon dioxide that will be “exponentially suppressed” is relatively small.

    His implication that the potential for greenhouse warming is almost all used up makes no sense. There have been much higher levels of carbon dioxide and global temperatures in the distant past, which would not be possible if he was correct.

    Comment by Blair Dowden — 5 Nov 2006 @ 7:25 PM

  110. Re 107, 109

    Actually Lubos incorrectly points out the logarithmic effect, because he concludes that at 380ppm ~75% of the 2x effect has been achieved. Using his own formula, it’s 52%, not 75%, unless there’s a typo. If you calculate it logarithmically, LN(380/280)/LN(560/280) = 44%.

    Second, the language is misleading. Most will read “75% of the warming effect” as 75% of the temperature change, when in fact it’s only 75% (or 44-52%) of the direct forcing that’s been achieved. Taking account of the large thermal mass of the system, the temperature change seen today is an even smaller fraction of the potential, even if you don’t believe in feedbacks (as he apparently does not).

    Comment by Tom Fiddaman — 5 Nov 2006 @ 10:00 PM

  111. Re #110: Well, Tom, he correctly says the effect is logarithmic but incorrectly calculates it. The language is certainly and deliberately misleading. By my calculation, before any feedbacks, a CO2 rise from 270 ppm (the pre-industrial level) to 370 ppm (near today’s levels) gives a forcing of 0.45 degrees, while a rise from 440 ppm to 540 ppm (double pre-industrial levels) gives 0.29 degrees. So it is less, but it does not vanish into nothing as Lubos claims.

    You are right that thermal inertia is delaying the effect of the greenhouse gas increases so far. But that will continue to be the case as long as greenhouse gas levels are increasing.

    Comment by Blair Dowden — 6 Nov 2006 @ 7:46 AM

  112. [...] Estimate of 1750-2000 Climate Forcings [...]

    Pingback by A Musing Environment » Blog Archive » Climate Forcings — 18 Sep 2007 @ 8:56 PM

  113. [...] Attribution of 20th Century climate change to CO2 [...]

    Pingback by FNs klimapanel - IPCC « Stigs klimablogg — 16 Apr 2008 @ 1:08 PM

  114. [...] to David Evans, is since 2001 temperatures around the world have stopped rising. And that’s despite increasing levels of carbon dioxide in the air. So statistically, in the last seven years, the flattening and perhaps even slight [...]

    Pingback by Spot the recycled denial II - 60 Minutes crunch time « BraveNewClimate.com — 20 Aug 2008 @ 1:48 AM

Sorry, the comment form is closed at this time.

Close this window.

0.591 Powered by WordPress