CMIP5 simulations

Personally, I am far more interested in the inclusion of the paleo component in CMIP5 (see Braconnot et al, p15). Paleo-climate simulations with the same models that are being used for the future projections allow for the possibility that we can have true ‘out-of-sample’ testing of the models over periods with significant climate changes. Much of the previous work in evaluating the IPCC models has been based on modern period skill metrics (the climatology, seasonality, interannual variability, the response to Pinatubo etc.), but while useful, this doesn’t encompass changes of the same magnitude as the changes predicted for the 21st Century. Including tests with simulations of the last glacial maximum, the Mid-Holocene or the Last Millennium greatly expands the range of model evaluation (see Schmidt (2010) for more discussion).

The CLIVAR newsletter has a number of other interesting articles, on CFMIP (p20), the scenarios begin used (RCPs) (p12), the ESG data delivery system (p40), satellite comparisons (p46, and p47) and the carbon-cycle simulations (p27). Indeed, the range of issues covered I think presages the depth and interest that the CMIP5 archive will eventually generate.

There will be a WCRP meeting in October in Denver that will be very focused on the CMIP5 results, and it is likely that much of context for the AR5 report will be reflected there.

Page 2 of 2 | Previous page

113 comments on this post.
  1. Richard Whiteford:

    This is a must read book for all climate presenters:
    http://www.amazon.com/Living-Denial-Climate-Emotions-Everyday/dp/0262515857/ref=sr_1_1?s=books&ie=UTF8&qid=1313067079&sr=1-1

  2. Dikran Marsupial:

    It is impressive that climate modelling has reached the point where decadal prediction is being seriously considered. The first I heard about it was from the paper by Fildes and Kourentzes paper on “Validation and forecasting accuracy in models of climate change”. I suspect that those performing decadal predictions are well aware of proper validation procedure already.

    [Response: Actually, that isn't a given. This is a new endeavour and although it can draw on experience with seasonal forecasting, it has it's own issues (lack of sufficient examples of usable hindcasts from which to derive bias corrections for instance, also rapidly evolving observational networks etc.). The whole thing is very much at the experimental stage - how to initialise, how to evaluate, how to predict are all up in the air - at least for now. - gavin]

  3. PeteB:

    Gavin,

    I just skim read Schmidt (2010). Sounds very interesting, and a great way of testing models, but looks like a large project needing interdisciplinary input. Is there any ‘grand plan’ on this with timescales – I guess this is likely to take years to really get going

    [Response: Indeed. Actually this is a big focus for a few people (including me, but it goes quite wide). Getting the paleo runs included in CMIP tool a lot of lobbying from PMIP, and getting people to realise the potential for using this resource is ongoing. I'm part of an effort to run a workshop next year to focus exactly on this for instance. You are right that this will take years, but I'm hopeful it will happen. - gavin]

  4. William Freimuth:

    Hindcasting problem? The problem is most of the world is either unaware or in total denial of the emerging crisis. Obviously climate models were not available in the Dark Ages (which we’re perhaps still in) but data exists to make a better case for “change”.

    For Christ’s sake don’t point out the difficulties….point out the problem!

    Could there be a court setup whose job it is to track those who are lying and enforce penalties relative to the damages forthcoming? If nothing else, at least get off the defensive and state charging those in the way, with crimes. Perhaps Limbaugh could be sent an assessment of his ‘offenses’ as the damages due to neglect mount. BTW, lying is one of the essential violations of Christendoms B1G (commandments).

    [Response: Careful, you are sounding a bit like Steve McI, only in reverse. --eric]

  5. Pete Dunkelberg:

    Re # 2: Do Climate Models need Independent Verification and Validation?

  6. Edward Greisch:

    1 Richard Whiteford: Read it. I Agree. Also read “The Authoritarians” by Bob Altemeyer. Free download from:
    http://home.cc.umanitoba.ca/~altemey/

    Also scary: http://thinkprogress.org/romm/2011/08/11/293781/attacks-on-climate-science-education

  7. David Horton:

    Bit hard on William, Eric. It seems to me too not so much like rearranging deck chairs on the Titanic as determining exactly the position of each one before the boat sank. Do any scientists still believe that overcoming the Koch brother’s funding of a massive disinformation campaign is just a matter of refining models? Do you think that, say, a President Bachmann, Perry, or indeed Obama, will suddeny say “Oh, at last, decadal prediction, now I can act”?

    Of course I know we need to keep refining models, but in the current, and forseeable, political climate, it seems like merely an academic exercise.

  8. Hero:

    Thanks for all your hard work Gavin, and to the scientists around the globe for their contribution to this important body of work.

  9. Pete H.:

    Re #4, For Chirst’s sake DO point out the difficulties and DO refine the models. That’s how science progresses. And if we get to the point were the American public will tolerate witch trials, then the message will have already made it through. So there will be no point in retribution. Your comment makes me wonder if you’re just trying to bait the RC bloggers.

    Re #7 “Do any scientists still believe that overcoming the Koch brother’s funding of a massive disinformation campaign is just a matter of refining models?” What makes you think refining models is about overcoming the Koch brothers? Or that it is a strictly academic exercise.

    The practical side of refining models? Policy makers need temporally and spatially refined models to make plans to deal with the inevitable (in the pipeline) changes we are going to experience. And let’s not forget, what at one time seemed like strictly academic exercises is what brought us to the current level of knowledge about climate.

    [Response: Thank you for getting it, because a lot of folks (see above) sure don't.--Jim]

  10. richard pauli:

    Pete #8 re: #7…

    Ummm, what policy? I see very little policy that is based on AGW science – most is based on politics and economics and carbon fuel interests. Lots of good ideas, good plans, great science, even a few token laws that impart symbolic action – mpg goals, carbon tracking, etc

    There is some good legislative success around stopping acid rain and the freon/ozone hole problem – but those were very small and economic no-brainers. But when it comes to global warming — there is NOT MUCH policy in place that exerts real physical interaction with this problem. Especially in corralling the rampaging elephant in the room of carbon combustion. No constraints on coal – and no curtailing of crude. We all talk. But nothing is happening.

  11. Kevin McKinney:

    #8–

    Good points, Pete. But the current case of Dr. Monnett seems to me perilously close to a witch trial *now*–true, criminal prosecution will not be undertaken, but Dr. Monnett is suffering real harm–psychologically and to his reputation at least, and with a significant risk of career and financial harm to come–apparently on the basis of pure speculation. And a significant number of folks are not only ‘tolerating,’ but cheering.

  12. Joe Hunkins:

    Thank you Gavin – great project and nice description along with your 2008
    “IPCC” modelling intro from 2008: http://www.realclimate.org/index.php/archives/2008/05/what-the-ipcc-models-really-say/

  13. Jim:

    If some people want to drag this post down into the swamp hole of politics, go elsewhere. This is a science post. The number of interesting and relevant and educational-for-everyone topics that could be discussed here is damn near infinite, but the politics of action against climate change…

    …is not one of them.

    Thanks

  14. Kevin McKinney:

    Good point, Jim, there’s “Unforced Variations” available for such. Or at least, arctic discussion. A (mild) mea culpa.

  15. MS:

    Thank you, Gavin for writing about this topic and for the link to Schmidt (2010).
    I understand that the (mid) Pliocene climate could give us clues to what the long-term consequences of enhanced anthropogenic warming might be.
    I am wondering if the data and the model runs really show something similar to the Pliocene or if it is too early to tell.
    Does the pliocene seem to be a good “model” for future warming or is it suspected, that there would be major differences?

    [Response: There will almost certainly be major differences, but I still think insight can be gained. The key for the Pliocene model/data comparison is the relative imprecision of the CO2 (and other GHG levels), and the uncertainty in the ice sheet/vegetation reconstructions. The Lunt et al (2010) paper discusses some of this. - gavin]

  16. richard pauli:

    #13 Jim – you touch a very important aspect of climate change – human politics. But I cannot agree with ignoring it.

    You may not want to talk politics, but all the climate science I read concludes that humans are a direct cause of excessive warming. So do we may not want to remove human influence from this science.

    As we model and propose future scenarios – just how, pray tell, do we remove politics from the discussion? And don’t we want to influence public policy?

    You ask that politics be ignored, but the climate models themselves are specifically predicated on different social reactions to carbon emission change.

    Perhaps we could talk of the psychology of denial and the human reticence to change – something rigorously studied by psychologists who also write peer reviewed papers. (admittedly a vast subject area, this indeed deserves its own web site)

    But I hope you are not saying that we must remove human psychology from modeling climate scenarios.

  17. Edward Greisch:

    13 Jim: What web site would you suggest for the politics of action against climate change? [Not counting climateprogress.org or Daily Kos] And what web site would you suggest for social sciences of climate change? No matter how much you dislike politics, if you want to get anything done, you have to go there. The Contributors should be running for the US Senate because GW would not be happening if the Senate had 60 scientists in it.

    Yes, policy makers do need computer simulations that will tell them what actions would actually work. That is, if any actions would actually work. It is true at the present time that we could put an end to GW. It will not be true some time in the future. The models should tell us how the costs will ramp up and when the last possible moment to save ourselves will be. The social sciences tell us why the public doesn’t get it. That is also something we have to know if we are to make a difference.

    It IS important to continue doing climate science. Research is also part of education. We must have the fully trained scientists to be witnesses before Congress. There must always be new research to show Congress, not something done by a previous generation. As computing power increases, there must be ever better models that will show policy makers in ever increasing detail what their options are.

  18. Hank Roberts:

    > Now that CMIP5 is gearing up for a similar exercise, it is worth
    > looking into what has changed – it terms of both the model
    > specifications, the requested simulations and the data serving
    > to the wider community. Many of these issues are being discussed in …
    > the current CLIVAR newsletter (Exchanges no. 56).
    > (The references below are all to articles in this pdf).

    > The CLIVAR newsletter has a number of other interesting articles,
    > on CFMIP (p20), the scenarios begin used (RCPs) (p12), the ESG data
    > delivery system (p40), satellite comparisons (p46, and p47) and the
    > carbon-cycle simulations (p27).

    Ah, homework. Thank you.

  19. One Anonymous Bloke:

    I agree with Jim – there are much better (ie: more effective) places to discuss politics; an email to your local MP/representative, for example.

  20. Radge Havers:

    Climate science communication –> behavioral and social science implications –> Unforced Variations.

    No?

  21. Jim Eager:

    MS @15, although CO2 levels and eventually global mean temperature may be similar one key difference between the pliocene and current is a completely different ocean current regime due the closure of the isthmus of Panama.

  22. John Mashey:

    This all sounds good, and sort of makes me wish I were still helping design and sell supercomputers to modelers … well, maybe again.

    But I remain interested in the unusual CO2 drop into 1600AD.
    http://i39.tinypic.com/if0m5g.jpg

    Is anyone specifically modeling that event (give the rarity of a CO2 drop like that)?
    One of Ruddiman’s hypotheses was that this was caused in part by massive post-Columbus die-offs in the Americas and resulting large reforestration. For the latest, see one of the interesting articles in the August issue of The Holocene.

    Without arguing whether or not that hypothesis is good or not, the question is
    suppose that the reforestration sequestration of CO2 was within the uncertainties, what effects (especially regional) would climate models show for that period, given other known forcings? (Say 1500-1700) Would these effects be different if the CO2 drop were mostly caused by other reasons?

    [Response: Thanks for the reminder on this interesting topic, and the good question. John is referring to a topic closely related to those covered in this recent post (which contains a link to the The Holocene issue mentioned). See also earlier RC posts here and here--Jim]

  23. CM:

    OK, stupid question: why’d they bump the numbering to 5?

  24. BillS:

    #7

    “You may have heard the French slogan l’art pour l’art – popularly Latinized as ars gratia artis in MGM’s roaring logo – and typically translated into English as art for art’s sake. However, the more common interpretation for the Latin ars is actually ‘science’. Starting from this basic point, here is an argument for scientific literacy… science for your sake… science for science’s sake.” By Chris Garrigues

    The remainder of this interesting essay can be found here:

    http://www.zcommunications.org/science-for-sciences-sake-by-chris-garrigues

    Some do science simply for science’s sake….

  25. Buck Smith:

    Is the source code for any of the models available for download?

  26. Pete H.:

    #17 What web site would you suggest for the politics of action against climate change?

    Richard Rood often discusses politics, policy, and planning.

    http://www.wunderground.com/blog/RickyRood/article.html

    There is also DeSmog Blog

    http://www.desmogblog.com/

  27. Andrew:

    Climate models are generally underestimating the decline of arctic sea ice.
    A recent study found that this could be explained if the decline were attributed half to “natural variability” and rest to greenhouse gases:

    http://www.nsf.gov/news/news_summ.jsp?cntn_id=121359&WT.mc_id=USNSF_51&WT.mc_ev=click

    However, how can they be sure that the assumed albedo of snow does not need significant improvement?

    There is at least one papers with better snow albedo models.
    However, it’s not clear if this could also explain the discrepancy between models and recently sea ice observations.

    http://www.agu.org/pubs/crossref/2011/2010JD015507.shtml

  28. One Anonymous Bloke:

    #25 Buck Smith, have you actually looked for any? I doubt it, since at the top of this very page there is a link: ‘Data Sources’.

  29. Aaron Lewis:

    My own observations are consistent with Rampal et al’s conclusions in Journal of Geophysical Research – Oceans . That is, the forecasts of climate models are significantly off, and Arctic sea ice is thinning, on average, four times faster than the models say, and it’s drifting twice as quickly. In this case, the ice variability studies that use multiple runs of these models to suggest the possibility of ice recovery, may simply reflect the current climate models’s inadequate analysis of heat transport into the Arctic. This suggests that without the near term feedbacks from loss of Arctic Sea Ice, these models dramatically understate long term effects and impacts of global warming.

    Past histories of large programing efforts (i.e., IBM OS/360) suggest that large programming efforts do not make large non-linear strides forward even with massive increases in effort. However, with feed backs such as albedo effects, water vapor feedbacks, CH4 from melting permafrost, CH4 from clathrate decomposition, and the ice dynamics from large ice sheets, the effects and impacts of AGW are likely to make large non-linear strides forward. Thus, the difference between reality and the model forecasts is likely to increase.

    Moreover, as long as there are new models in the pipeline, risk managers and policy makers will consider the question of global warming open. They will hope that the next model will show that AGW is not as bad as we thought.

  30. Hank Roberts:

    > Buck Smith … source code
    Yes.
    http://www.google.com/search?q=cmip5+“source+code”
    http://scholar.google.com/scholar?q=cmip5+%22source+code%22

  31. Steve Bloom:

    Re 15/21: My understanding is that the Central American Seaway was closed or at least mostly so by 3.3 mya, but there are certainly other things (e.g. IIRC the height of the Rockies) that made the Pliocene warm period climate a little different in its details from what we would get with the same CO2 levels run to equilibrium.

    The trick is that while knowing the details of past higher-CO2 equilibrium states is useful, we won’t be seeing anything like them since what we actually have is an unnaturally rapid CO2 transient that is having/will likely have all sorts of feedback effects (e.g sink saturation, ocean acidifcation, possible large methane pulse) that wouldn’t amount to much under natural conditions. As Jim Hansen has forcefully pointed out, even the PETM (the biggest transient nature has been able to manage in the Phanerozoic) is a poor analogy since it started out in ice-free conditions. As we can see from the record of recent deglaciations, ice sheets like to collapse quickly and have all sorts of effects that we would probably prefer to avoid, not that a too-rapid warming in ice-free conditions would be a walk in the park (consider e.g. the implications of a too-fast shift in fertile zones due to expansion of the tropics). It seems clear enough at this point that we won’t be avoiding a lot of them.

  32. Steve Bloom:

    Gavin, I’ve been following with great interest what seems to be camapign by Knutti (with various co-authors) to in effect decertify some of the poorer-performing models or, I suppose, force improvements. What’s your view of that and how is it likely to be reflected in the AR5?

    [Response: "decertify" is not a useful word in this context. Rather the idea is that it should be possible to find metrics that would show that different models merit different levels of credibility in their projections. Any specific prediction for an outcome would use input from all relevant models but they would be weighted based on prior assessments of their credibility. This is much harder to do than it sounds for various reasons - not least of which is that you have to justify the weighting scheme, find skill metrics that actually correlate to projection outcomes etc. This was all quite well discussed in the IPCC experts meeting report. - gavin]

  33. prokaryotes:

    What is the climate sensitivity in CMIP5? I lookign forward to read some feedback about David Wasdell’s paper on climate sensitivity.

    A few bits…

    In addition to initiating a process of global warming, the anthropogenic disturbance has also triggered the action of a complex web of interconnected feedback mechanisms
    which amplify the effect of the original disturbance. The value of the amplification factor determines the required increase in average surface temperature if thermal equilibrium is eventually to be restored.

    [..]
    Taking account of the contribution of the CO2 with no feedback amplification we arrive at a correlate temperature 0.8oC below the pre-industrial benchmark.

    Using the Charney amplification factor of 2.5 the figure decreases to -2.0oC.
    Incorporating the carbon cycle feedbacks of the Hadley model yields -3.8oC.
    Adding other slow feedbacks for the Hansen amplification factor yields -4oC. However, the empirically derived value for the average surface temperature during the depth of the ice ages stands at 5.0oC below the pre-industrial benchmark.
    This provides us with an anchor point of [180 ppm, -5.0oC] through which the amplification line representing the sensitivity of the whole earth system must pass. The second point on the line is of course the pre-industrial benchmark of 280 ppm and 0.0oC. Projecting that forward into the next doubling of CO2 concentration yields an AF of 6.5, a climate sensitivity of 7.8oC, and a Temperature-Forcing ratio of 2oCw-1m-2. Those figures are just over 21⁄2 times the values derived from the Charney Sensitivity.

    [..]

    In 2005, Ferdinand Engelbeen, a Belgian scientist, conducted a regression analysis of the correlated values of temperature and CO2 concentration based on the Vostok records. It was posted on the Real-Climate web site and little further attention was paid to it

    [..]

    The high level of certainty associated with the Earth System Sensitivity of at least 7.8oC for a doubling of CO2, requires that the Charney Sensitivity (of 3oC) should now be abandoned and replaced by that figure for all future strategic negotiations.

    [..]
    The outcome of the COP 15 deliberations affirmed the need to limit temperature rise to the 2oC ceiling, and this element of the Copenhagen Accord [16] was subsequently embedded in the Cancun Agreement of COP16. These positions are totally dependent on the Charney Sensitivity. Leaving aside for the moment the challenge that even a 2oC rise in temperature would take us well beyond dangerous climate change and into the domain of “extremely dangerous climate change”, we note that the Earth System Sensitivity indicates that a sustained CO2 concentration of 440 ppm would result in an equilibrium increase of 5oC above the pre-industrial benchmark In this case the 2oC guardrail has already been overwhelmed by some 60 ppm. The 2oC threshold was passed when the concentration reached 330 ppm.
    [..]
    Utilising the ensemble of climate models that underpin the IPCC Fourth Assessment
    Report, Malte Meinshausen produced a probability density function (PDF) showing the
    clustering of the model outputs around a climate sensitivity of 3oC [13] (co-incident with the Charney Sensitivity). With very few exceptions, that ensemble limits its treatment of feedback mechanisms to the same set of fast feedbacks utilised in the original Charney analysis.
    [..]
    If we explore the Hansen amplification factor of 5.0 (a sensitivity of 6oC for a doubling of atmospheric CO2) then we see he predicts an equilibrium of rise of 4oC with a stabilised CO2 concentration of 440 ppm. The “safe” ceiling of 2oC is reached with a concentration of 350 ppm. That is why he consistently asserts that we need to reduce CO2 concentration to below 350 ppm while warning that even then, the temperature rise would expose the system to further amplification from slow feedbacks as well as initiate a dangerous increase in sea level

    [..]
    Exploring the C-ROADS Simulator
    Using the best available expertise in system dynamics simulation, field leaders from the MIT Sloane School of Management and Ventana Systems created the C-ROADS simulator in preparation for the COP 15 gathering in Copenhagen [21]. The acronym stands for “Climate Rapid Overview and Decision Support”. The simulator provides a visual interface that responds in real time to inputs of proposed reductions in CO2 emissions, relating outcomes to atmospheric concentration trajectories and implications for increase in global temperature. The simulator is now hosted independently at http://ClimateInteractive.org

    Its underlying model architecture has been stringently reviewed by an august panel of leading scientists. They validated its accuracy in representing the “state-of-the-art” climate models used in the preparation of the IPCC Fourth Assessment Report, (the same set utilised by Meinshausen to derive his PDF of Climate Sensitivity). They went on to recommend it for use as the official simulator for the UNFCCC negotiations. ClimateInteractive staff and simulation platform were extensively involved in the preparation of “The Emissions Gap Report” of the UNEP [22], released in Nov. 2010 prior to the COP 16 gathering in Cancun. Following the promulgation of the Copenhagen Accord, nearly 140 countries associated themselves with the document and over 80 countries, representing about 80 per cent of global emissions, have appended targets and/or mitigation actions. (UNEP p.38) These promises and commitments were entered into the C-ROADS platform and the resulting increase in average global temperature by the year 2100 was presented in the form of a Climate Scoreboard thermometer.

    The C-ROADS simulator uses the Charney Sensitivity and limits the effects of positive feedback to the fast mechanisms associated with that value.

    [..]
    All the work on climate sensitivity is based on paleo records of slow, close to equilibrium behaviour at a global level. Those conditions no longer apply.

    [..] the high level of climate sensitivity, combined with rapid change and far-from
    equilibrium dynamics, exposes us to a severe risk of triggering an episode of runaway
    climate change.

    Link to full paper and blog post..
    http://climateforce.net/2011/08/13/climate-shift-impact-risk-assessment-revisited/

  34. JimCA:

    Can anyone shed light for us laymen about the recent NCAR prediction that arctic ice loss might temporarily abate during the next decade?

    In a nutshell: Is this a credible prediction, and if so, is there a simple explanation?

    When I look at the trends over the past 30 years, especially ice volume, it’s hard to believe those losses would suddenly stop for a while.

  35. Didactylos:

    Buck Smith: Yes!

    Click on “data sources” at the top of the page for 7 major climate models, and various others.

  36. Hank Roberts:

    > JimCA
    Asked and answered:
    http://www.realclimate.org/index.php/archives/2011/08/unforced-variations-aug-2011/comment-page-5/#comment-213032

  37. prokaryotes:

    David Wasdell talks about his climate sensitivity assessment and computer models in a recent interview (july 2011)
    http://www.youtube.com/watch?NR=1&v=tDAQ9lXzc1c

  38. prokaryotes:

    Then, one day, to your horror, you open a journal of Uighur studies and find a lead article proving that everybody has been interpreting Uighur wheat production records wrong, and that all previous estimates of what the Uighur numbers mean were off by a factor of two. http://www.realclimate.org/index.php/archives/2005/12/natural-variability-and-climate-sensitivity/

    I got a response from Ken Caldeira and added more content to this blog post … http://climateforce.net/2011/08/13/climate-shift-impact-risk-assessment-revisited/

    When reading about models and simulations i find it would make sense to list which of the climate class forcings(or at thresholds sub feedbacks) and what CS is considered. The logarythmic approach of Wasdell i think is very reasonable. The question is about thresholds and there are so many feedbacks which are still not used and i find it is important to make that clearer that we use really conservative estimates – which not consider all variables. So im still reading into it and looking for more feedback…

  39. Pete Dunkelberg:

    Crowd sourcing science:

    He attributes this chasm to the fact that humans are much better than machines at thinking through the scientific process and making intelligent decisions based on past results. “A computer is completely flummoxed by not knowing the rules, but players are unfazed: they look at the data, they make their designs, and they do phenomenally.”

  40. Pete Dunkelberg:

    On the other hand, Prokaryotes @ 37 “So im still reading into it….”

    Yes, we must all be wary of confirmation bias, not to mention overestimating the size of every imaginable feedback.

  41. Hank Roberts:

    > Ken Caldeira about the Wasdell paper ….
    > http://climateforce.net/2011/08/13/climate-shift-impact-risk-assessment-revisited/

    Caldeira’s response is well worth reading in full.
    Nomination for a FAQ item.

    [Response: Caldeira's refers to the 'Eocene methane spikes' as evidence for high sensitivity. The problem with this is that both the temperature change of the Eocene and the greenhouse gas levels are highly uncertain. Contrary to popular view, the Eocene data are simply not good enough to 'challenge' climate models with midrange sensitivity. See e.g. the open access paper The early Eocene equable climate problem revisited by Huber and Caballero (2011) in Climate of the Past. I think it highly unlikely anyone is going to be able to demonstrate high sensitivity. It is even more unlikely we can rule it out. Except by waiting 50 years or so and seeing what happens of course. --eric]

  42. wili:

    I am glad to see some discussion of climate sensitivity.

    It seems to me that historical comparisons have the disadvantage that we are adding GHGs at a much faster rate than at any other point in the history of the earth, if I understand correctly.

    It seems to me that this very fast rate of increase could change the nature of certain feedbacks. For seabed methane in fairly shallow waters, for example, a relatively slow increase in GHG and global temperatures may melt glaciers and polar ice enough to increase sea levels, thus increasing the pressure that helps keeps clathrates in place, before the ocean warmed up enough to melt the clathrates.

    As it is, we are warming the oceans, particularly the Arctic very quickly, while sea level rise is still measured in centimeters at most.

    IIRC, Wasdell has been criticized by Archer in the past. Does this recent work seem more solid in Archer’s view?

    reCaptcha: Bremen ogionti

  43. Aaron Lewis:

    Eric inline comment on Hanks comment 41:

    It does not matter what the Eocene conditions were at the start of its CH4 releases, because we are there (or very close). The observed methane plumes from sea beds in the Arctic, observed melt of permafrost, observed release of methane from Arctic wetlands, and large number of high “outlier” methane concentrations in global carbon cycle monitoring programs all suggest that we are seeing the start of carbon-cycle methane releases. AGW has put a lot of heat in our oceans, and there is no doubt that more of that heat will be transferred to permafrost and clathrates resulting in additional melt/decomposition. We are at the start of carbon cycle feedback.

    CH4 concentration in the northern atmosphere (including high outliers) is now close to 2 ppm based on local, current emissions. That is almost like having another 100 or 150 ppm of CO2 in the air (effect calculated on annual basis), so we need to think about our current GHG concentration as 500 CO2- eq in 2011. I know that means assuming CH4 is 70 times more effective as a green house gas than CO2, but 70 is a good number for one year, and what we are worried in this case is the local warming that will drive next year’s permafrost melt. We do not need a 20 year greenhouse equivalent number, because in 20 years the permafrost will be mostly seasonal wetlands, all producing CH4. For example, CH4 trapped under ice over the winter and released as the ice melts, just in time to jump-start local warming of the wet lands the next spring.

    We do not need to wait 50 years. Go count those wetland ponds. You can do it from your desk with satellite photos. Now, there are a lot of them, and every year there are a more ponds. Every year many of them get a little bigger. Every year, most of them have a little longer melt season. Every year they get a little warmer, so that they produce a little more CH4 per acre. It is a non-linear feedback cycle resulting in higher sensitivity.

    Now is the time to plan for it.

  44. prokaryotes:

    [b]From the newsletter:[/b]

    Simplified, fast models (so called intermediate complexity models or EMICs) were also used for these time slice simulations in order to provide a reference for their use for long-term transient experiments and in palaeoclimate studies.
    The main focus for time slice experiments were the Last Glacial Maximum (LGM), 21 000 years before present (BP), and the mid-Holocene (MH), 6 000 years BP: intervals which correspond to times when differences in boundary conditions led to extreme climates that are relatively well documented by palaeoenvironmental data.

    The third phase of PMIP and its role in CMIP5 An important aspect of PMIP3 is the inclusion of three key time periods as part of the Phase 5 Coupled Modelling Intercomparison Project (CMIP5, Taylor et al., 2009; 2011) within Tier 1 (the Last Glacial Maximum 21,000 years ago, and mid-Holocene 6,000 years ago) and Tier 2 (Last Millennium).
    __
    So Wasdell point’s out that these past climate changes cannot serve as a basis for an accelerated cc out of equilibrium range a few magnitudes larger.

    [b]Wasdell’s critique on modeling and more bits from his draft:[/b]

    Additional change in greenhouse gas concentrations is caused by non-anthropogenic feedbacks which are sensitive to climate change. Additional carbon dioxide, water-vapour and eventually methane, combined with the temperature-driven change in ice and snow albedo, together with complex oceanic, vegetative and cloud-system feedbacks, all contribute to amplify the original disturbance. The value of the eventual quilibrium rise in average surface temperature depends on the amplification factor applied to the original anthropogenic disturbance by the feedback system.

    About Charney Sensitivity
    The report explicitly excludes the role of the biosphere in the carbon cycle (and so takes no note of the carbon-cycle and vegetation feedbacks). It also ignores the transfer of heat to the deep oceans, a position that leads to a fast approach to dynamic thermal equilibrium. Our current observation and understanding of this factor leads to slower predictions of the rate of temperature rise.

    “If the Charney sensitivity, supported by our modern computer models, projects that a doubling of the concentration of atmospheric carbon-dioxide leads to a temperature rise of 3oC at equilibrium, then why, in the empirically measured behaviour of the planetary system, does an increase of only 56% in CO2 concentration (from 180 ppm to 280 ppm) lead to a 5oC change in temperature?”

    If we take into account the non-linear relationship between the variables as expressed in the semi-log (base 2) presentation, the 56% increase in absolute value of the CO2 concentration across this particular range, equates to only 63.4% of the effect of doubling, i.e. a forcing of 2.54 wm-2. Applying the Charney Sensitivity to this proportion yields an increase of only 2oC for the change from coldest point of the ice-age to the pre-industrial benchmark. Historical data tells us that the shift should be 5oC. It just does not compute. The inescapable conclusion is that the computer modelling ensemble, together with the Charney sensitivity and supported by Hansen’s “empirical sensitivity” are all omitting something fundamental. They are grossly under-representing the contribution of the complex feedback system to the amplification of the effects of change in CO2 concentration.
    The implications are profound. The whole international strategic response to climate change is based on the output of the computer ensemble. The basis is recognised to be “conservative”, but to be under-representing the threat by a factor of two-and-a-half is a culpable collusion with a process of collective denial.

    In an attempt to close the gap between computer modelling and empirical measurement, Hansen et al offered a hybrid solution in their earlier paper “Target Atmospheric CO2: Where Should Humanity Aim?”, published March 2008 [7]. They started with the assertion: 8.8.1 “Paleoclimate data show that climate sensitivity is ~3oC for doubled CO2, including only fast feedback processes. Equilibrium sensitivity, including slower surface albedo feedbacks, is ~6oC for doubled CO2 for the range of climate states between glacial conditions and ice-free Antarctica.” [op. cit. p1]

    The methodological approach is summarised in the paragraph:
    “Climate models alone are unable to define climate sensitivity more precisely, because it is difficult to prove that models realistically incorporate all feedback processes. The Earth’s history, however, allows empirical inference of both fast feedback climate sensitivity and longterm sensitivity to specified GHG change including the slow ice sheet feedback.”

    After careful and technical evaluation of the long-term slow feedback mechanisms, they conclude that:
    “Global climate sensitivity including the slow surface albedo feedback is 1.5oC per wm-2 or 6oC for doubled CO2, twice as large as the Charney fast-feedback sensitivity.”

    This “Hansen Upgrade” is represented by the green line on the semi-log (base 2) scale. The sensitivity of 6oC for a doubling of CO2 yields an Amplification Factor of 5.0. However, it still falls short (by some 4 wm-2) of the forcing required to balance a 5oC rise in temperature.

    Towards the end of 2009, Mark Pagani et al published a paper on “High Earth-system climate sensitivity determined from Pliocene carbon dioxide concentrations” [Nature Geoscience Letters 20 December 2009] [10]. They concluded that “the Earth-system climate sensitivity has been significantly higher over the past five million years than estimated from fast feedbacks alone”.

    Writing in the “Perspectives” section of the Jan. 2011 edition of Science, Jeffrey Kiehl reviewed current peer-review academic papers reporting on the reconstruction of values of atmospheric CO2 concentration reaching back through ~100 million years. The authors also derived values for earth system climate sensitivity across this period. Kiehl’s summary conclusion was that the data for 30 to 40 million years before the pre-industrial benchmark indicate that Earth’s climate feedback factor is ~2oCw-1m-2. That is equivalent to a climate sensitivity of 8oC for a doubling of atmospheric concentration of CO2, with an amplification factor of 6.7.

    Re-working Kiehl’s figures using the graphic simulator leads to a marginally higher outcome.
    Earth surface temperature decreased by 16oC during the period requiring a shift of 52.8 wm-2 of forcing to balance the dynamic thermal equilibrium. During the same period, CO2 concentration declined from1000 ppm to 280 ppm, equivalent to 1.87 of the doubling/halving forcing from CO2 alone. CO2 change therefore contributed some 7.48 wm-2 towards the overall forcing, leaving a balance of 45.32 wm-2 as the contribution from the dynamic feedback system. That yields an amplification factor of 7.0, a sensitivity value of 8.47oC for a doubling of CO2, and a climate feedback factor of 2.1oCw-1m-2.

    Rapid Climate Change in Far-from-Equilibrium Conditions
    Historically climate change at a global level has been slow and in conditions of dynamic thermal equilibrium. Net radiative imbalance has remained close to zero and the earth system has responded to change at a pace that allowed continuous adaptation of the bio-geo-chemical systems. Within those conditions there have been examples of comparatively rapid change in limited sub-system behaviour where specific tipping points have been activated by the slow global change, and the sub-system has moved from one stable state to another. All the work on climate sensitivity is based on paleo records of slow, close to equilibrium behaviour at a global level. Those conditions no longer apply.

    The Earth System Sensitivity would indicate an expected rise in temperature at equilibrium of 3.8oC. There is therefore an expected rise of 3.0oC still in the pipeline, to which we are already committed. If non-CO2 GHGs are included then the expected increase would rise to 5.4oC (4.6oC still in the pipe-line).

    Historically a change in CO2 concentration of 100 ppm has taken place over a period of some 10,000 years. Humanity has now generated the same change in the space of a single century, one hundred times faster than at any point in the historical record (apart perhaps from the effects of the impact of a massive asteroid).

    Net Radiative Imbalance during the past has not exceeded 0.01 wm-2. Anthropogenic forcing over the last century has generated a net radiative imbalance of between 1.0 and 3.0 wm-2.This rate of global heating is of the order of 300 times the historical maximum. It has pushed the earth system significantly away from equilibrium and activated increasing time-delay between forcing and the eventual achievement of a new state of dynamic thermal equilibrium.

    Under these conditions a range of feedback processes are brought into operation that can be considered negligible when the system is very close to equilibrium:
    Time delay in mixing of ocean layers (thermal inertia of the deep ocean heating) leads to relative heating of the surface, increased stratification, less up-welling of cold nutrient-rich water, decay in plankton take-up of CO2.
    Increased acidification of the surface layer leads to lowered efficiency of the ocean sink of atmospheric CO2.
    Hotter ocean surface combines with hotter atmosphere to increase the water-vapour feedback and so enhance the endothermic phase-change feedback that increases forcing while bypassing the temperature-sensitive radiative damping negative feedback.
    Heat transfer from equatorial to high latitude polar regions is partially taken up in the endothermic phase-change of net ice melt. The resultant decrease in albedo constitutes a positive feedback which is also partially independent of the temperature-sensitive radiative damping negative feedback.
    The greater the net radiative imbalance the longer the time-lag to establish new dynamic thermal equilibrium. Non-temperature-sensitive feedbacks, driven by increased CO2 concentration or by energy-flux in distinction from increased temperature, all contribute to amplification of the forcing, so increasing the time-lag and setting up second-order feedback reinforcement.
    These non-temperature-sensitive feedbacks continue to accelerate global heating even during periods of increased heat-transfer to deep ocean with consequent slowing of the rate of change of average global surface temperature.
    The pace of change overwhelms the capacity for smooth adaptation, evolution and mobility of the biological systems leading to patterns of die-back and burn that transfer carbon from biomass to atmosphere. That increases the carbon-cycle feedback dynamics.

    Taken all together these phenomena enhance the system sensitivity and increase the amplification factor beyond the value of the Earth System Sensitivity previously developed from slow and close-to-equilibrium patterns of change. The value of the amplification factor of 6.5 representing a Sensitivity value of 7.8 oC for a doubling of the concentration of atmospheric CO2 should therefore be taken as a conservative
    minimum figure in our current situation.
    Rapid climate change, in conditions of dis-equilibrium, precipitates the activation of an interconnected series of sub-system tipping-elements [24]. That in turn drives turbulence and inherent unpredictability in the global climate system. There is also an increasing frequency of extreme events in local weather conditions.
    At the overall global system level, the increasing power of amplifying feedback dynamics could push the system beyond the critical threshold which signals the onset of a period of self-amplifying or “runaway” climate change for which there is currently no modelling capacity. The subject of the boundary conditions of runaway behaviour in the earth climate system as a whole is addressed in the second section of this paper under the title of “Beyond the Stable State”.
    ___

    Notice that Ken Caldeira’s response is based on a paper with Pagani from 2006 and Wasdell cites a paper from him, dated 2009.
    Notice that i could not find the Engelbeen post here at RC, though i did not checked all his comments, but he posted it at WUWT.

  45. wili:

    @ ALewis “70 times more effective as a green house gas than CO2, but 70 is a good number for one year”

    I’m not sure I follow your wording here, but you should be aware that Shindell et alia in “Improved Attribution of Climate Forcing to Emissions“ found that the actual short-term global warming potential of methane was 105 times that of CO2.

    http://www.sciencemag.org/cgi/content/abstract/326/5953/716

    reCaptcha: mickshin rejoiced

  46. prokaryotes:

    The Pagani paper

    High Earth-system climate sensitivity determined from Pliocene carbon dioxide concentrations

    Climate sensitivity—the mean global temperature response to a doubling of atmospheric CO2 concentrations through radiative forcing and associated feedbacks—is estimated at 1.5–4.5 ◦ C (ref. 1). However, this value incorporates only relatively rapid feedbacks such as changes in atmospheric water vapour concentrations, and the distributions of sea ice, clouds and aerosols2. Earth-system climate sensitivity, by contrast, additionally includes the effects of long-term feedbacks such as changes in continental ice-sheet extent, terrestrial ecosystems and the production of greenhouse gases other than CO2. Here we reconstruct atmospheric carbon dioxide concentrations for the early and middle Pliocene, when temperatures were about 3–4◦C warmer than preindustrial values3–5, to estimate Earth-system climate sensitivity from a fully equilibrated state of the planet. We demonstrate that only a relatively small rise in atmospheric CO2 levels was associated with substantial global warming about 4.5 million years ago, and that CO2 levels at peak temperatures were between about 365 and 415 ppm. We conclude that the Earth-system climate sensitivity has been significantly higher over the past five million years than estimated from fast feedbacks alone.

    The magnitude of Earth-system climate sensitivity can be assessed by evaluating warm time intervals in Earth history, such as the peak warming of the early Pliocene ∼4–5million years ago (Myr). Mean annual temperatures during the middle Pliocene (∼3.0–3.3 Myr) and early Pliocene (4.0–4.2 Myr) were ∼2.5 ◦ C (refs 3, 4), and 4 ◦ C (ref. 5) warmer than preindustrial conditions, respectively. During the early Pliocene, the equatorial Pacific Ocean maintained an east–west sea surface temperature (SST) gradient of only ∼1.5 ◦ C, which arguably resembles permanent El Niño- like conditions6. Meridional5,7 and vertical ocean temperature gradients8 were reduced, and deep-ocean ventilation enhanced, relative to today9,10. Deterioration in Earth’s climate state from 3.5 to 2.5Myr led to an increase in Northern Hemisphere glaciation11 . By ∼2 Myr, subtropical Pacific meridional SST gradients resembled modern conditions5, and the Pacific zonal SST gradient (∼5 ◦ C) was similar to the gradient observed today, with a strong Walker circulation6.
    Tectonics and changes in ocean12–14 and atmospheric circulation15,16 were potentially important factors in climate evolution during this time. http://www.geo.umass.edu/courses/geo763/Pagani.pdf

    Re eric’s, comment :”Contrary to popular view, the Eocene data are simply not good enough to ‘challenge’ climate models with midrange sensitivity. See e.g. the open access paper The early Eocene equable climate problem revisited by Huber and Caballero (2011) in Climate of the Past.”

    From the abstract you link, it states “We find that, with suitably large radiative forcing, the model and data are in general agreement for annual mean and cold month mean temperatures, and that the pattern of high latitude amplification recorded by proxies can be largely, but not perfectly, reproduced.”

    The models are not perfect either so i don’t see the problem here with the “popular view”, especially when assessing a wider range of studies.

  47. Hank Roberts:

    > Engelbeen
    Dunno; one of these eight or so hits might lead you to it:

    http://www.google.com/search?q=site%3Arealclimate.org+%2BEngelbeen+%2B2005+%2Bregression+%2Bcorrelated+%2BVostok

    > biological systems

    I do wonder whether there are big surprises waiting for the physicists once biological systems are better modeled. I’d suspect there are cross-discipline areas that simply can’t be modeled yet that while they may produce only slight wiggles in the models, would produce tipping points/crashes/excursions in reality.

    Since evolution has changed the biology of the planet so dramatically between each of the major extinctions, the past isn’t much of a guide to _what_ could be modeled.

    But is there any way to get at this better by looking at the paleo work?

    For example this:

    http://www.clim-past.net/5/297/2009/cp-5-297-2009.html
    Clim. Past, 5, 297-307, 2009
    http://www.clim-past.net/5/297/2009/
    doi:10.5194/cp-5-297-2009
    Ecosystem effects of CO2 concentration: evidence from past climates

    Or looking at older work, this one (it’s a Letter):
    http://www.nature.com/nature/journal/v429/n6991/abs/nature02566.html
    suggested that a new core drilling investigation around a group of hydrothermal vent complexes ought to be able to resolve the question about timing a methane excursion. I haven’t found out whether the drilling was done. There’s a copy
    http://folk.uio.no/hensven/nature/svensen_etal_nature04.pdf

    This is stuff the modelers presumably know far more about than blog readers — I’d love to hear more from the people who are doing the modeling, but definitely don’t want dumb questions to slow down the actual work they’re doing. Just very curious.

  48. Pete Dunkelberg:

    @ 44: “The whole international strategic response to climate change is based on the output of the computer ensemble.”
    I don’t think so, one reason being that humanity’s response is based on denying this, as you have noticed elsewhere.

    @ 46: Re Eric plus Huber_and_Caballero_2011_The_early_Eocene_equable_climate_problem_revisited.pdf.
    If you look into that paper, they find that the Eocene tropics were warmer than was thought 10 to 15 years ago. This facilitates solving the problem via higher pCO2. Lots more. Applying a sensitivity of 7 to that pCO2 would really get earth cooking.

  49. prokaryotes:

    Re Pete Dunkelberg, again i quote from the Wasdell paper:

    The higher the concentration of any particular greenhouse gas, the less efficient it becomes at inhibiting infra-red radiation in the particular wavelength zone associated with its specific molecular structure. A long history of experimental verification has shown the relationship between concentration and absorption efficiency to be logarithmic. In particular, the change associated with a doubling of the concentration of carbon-dioxide is known to reduce its efficiency as a greenhouse gas. The forcing associated with each doubling is a constant 4 watts per square metre (wm-2) at the earth surface. That requires a change of 1.2oC in surface temperature to re-balance the energy budget. Logarithmic functions of this kind produce a constant output for any halving or doubling of the parameter across a given range.

  50. Hank Roberts:

    Given any one model — is the climate sensitivity something that emerges from a variety of assumptions chosen for that particular model, based on how it comes out? Or from each run of that model?
    (if you had computer and time enough to let each scenario run go a few thousand years of scenario time, til the temperature leveled off and started to decline — which I think isn’t practical)

    Or does the model run a collection of scenarios, for a few hundred years each, through the first fast feedbacks, based on a single sensitivity number, to see what the other factors produce?

    Or — I’m sure I don’t have this clear enough, I’m aiming for the high school level or simpler words here — can one of the modelers give an idea of what the various CMIP groups are doing with their models?

  51. Edward Greisch:

    44 prokaryotes: “a climate sensitivity of 8oC” OH S***!

    I hope the CMIP5 simulations can answer questions like whether an 8 degree C climate sensitivity is real. If it is correct, we are in a whole lot more trouble than I thought, which was a lot.

    Same for 43 Aaron Lewis: “we need to think about our current GHG concentration as 500 CO2- eq in 2011.” Because a doubling of equivalent CO2 would be 560 equivalent ppm, which is too soon.

    Another question for the models: Has the methane from Arctic wetlands taken it out of our hands yet? Is there still any action strong enough to turn the situation around?

    Dream list for the model: Let it be an engine for a game that gamer type regular people can play in which they can act out the 2050s. Design a popular Sim-Earth game around CMIP5 and sell a lot of games at low prices. Make it run on many platforms and the web. But make it all-too-real. The gamer dies of starvation. Maybe this is how we communicate it to the world.

  52. prokaryotes:

    Edward Greisch:”Let it be an engine for a game that gamer type regular people can play in which they can act out the 2050s. Design a popular Sim-Earth game around CMIP5 and sell a lot of games at low prices. Make it run on many platforms and the web. But make it all-too-real. The gamer dies of starvation. Maybe this is how we communicate it to the world.”

    NASA Launches New MMORPG Moonbase Alpha
    http://www.g4tv.com/thefeed/blog/post/705908/nasa-launches-new-mmorpg-moonbase-alpha/

    But since it has become very quiet about this project… sad.

  53. wili:

    A humble request:

    Can we have a main post examining the validity or lack thereof of the Wasdell paper that prokaryotes has brought to our attention?

    If it is garbage, it should be pretty easy for the experts here to shoot down and point out where the holes are. If it is spot on, we need to rethink a number of assumptions, at least.

    [Response: I gave it a quick skim. He is very confused and adds in and removes feedbacks in a rather willy-nilly manner. His CS value is based solely on the the fact that the last ice age was ~5 deg C colder and had 180 ppmv CO2. This is no good for calculating the Charney Sensitivity (because you are neglecting all other forcing terms), and it is a poor estimate of the Earth System Sensitivity because of the difficult-to-deal-with seasonality of the orbital forcing for creating the ice sheets in the first place. I was trying to remember where the name Wasdell came up before, and it was with a terrible piece in New Scientist accusing IPCC of neglecting water vapour feedback etc. He was wrong then, and he is wrong now. - gavin]

  54. prokaryotes:

    Climate Sensitivity with David Wasdell

    Climate Sensitivity is made up of two fundamental parts. The first is the effect of doubling the concentration of atmospheric carbon dioxide on its own, holding all other system parameters constant. The second is the amplification of the primary change by a range of other system variables, namely the dynamic feedback system.
    [..]
    The effect of doubling CO2 concentration on its own is extremely accurately known from observation, theoretical calculation and laboratory testing.
    It stands at 1.2oC and is represented by the black line on the semi-log scale below. The forcing generated by such an intervention is also accurately known to be 4.0 wm-2.
    The relationship between the two figures is governed by the Stefan-Boltzmann law concerning the energy radiated to the cold spatial sink by a “black body” at a given temperature. The ‘black body’ value is adjusted to take account of the emissivity of the planet. The change in radiation from the earth generated by a change of 1oC in surface temperature is 3.3wm-2.

    In climate models, the value of the amplification factor depends on which feedback mechanisms are taken into account, and on the competence of the modelling of the various feedback mechanisms and their complex interactions.

    Temperature-Forcing ratio. This definition answers the question about the equilibrium temperature change required to balance the effect of any given CO2 forcing. It is presented in degrees per watt per square metre, oCw-1m-2. Since a doubling of atmospheric concentration of carbon dioxide delivers a forcing of 4wm-2, the temperature-forcing ratio is one quarter of the climate sensitivity.

    Concentration-temperature ratio. This final definition relates the number of parts per million (by volume) of the atmospheric concentration of CO2 required to generate a shift of one degree in equilibrium temperature. Measured in ppmoC-1, it is specific to a given level of concentration and changes in step with the logarithmic decay in efficiency of CO2 to act as a greenhouse gas as its concentration increases. For instance if the concentration-temperature ratio is 20 ppmoC-1 when the concentration is 280 ppm, then it will increase to 40 ppmoC-1 when the concentration is 560 ppm, and 80 ppmoC-1 for a concentration of 1120 ppm.
    With this set of definitions in mind we can now proceed to explore the four main approaches to determining the increase in equilibrium temperature consequent upon any given increase in the atmospheric concentration of carbon dioxide.
    [..]
    This paper has therefore adopted a graphical presentation using a semi- logarithmic scale in which the curves of normal log functions display as straight lines. The device enables clarity of comparison between a variety of amplification factors applied to the logarithmic relationship between carbon-dioxide concentration and the compensatory change in equilibrium temperature as enhanced by correlative feedback dynamics.
    [..]
    One completely unanticipated outcome of using the semi-log display is the almost perfect symmetry between the 180 ppm and the 440 ppm values with respect to the pre-industrial benchmark. Implications of this symmetry are drawn out later in the paper, for now we simply note that the change in CO2 concentration from 180 ppm to 280 ppm may be expected to have the same effect as the increase in CO2 concentration from 280 ppm to 440 ppm, namely a shift of 5oC in the average surface temperature of the planet rather than the 2oC currently predicted as the equilibrium response to a concentration of 440 ppm.
    [..]
    The amplification factor (AF) differentiates between the role of the change in concentration of atmospheric CO2, and that of the feedback system. It is defined as the ratio by which the feedback system multiplies the contribution of the forcing from any given change in atmospheric concentration of CO2. Like climate sensitivity, its value is also constrained by the condition of dynamic thermal equilibrium. The value of climate sensitivity is obtained by multiplying the effect of doubling CO2 concentration (1.2oC) by the amplification factor representing the contribution of the interdependent set of feedbacks in the earth system dynamic. The relationship is represented by the equation:
    S = 1.2 AF oC

    ___
    I suggest RC puts up a blog post, dedicated to the Wasdell paper…

  55. CM:

    > I suggest RC puts up a blog post, dedicated to the Wasdell paper…

    I suggest not.
    But this existing post might help clarify a few things:
    http://www.realclimate.org/index.php/archives/2008/04/target-co2/

  56. Hank Roberts:

    Gavin, thanks for the inline reply above.

    Proc, belatedly looking the guy up
    http://scholar.google.com/scholar?q=“David+Wasdell”+climate
    Wasdell is a religious/psychology writer, blog-flogging a new book. I think you’ve been following a red herring on this one. If you must continue, would you persist at your own blog? Way off topic for here, now that Gavin’s pointed out the guy’s previous mistakes.

  57. prokaryotes:

    CM “I suggest not.”

    From the 2008 article: “…the conclusion that the Earth System sensitivity is greater than the Charney sensitivity is probably robust. And that is a concern for any policy based on a stabilization scenario significantly above where we are now.”

    The paper from Wasdell touches a lot new ground and basically brings up an entire new approach to quantify climate change, anthropocene climate change. Then does the article from 2008 leaves room for a lot of discussion, for example it does not even mention equilibrium. To reject a broader discussion of the Wasdell paper is a bit odd i think, unless you present a precise argument why that might not serve good. Especailly in light for policy implications.

    [Response: A good discussion of Earth System Sensitivity is found in Lunt et al (2010) (I'm a co-author) and I don't think people disagree that the ESS is likely to be larger than the CS. That is not the issue. Nor is whether this is interesting or useful to calculate. My disagreement with what Wadsell has argued is all in how he has constructed his argument, not whether it is worth talking about. - gavin]

  58. prokaryotes:

    Gavins response to wili: “in New Scientist accusing IPCC of neglecting water vapour feedback etc. He was wrong then, and he is wrong now.”

    I cannot read the entire article in the NS, but i wonder why a few years later then other climate scientist warn of the wv feedback… On the bottom line i would not reject a paper because he was proven wrong in the past. And i cannot find any ill advised motives here.

    Water Vapor Feedback Loop Will Cause Accelerated Global Warming, Professor Warns http://www.sciencedaily.com/releases/2009/02/090219152132.htm

    Gavin “This is no good for calculating the Charney Sensitivity (because you are neglecting all other forcing terms),”

    I don’t think so, because he uses a amplification factor and a logarythmic scale, also he notes briefly that his CS estimate is also underestimating the feedback cycle, because of other GHG.

  59. M:

    Prok:
    1) From your sciencedaily link: ““Everything shows that the climate models are probably getting the water vapor feedback right,”

    That means that Dessler et al. were _confirming_ that models were already doing the right thing.

    2) “for now we simply note that the change in CO2 concentration from 180 ppm to 280 ppm may be expected to have the same effect as the increase in CO2 concentration from 280 ppm to 440 ppm”

    This assumes that there were no forcings on the Earth system other than CO2 in the historical era. We know, however, that that is wrong: changes in orbital dynamics led to ice sheet retreat, which, by itself, contributed close to half of the total temperature change. Because we don’t have those orbital changes or those ice sheets today, we wouldn’t expect as large a response.

    [Response: All other things being equal, the forcing from 180 to 280 ppm is 5.35*ln(280/180) = 2.4 W/m2, and from 280 to 440, it is 5.35*ln(440/280) = 2.4 W/m2 as well. So to that extent the statement is correct. What is wrong is associating a temperature change of 5 deg C to the former on the basis of the ice age/interglacial difference. This assumes that everything that changed between those two periods was because of the CO2 - it wasn't. - gavin]

  60. Martin Vermeer:

    Implications of this symmetry are drawn out later in the paper, for now we simply note that the change in CO2 concentration from 180 ppm to 280 ppm may be expected to have the same effect as the increase in CO2 concentration from 280 ppm to 440 ppm, namely a shift of 5oC in the average surface temperature of the planet rather than the 2oC currently predicted as the equilibrium response to a concentration of 440 ppm.

    …and that’s where Wasdell already goes off the rails. A large chunk of the 5oC temperature shift from glacial to interglacial was due to albedo loss — hey, huge ice sheets at low-ish latitudes were melting away — which will not be repeated in the same way from 280 to 440 ppm. And this is anyway a longer time scale thing.

  61. wili:

    Thanks for the responses.

    Martin’s comment points up to me that whenever you compare two specific warming periods you end up getting into the specifics of what helped drive that particular warming. If the vast albedo shift of a heavily glaciated Northern Hemisphere turning into mostly open land and water is not a factor we have to worry about so much this time, what other particular events could potentially come into play in the next few decades?

    I had noticed that Wasdell had published something earlier that you disagreed with. I tried not to assume that one perhaps ill-thought-through article condemned all his future articles to be similarly ill-conceived.

    In light of your response above to prokaryote at #57:

    “A good discussion of Earth System Sensitivity is found in Lunt et al (2010) (I’m a co-author) and I don’t think people disagree that the ESS is likely to be larger than the CS. That is not the issue. Nor is whether this is interesting or useful to calculate. My disagreement with what Wadsell has argued is all in how he has constructed his argument, not whether it is worth talking about.”

    let me amend my humble request–could we have a main post that looks at ways one might come up with a reasonable approximation of how much larger ESS is likely to be than the CS?

    Thanks again.
    –wili/John Harkness

    (reCaptcha: icepatie cordata,)

  62. Hank Roberts:

    > … disagreement … in how he has constructed his argument

    He’s no doubt a very nice man.
    But so are many people far out in various political directions.
    They all need help getting the science right, including him.
    You can see that just from the titles of his work.
    http://www.meridian.org.uk/About/Director/Pro-About_the_Director1.htm

    (space in URL added to get past the overeager spam filter)

  63. prokaryotes:

    Martin Vermeer:”A large chunk of the 5oC temperature shift from glacial to interglacial was due to albedo loss…”

    I think to look at the periodically albedo loss of the earth is a good indicator to assess/project the fast advancing artificial albedo loss today, which is projected to advance in parts quiet rapidly (year to decade timescales). Still it is not clear to me why his approach is not further considered because of the historical temperature mark of 5C he uses. I find that makes a lot of sense to use the natural cycle as a basis.

    Wasdell:”“If the Charney sensitivity, supported by our modern computer models, projects that a doubling of the concentration of atmospheric carbon-dioxide leads to a temperature rise of 3oC at equilibrium, then why, in the empirically measured behaviour of the planetary system, does an increase of only 56% in CO2 concentration (from 180 ppm to 280 ppm) lead to a 5oC change in temperature?”
    If we take into account the non-linear relationship between the variables as expressed in the semi-log (base 2) presentation, the 56% increase in absolute value of the CO2 concentration across this particular range, equates to only 63.4% of the effect of doubling, i.e. a forcing of 2.54 wm-2. Applying the Charney Sensitivity to this proportion yields an increase of only 2oC for the change from coldest point of the ice-age to the pre-industrial benchmark. Historical data tells us that the shift should be 5oC. It just does not compute. The inescapable conclusion is that the computer modelling ensemble, together with the Charney sensitivity and supported by Hansen’s “empirical sensitivity” are all omitting something fundamental.
    They are grossly under-representing the contribution of the complex feedback system to the amplification of the effects of change in CO2 concentration.”

    And reading now..

    Earth system sensitivity inferred from Pliocene modelling and data
    http://pubs.giss.nasa.gov/cgi-bin/abstract.cgi?id=lu07000g

  64. Hank Roberts:

    Oh, by the way, Wasdell cites Engelbeem at RC here:

    http://www.meridian.org.uk/Resources/Global%20Dynamics/ClimateSensitivity/frameset.htm?p=8

    (space added to get around the spam filter)

    The pointer is meant to take readers to:

    http://www.realclimate.org/index.php/archives/2005/11/650000-years-of-greenhouse-gas-concentrations/comment-page-1/#comment-5875

    Eric replied inline to that, pointing out where it’s wrong — but I doubt many readers get far enough to read that; Wasdell clearly did not bother to mention the response.

    I suspect Tamino would have a bit to say about his curve-fitting

  65. prokaryotes:

    Hank Roberts if you look at other data with David Wasdell make sure to watch his latest interview too (at my block).

    What i start to understand now is that the issues surrounding climate sensitivity are nothing new and acknowledged by leading climate scientist. And in this regards it would be nice to see a blog dedicated to the latest science.

  66. Dr Nick Bone:

    Wasdell’s paper seems pretty simplistic for the reasons Gavin mentions… you can’t assume that all the temperature difference between LGM and Holocene was due just to the CO2 difference.

    Even under the Earth System Sensitivity approach, with albedo changes treated as a slow feedback to greenhouse gas changes, CO2 is not the only GHG to consider: methane and NO2 also increased from LGM to Holocene. This is why Hansen et al got an ESS of 6°C for doubled CO2 rather than 7.8°C.

    But even then it’s not clear that Hansen et al have treated the orbital variations properly. Consider two thought experiments:

    1. Back during the LGM there is a huge series of volcanic eruptions, which increase the CO2 level, but of course leave the orbital forcings unchanged. Allow a couple of thousand years for albedo feedbacks and other slow feedbacks to complete. At the new equilibriun, CO2 level is now doubled at 360ppm, but no other changes in GHGs. What is the Earth’s new temperature?

    2. Opposite thought experiment. Orbital variations from LGM to Holocene are as for real history, but now the carbon cycle feedbacks are switched off. CO2 remains at 180ppm and other GHGs remain at LGM levels. Albedo feedbacks on the orbital forcings still happen. What is the Earth’s new temperature?

    If ESS is 6 degrees then the answer to question 1 is “Holocene maximum or a bit warmer” ie the CO2 change without orbital change has ended the ice age prematurely. Whereas the answer to question 2 is “Not very different from LGM” ie the orbital changes without CO2 changes have not had much effect at all, even allowing for albedo feedbacks. Those might be the right answers, but I don’t think we know that, and it seems rather an extreme attrribution of the changes between LGM and Holocene.

    So 6 degrees is probably close to an upper bound on ESS rather than a lower bound. I had a look at the other literature on ESS based on Pliocene temperatures and CO2 including Lunt et al, Pagani et al and Tripati et al. Most of those produce an ESS in the range of 4-6 degrees; certainly higher than Charney, but not 2.5 times higher.

  67. wili:

    “So 6 degrees is probably close to an upper bound on ESS rather than a lower bound.”

    Not sure I follow you there. It seems to me that there are too many unknowns (and doubtless even more unknown unknowns) to be able to put a firm upper bound on ESS at this point, especially as it relates to our current situation where we are spewing CO2 into the atmosphere at a rate a couple of orders of magnitude faster than has ever happened in the paleo-record.

  68. Hank Roberts:

    This is the blog you’ve been wishing for.

    They’re not doing high drama. Wasdell (and others *cough* Romm *cough*) believe high drama is necessary.

    Public reaction appears to be proving them right (dammit).
    But drama often includes oversimplification, and sometimes flat errors.
    You want to blog Wasdell? Point out that he’s got the right idea but made some mistakes in details as pointed out above — rather than echoing errors.

    Opportunity: doing the suggested reading; see what’s being done.

    http://pubs.giss.nasa.gov/abs/sc02400t.html
    Schmidt, G.A, 2010: Enhancing the relevance of palaeoclimate model/data comparisons for assessments of future climate change.

    —-excerpt from the opening of the topic—–

    …. Some 600 papers …. We discussed some … back in 2008.
    Now that CMIP5 is gearing up for a similar exercise, it is worth looking into what has changed – it terms of both the model specifications, the requested simulations and the data serving to the wider community. Many of these issues are being discussed … CLIVAR newsletter (Exchanges no. 56).

    Got that? (right click and “save as”)

  69. CM:

    Re: Earth System Sensitivity, some reading: the Lunt et al. paper Gavin mentioned above is freely available here:
    http://pubs.giss.nasa.gov/abs/lu07000g.html

    And the discussion in the NAS Climate Stabilization Targets report looks helpful:
    http://www.nap.edu/openbook.php?record_id=12877&page=217

  70. Dr Nick Bone:

    Re 67: “Not sure I follow you there. It seems to me that there are too many unknowns (and doubtless even more unknown unknowns) to be able to put a firm upper bound on ESS at this point”

    The logic is that if ESS is MORE than 6 degrees, then the temperature difference between LGM and Holocene would be bigger than what we observe/ reconstruct. (Even if the orbital variations themselves have minimal/no effect – see my thought experiment 2.)

    You can push the bound a bit by assuming a slightly higher temperature difference between LGM and Holocene (say 6 degrees rather than 5 degrees) and a slightly lower range of CO2 / other GHG variation (say 185ppm to
    275ppm CO2 rather than 180ppm to 280ppm). But that still doesn’t take ESS up to 8 degrees.

    Incidentally, I think Wasdell might me right but for the wrong reason, because the ESS is not the whole story. We need to look at carbon cycle feedbacks as well (i.e. the fact that Nature will respond to higher temperature by spitting out its own CO2 and methane into the atmosphere, adding to the CO2 we’ve already dumped their ourselves). Once you add those feedbacks in, we probably are up to about 8 degrees rise in response to us doubling CO2.

    See some of my previous postings here:
    http://www.realclimate.org/index.php/archives/2010/02/good-news-for-the-earths-climate-system/comment-page-8/#comment-162621

  71. prokaryotes:

    Soil respiration & Denitrification Feedbacks and modeling
    What i did not read anywhere yet is how we quantify feedbacks from soil respiration(Co2) and denitrification(No2). When i read about the bio-geochemical feedback this might be covered there but so far i did not read any specific assessments.

    How does the modeling of these feedbacks work? Also under special circumstances there are other phenomena

    Soil carbon and climate change: from the Jenkinson effect to the compost-bomb instability
    More recently there has been a suggestion that the release of heat associated with soil decomposition, which is neglected in the vast majority of large-scale models, may be critically important under certain circumstances. Models with and without the extra self-heating from microbial respiration have been shown to yield significantly different results. The present paper presents a mathematical analysis of a tipping point or runaway feedback that can arise when the heat from microbial respiration is generated more rapidly than it can escape from the soil to the atmosphere.
    http://onlinelibrary.wiley.com/doi/10.1111/j.1365-2389.2010.01312.x/full

    New study focuses on nitrogen in waterways as cause of nitrous oxide in the atmosphere
    The new study reports that streams and river networks are the source of at least 10 percent of human-caused nitrous oxide emission to the atmosphere worldwide, which is three times the amount estimated by an earlier report from the Intergovernmental Panel on Climate Change (IPCC). http://www.eurekalert.org/pub_releases/2010-12/uond-nsf122010.php

    Denitrification takes place under special conditions in both terrestrial and marine ecosystems. In general, it occurs where oxygen, a more energetically favourable electron acceptor, is depleted, and bacteria respire nitrate as a substitute terminal electron acceptor. http://en.wikipedia.org/wiki/Denitrification#Conditions_required

    Ocean’s Harmful Low-Oxygen Zones Growing, Are Sensitive to Small Changes in Climate http://climateforce.net/2011/07/12/feedbacks-climate-change-reducing-ocean’s-carbon-dioxide-uptake-and-accelerates-hypoxia-states/

  72. Richard Pauli:

    Referring people to a new poster “The Software Architecture of Global Climate Models Key to Diagrams” from Kaitlin Alexander, Steve Easterbrook

    http://climatesight.files.wordpress.com/2011/08/poster.pdf

    http://climatesight.org/2011/08/16/wrapping-up/

  73. Hank Roberts:

    > Dr. Nick Bone … previous postings
    Good pointer, thank you for the reminder.

    Carbon cycle feedbacks were in models at least as early as CMIP3, right? I know the marine biologists have been worrying about plankton changes for a long while, though I rarely see those discussed by modelers.

    Just poking around, I found this paper, originally out in 2008 for AR4, revised in 2011:

    http://nldr.library.ucar.edu/repository/assets/osgc/OSGC-000-000-002-064.pdf

    Atmos. Chem. Phys., 11, 1417–1456, 2011
    http://www.atmos-chem-phys.net/11/1417/2011/
    doi:10.5194/acp-11-1417-2011

    Emulating coupled atmosphere-ocean and carbon cycle models with
    a simpler model, MAGICC6 – Part 1: Model description and
    calibration

    “… This study presents the most comprehensive AOGCM and carbon cycle model emulation exercise to date. We use an updated version of the MAGICC model, which was originally developed by Wigley and Raper (1987, 1992) and which has been updated continuously since then (see e.g. Raper et al., 1996; Wigley and Raper, 2001; Wigley et al., 2009).

    Several amendments to MAGICC have been spurred by new results presented in the IPCC AR4 as well as by the increased availability of comprehensive AOGCM and carbon cycle model datasets. For example, land/ocean temperature evolutions for both hemispheres were calculated for each AOGCM allowing for a more in-depth analysis of optimal heat exchange parameterizations in MAGICC.

    Emulations with a simple model like MAGICC6 can by no means replace research into more sophisticated carbon cycle and general circulation models. Rather, what MAGICC6 offers primarily is a method to extend the knowledge created with AOGCMs and carbon cycle model runs in order to provide estimates of their joint responses and to extrapolate their key characteristics to a range of other scenarios….”

    [extra paragraph breaks added for online readability --hr]

  74. Didactylos:

    “A line count of source code is a good proxy for complexity.” No. No, it’s not. It’s just one of the few measures we have, and we have to make do.

    It fails to distinguish between needed complexity and byzantine snarl-ups. Between the complexity of what the code does, and the code itself.

  75. Pete Dunkelberg:

    Thank Heaven for Joe Romm and Climate Progress! Hank (# 68) we have a great problem. The problem at first looks like fossil fuels and increasing CO2, but why aren’t we solving this problem? Because of fossil fueled politics. Not because of the cost of renewable energy. Fossil fuel appears economical because significant costs of its use are not attributed to it. Solar energy for example could be ubiquitous. It could be on most rooftops and car tops. From the rooftops it could fuel cars and send power to the smart grid we could have.

    Who but Joe Romm at Climate Progress puts it all together – science, political lunacy and alternative energy – every single day and does a very good job of putting two and two together and getting four time after time despite the complexity of all the intertwined problems he deals with? And which commenters at which blog (*cough*) can’t decide within a factor of two what their favorite number is? :)

    Why can’t we (humans overall and especially the US) respond rationally to the great problems facing us? The whole business of deny, delay and make personal attacks on scientists is well crystallized by Juan Cole.

    And so our national debate is stunted and distorted. Instead of arguing over the best ways of dealing with our most pressing problems, we are reduced to disputing about whether a problem even exists. The latter is a rhetorical device of wealthy special interests designed to derail the ordinary workings of democracy.

    Instead of working on solutions to clear problems, public discourse has been forcibly dumbed down to the level of debating whether the problems even exist. So, Hank, the great need is to get the message out, and the message is “For crying out loud Virginia we really do have a serious problem.” Anything humanly beneficial requires a two thirds majority in 21st century democracy, so the message must get out to far more people than those who currently get it.

    Real Climate makes a valuable contribution to public knowledge of science. It is benefit to its readers, to all the other science blogs, thence to the world wide online community, thence to all the teaming human billions. Ditto for Climate Progress, and the latter addresses human as well as scientific matters. Climate Progress integrates things that people can actually do something about. We can’t do anything about climate sensitivity. It simply is what it is. And, Hank, …

    The exact value of sensitivity is not what matters in human terms. This is because corporate powers are determined to keep on burning carbon until they cook the last dollar out of us, no matter how much carbon it takes. What might stop them? Climate disruption might make business as usual impossible before we are totally cooked. Expect casualties. Lots of casualties. Or, humankind might rise up and make the obvious changes. What will motivate more people to constructive action sooner? Discussion of the exact value of sensitivity, or pounding away at fossil fueled politics day after day?

    Please, no more uncalled for remarks about Wasdell. He’s a religious spokesperson. You can’t expect him to use differential equations. He is concerned about human suffering and looming danger, as one would hope all religious spokespersons would be. Alas, too many have converted to the Holy Rightwing Church where greed is a sacrament not a sin. As a religious spokesperson he can reach people who tune science out. His audience is not going to take up atmospheric physics, but they need to understand that suffering will come if we persist in burning carbon as usual. There is a great human need to get beyond the numbed debate about whether there even is a problem. Let a hundred Wasdells bloom! If you want to be constructive, write to him privately about how to better express some tech details.

    So, Hank the crucial need is to get the message out that there really is a problem, and that it has aspects (politics, not sensitivity) that we can do something about if we will. Focus on that, and

    Thank Heaven for Climate Progress!

  76. prokaryotes:

    Pete Dunkelberg, no it is absolutely irrelevant if Wasdell also used to be what you call a religious spokesperson. There are many scientist who also can be considered religious spokesperson. So i really don’t understand your argument.

    I look at Wasdell’s message and data and what he has to say. He does a great job with messaging, he says things easy to understand and points to risks. At least to me it was not obvious that climate sensitivity is so far considered, to be underestimated, which is crucial for decision making. In this regards Wasdell helped to bring awareness to this topic. Because i believe i follow the development of climate science closely i wonder how many people beside me are still not aware that we deal with an assessment which is not realistically. This in light of the media which in the past painted a picture of overestimation.

    Besides arguments from gavin about flaws of Wasdell’s draft i cannot understand why his figures are so fast skipped. I don’t think he was considering all GHG with his numbers as i pointed out above.

    Because of what i learned during the discussion here at RC i will shift my focus to other people when it comes to the work of climate sensitivity, namely pagani, lunt, tripati, gavin’s work and hansen ofc. But also will read future Wasdell releases. On the bottom line i fear that even when i cannot understand why Wasdell is so fast disregarded, most decision maker cannot either. In this sense i ask again for a better analysis of his work. And because his motives are not ill advised either. Maybe debunking denial has become to much of a concern and better messaging of the science at hand and where it lacks must be improved.

    What is more? In his latest interview he calls the denial what it is “A crime against humanity”. We need this kind of messaging to bring better awareness to the topic. Scientist must foster a stronger message to buckle up the public, based on their findings. And that does mean also better messaging when it comes to underestimation of work which is used as the basis for policy making!

    There are already enough extremes which can be used as examples to warn the public. In this regards Bill McKibben is great in messaging.

    To contact Wasdell is not easy because he has no contact on his website and emails from the draft paper do not work. Finally i found his email, but why should i contact him, that should be done by people with better insight on the flaws of his work. He is asking for contribution to the paper.

    So at last i point again to the interview with Wasdell from July, where he talks about his friend Lovelock and calls the denial what it is…
    http://climateforce.net/2011/08/14/state-of-planetary-emergency-we-have-a-problem-david-wasdell-at-taellberg-forum-2008/

  77. prokaryotes:

    Correction: The 8 part interview with Wasdell from July can be found here http://climateforce.net/2011/08/13/climate-shift-impact-risk-assessment-revisited/

  78. Kevin McKinney:

    “[extra paragraph breaks added for online readability --hr]”

    Blessings upon you and your house forever for that!

  79. Brian Dodge:

    “Climate disruption might make business as usual impossible before we are totally cooked.”

    At which point anthropogenic GHG emissions move from being a forcing (independent of the state of the climate) to a feedback(responding to the state of the climate); in this case, a negative feedback – as the climate gets warmer and less conducive to civilized BAU, fossil fuel consumption will fall, and eventually natural processes will start removing CO2 faster than we can put it into the atmosphere. I call this the Dodge Iris effect, which will act to stabilize the climate – at some new unit root equilibrium, which may allow survival of the remaining humans.

  80. wili:

    @# 70 “I think Wasdell might me right but for the wrong reason”

    I suspect the same.

  81. Martin Vermeer:

    prokaryotes, it should be possible to criticize the “good guys” when they get it wrong. Wasdell is undoubtedly well-intentioned, but in science that’s just not good enough. In its own way this is just as damaging as the “scientific denialists” like Spencer talking sensitivity down. An embarrassment and a liability to competent climatology.

  82. prokaryotes:

    Oh come on Martin, he is not a climate scientist and if you read the first page, he is trying everything to make this a community afford. To compare him with Spencer is not very nice.

    I suggest every scientist should handle such work in a more constructive way, to show the public a) that climatology really is that complicated and b) to use this situation to bring attention to the topic. Overall isn’t this why RC exists in the first place? Everything else is not constructive and rather seems a bit arrogant.

    Now people say he was reasonable, and right (even for the wrong reason?!). Nevertheless i see your point but then again, there was no real coverage of climate sensitivity and underestimation, since a few years! At least not in the media i follow, Why? Just use this draft as an opportunity, to roll out the “real” science about CS.

  83. Richard Pauli:

    In the case of Wasdell — we perhaps see mistaken calculations that point to a resulting state which is not yet proven – but is never-the-less both possible and plausible – namely, runaway tipping points of no return.

    In the case of denialists like Spencer, Lindzen, et. al – we have mistaken calculations describing conditions more widely declared to be impossible and implausible – namely, global cooling or warming without an industrial CO2 influence.

    [Response: Both cases are implausible. It doesn't matter whether Wasdell is rightly concerned about the issue, or whether Spencer and Lindzen are not. Only the logic and actual support for the claims are important in deciding whether they have merit. - gavin]

  84. wili:

    Brian Dodge @#79: I think it is important to consider possible human actions as feedbacks, but it is even more difficult in this case to know what to expect. The main feedback from hotter temps in much of the world that can afford them would be increased use of air conditioning, mostly run on coal-powered electricity. If recent history is any indication, even very low EROEI sources of very dirty ff like tar sands are going to be used up. An immediate and total collapse of all world civilization may have the effect you claim, but that does not seem to be in the cards, as far as I can see. Meanwhile, desperate people do more and more desperate things–cutting down forests to burn to stay warm; hunting down every last wild animal to stay fed…

    It looks quite plausible and quite likely at this point that we will extract and burn nearly every ounce of fossil fuel that is recoverable. Energy is power. And power does not yield easily to persuasion. Wars are fought over the kinds of resources that we are asking people to walk away from voluntarily. We must keep asking and trying to persuade, but we can’t be too surprised that powerful forces are actively working against us and will continue to be.

  85. Dr. Skip T. Teck:

    How’s that whole “polar amplification” thing holding up for you these days? BWAHAHAHAHAHAHAHAHHAHAAHA

    Absolutely loving watching your world implode in slow motion!!!! And there’s not a damn thing you can do about it.

  86. richard pauli:

    #83 Gavin: – agreed that deciding merit must rule science… but can I ask you to reconsider the importance of plausibility and risk in selecting areas of study?

    In 2006 you posted a fine essay on runaway tipping points of no return, in which you concluded:

    “Much of the discussion about tipping points, like the discussion about ‘dangerous interference’ with climate often implicitly assumes that there is just ‘a’ point at which things tip and become ‘dangerous’. This can lead to two seemingly opposite, and erroneous, conclusions – that nothing will happen until we reach the ‘point’ and conversely, that once we’ve reached it, there will be nothing that can be done about it. i.e. it promotes both a cavalier and fatalistic outlook. However, it seems more appropriate to view the system as having multiple tipping points and thresholds that range in importance and scale from the smallest ecosystem to the size of the planet. As the system is forced into new configurations more and more of those points are likely to be passed, but some of those points are more globally serious than others. An appreciation of that subtlety may be useful when reading some of the worst coverage on the topic.”

    http://www.realclimate.org/index.php/archives/2006/07/runaway-tipping-points-of-no-return/

    Of course, toss out meritless studies, but shouldn’t there be a type of scientific triage where studies of plausible, high risk scenarios receive the corrective attention they deserve?

  87. Kevin McKinney:

    #85–borehole!

    Also, watch that IPP.

  88. Hank Roberts:

    Richard, Wasdell doesn’t have a study.
    He doesn’t know what’s already been studied.
    He doesn’t talk about what’s in CMIP5 to be studied.
    He claims stuff isn’t being done that is being done.
    D’oh.
    Digression avoids discussing reality.

    Look at the CMIP5 article, please.

    Please.

    That’s the topic here — the studies being done for the next IPCC.

    Those are the studies being done.

    They do cover the range of concern.

    Talk about what’s really happening, not about unreliable accusations.

  89. Edward Greisch:

    Reply to Hank Roberts, CM, prokaryotes, wili and others: I just finished reading “Delusional Democracy” by joel S. Hirschhorn. This book has a great deal to say about the way climate science is being treated, but not directly. Namely, we live in a plutocracy, and we had better save democracy quick. Science requires democracy.

    I started reading “Global Warming and Political Intimidation” by Raymond S. Bradley today. Bradley was one of the authors on the Hockey Stick paper with Michael Mann. Same problem: The plutocracy is against excessive truth, as all non-democratic governments are. The problem is systemic to the decay of democracy. It is not Climate Science alone that is in trouble, although it may seem that way.

    Wasdell’s sensitivity may not be the real issue. Sensitivity is clearly an important issue, but as I read this whole page, it seems to me that Wasdell’s sensitivity could be a symbol for something deeper. That something deeper would be the Political Intimidation that we all know is going on. The high end of the climate sensitivity distribution does need to be checked out carefully.

    Dodge Iris effect: Roger that. If the iris is open, somebody survives.

    86 richard pauli: Yes, those high risk scenarios and tipping points deserve our attention.

  90. Hank Roberts:

    http://shewonk.files.wordpress.com/2011/07/ears.jpg?w=300&h=195
    Wish: some science topics people can join only after doing the reading?

  91. prokaryotes:

    Edward Greisch i agree, something larger seems at work. The weather in my region i can only describe as super strange, but the denial keeps on. Then look at the news and imagine how climate change might act on the world.

    What if?
    What if we already face an evolving disastrous situation and governments decide to preparing with building arks. Of course not everybody has the chance to join, that’s why you keep the denial up.

    At this point it is either keep denial up and start praying or worldwide combined large scale affords to reduce emissions.

    Read the Wasdell paper again, he is underestimating (Yes, yes he comes to the right conclusion but is technically wrong) …

  92. dhogaza:

    How’s that whole “polar amplification” thing holding up for you these days? BWAHAHAHAHAHAHAHAHHAHAAHA

    The joke is on the troll, of course …

  93. Edward Greisch:

    CMIP5 archive: “poorly understood feedbacks” and “develop additional experiments that might build on and augment the experiments described here.”

    91 prokaryotes: “right conclusion but is technically wrong” Confusion is settled one way: additional experiments.

    How can a retired robotic weapons engineer come out of retirement and help with the experiments that will help CMIP5 nail down the sensitivity?

    Or rather the sensitivity function generating mathematics, since sensitivity depends on time, the condition of the planet, and other stuff.

  94. Kevin McKinney:

    #92–Oh, yes, indeed it is.

    By the way, I got an anonymous email response to an RC comment, awhile back, that sounded quite a bit like this one.

    Gutsy, that anonymous thing–nothing like having the courage of your convictions, I always say.

  95. Truthseeker:

    For those that think that climate models are worth the effort, try this;

    http://lorenzo-thinkingoutaloud.blogspot.com/2009/03/computer-models-and-cognitive-failure.html#0

    [Response: What's the logic here? Because some physicists/models have been wrong because of incorrect assumptions, it is therefore not worth doing any calculations? Strange. - gavin]

  96. Didactylos:

    Truthseeker: this is why is is vitally important to validate models. And you know what? Scientists do validate models! Isn’t that amazing?

    It’s like they know that models are only a simplified version of the world, and that they contain all the assumptions we hold!

  97. spyder:

    Just to put a bit more stress on the modeling, here is another relatively compendium analysis of several long term research studies:
    http://arstechnica.com/science/news/2011/08/climate-change-causing-species-to-change-habitat-faster-than-expected.ars

  98. Bengt Randers:

    How vill the simulations take Peak Oil, Peak Gas and Peak Coal into acount. From what I have heard from Uppsala, are many models using IEA most optimistic. There is not that much oil and coal. IEA is not an expert on these matters, it is political organisation working for OECD. When they say “Fields yet to be found” in there WEO 2010, they cant get the numbers to add up.
    http://aleklett.wordpress.com/2011/07/14/global-conference-on-global-warming/

  99. Hank Roberts:

    Anyone know where the discussion of Salby’s ozone paper is? All I have is the Science reference I gave above that mentions some think something omitted:

    “a group of researchers claims they can already see the ozone hole slowly recovering. Many others, however, say the paper, now in press in Geophysical Research Letters, leaves out critical information needed to clinch the case.”

  100. Hank Roberts:

    This refers to CMIP3 models:

    GEOPHYSICAL RESEARCH LETTERS, VOL. 38, L16704, 6 PP., 2011
    doi:10.1029/2011GL048237
    http://www.agu.org/journals/gl/published.shtml

    Are there two types of La Nina?
    Key Points
    La Nina events in the central and eastern Pacific SST are highly correlated
    There is a strong asymmetric character between El Nino and La Nina events
    CMIP3 models simulate the two types of El Nino more independently than La Nina

  101. Hank Roberts:

    oops — mispasted inquiry 20 Aug 2011 at 11:03 AM
    re Salby, sorry; entirely out of place

  102. john burgeson:

    I need a recommendation for a SHORT and readable book (or article) which is readable by a HS graduate and sets forth the case for climate change as clearly as possible. My grandson is not scientifically oriented, and at age 28

  103. Hank Roberts:

    > recommendation for a SHORT and readable book
    I’ll post one or two, replying over in the open thread:
    http://www.realclimate.org/index.php/archives/2011/08/unforced-variations-aug-2011/

  104. john burgeson:

    Thanks, Hank. Appreciated.

  105. Meow:

    @102: Try this, which is from a lively historical perspective.

  106. Denis Royer:

    @ CMIP5 simulations : Glad that you mention the role of hobbyists in the climate science. I rather would term them as “amateur climatologists” with reference to amateur astronomers whose contributions to astronomy is not more to mention. The analogy is obvious, the data mining and micro-computers replacing the sky observation and the telescopes.
    From a retired nuclear physicist (and an amateur astronomer)

  107. William Freimuth:

    Earlier I posted a rant (#4); please forgive me and accept my heartfelt gratitude for your scientific research.

  108. Kevin McKinney:

    #102–

    A very short article doing roughly that:

    http://doc-snow.hubpages.com/hub/Global-Warming-Science-A-Thumbnail-History

  109. Hank Roberts:

    http://www.agu.org/pubs/eos/eo1131.shtml

    Paywalled ( $20/yr for AGU membership) — worth a look:

    VOLUME 92 NUMBER 31 2 August 2011

    View entire full-size issue. [PDF 5.15 MB]

    FEATURE
    Guidelines for Constructing Climate Scenarios
    How best can scientists understand and characterize uncertainty in climate predictions, and what are some key considerations when selecting and combining climate model outputs to generate scenarios? Addressing these questions in the context of recent research leads to some possible guidelines for creating and applying climate scenarios.
    [PDF]
    By P. Mote et al.

  110. Geoff Sherrington:

    The journal “Analytical Chemistry” in 1986 reported a summary of analytical chemistry of lunar material collected during Apollo missions. Author, G.H. Morrison.
    Many laboratories submitted their replicate analyses of the same material, to give a “within laboratory variance” (here WLV). More than a score of the world’s top laboratories analysed most of the elements, so it was possible to compute a “between laboratory variance”, BLV. In many cases the WLV was less than the BLV, meaning that the laboratories were optimistic of their capabilities. Unfortunately, the BLV was larger than many has hoped for, but it was the variance that had to be used because there was no known way to discard some laboratories and keep others.
    To theme. After CMIP3, I blogged repeatedly that modellers should submit the results of all test runs, excluding those with known, excusable errors,not just those that had a good feel to them. In this way, a within model variance can be found. When all modellers report, a between model variance can be calculated and compared. If it is calculated with the inclusion of the within model variance, rather than on 1 or 2 preferred model outcomes per modeller (analogous to the lunar rocks example) a better estimate of confidence in models can be derived.
    Do you know if this is going to be done?

  111. Geoff Sherrington:

    Re 110 – date error. The paper is ANALYTICAL CHEMISTRY, VOL. 43, NO. 7, JUNE 1971. “Evaluation of Lunar Elemental Analyses”. George Morrison was at Cornell University.

    In the past I have been troubled by the significance of some modelling in GCMs because the uncertainty was ill defined. In 110 I suggest application of a method learned at high expense from older study. Although older, the fundamentals have not been overtaken by new developments.

  112. Ray Ladbury:

    Geoff Sherrington, Thank you for that utterly irrelevant suggestion, based on an utter and complete misunderstanding of how climate models, climate science and science in general work.

  113. Geoff Sherrington:

    112. Ray Ladbury. And the suggestion is irrelevant because ….?Soingli Museaum