Unforced variations: Oct 2011

Open thread for October…

409 comments on this post.
  1. Pete Dunkelberg:

    How much of the observed Arctic amplification comes from black carbon? Mark Z. Jacobson thinks a lot of it does. fat pdf.

    [Response: BC does have an outsize effect in the Arctic (because of the effect on snow albedo etc.), but BC levels have been dropping in the Arctic since the 1980s (collapse of communism, Clean air acts etc.), so whatever effect it is having is going down quite rapidly. There is more that can be done which would probably help, but I would not assign it a dominant role. Important, yes, but not dominant. – gavin]

  2. Kevin McKinney:

    From the shameless self-promotion department, a couple of milestones I’d like to acknowledge (especially since I owe them, in part, to the interest of RC readers):

    My Callendar article just hit its 1000th page view:


    It was preceded slightly by the (much older) one on Fourier:


    And the new articles on radiation in the atmosphere are off to a good start last month at the 100 page view mark:

    (The search for the “solar constant”)

    (Measuring surface- and back-radiation.)

    Thanks to all who’ve checked them out so far.

  3. Pete Dunkelberg:

    Linda’s first CO2 experiment. An improved link over the one from J Bowers in another thread.

  4. caerbannog:

    Dr. Mann has just responded to a particularly pernicious denier hit-piece in the Vail Daily — it deserves wider exposure, so I’m putting up the link to it here: http://www.vaildaily.com/article/20111001/EDITS/110939988/1021&ParentProfile=1065

    And here’s a link to the hit-piece that prompted Mann’s response: http://www.vaildaily.com/article/20110930/EDITS/110929829/1021&ParentProfile=1065

    Interestingly enough, all you will see there is this: “This is an invalid article or has been removed from our site.”

  5. Snapple:

    Dr. Mann helped write one part of the 4th IPCC Report that received a Nobel Prize.

    Martin Hertzberg claims the IPCC science is false, but our government’s National Intelligence Council, which is made up of 16 intelligence agencies, accepts the IPCC science.

    Dr. Thomas Fingar, former Deputy Director of National Intelligence for Analysis said in his June 25, 2008 before Congress:

    “Our primary source for climate science was the United Nations Intergovernmental Panel on Climate Change (IPCC) Fourth Assessment Report, which we augmented with other peer-reviewed analyses and contracted research. We used the UN Panel report as our baseline because this document was reviewed and coordinated on by the US government and internationally respected by the scientific community.”

    I think that’s a pretty helpful character witness.

    The NIC has (recently?)declassified and posted many studies on their site, which I describe and link to here.


    Perhaps Cuccinelli would like to sue them, too.

  6. Pete Dunkelberg:

    Way to go Mike Mann! Noteworthy is the comparison of caerbannog #4

    “Interestingly enough, all you will see there is this: “This is an invalid article or has been removed from our site.””

    with this:

  7. Snapple:

    Here is how Hertzberg is described on another site:

    Dr. Martin Hertzberg is a long time climate writer, a former U. S. Naval meteorologist with a PhD in Physical Chemistry from Stanford and holder of a Fulbright Professorship. He is a co-author of Slaying the Sky Dragon – Death of the Greenhouse Gas Theory and a member of an international group of scientists calling themselves the Slayers.


    It’s hard to see how someone could have a PhD in Chemistry and not accept that CO2 is a greenhouse gas.

  8. harvey:

    He’s a blow ‘em up man…


  9. Imbroglio:

    @caerbannog: The magical disappearance of Hertzberg’s screed may be connected to the magic word ‘libelous’ in Mann’s response. I’m sure Mann has better things to do than sue for libel but perhaps someone at the Vail Daily decided that discretion was the better part of vilification. The article’s still in the Google cache at time of writing, though it will presumably disappear as soon as Google re-crawls the URL. But of course it’s nothing we haven’t seen a hundred times before. I’ve saved a copy to my own computer, mainly because I loathe the policy of quietly ‘disappearing’ embarrassing articles, as practised by some online publications: if you think you’ve made a mistake, stand up and admit it, don’t try to cram it down the memory hole.

  10. Imbroglio:

    Apologies, I think the Google Cache link got chewed up. Here’s a short version: http://goo.gl/cyJWj.

  11. Edward Greisch:

    Please sign my petition at http://wh.gov/gtV.

    Stop Global Warming by shutting down the coal industry.
    If we do not stop Global Warming [GW] now, the desertification will continue and increase. Some time between 2050 and 2055, the land surface will be 70% desert and agriculture will collapse. Collapses due to small climate changes have happened many times before. If agriculture collapses, civilization collapses. If civilization collapses, everybody or almost everybody dies. We must prevent this by shutting down the coal industry. Let the electric companies figure out how to make electricity without making CO2, as long as they do so. Set a time limit of the end of 2015 to reduce the CO2 from a power plant by at least 95%.”

  12. Edward Greisch:

    Go to
    and start another petition. If you get 5000 signatures, somebody will pay attention to it. You have to get 25 signatures to get the petition to be viewable without going to a special URL.

  13. Pete Dunkelberg:

    Trees have been helping us, says the New York Times.

    Scientists have figured out — with the precise numbers deduced only recently — that forests have been absorbing more than a quarter of the carbon dioxide that people are putting into the air by burning fossil fuels and other activities. It is an amount so large that trees are effectively absorbing the emissions from all the world’s cars and trucks.

  14. caerbannog:

    I posted a diary about Mann’s Vail Daily piece over at dailykos.com, hoping that some folks there would pick up that ball and run with it — and run with it they did! The diary made the “recommended” list, and lots of folks must have taken up my suggestion to click on the “recommend” link over at Vail Daily. Mann’s piece has collected 170 “recommends” so far (about two orders of magnitude higher than the average number of “recommends” bestowed on a typical op/ed piece there).

    I’m not harboring any illusions about this making any significance difference overall, but what the heck — every little bit of “viral” marketing helps!

  15. Brian Dodge:

    Dr. Hertzberg was a blow ‘em up man; according to his website he now “…serves as an expert consultant for attorneys involved in litigation related to accidental explosions and fires.”

    His statement “Knowledgeable scientists, including the more than 30,000 such as myself who have signed the Oregon Petition” (in the disappearing rant), and OISM’s lax standards as to who is a scientist, reminds me of a joke whose punchline is “”I don’t believe I’d a’told that one, brother.”

  16. Mike:

    I’m covering related rates in my calculus course. I want to do an example for the students on the expansion of the Earth’s atmosphere as temperature rises. Let T be the ave temp, V be the volume, h the height (or depth) of the atmosphere and t be time. Then

    V=some function of T. T is a function of t. And we want to find dh/dt.

    But I don’t know what to use for V(T) or how to define h. Any suggestions? I need to keep it simple.

  17. Ray Ladbury:

    Pete Dunkelberg, To me, the news about carbon sequestration by vegetation seems to be a double-edged blade. Yes, it has helped us out, but CO2 has continued to climb expoentially. And as temperatures continue to rise, and more of the planet descends into drought, and more topsoil is washed away in impulsive rain efents, this could become a substantial positive feedback in the carbon cycle. Bastrop County, TX may have been an excellent example of the type of feedback I mention.

  18. Hank Roberts:

    > expansion of the Earth’s atmosphere as temperature rises.
    > … V=some function of T …

    Mike, I’m not a physicist, but I don’t think it’s that simple.
    Try this: http://www-das.uwyo.edu/~deshler/Atsc4400_5400_Climate/PierreHumbert_Climate_Ch1.pdf

  19. Pete Dunkelberg:

    Ray, yes, much of the NYT article is about extensive tree deaths cause by beetles moving north in a warmer west coast environment and extensive tree deaths from fire drought and floods around the world.

  20. Septic Matthew:

    11, Edward Greisch: Some time between 2050 and 2055, the land surface will be 70% desert and agriculture will collapse. Collapses due to small climate changes have happened many times before. If agriculture collapses, civilization collapses. If civilization collapses, everybody or almost everybody dies.

    Is civilizational collapse now on-topic?

  21. Septic Matthew:

    Now is a good time to address this paper:


    The authors estimate a most likely increase of 1.6K by 2080 if the atmospheric concentration of CO2 doubles gradually throughout that time span.

    Personally, I think that the transient climate sensitivity is the single most important quantity wrt climate change for public policy purposes. The authors provide one way to estimate it. Theirs is probably not the last word.

  22. dhogaza:

    He is a co-author of Slaying the Sky Dragon – Death of the Greenhouse Gas Theory and a member of an international group of scientists calling themselves the Slayers.

    Even Judith Curry, with her tendency to keep an open mind to all things skeptical regarding climate science, has openly described this book as being horsepucky.

  23. Pete Dunkelberg:

    Mike, you’re on the right track in that climate and weather are how familiar physical and chemical processes work out on a planetary scale. Having a planet in your equations complicates things a bit. Hank gave you a top notch University physics textbook reference. Understanding the Forecast [second edition any day now] aimed at non-majors is a gentler start. There is online material for both these books and others too isn’t there?

    Here is a wonderful article explaining the basics back in 2000:
    Held and Soden 2000. Water Vapor Feedback and Global Warming. Annu. Rev. Energy Environ.

    [beware a notorious “journal” with a name like that but not an annual review]

    Now back to your original question: if all the complicating factors that are constantly changing are left out and if you settle for half an atmosphere, you could work with the 500 millibar height, where half the mass of the air is below, half above.

    Here’s another one that might work: use the Stefan-Boltzman law: radiating energy away goes up as the 4th power of temperature.

    Consider a 1 square meter surface receiving from 200 to 300 watts of energy as visible light non stop, and radiating the energy away (the only way for a planet to cool itself and just getting hotter forever). Look at delta T of the surface as the incoming watts vary. The outgoing radiation will be infrared.

  24. Geoff Beacon:

    Excuse me for referencing my own blog posting but this is the only source of the preliminary work that Kim Swales and colleagues are doing on the elasticity of demand for household energy consumption. He has told me

    We have undertaken econometric work on the elasticity of demand for household energy in the UK. We get values of around 0.4 for the short run and over 1 for the long-run.

    If this preliminary result is borne out, it means that Hansen’s carbon fee can have dramatic results. I favour taxing carbon to subsidise jobs because it may be a policy easier to sell in the UK.

    I think it time that climate scientists grappled with economists because they have the power and many are climate ignorant.

  25. Hank Roberts:


    Colder stratosphere -> loss of stratosphereic ozone –> new record low

  26. Richard Hawes:

    Science Fiction Literary Assistance Required:-

    Although I have worked in the oil patch since 1974, my M.Sc. is in glacial geomorphology (as was to be my PhD before I had a bad hair day and fell off Albert Peak near Revelstoke), and so I am “climatologically aware”, if no longer “practising”.

    I am looking for the title and author of a science fiction short story from the 1960’s or 1970’s. This estimated time period is based upon the degree of technical understanding and technical development in the writing about computers, satellites and climate control.
    I read it in an American-published paperback compendium called something like “Greatest Science Fiction Short Stories, Volume 1 or 2”, which I purchased between 1972 and 1980. It has been lost in one of several packings and moves and unpackings.

    The time period, I think, is implied as being some time in the very near (geological) future, around 2500 CE. The location is on board a manned orbital space station that contains the computer and weather control systems that operate the weather on a colonised planet, not named, but somewhat like earth. The weather system has to be geo-engineered, or it will revert to the pre-human colonial conditions. This was uninhabitable for human life because of its’ extreme cold and storms (?). The computer is controlled by a single climate scientist / geo-engineering computer operator, who presumably is on rotation from “groundside”.
    The station is visited by … how best to describe him, a young man of the Bullingdon Club persuasion … from the planet, from “groundside”. There is a total social disconnection between the two people. The young man clearly despises the geo-engineering computer operator as being a scientist, a boring person with a boring and pointless life, a technoid. However, the young man also has some rudimentary computer skills, and perceives the climate control computer as just another toy. While the computer operator is elsewhere, the young man deliberately asks the computer a self-referential question as a malicious prank against the computer operator. In the story, there is absolutely no way out of the self-referential loop. Since the computer is focused on the question, it cannot control the climate, which starts to go irreversibly out of control. The young man tires of the prank. He tells the computer operator what he has done, and to stop the loop, so as to regain control of the climate. The operator replies that he cannot, because there is no means of breaking the self-referencing loop. The computer operator immediately knows full well what the young man has done. The young man begins to realise what he is responsible for. They stare at one another.
    End of story.

    The writing style, so far as I remember, was more Ray Bradbury than Isaac Asimov, but pithy. I have never forgotten the story. Needless to say, I want to find it again.

  27. Pete Dunkelberg:

    Amazon has the second edition: http://www.amazon.com/Global-Warming-Understanding-David-Archer/dp/0470943416/

    But Dr. Archer, the price has jumped way up to $62.34 from the first edition $62.34 while the page count has gone down from 288 to 207. And that’s for a paperback. Most 200 page paperbacks cost less.

  28. WebHubTelescope:

    Scientists have figured out — with the precise numbers deduced only recently — that forests have been absorbing more than a quarter of the carbon dioxide that people are putting into the air by burning fossil fuels and other activities. It is an amount so large that trees are effectively absorbing the emissions from all the world’s cars and trucks.

    This is the “residence time” vs “adjustment time” issue. The claim is that the residence time of CO2 is quite short because of carbon cycling but that the time it takes to migrate to deep sequestering stores is long. I came up with a model for this based on the master equation here: http://theoilconundrum.blogspot.com/
    The residence idea really has to be clarified because the skeptics are having a field day with it. The problem is that they don’t understand the mechanisms and by introducing some fundamental concepts such as diffusion, you can really squelch the concerns.

    If climate scientists aren’t thinking about it this way, then maybe you can learns something new.

  29. Pete Dunkelberg:

    @ 25, loss of ozone:
    Canada had naturally taken the lead in tracking Arctic ozone, but now they don’t want to anymore, notes the Rabbet:

    USA could do better too notes ever lovin’ Romm:

    P S @ 27: oops on the book price. It went up from about 40 to about 60 bucks for a shrinking book.

  30. John Mashey:

    re: Vail Valley Times
    We were just there at Vail a month ago, before Steve Schneider conference in Boulder.

    Small publications are easily vulnerable to such things, as discussed in part in poor science reporting. Small publications simply do not typically have the expertise to deal with such, but can be helped, or at least urged to avoid such things.

    Still, I suggest that we should praise VVV for being responsive, far more so than various much larger publications. They screwed up, but then acted decisively and ought to get creit for that.

  31. pjclarke:

    The open-armed embrace of Dr Hertzberg over at WUWT is something to behold. One hopes we are witnessing a defining moment. As I write the Wattbots are searching the internet’s various caches for the original text with a view to reproducing it in full, complete with violations of the second law of thermodynamics and the defamatory codicils. Go for it, Anthony.

  32. Meow:

    @21: Very briefly perusing that paper, I came upon Figure 13 and scratched my head. Every one of its PDF curves sums to > 1. What’s up with that?

    CAPTCHA: edorrow Colclough

  33. David B. Benson:

    On the sci-fi theme, I’m finding Empress of Eternity by L.E. Modesett, Jr., despite some reviewers who didn’t care for it; it has a stong climate cycle component which is why I mention the fantasy novel here.

  34. Meow:

    @21: On further reading, I see that the paper projects transient climate sensitivity forward to 2030, showing a handy 90% confidence interval in Fig. 6. The 2008 90% CI is 1.3K-2.6K, which p.15 says “is reduced by 45% by 2030″ by “assimilation of surface temperature data up to…2030″. Table 1 then gives the “most plausible” 2030 CI as 1.3K-2.0K.

    Which immediately raises the question: what “surface temperature data up to…2030″?

    Well, that turns out to be data satisfying the constraint, says p.17, that “the temperature increase between now and then is no more than expected with our currently-calculated sensitivity parameter.”

    Let me get this straight: the paper calculates a sensitivity parameter’s year-2008 CI using modelling based upon historical forcing and temperature data, then generates synthetic temperature data for the years 2008 (?)-2030 based upon that CI, then uses that additional data to derive a 2030 CI?

    I could well be missing something, but this seems circular.

  35. caerbannog:

    As I write the Wattbots are searching the internet’s various caches for the original text with a view to reproducing it in full, complete with violations of the second law of thermodynamics and the defamatory codicils.

    I suspect that at this point, Hertzberg would much prefer that the defamatory bits remain buried. Unfortunately for him, his WUWT “allies” are determined to keep digging them back up.

    IIRC, there’s an old saying that goes something like “With friends like that….”

  36. Hank Roberts:

    > Padilla et al.
    First hit in Google is “Cloud wars” at Climate Etc.

  37. Richard Palm:

    One of my online acquaintances is asking “What’s the control?” when testing predictions of anthropogenic global warming theory, and I’m not sure I know the answer. I assume that what he’s getting at is what are the results being compared to, in determining whether the data are meaningful?

  38. Lawrence McLean:

    Re #36, Richard Palm,
    The question “What is the control?” regarding AGW, is either naive or sarcastic. The convenience of being able to perform experiments with controls is not available in many scientific areas, examples where this convenience is not available include Astronomy, Geology and Climate science. With these sciences, other techniques must be used to validate theories. That the convenience of experimental controls are not available in these Sciences does NOT invalidate them, it just makes them a bit harder and the Scientists involved need to be a bit smarter in order to be effective!

  39. Lawrence McLean:

    #24, Geoff Beacon,
    Before any of the measures you suggest can be implemented, there is a battle that needs to be fought and won. That battle is to discredit the so called “Free trade” economic dogma that has infected governments worldwide. It even has its own police force (the World Trade Organization). Unless trade tariffs (which have unjustifiably been discredited by neoclassical economics) are implemented to protect the industries in those nations that implement the measures you suggest, then those measures will destroy the economies in which they are implemented.

  40. Septic Matthew:

    32, meow: Every one of its PDF curves sums to > 1. What’s up with that?

    Each pdf has a peak > 1.

    35, Hank Roberts,

    In response to a paper that is in press you go to a blog? You did much the same when you decided not to read the 110 pp of AOAS that I referenced. Didn’t you?

    31, pjclarke,

    I agree. Hertzberg is an embarrassment.

  41. Septic Matthew:

    34, meow: Let me get this straight: the paper calculates a sensitivity parameter’s year-2008 CI using modelling based upon historical forcing and temperature data, then generates synthetic temperature data for the years 2008 (?)-2030 based upon that CI, then uses that additional data to derive a 2030 CI?

    I asked the corresponding author if they would be willing to share code and data. Maybe after the paper is published.

    In order to make a prediction about 2030 some assumption about what happens between now and then is necessary. Consider other predictions for the rest of the century that also make such assumptions (e.g. point process model for volcano eruptions) and use parameter estimates from models. With their code, or another implementation of the nonlinear Kalman filter, other people can make complementary predictions based on complementary assumptions.

  42. Hank Roberts:

    > What’s the control

    The planet without burning fossil fuel, which changed far less quickly.


    “When we model previous switches in climate, we can compare the model to the results of real-world experiments recorded in ocean sediments and ice cores. But when we model the future, we have no empirical basis to judge the model’s accuracy. If we take no action until we are completely confident the models are correct, then the only use for the models will be to explain what happened. Our insistence on a tested model is part of the reason society is continuing to conduct the largest experiment ever done, the experiment of increasing the atmospheric concentration of greenhouse gases.

    It will be another 20 years before the climate changes that are predicted to be associated with the greenhouse effect become large enough to be unambiguously differentiated from naturally occurring variations in climate….”


    As my dentist puts the same concept:
    “Brush only the teeth you want to keep.”

  43. Hank Roberts:

    PS for Richard Palm — point your online acquaintance to:

  44. Edward Greisch:


    Climate Change and Society

    Page 40: The IPCC was formed….to blunt the activism that was beginning to emerge in the scientific community.

    Ah Ha! So RC is afraid of activism for fear of a repeat of something or other that the rest of us don’t know about. Tell us the whole story so that there will be too many people who know….

  45. Darv:

    Richard Hawes @ 26
    “The Monkey Wrench” Gordon R. Dickson in “The Penguin Science Fiction Omnibus”. Aldiss, B. ed 1973
    I bought my copy new in 1987 so good second hand numbers should be available.

  46. Matt McIrvin:

    A more sophisticated variant of the “no global warming since 1998″ meme seems to be developing: it’s something like “statistically significant cooling since 2001 or 2002″, based on curve fits to satellite measurements. The people pushing this don’t seem to have any specific alternative climate hypothesis they’re advocating, it’s more pure sowing of doubt based on a fairly short run of data, but at least they don’t seem to be depending on the 1998 maximum any more to drive the trend they’re claiming.

    Anyway, it’d probably be good to have a FAQ on this.

  47. Kevin McKinney:


    I think so.

    Some folks want to hang doubt on mainstream climate science by reifying the controlled lab experiment as the only ‘real’ way to do science. Of course, this rules out climate science since, as your friend’s question says (and as frequently expressed by many) “we don’t have a spare Earth or three lying about.” Hence, the thought goes, we can’t possibly know anything about the topic. . .

    Climate science is a huge topic. And some subfields do, in fact allow experimentation. For instance, John Tyndall’s work uncovering the greenhouse properties of CO2, water vapor, and other gases, was done (very elegantly, too) in the lab. That’s probably the most famous example, but I’m sure that there are others one could cite.

    But many sciences–both under and out of climate science ‘umbrella’–are not particularly amenable to the controlled lab experiment–or at least, not as the only way of learning anything. Many sciences require something called ‘field work,’ where the scientists go out into the real world and measure things in situ. It can be tough, it can be tedious, it can be messy, and it can be a challenge to isolate the thing you actually want to measure–but it can be, and is, frequently done. (Think of ecologists, biologists, and glaciologists, to name just three fields.)

    (By the way, “in situ” measurement was the topic of my “Fire From Heaven” articles, linked at the top of the thread–there is now roughly two centuries worth of efforts to measure radiation and heat transfer in the atmosphere, just a small portion of which I sketched in those two pieces.)

    Such measurements can be combined with careful analysis to build up a logical, coherent picture of how the climate system works. There are ways to approximate experimentation: one observes the changing inputs that Nature gives you: what happens to weather when a large volcanic eruption occurs? What happens when the solar minimum comes round, and is especially low? (A current ‘experiment’ now coming to an end, apparently.) What happens to weather when the Arctic sea ice has four years of very low summer minima in a row? (Another current “experiment.”)

    You may also be able to see how the system worked in more widely-varying conditions in the deep past: what happened when the Panama isthmus closed? What sorts of climatic conditions are associated with extremely high CO2 levels? Do climatic conditions sensibly correlate with Milankovitch cycles of insolation changes?

    Finally, of course, you have the GCMs–the ‘virtual Earths’ that come closest to allowing controlled experiments on a planetary scale. Model experiments are the current (and likely, future) gold standard for attribution studies–if you can build a simulation based upon physical principles that reproduces a past climate, you can then vary the ‘virtual CO2′ or ‘virtual aeorosols’ or ‘virtual sunlight’ and see what happens. It’s precisely because the model runs are so convincing as experiments that denialists spend so much time attacking their validity. (For most of these folks “unvalidated models” appears to be a single word–even though every model goes through a validation process.)

    In summary, there are many “controls”:

    –For the whole system, we have modeling and paleoclimatic data.

    –For atmospheric, oceanic, and terrestrial processes, we have in situ measurement and remote measurement (ie., satellite sensing), combined in some instances with lab experiments.

    –For basic physical properties, we have lab experiment, verified by in situ and remote measurement.

    The demand for a single “control” is much too simple, radically underestimating both the scope of the problem, and the depth and breadth of the knowledge already acquired.

  48. J Bowers:

    Cambodia suffers worst floods in a decade

  49. Kevin McKinney:


    Yes. This tactic has a history by now. See my article:


  50. Kevin McKinney:

    #45 and my (as yet unmoderated) response–I should have added that I’m always on the lookout for updates, so if you have specific examples in mind, I’d very much appreciate a pointer!

  51. Martin Vermeer:

    Meow #32, I don’t see that… visually they scale to ~1 for me. Do you have access to their data?

    Meow #34, yes it is circular where temperatures and transient sensitivities themselves are concerned, as they use “observations” from a GCM. But the uncertainty estimates (which is their focus) will be meaningful nevertheless — provided the GCM, and the forcings driving it, aren’t too far off. That’s how I take it.

    BTW the paper is still “in review” (where?) according to this. Hope the reviewers catch some of the “warts” like in the intro, “after about 70 years given a 1% CO2 doubling rate”. Surely they mean “after about 70 years given a 1% CO2 annual rate of growth” (which produces a doubling in 70 years).

    Nice stuff… funny that they’re coming from Mechanical and Aerospace Engineering…

  52. wili:


    “scientists who found the new fields of outgoing methane in the Arctic region have not defined yet whether it is the consequence of hydrates failure or result of high activity of sea microorganisms. To know this for sure they should first analyze the samples they gathered during the expedition.”

    What kind of sea microorganism activity would be likely to create massive quantities of methane?

  53. Septic Matthew:

    51, Martin Vermeer, I am glad you liked it. On the pdf itself it says “J. Climate, in press”.

    Surely they mean “after about 70 years given a 1% CO2 annual rate of growth”

    It’s funny no one caught that in review. But most papers have a few odd locutions.

    “Warts and all”, right now I think it is the best work on the most important topic.

    Meow #32, I don’t see that… visually they scale to ~1 for me. Do you have access to their data?

    They’re pdfs: the areas under the curves are 1, not the peaks of the curves.

  54. Hank Roberts:

    10.1098/rsta.2011.0003 Phil. Trans. R. Soc. A 28 May 2011 vol. 369 no. 1943 1980-1996

    Warming up, turning sour, losing breath: ocean biogeochemistry under global change — Nicolas Gruber

    “… the ocean’s biogeochemical cycles and ecosystems will become increasingly stressed by at least three independent factors. Rising temperatures, ocean acidification and ocean deoxygenation will cause substantial changes in the physical, chemical and biological environment, which will then affect the ocean’s biogeochemical cycles and ecosystems in ways that we are only beginning to fathom.

    Ocean warming will not only affect organisms and biogeochemical cycles directly, but will also increase upper ocean stratification. The changes in the ocean’s carbonate chemistry induced by the uptake of anthropogenic carbon dioxide (CO2) (i.e. ocean acidification) will probably affect many organisms and processes, although in ways that are currently not well understood.

    Ocean deoxygenation, i.e. the loss of dissolved oxygen (O2) from the ocean, is bound to occur in a warming and more stratified ocean, causing stress to macro-organisms that critically depend on sufficient levels of oxygen. These three stressors—warming, acidification and deoxygenation—will tend to operate globally, although with distinct regional differences.

    The impacts of ocean acidification tend to be strongest in the high latitudes, whereas the low-oxygen regions of the low latitudes are most vulnerable to ocean deoxygenation. Specific regions, such as the eastern boundary upwelling systems, will be strongly affected by all three stressors, making them potential hotspots for change.

    Of additional concern are synergistic effects, such as ocean acidification-induced changes in the type and magnitude of the organic matter exported to the ocean’s interior, which then might cause substantial changes in the oxygen concentration there. Ocean warming, acidification and deoxygenation are essentially irreversible on centennial time scales, i.e. once these changes have occurred, it will take centuries for the ocean to recover.

    With the emission of CO2 being the primary driver behind all three stressors, the primary mitigation strategy is to reduce these emissions.”

    [extra paragraph breaks added for readability — hr]

  55. Paul S:

    #21, Septic Matthew – ‘The authors estimate a most likely increase of 1.6K by 2080 if the atmospheric concentration of CO2 doubles gradually throughout that time span.’

    Surely that only holds if climate is currently in equilibrium? What should happen by 2080 is that we get ‘pipeline’ warming associated with the ~100ppm increase seen so far + an extra 1.6K transient response to a doubling (increase to 780ppm).

    For comparison the IPCC A2 scenario reaches about 660ppm by 2080. The A2 scenario includes other long-lived GHGs so overall forcing is probably relatively similar to 780ppm CO2. The IPCC ensemble model cast has a transient response of around 1.6K too yet the model mean projection is for about 2.5K warming from 2000.

    ‘Personally, I think that the transient climate sensitivity is the single most important quantity wrt climate change for public policy purposes.’

    Transient sensitivity is a useful metric but I would think it far more important for policymakers to understand that there could be a large amount of warming waiting to occur after we manage to stabilise CO2 concentration. Hence it is incomplete without also knowing the equilibrium response. Another nugget of knowledge that would be very useful for policymakers would be the length of time it takes for equilibrium to be achieved. I don’t see this discussed very often.

  56. Martin Vermeer:


    > They’re pdfs: the areas under the curves are 1, not the peaks of the curves.

    Precisely. That’s why I wondered how Meow could be so sure… the eye is not very good at integration.

  57. Hank Roberts:

    Martin, did you email them a pointer to the comments on errors in the review draft? Email is at the bottom of that page you linked to.

  58. Meow:


    Precisely. That’s why I wondered how Meow could be so sure… the eye is not very good at integration.

    You’re right about that. Visually I had estimated each curve’s area to be significantly > 1, but having measured the blue curve’s area in pixels, I find it’s ~1, so I withdraw that criticism.

  59. Meow:


    Meow #34, yes it is circular where temperatures and transient sensitivities themselves are concerned, as they use “observations” from a GCM. But the uncertainty estimates (which is their focus) will be meaningful nevertheless — provided the GCM, and the forcings driving it, aren’t too far off. That’s how I take it.

    But they’re estimating the uncertainty in the 2030 transient sensitivity by assimilating the synthetic 2008-2030 “temperature record”, which appears, in turn, to be derived from the 2008 transient sensitivity. How can that procedure possibly yield a 2030 TCS with a smaller uncertainty than that of the 2008 TCS?

  60. Maya:

    “Mann’s piece has collected 170 “recommends” so far ”

    281 with the one I added. You’re right, in the grand scheme of things, it’s not important, but if even one person notices, and goes “hmmm” and starts to read to find out more about the science of this whole global warming thing, and maybe clicks on one of those links at the bottom … then maybe it’s one more person who’ll understand, who’ll care, who’ll try to make a difference.

  61. Septic Matthew:

    55, PaulS,

    Yes. The other really important quantities are (1) the equilibrium climate sensitivity; and (2) how long it takes to achieve 99% of equilibrium after it gets halfway there.

    59, Meow: How can that procedure possibly yield a 2030 TCS with a smaller uncertainty than that of the 2008 TCS?

    I think they mean to say that, with temperature records available in 2030, estimated transient climate sensitivity then will be more precise; their computation with a simulated future illustrate how that can come about. However, I often write something like “In 20 years we’ll have a much clearer idea which of the models is most accurate,” so you could make a case that I am projecting my own belief onto them.

  62. Septic Matthew:

    does anyone else have references to good estimates of the transient climate sensitivity — peer-reviewed papers?

  63. Hank Roberts:

    Many (tho’ many are paywalled) from Scholar’s “Related Articles” link:

  64. Septic Matthew:

    Here’s one:

    Schwartz, S.E. Heat capacity, time constant, and sensitivity of Earth’s climate system. J. Geophysical Research, 112, D24S05, doi: 10.1029/2007JD008746, 2007. [BNL-79148-2007-JA]

    [Response: Here’s another: Foster, G., Annan, J.D., Schmidt, G.A., Mann, M.E., Comment on “Heat Capacity, Time Constant, and Sensitivity of Earth’s Climate System” by S. E. Schwartz, J. Geophys. Res., 113, L22707, D15102, doi: 10.1029/2007JD009373, 2008. -mike]

  65. Septic Matthew:

    Hank Roberts,

    Thank you.

  66. flxible:


    What kind of sea microorganism activity would be likely to create massive quantities of methane?

    one just never knows about unknowns. ;)

  67. Hank Roberts:


    — excerpt follows —

    … Between 18 and 20 kilometres up, over 80 per cent of the existing ozone was destroyed. “The loss in 2011 was twice that in the two previous record-setting Arctic winters, 1996 and 2005,” says Nathaniel Livesey …
    … But we don’t know why the stratosphere stayed cold for so long. “That will be studied for years to come,” Santee says.

    Climate change could be partly responsible. That may seem counter-intuitive, but global warming occurs only at the bottom of the atmosphere. “Climate change warms the surface but cools the stratosphere,” Harris explains.

    In 2007 the Intergovernmental Panel on Climate Change concluded that “there has been global stratospheric cooling since 1979″. “Whether that is because of climate change is speculation,” Santee says.

    — end excerpt —

    Livesey: http://science.jpl.nasa.gov/people/Livesey/
    Harris: http://www.cei.cam.ac.uk/directory/nrh1000@cam.ac.uk
    Santee: http://science.jpl.nasa.gov/people/Santee/

    Are there other predictions for, or ways in hindsight to explain, the stratosphere showing a cooling trend for over 30 years?

  68. Martin Vermeer:

    I think they mean to say that, with temperature records available in 2030, estimated transient climate sensitivity then will be more precise; their computation with a simulated future illustrate how that can come about. However, I often write something like “In 20 years we’ll have a much clearer idea which of the models is most accurate,” so you could make a case that I am projecting my own belief onto them.

    No, I think this is about right. The new information is how global mean temperatures will respond to much larger (and properly known!) forcings 2008-2030 according to one model. The response itself cannot be trusted any better than the GCM used, but the propagation of uncertainties may be realistic (or that’s the idea I get).

    By the way, SM #64 is a good illustration of why one should work with climate scientists when doing climate related work, or risk egg on one’s face :-)

  69. Andy:

    ref 26. I’m fairly sure the writer was Harry Harrison and I think the title was “The Invisible Idiot”

  70. Barton Paul Levenson:

    Hertzberg is the guy advising left-wing crazy Alexander Cockburn. See:


    [Response: Also mentioned here “Cockburn’s form” – gavin]

  71. Barton Paul Levenson:


    I tried to sign your petition, but the “SIGN THIS PETITION” button was grayed out and wouldn’t do anything.

  72. ldavidcooke:


    Hey Pete,

    Please do not forget the role that convection plays in part of the heat transport. Rather then just radiative emission, alot of energy goes into evaporation or sublimation which is transported to between 2-6km and radiates out from the elevated altitude. The other missing link is the change in the adabatic wet/dry transition height. Increase the atmospheric heat content and you increase the adiabatic height, complicate that with changes in aerosol CCNs and added wv/condensation UV absorption and re-emission as LW and you have a wonderful story to share. I am concerned that many get stuck in the first chapter or chorus and miss the rest…

    Dave Cooke

  73. Peter Backes:

    Geoengineering raises its head again:


    Significant quote:

    Jane Long, an associate director of the Lawrence Livermore National Laboratory and the panel’s co-chairwoman, said that by spewing greenhouse gases into the atmosphere, human activity was already engaged in climate modification. “We are doing it accidentally, but the Earth doesn’t know that,” she said, adding, “Going forward in ignorance is not an option.”

  74. Richard Hawes:

    Richard Hawes @ 26
    “The Monkey Wrench” Gordon R. Dickson in “The Penguin Science Fiction Omnibus”. Aldiss, B. ed 1973
    I bought my copy new in 1987 so good second hand numbers should be available.
    Thanks Darv …. this story has stuck with me a long long time, especially dealing with some of the trogs and cornucopians in the oil patch.

  75. Thomas:

    Hank @67.
    There is a short simple explanation for strtospheric cooling. The upper atmosphere obtains heat input from two sources, solar UV, and upgoing longwave from below. It loses heat via longwave emission. The solar UV is (to first order) unafected by the CO2 concentration, there is an excess of IR emission versus absorbption since the upper atmosphere is higher than the radiative temp of the planet, so increasing the IR opacity increases the cooling efficiency via LW emission. So the upper atmosphere should cool with increasing concentration of greenhouse gases. Of course subtle changes in chemistry and or circulation could complicate matters, but thats it in a nutshell.

  76. Paul S:

    Hank Roberts – ‘Are there other predictions for, or ways in hindsight to explain, the stratosphere showing a cooling trend for over 30 years?

    Ozone depletion also causes stratospheric cooling (I’m sure there must be a RealClimate post on this). Temperature trends in the stratosphere have been flat for about 15 years now, perhaps partly due to a small recent Ozone recovery(?).

    I found this recent paper a few weeks ago (Forster et many als. 2011). It’s and assessment of CCM (Chemistry climate model) performance against stratospheric trends but obviously discusses the factors forcing the trends along the way.

  77. ldavidcooke:

    Hey Dr. Schmidt and Co.,

    Is it possible we can revisit Drs. Lin and Chambers “how clouds work” again, though just from a vertical RH profile, based on the radiosonde/CALIPSO/CLOUDSAT, ocean versus land, data and not the Dr. Lidzen SST basis? It would be nice to see how this plays into the PSC/MSC formation, increasing drought mechanics and atmospheric heat content without dragging in CCNs. (I know, my fault, I just did not understand the mechanics at the time and jumped on a Colardo State Grad. student’s paper without thinking things through.)

    Dave Cooke

  78. Hank Roberts:

    > SM
    > … In response to a paper that is in press you go
    > to a blog?

    Nope. I looked for mention and noted that it’s already a hot item in the discussions among the septic crowd.

    > You did much the same when you decided not to read
    > the 110 pp of AOAS that I referenced. Didn’t you?


    But thanks for asking.
    Always happy to disabuse you. (wry grin)

    I think we’re getting somewhere.

    Aside — email addresses @princeton.edu given for the authors of the Padilla ‘in press’ paper don’t work; I’d hoped the author could be invited to look in here.

  79. Hank Roberts:

    > SM
    > In response to a paper that is in press you
    > go to a blog?

    Nope. I mention it’s already a hot topic there. I don’t go there.

    > You did much the same when you decided not
    > to read the 110 pp of AOAS that I referenced.
    > Didn’t you?

    Nope. Thanks for asking.

  80. Septic Matthew:

    64, mike, inline. Thank you much. It’s high on my current reading list. I also got some papers from Stephen E. Schwartz, cited inline.

    Here is an introduction to the method used by Padilla et al. http://www.image.ucar.edu/~nychka/manuscripts/nychka_anderson2.pdf

    The book that is cited by them has a price tag of $203, so I must skip it for now.

    68, Martin Vermeer: No, I think this is about right.

    I hope the “no” refers to my “projection” comment, and the “about right” to how the uncertainty declines with time.

    You are right about statisticians working with climate scientists. Dr. Douglas Nychka at NCAR is an example. Statistics (and statisticians) grows by addressing new challenges in fields that statisticians have not studied up til then. I hope that this topic (the transient climate sensitivity) stimulates some interest at the 2012 Joint Statistical Meetings in San Diego: http://www.amstat.org/meetings/jsm/2012/index.cfm

    “Statistics: Growing to serve a data-dependent society.”

    I also hope that Padilla et al will expand their exposition and submit it to the Annals of Applied Statistics; as McShayne and Winer did with theirs.

    Thanks to all for your help.

  81. Septic Matthew:

    Thanks to all above for your comments. To mike (64 inline) that paper is high on my reading list, thanks again. 68, Martin Vermeer — I agree on the necessity to work with climate scientists.

    Each year more statisticians get involved with climate scientists. Hopefully, this topic (estimating transient climate resonse) will be on the schedule next year in San Diego: http://www.amstat.org/meetings/jsm/2012/index.cfm

  82. Mal Adapted:

    Los Alamos National Laboratory is hosting the Third Santa Fe Conference on Global and Regional Climate Change Oct. 31 thru Nov. 4. A lot of good science has come out of LANL, but the conference program is dismaying. I’m not familiar with many of the names on it, but I do know a few of them, e.g. Lindzen, Singer and Monckton! What can the conference organizers be thinking?

    [Response: The organizer is Peter Chylek, and I have no idea. – gavin]

  83. Patrick 027:

    Re 75 Thomas, 67 Hank Roberts

    Stratospheric cooling more here:
    The goal was a brief explanation. My own attempts there pretty much failed at being brief. But maybe another go at it:

    1. First distinguish between transient and full equilibrium responses (with only the Planck response; add in Charney feedbacks and then non-Charney feedbacks later):

    If a sudden jump in greenhouse gas concentration occurs, you can have cooling in the upper atmosphere that dissappears later (assuming heat capacity is distributed in a sufficient manner?). For example, in a grey gas case with no direct solar heating above some height z (assuming z is above the tropopause and approximating the upper atmosphere as being in pure radiative equilibrium), then the net LW flux above z must be constant with height when in equilibrium and equal to the OLR at TOA (outgoing longwave radiation at the (effective) top of the atmosphere). Increasing LW opacity will at first reduce OLR. The skin layer’s temperature will fall in response, coming to equilibrium with OLR. But the radiative imbalance causes heat to accumulate below. Eventually a full equilibrium is a achieved when OLR is restored, and the skin temperature must be restored as well.

    2. But if opacity is only increased in some bands and not others, the restored OLR can have a different spectral distribution than before. Depending on how the spectrum of atmospheric opacity has changed, some may be displaced from the bands that exert greatest influence on the skin layer, and thus result in a persistent cooling near TOA. On the other hand, if optical thickness in increased in an atmospheric window (assuming surface emissivity is high enough in that window, to generalize this more), this tends to warm the skin layer by intercepting OLR that is originating in warmer places.

    3. Other effects of changing the spectrum can occur due to the shape of the Planck function over the spectrum and over temperature (consider whether the lapse rate, in terms of the Planck function, is convex (tends to cause net cooling) or concave (tends to cause net warming) – in particular, on the distance scale of moderate opacity (~ mean free path of photons, maybe give or take?) – this will vary over frequency – even in a grey gas, one may go from convex to concave at some point in the spectrum. In pure LW radiative equilibrium, net warming and net cooling at different parts of the LW portion of the spectrum must balance; otherwise an imbalance will balance solar heating, and/or convective heating/cooling (such as at the surface or in the troposphere, or regionally/seasonally/etc. in the stratosphere (convection) or in the stratosphere due to the ozone layer). The skin Planck function is half of OLR in the absence of direct solar heating or other complications; the temperature may range from half the brightness temperature of OLR to near the same brightness temperature of the OLR depending on where it is determined in the spectrum.

    4. If there is direct solar heating sufficiently (depending on opacity) near TOA, adding LW opacity thins the skin layer, removing solar heating from it, bringing the equilibrium skin layer temperature closer to pure LW equilibrium value.

  84. Hunt Janin:

    I need a professional-quality color photo for use on the front cover of my coauthored book, “Rising Sea Levels.”

    Since from a photographic point of view, sea level rise is not a good subject (it is much too slow), I’m thinking of using a dramatic photo of a storm surge hitting the shore.

    Anything WRONG with this idea?

    [Response: Hmm…. There is an implicit attribution which is not what you are trying to convey (I think). However, there are photos of impacts, and in particular increased erosion, that might convey what you want more directly. I’m thinking of images of Shishmaref falling into the sea, or abandoned houses on the Carolina barrier islands. Gary Braasch might be your man. – gavin]

  85. dhogaza:

    Hunt, Gavin’s suggestion of Gary Braasch is an excellent one. Here’s a link to the “climate change” category on his website.

  86. chris:

    Mal Adapted (re @80)

    As someone who researches in rather non-controversial arenas (Mol. Biol./Medical Biophys.) unrelated to climate science, I am routinely flabbergasted by some of the (wilfully) dismal rubbish that is occasionally published in climate science presumably in pursuit of dreary agendas.

    One area that seems a focus for some of this stuff is the insinuation of “evidence” for low climate sensitivity. As an outsider I tend to take papers rather for granted unless something stands out as being patently dodgy. There’s no escaping that fact that Petr Chylek’s effort to insinuate low climate sensitivity by ludicrous datapoint selection of ice core temperature proxies and dust levels in his 2008 GRL paper (with U. Lohmann) is a deeply flawed analysis (see, for example, the comment by Annan and Hargreaves Clim. Past, 5, 143–145, 2009; http://www.clim-past.net/5/143/2009/cp-5-143-2009.pdf ). The nature of this particular flaw is rather similar to that which Lindzen and Choi made in their insinuation of low climate sensitivity (negative feedback) from a similarly astonishing selection of data points comparing TOA radiative flux in response to surface temperature variation. I don’t think it’s unreasonable to point out that Dr. Spencer has also made some rather flawed analyses in pursuit of negative feedbacks/low climate sensitivity (as have a very small number of other authors). I don’t see why we should have to “pussy-foot” around this subject – the papers (and their rebuttals) are there in cold black and white, and those authors wrote them.

    So perhaps Dr. Chylek has some sympathy for that particular point of view (the pursuit of low climate sensitivity), which is shared by some of his meeting participants you mentioned. Incidentally, I don’t think this is a problem for the science. One might say that the assumptions underlying our knowledge base should be tested towards destruction, and we might even be reassured by the fact that some individuals that do their damnest to pursue the contrary point of view, rather, in fact, tend to reinforce the pukka science. Perhaps we might be saddened by the observation that their efforts are so puny, scientifically-speaking.

    Anyway, that might be relevant to the question of why Dr. Chylek has organized a meeting with some odd participants.

  87. M:

    Mal Adapted: Add to your list Easterbrook, Scafetta, Morner, Loehle… plus Schwartz on climate sensitivity and Garrett on his wacky thermodynamic economy theory… they’re not all as bad as Singer and Monckton, but not people I associate with high quality science. And probably another half-dozen names there are people who might not be contrarians themselves, but are often quoted by the Moranos of the world nonetheless.

    And yet, there are a number of top-notch names there too – I have to wonder what they’ll think when they have to sit through the junk from the above folks…

  88. Edward Greisch:

    71 BPL Thanks.

  89. Edward Greisch:

    DotEarth http://community.nytimes.com/comments/dotearth.blogs.nytimes.com/2011/10/02/a-map-of-organized-climate-change-denial

    A Map of Organized Climate Change Denial

    has over 130 comments. Links to a very good very expensive book on the sociology of GW.
    See also:
    “Bloomberg’s Bombshell Report on “The Koch Method”: How to Steal, Cheat and Lie Your Way to the Top” on

  90. Patrick 027:

    Re my last comment:

    1. Increasing LW opacity will at first reduce OLR. – assuming no direct solar heating of the air above some height, or otherwise assuming a distribution such that temperature still declines with height, at least on the vertical distance scale that is similar to a unit of optical thickness, at least within a sufficient optical depth from TOA down into the atmosphere.

    2.On the other hand, if optical thickness in increased in an atmospheric window (assuming surface emissivity is high enough in that window, to generalize this more), this tends to warm the skin layer by intercepting OLR that is originating in warmer places.
    – generalizes to any atmospheric window down to a sufficiently warm level – for example, a humid or cloudy air mass (provided the clouds are not too high in the troposphere or else that the lapse rate is still positive going into the stratosphere, etc.) would act optically act like an elevated surface that may be cooler (unless there’s an inversion under it) than the actual surface but still supply a relatively ‘warm’ upward LW flux that the air above must adjust to if it has optical thickness (of the emission/absorption type, as opposed to pure scattering) in that band.

  91. David Young:

    I hope someone will help me again because after significant research, my numerical analysis questions just won’t go away. I was pointed to a video by Cambridge University Isaac Newton Institute of P. Williams discussing time stepping errors and he shows some rather alarming things for simple models, namely, that the apparently standard leapfrog scheme with Robert-Asselin filter is pretty dissipative and damps the real oscillations in the system. He studied a few explicit methods of different orders and seemed to conclude that higher order was good and that the problems with leapfrog are not hard to fix. Then I went back to my graduate school text and I found that “The leapfrog scheme is also prone to nonlinear instability.” Comments?

  92. Hank Roberts:

    For anyone wondering about SM’s mention of AOAS, here’s what that’s about:




  93. Edward Greisch:

    71 BPL: A whitehouse.gov account is required to sign Petitions. That may be the problem. You have to create an account. It is not as user friendly as a Macintosh.

  94. David B. Benson:

    Barton Paul Levenson @71 — First register; unfortunately that causes a login. So logout. Now start over and log in. Then you can sign.

    Remeber that it’s good enuf for government work.

  95. Septic Matthew:

    90, David Young:

    It is nearly always possible to come up with an interesting and relevant example that defeats each scheme.

    Sorry, but you always have to experiment a lot with problems similar to yours and with different schemes. I always use an implicit scheme with stepsize adjustment and error-corrections; and I nearly always test the finalest results against the simplest Euler’s method with diminishing step-sizes. It takes a long time, and with large enough systems can be discouraging.

    With these large, meaning high-dimensional, climate systems run for long times, I am glad that I am not the one responsible for claiming that they achieve what they aim for.

    I am not an expert — I merely have some good books and have spent time in the library stacks with other good books. Every differential equation solver can be defeated with a problem reasonably similar to the problem that you are working on. As far as I can tell.

  96. Martin Vermeer:

    64, mike, inline. Thank you much. It’s high on my current reading list. I also got some papers from Stephen E. Schwartz, cited inline.

    SM #79, mike is pulling your leg, making fun of you unearthing, with unerring ‘skepticism’, the one paper that is seriously flawed… he is the number four author on the “comment” he links to, that demolishes the Schwartz paper. But by all means, read both ;-)

    Off topic, why am I reminded of some women serially finding men that are bad for them? Something in the water?

  97. ldavidcooke:


    Hey David,

    I do not pretend to know anything about models; however, to me the biggest issue is the weight of the modifiers of the inital perameters. You cannot simply plug in a stepping value and not have the system quickly detatch from reality. At one point in a former life regarding Capacity Planning we would would build a model that used the past and we derived a weighing factor from projected sales, of course that was a fiasco. Then we discovered we could apply a range bounded random value with the weight adjusted by the change in a leading economic indicator and the industry projections of the customer. In that manner we were able to be within 1% of varibility over a 6 year period with a bi-monthly cycle.

    Given this it suggests that maybe we have a disconnect in model creation. It s important that the initial values are the correct values and the interdependencies are well defined. The next step is to examine the range of the modifiers and to define the range based on a weighing with 1-4 std normalized deviations. The issue is to apply weight to the range of the modifers so that they change in step with the change of the results of the prior step. IE: If a value of a modifier from a previoue step caused one of the calculated steps to demonstrate a large change there needs to be a feedback to the bounded range modifier to deselect a 4th and possibly 3rd level std. deviation in the next step, though in nature it has been demonstrated that repetitive outliers can exist though eventually they terminate the trend and reverse.

    In short, you will have to re-initialize each time. The actual change is not so much the models results; but, the modifing factors static factors are not real world. Yes they simplfy the model and help point a direction; but, in the end they will not track real world events more the three to maybe ten cycles, depending on your resolution and interrelationship description. As to leapfrogging with a linear modifying factor this is like starting over and plugging in a amplitude greater inaccurate modifing factor. If you attempt a course or low resolution run you still must run a high resolution modifying factor otherwise all you are doing is amplifying the error.

    As to what this has to do with your mechanics and formula adjustment filters I can be no help. However, I can tell you that tight model tracking is rocket science and should not be left up to the student or amateur. To devise a model of a high level of variables, with a high range of values, such that the resultant appears to be chaotic takes more man hours and expertise then any simplifing matematical tool kit can account for.

    Dave Cooke

  98. Hunt Janin:

    Re sea level rise photo (83 & 84 above):

    Many thanks, Gavin and dehogoza. I’m emailing Gary Braasch to see if he has such a photo.



  99. Hank Roberts:

    > Padilla
    already Cited by 1

  100. Hank Roberts:

    Oops. “Padilla Cited by 1″
    link broke when posted; Scholar pointed to:

    or http://www.princeton.edu/~gkv/papers/Xie_Vallis11.pdf

    The passive and active nature of ocean heat uptake in idealized climate change experiments [PDF] from princeton.edu — P Xie, GK Vallis – Climate Dynamics, 2011 DOI 10.1007/s00382-011-1063-8

    It begins:
    “Abstract: The influence of ocean circulation changes on heat uptake is explored using a simply-configured primitive equation ocean model resembling a very idealized Atlantic Ocean. We focus on the relative importance of the redistribution of the existing heat …”
    and ends
    “… Evidently, the warming occurs first in the mixed layer and
    then, on the multi-decadal timescale, in the main thermo-
    cline and southern ocean (Fig. 3), with a potential localized
    cooling in high northern latitudes due to a weakening of the
    MOC, with a significant warming of the abyss only on
    century–to multi-century timescales. Note that a warming
    of the thermocline only, with a depth of 1 km, produces to
    a sea-level rise of about 20 cm per degree, and a warming
    of the entire water column, with a lower average coefficient
    of thermal expansion but greater volume, translates to a
    rise of about 50 cm per degree. Thus, understanding how
    the heat uptake reaches the deep ocean, and on what
    timescales, is an important problem that we need to better

  101. Icarus:

    Simple question: We hear that there is a new ‘ozone hole’ over the Arctic due to a cooling stratosphere, and a cooling stratosphere is (as I understand it) a classic ‘fingerprint’ of an enhanced greenhouse effect, so would it be reasonable to attribute this ozone hole to rising anthropogenic greenhouse gas emissions?

  102. Hank Roberts:

    > unearthing, with unerring ‘skepticism’,
    > the one paper that is seriously flawed

    Yup. does seem to keep happening. How come?

    > Schwartz

    One of his edited versions of the forcings picture:

    “Added to the figure (light blue bar) is an estimate of the total (direct plus first indirect) forcing, -1.2 W m-2, and the associated uncertainty range: -0.6 to -2.4 W m-2.”


    “Added to the figure (green bar at bottom and associated uncertainty range) is the estimate from the 2001 IPCC report2 of the total forcing projected for 2100, where the uncertainty denotes the range of estimates for different emission scenarios.”

    His theme since the 1990s — uncertainty, no effect _yet_ from CO2, e.g.

    His presentations have long been popular in, er, uncertain circles.
    That’s hard to understand unless all they care about is short term, because
    his conclusion seems to always be along the lines
    ‘We ain’t seen nothin’ YET but for sure it is a’comin’ …’

  103. Deep Climate:

    Said and Wegman 2009: Suboptimal scholarship

    Today I present an analysis of a 2009 article by Yasmin Said and Edward Wegman of George Mason University. “Roadmap for Optimization” was published in the inaugural edition of WIREs Comp Stats, one of a new family of Wiley publications conceived as a “serial encyclopedia”.

    As the title implies, the article was meant to provide a broad overview of the mathematical optimization and set the stage for subsequent articles detailing various optimization techniques. However my analysis, entitled Suboptimal Scholarship: Antecedents of Said and Wegman 2009, demonstrates the highly problematic scholarship of the “Roadmap” article.

    * No fewer than 15 likely online antecedent sources, all unattributed, have been identified, including 13 articles from Wikipedia and two others from Prof. Tom Ferguson and Wolfram MathWorld.

    * Numerous errors have been identified, apparently arising from mistranscription, faulty rewording, or omission of key information.

    * The scanty list of references appears to have been “carried along” from the unattributed antecedents; thus, these references may well constitute false citations.

  104. ldavidcooke:


    Hey Hank,

    Concur, note that the increase in volume should be constrained within the thermocline. Hence, you should have to distribute the expansion factor. Though the water column covering the polar basins should be uniform as they fill to feed the THC. I guess the question is what happens if you fill the basins faster then they empty? Does the THC rate increase or does the spill over inflate the area above the whole of the abysmal plain?


  105. Harmen:

    “I need a professional-quality color photo for use on the front cover of my coauthored book, “Rising Sea Levels.””

    A photo of Banksy’s masterpiece in London?

  106. Hank Roberts:

    > Color photo
    Maybe a predictive drawing? http://climatechangepsychology.blogspot.com/2009/10/sea-level-rises-would-flood-phillyand.html

  107. Russell:

    Given the Nobel news, brace yourselves for howls of outrage from WUWT at the Committee’s failure to declare Viscount Monckton and Lubos Motl co-Laureates in Peace, Physics & Medicine :

    For curing Bright’s Syndrome by using cosmic ray neutrinos to cause faster than light climate change in 11 dimensions.

  108. wili:

    Picking up on the topic of the enormous and unexpected ozone hole over the Arctic discussed by hank, thomas, patric, icarus, etc:

    Is there any chance that the dramatic increases in methane output from Siberian Continental Shelf that Shakhova and others have been reporting on is related to this?

    –methane is a powerful ghg, 105 times the global warming potential of CO2, so one expected result of a massive release would be that the stratosphere would cool as the troposphere warmed

    –being lighter than other common atmospheric gasses, much of the methane would rise to the stratosphere, and there, react with the ozone, destroying it (and, in the process, producing CO2 and H2O, both ghg’s in their own right–and am I remembering wrong, or did recent studies not suggest that increases of stratospheric water vapor was a more powerful driver of gw than had previously been assumed?)

    –all that lighter-than-air methane rising rapidly would, I should think, increase the vortex that was keeping in all that cold.

    Is anyone else making these connections, or am I way off base. If the latter, please let me know where I erred.

    –a large mass of rising gas would presumably add t

  109. David Young:

    I guess there is no comment from Gavin on the startling evidence (both empirical and theoretical) of large numerical errors in climate models. Sceptic Matthew, its true that many schemes can be defeated by challenging problems. That’s the point of using modern methods that are more robust.

    What is most disturbing about the Williams material is that if a dissipative scheme is at the heart of most climate models, then the empirically observed fact that climate models seem to converge in average properties or patterns could be simply an artifact of the numerical scheme.

    It is shocking to me that such studies have not been done until recently. In any other field, people would have insisted on it.

    [Response: What is shocking is how willing you are to believe the worst without any actual knowledge of the subject. Methods can always be improved (and are being), but the biggest problem with this genre of argument – I call it the ‘a priori climate models can’t work’ argument – is that it is trivially refuted by the fact that climate models do actually work – they match climatologies, seasonality, response to volcanoes, ozone holes, ice ages, dam bursts, ENSO events etc. etc. pretty well (albeit not perfectly, and with clear systematic problems that remain foci of much research). One shouldn’t be complacent, but people like yourself who come in with a preconceived notion that climate modelling is fatally flawed end up revealing far more about their preconceptions than they do about climate models. – gavin]

  110. David Young:

    Gavin, finally you respond. I have no preconceived notions. You are reading my mind, a rather prejudiced way to do things. I merely know from 30 years experience that numerical errors are a common source of large errors in models of everything from aircraft flow problems, to climate models. Perhaps Williams is likewise the victim of preconceived notions. My challenge to you ( which judging from you last post you are not taking seriously) is to actually do what people in most fields do, namely, do Paul Williams’ checks on your model. If its fine, its fine. But your lack or curiosity is puzzling. I think Bob Richtmyer would be concerned for your reputation.
    By the way, the way errors can vitiate model results is documented in some of the references I cited in previous posts. Perhaps you should look at some of them. They are indeed mainstream mathematics where there are real standards of proof.

    [Response: “Finally”? Sorry, but I have both a job and a life, and playing games with you is not my no. 1 priority. I do find it amusing that you conflate the fact that I am not hanging on every one of your wise words with a lack of respect for Ulam, Richtmyer and Lorenz (and why not add Feynman and Hawking just for fun as well?). Whether you appreciate it or not, checks and improvements of all sorts are ongoing at all climate model development centers (oh, look, just like other fields!), and that occurs completely independently of anything you suggest, or I do. Williams stuff does indeed look interesting – but expecting me to suddenly drop everything I am doing and move into a new sub-domain of climate modeling in less than 24 hrs because you are ‘shocked’ at the state of the field is, shall we say, a tad optimistic. – gavin]

  111. David Young:

    Gavin, Have you viewed the P. Williams presentation? I’m giving you a fair shot here to look into it. Your response is so typical of engineers and scientists to mathematical theory, namely, that our models “work.” Where is the proud history of modeling in engineering? Von Neuman, Ulam, even Lorentz are standard we should aspire to, not denigrate

  112. Brian Dodge:

    “What kind of sea microorganism activity would be likely to create massive quantities of methane?” wili — 3 Oct 2011 @ 8:51 AM

    anaerobic decomposition (google biogas generators) – perhaps related to early algal blooms, out of sync with trophic consumer species.

  113. Hank Roberts:

    > being lighter than other common atmospheric gasses,
    > much of the methane would rise to the stratosphere


  114. John Mashey:

    re: 108
    David Young:
    So, I see Williams conferences including several at Fall AGU. (It would be nice, if someone actually cites what they’ve seen, instead of a vague reference that makes people waste time looking for it.) I’ll try to attend one of those sessions and see what live expert reaction is … since watching a video does not provide such feedback.

    A while back, I observed that technical people skeptical of climate models seemed to have different reasons that sometimes related to their own technical discipline. Perhaps you might read that discussion and let us know about your discipline and/or technical experience. I’ve found it quite valuable to understand where people are coming from in such discussions. For instance, the key to one discussion was realization that someone had bad experience with protein-folding. In anothehr case, the simulation experience was financial modeling.

  115. Hank Roberts:

    “… 6. Transient response to the well-mixed greenhouse gases
    Mar 28, 2011 – Global mean surface air warming due to well-mixed greenhouse gases … gases (WMGGs: essentially carbon dioxide, methane, nitrous oxide, …


  116. wili:

    How long must we tolerate those who are either willfully ignorant or actively trolling for emotional reactions to obviously faulty data? Isn’t that what the borehole is for?

  117. David Young:

    Mal Adapted: You are not well adapted on this one. Los Alamos has a reputation for scientific independence and being difficult for administrators to control. That’s probably a good thing. Science is advanced by allowing all ideas to be heard and Los Alamos has a stellar history in this regard. I could name the names from the past, but perhaps you already know them (on second thought, perhaps you don’t). Stan Ulam’s autobiography should be required reading for scientists in graduate school, especially climate scientists.

  118. David Young:

    Jphn, My field is fluid dynamics, i.e., the solution of the Navier-Stokes equatons which govern weather and climate. This is not about application specific experiences but about the mathematics of the system, which is common to all fields. I’ve seen the Gavin response many times. Its understandable but counterproductive and not scientific. It’s basically, “Yes, I know there is a theoretical problem, but my simulations “work”.” I hope Gavin takes note of this: in my experience the next step is that someone like Paul Williams goes off and shows that the current numerics is causing large errors and then people scramble to fix the problem, often claiming that they knew about the problem all along.

    [Response: What on earth makes you think I have any objection to people fixing problems that are found? I do it all the time and so do most of my colleagues. “Objecting to your condescension” != “thinking models are perfect”. – gavin]

  119. MalcolmT:

    Hunt @ 84,
    The people who made The Hungry Tide ( http://www.imdb.com/title/tt2011296/ ), about the effect of rising sea level on Kiribati, may well have relevant stills.

  120. David Young:

    Yea, I’m not expecting you to drop everything and fix the problem. It’s just ironic that people continue to insist that model simulations have meaning, or at least imply that. You can look up my publications on the web (Im using my real name because this is a serious scientific discussion). We have developed models that do meet numerical tests for grid convergence. You are tempting me to get out my checkbook and write a check to Heartland Institute or maybe write my congressman asking that climate modeling be monitored by Los Alamos, where there are still a few mathematicians in the theoretical group.

    But the main point, and I hope you see this, is that science demands that these things be investigated very carefully. You don’t have to do it yourself, but you I’m sure have a mathematically inclined scientist on your team. You need to assign a sceptical team member to break the models, that’s what I do myself. It is very helpful.

    [Response: I agree – and we break the models all the time. Why not simply ask as what we do instead of assuming that we don’t do anything? By the way, LANL does a huge amount of climate modeling in collaboration with NCAR (i.e. http://climate.lanl.gov/) – again, something that is easily asked and answered in contrast to your assumptions. – gavin]

  121. David Young:

    Gavin, I hope you have the fortitude to post this. I am not saying that you are claiming that “the models are perfect” I am saying that due diligence with regard to issues of numerical consistency and accuracy should be a high priority — a higher priority than publishing more and more papers based on the model results.

  122. Kevin McKinney:


    So, it’s more important to keep on documenting a tool’s potential usefulness than actually to use it?

    Especially during what some see (with evidentiary support) as a crisis for which that tool has a crucial diagnostic role?

    Don’t know much at all about the modeling issues raised in this discussion, but that last statement just seems bizarre to me. By all means, keep checking model validity and reliability, but “drop everything?” Really??

  123. ldavidcooke:


    Hey Dr. Young,

    We are talking about a group that is 20yrs into a 50yr project. They have a vast accumulated data set and have been substituting the various discreet processes in for large scale patterns as confirmation is established. As to the ability to account for fluctuations of discreet processes they have improved well over the last 15yrs. At issue is the conversion of the systems from large scale to discreet is difficult to extract from an apparent chaotic system.

    As to the main issues with modeling as a whole is the idea that there is one solution set when the influence of a discreet process runs a knife edge which can switch by chance. You can plan for the chance by weighing factors though you cannot predict direction or amplitude. The other issue is the basis of homoginization, in essence the idea that all variables for a given grid block are the same. Smaller grid blocks increase the resolution; but, if you do not account for neighboring influences prior to roll up you miss the range of probability, in essence, damp down the potential change.

    As a whole I believe the NCAR, NASA and NOAA teams have done an excellent job with the resources allowed. To me it seems more wrong for criticism when it is clear that additional expertise could compress the development time and effort. Which is better, hobble the horse so it takes all day to go a mile, or judicious put funds and resources in place so that it can reach the goal in 2 min. It really is easy to sit on the sidelines throwing spitballs it takes more (edited) to help lift the load. Care to share your preference or expertise?

    Dave Cooke

  124. ldavidcooke:

    Hey All,

    Now back to the Arctic Ozone hole. As we know an unusual pattern set up prior to the formation. A Blocking High at high latitude formed over the region of the NAD. The shift of the winds dramatically changed the heat/wv flow into the Arctic region last Fall. It appears that as of the second week in Jan. that the Arctic Circulation formed two seperate pools a strong pool over the Northern Eur-Asian region and a weaker one over the NA region. The end result was a very strong vortex with extreme upper atmospheric cooling parked over the Eur-Asian region, both deforming the N. Jet Stream elipse more then prior seasonal monitored deviations and provided a much better ground for tri-nitrics and chlorinated compounds to reduce the Stratospheric Ozone there from the normal 430-460 to 230 Dobson concentration. With an area about the area of Germany virtually devoid of Ozone.

    As to root cause, GHG, warm water vapor or aerosols, I do not believe so. As to a high level of low latitude heat content escaping, definitely. At issue remains the question of the drivers of long resident Blocking Highs and Cut-Off Lows which seem to be increasing in frequency and now latitude range.

    My best guess will be related to the changes in the flow patterns of both the NH Northern and Southern Jet Streams. That aerosols and GHGs play a part is undeniable. The question is the mechanics.

  125. Maya:

    I thought this was interesting, and didn’t know climate models were the topic of discussion on the open thread until I popped over here to see if anyone else had mentioned this. So, it’s not a commentary on anything, just a “oh look, this is interesting”.


    I believe this is the text of the paper that’s referenced:


  126. Ray Ladbury:

    David Young,
    OK. Let me get this straight. You are claiming that despite:
    1) the overwhelming evidence that the climate is changing,
    2) the overwhelming evidence that the change is due to greenhouse gasses; 3) thae fact that GCMs are in no way essential for attribution of the current warming to greenhouse gasses
    4) the fact that the small amount of warming we’ve had is already causing serious consequences

    that the mere fact that GCMs might have some imperfections is sufficient for you to write a check to the professional liars at Heartland?

    When might the Discovery Institute or Jenny McCarthy expect their checks?

  127. t_p_hamilton:

    “I hope Gavin takes note of this: in my experience the next step is that someone like Paul Williams goes off and shows that the current numerics is causing large errors and then people scramble to fix the problem, often claiming that they knew about the problem all along.”

    So according to David Young’s reasoning, people shouldn’t be using Navier-Stokes at all, because you never know when a Paul Williams will show what was thought to be OK wasn’t.

  128. Rob Nicholls:

    Are anthropogenic sulphate aerosol emissions rising at the moment (e.g. due to increased coal burning in China)?

    If so, how big an effect is the rise in sulphate aerosol emissions thought to be having on global temperatures at the moment? Is there much possibility that sulphate emissions will cause a temporary halt to global temperature rises over the next few years or decades? (I suspect not, but I thought I’d ask).

    This question was triggered by a BBC report that I saw recently ( http://www.bbc.co.uk/news/science-environment-14002264 “Global warming lull down to China’s coal growth”) suggesting that a study had concluded that the “lull” in global temperature rise between 1998 and 2008 was due to sulphate aerosols arising from a sharp increase in coal burning in China (a bit like the lull in warming from the 1950s to 70s caused by US and European coal burning).

    I’ve only been able to see the abstract of that study, (“Reconciling anthropogenic climate change with observed temperature 1998–2008” by Robert Kaufmann, Heikki Kauppi, Michael Mann [editorial note–this is not RealClimate contributor Michael E. Mann] and James Stock, see http://www.pnas.org/content/108/29/11790.abstract ); the abstract paints a more complex picture than the BBC report. (The abstract mentions the solar cycle and the change from El Nino to La Nina as dominating anthropogenic effects between 1998 to 2008, because CO2 rises were partially offset by the influence of rapid increases in sulphate aerosols from increased coal burning).

  129. Mal Adapted:

    David Young,

    I’m acquainted with much LANL history, and I first heard of Stanislaw Ulam as a schoolboy. Regardless, I don’t claim expertise in nuclear physics, or climate science. Your comments here suggest that you consider yourself expert enough to contend with genuine climate experts like Gavin. Are you aware of the Dunning-Kruger effect?

  130. Thomas Lee Elifritz:

    Anybody want to discuss the Younger Dryas? Didn’t think so.

    No matter. I see Thomas Lowell is out there promoting his personal crackpot theory ahead of the GSA meeting this week in Minneapolis. I predict many intellectual meltdowns and the usual heated arguments at this meeting. Yay!

    The more the merrier as far as I’m concerned. What’s the difference?

    [Response: This? – very unlikely in my opinion. The watershed drainage area for lakes is much greater than the surface available for evaporation, and so for evaporation to control major lake level changes you need to be in really arid conditions (think Lake Chad). Doesn’t seem likely for the boreal regions – even during the ice age termination. – gavin]

  131. Hank Roberts:

    >> David Young
    >> write my congressman asking that climate
    >> modeling be monitored by Los Alamos

    answered by:

    > LANL does a huge amount of climate modeling
    > in collaboration with NCAR (i.e. http://climate.lanl.gov/)
    > … easily asked and answered
    > in contrast to your assumptions. – gavin]

    The exchange between real scientists helps us readers learn a lot.

    Ignore the kibitzers, please.

  132. Thomas Lee Elifritz:

    I was referring to his newswire self promotional article. ‘Not that there’s anything wrong with that’. A little self promotion never hurts, even when you might be wrong. It’s amusing though that we still can’t pin down a simple 9500 cubic kilometer water leak thirteen thousand years ago, when our planet is still hemorrhaging fresh water as we speak. Luckily though, we are getting it back in droves in the form of increased humidity.

    I also find it amusing that nobody comments on the almost obvious climate inversion going on in the Midwest – massive almost permanent morning dews far beyond anything I have ever observed, and cloudless high pressure days unlike anything I have observed in my fifty years of weather observing. Hot days, cold nights (almost desert like) along with the increased humidity even in the presence (or in this case absence) of any clouds.

  133. Hank Roberts:

    For Rob Nicholls:

    I skimmed the results, you may find something to answer your questions, e.g. this article might help:

    “Anthropogenic SO2 emissions increased alongside economic development in China at a rate of 12.7%yr−1 from 2000 to 2005. However, under new Chinese government policy, SO2 emissions declined by 3.9 % yr−1 between 2005 and 2009….”

  134. sidd:

    Eminent scientist David Young wrote:

    “…startling evidence…large numerical errors in climate models…It is
    shocling…30 years experience…large errors…climate models…your lack or
    curiosity…would be concerned for your reputation…errors can vitiate model results…
    real standards of proof…giving you a fair shot here…Your response is so
    typical…Von Neuman, Ulam, even Lorentz…My field is fluid dynamics…
    Navier-Stokes equatons…seen the Gavin response many times. Its understandable but counterproductive…large errors…people continue to insist that model simulations have
    meaning…Im using my real name because this is a serious scientific discussion
    …check to Heartland Institute…sceptical team member…fortitude to post this
    …due diligence…numerical consistency and accuracy…higher priority than

    Dude, can you turn down that whistle ? All the dogs are freaking out…


  135. Russell:

    Could this be the David Young seen attending a heartland function on the outskirts of Tulsa?


  136. CM:

    A variation on #118–120:

    It’s just ironic that people can continue to insist that relativity has meaning. It’s basically, “Yes, I saw the press release about faster-than-light neutrinos, but my GPS ‘works’.”


    Paul Williams, though, seems to be doing interesting and constructive work addressing model uncertainties. He also seems unlikely to be writing any checks to the Heartland Institute (he’s quoted in the Sunday Times as saying he keeps a record of all the climate skeptics’ threats, “partly to provide a list of suspects if I ever disappear, but mainly because it’s funny to read how many times people can call you a ‘caulkhead'”.)

    (Kevin, there’s a lecture on the time-stepping issue that I found surprisingly accessible, even though I am a Bear of Very Little Math. And you at least will want to check out his paper on “Meteorological phenomena in Western classical orchestral music”…)

  137. David B. Benson:

    David Young — Whatever imperfection you are troubled about has probably already been resolved (and likely some time ago). The is a best about climate models in “The Discovery of Global Warming” by Spencer Weart:
    and in more depth in
    “A Climate Modelling Primer” by Henderson-Sellers
    Introduction to Three-Dimensional Climate Modeling 2nd Edition
    Warren M. Washington and Claire Parkinson
    with possibly now a book about the history of climate model development. In any case, you could check the code from both NCAR and GISS if you wanted to. [I myself am satisfied that the lesson of Emmy Noether’s (first) Theorem has been learnt for these codes.

  138. Brian Dodge:

    “Robert-Asselin filter is pretty dissipative and damps the real oscillations in the system.”
    “…in my experience the next step is that someone like Paul Williams goes off and shows that the current numerics is causing large errors…”

    “The effects of the RAWfilter on the climatology and forecast skill of the SPEEDY model,” Javier Amezcua, Eugenia Kalnay, and Paul Williams (the horse’s mouth, as it were)

    “In a recent study, Williams (2009) introduced a simple modification to the widely used Robert–Asselin (RA) filter for numerical integration. The main purpose of the Robert–Asselin–Williams (RAW) filter is to avoid the undesired numerical damping of the RA filter and to increase the accuracy.”
    “…in tropical surface pressure predictions, five-day forecasts made using the RAW filter have approximately the same skill as four-day forecasts made using the RA filter.”

    I wouldn’t necessarily call 25% improvement a correction of ‘large errors’ – YMMV

  139. Hank Roberts:

    > dewpoint
    You can find papers on trends. I just glanced through scholar. Example:

    Theor Appl Climatol (2009) 98:187–195 DOI 10.1007/s00704-008-0094-5
    Trends in extremes of temperature, dew point, and precipitation from long instrumental series from central Europe
    Published online: 7 February 2009 # Springer-Verlag 2009
    “… While precipitation at Potsdam does not show pronounced trends, dew point does exhibit a change from maximum extremes during the 1960s to minimum extremes during the 1970s….”

  140. David Young:

    The thing that struck me about Williams presentation was the simple test integration where the solution is a sine wave for Y and a cosine for X. The RA result reduces the amplitude by 50% over just a few wave lengths. He picked nu = delta t so that d is third order in delta t. So the formal order of the scheme is maintained at second order. However, the problem here is that the errors accumulate in time, they add up and eventually the signal will disappear. Thus, only in a system where the dynamics are strongly forced all the time will the signal be accurately predicted. You might be lucky in any given problem, but you know when looking at small effects, this kind of thing makes it questionable. Williams’ example using the Lorentz attractor is perhaps an extreme case but shows that choice of delta t can have a big effect on long term simulations, exactly where modelers claim that climate models have the correct patterns. This example shows that numerics can have a strong effect on these patterns. In other words, just having a stable attractor does not guarantee that you can track it accurately. It’s nice to see Williams making these points.

    There are more things like this: Keyes and Knoll in Journal of computational physics (around 2002-2005 somewhere) is excellent on the advantages of implicit methods in large simulation codes. Williams considers only explicit schemes. Advection is very accurately discretized by the Streamwise Upwind Petrov Galerkin method. Finite differences are not as good especially on grids that are not very uniform. Just look for T J R Hughes for SUPG.

  141. ldavidcooke:

    Hey Dr. Young,

    I guess many of us are missing the point, not to discredit the works of Dr. Williams; however, wrt the scheme of things. For instance it is not unlikely for small feedback loops to occur between variables and spread. However, as the cycle periods are not in phase they naturally drift out of sync. As long as your steps are in sync with the modeled variable you would be fine. The problem is trying to find the lowest common denominator when combining multiple variables. Sure you can try to bound the systems for smaller cycles; however, then it is no longer representative of the object of the model. Put differently the processes do not match so the result becomes artificial.

    As to gridding, my experience goes to the former CLI Rembrant Video codec in your two confrence rooms that were installed in the ’90s. The conversion of 15000 pixels or 90mb of digitized data to 1.5mb of DS-1 transport. It was accomplised via FTs to compress the intensity and color/gama data Into large data blocks. After the initial image was created persistence maintained that which had not changed or reduced the priority in the stack, that which changed little. The bulk of the data traffic was that which changed frequently. (Maybe it is time to entertain graphic/optical modeling and let the resultant be reflected by the color temperature.)

    It seems that most of our models have to change in lockstep and if we attempt to change all variables at some prorated rate we have removed natural varibility, where we try to correct by injecting seasonality via a Holt-Winters curve filter.

    So I believe the real question is what is the purpose of introducing various smoothing techniques when the process that they are modeling is not smooth?

    Dave Cooke

    (PS: Re:Kibitzers Hank, The DOE project is unlikely focused on NCAR support, most often it is DoD. I believe it is CPU cycle time where NCAR/UCAR gets its greatest LANL or LLNL support, seconded by shared data.)

  142. Dan Schillereff:

    Indirectly climate-related, and certainly not as climate-related as the on-going modelling discussion but this graphic is perhaps of interest showing considerable recent (past 48h) seismic activity beneath Katla. Is this a warning for a potentially global-scale eruption?


  143. Hank Roberts:

    > is this a warning ….?

    You’re asking the climatologists? (grin)
    You might better ask that question at the site you point to.

    They have publications, including one on forecasting eruptions: “Long-term and short-term earthquake warnings based on seismic information in the SISZ (PDF)”

    It’s on the page at:

    One way to phrase the question is, does a 48-hour period give useful information about any trend, or do you need a longer sample to distinguish a change from the natural variation. That’s a pretty basic statistics question for any data set — how much data do you need depends on how noisy (how much natural variation).

  144. Hank Roberts:

    Comments on the recent WSJ piece about climate science elsewhere abound.
    E.g. http://sci-ence.org/neutrino/
    Hat tip to: http://www.metafilter.com/108152/If-Einstein-might-be-wrong-about-relativity-how-can-we-really-trust-any-scientist

  145. David Young:

    Dave Cook, I don’t quite understand your post. The idea is that in the limit of finer grid and smaller time steps, the numerical error goes to zero and so all the scales are tracked correctly. There is a relatively complete theory for this at least for some types of problems, for example analysis of elastic structures, where things are elliptic. The theory is the basis for the numerical solution of these problems. It gets harder as you go to hyperbolic systems and then to the high Reynolds’ number Navier-Stokes. In any case, there are some mathematical things that can help. I mentioned some of them in the previous post. If you can access journal of computational physics, the survey paper I mentioned is very good.

    There is a great thread on Climate etc. about chaos, ergodicity and attractors that goes to the issues for Navier-Stokes, at least the continuous theory. We still need a comprehensive discrete numerical theory for this problem. The only people I know of who work on this are the CFD people with a more mathematical bent. I mentioned Hughes before. There is a lot of work on Discontinuous Galerkin methods and a pretty sound mathematical basis for it. The literature on this is vast, but these methods are a lot better than finite differences, which it sounds to me are the basis of all the climate models. These methods are 50 years old and reworking these codes could p;ay very big dividends. That’s my only point.

    The problem here on Real Climate is that to rise above the noise of treating models as black boxes that are the basis of a thousand papers, you have to raise your profile enough to arouse the wolves (not Schmidt, but some of those who don’t use their real names). Actually, the most snide of the posts has been removed (that’s gratifying. Thanks, Gavin.) I’ve experienced this before, people are always defensive when someone tries to ask them to improve numerics. But I detect that climate science (at least the subset represented here) is more defensive than usual. I understand why, they have been assaulted on a professional and personal level often by ignorant people with a political agenda. Once a war gets started, it’s hard to get back to normal.

    Anyway, my only point is that things could almost certainly be improved dramatically by a careful examination of the numerical PDE literature from 1980 to present. I would suggest that the modelers spend some time with experts already in Federal employ. Phil Collela at Berkley or David Keyes from Columbia are two who are approachable.

  146. Thomas Lee Elifritz:

    A little off topic again, if that is possible on an open thread, but Thomas Lowell certainly has pushed a lot of buttons with his public relations effort – now I see WUWT has picked up the feed in order to bash the status quo a bit on the uncertainties. Is there anyone else out there who thinks this is perhaps not the optimum manner that science ought to be publicized?

  147. David Young:

    By the way, one other critical thing. Once you clean up your numerics, you can really address the critical questions about ergodic behaviour of the system, stability, and sensitivity to parameters. With all due respect, just running models with little control of numerical error tends to degenerate into data fitting,i.e., you change the free parameters, the forcings, maybe even the time step or spacial gridding, etc to match data. It becomes impossible to systematically address the real issues because you can never distinguish numerical error from the actual physical effects. And you eventually are forced to face the fact that this way of doing it breaks down when you try a different problem, change boundary conditions, etc.

  148. Russell:

    144 re 135

    I am delighted that 144 demonstrates that David Young is not among those illustrated in 135


    and I wish him a good deliverance from the district depicted.

  149. David Young:

    I now understand why GISS people are sensitive about this issue of numerical errors. It appears that Hansen was criticized for using too coarse a grid back in the 1990’s. I claim its much more critical now and I’ll explain why.

    Simulations can have many purposes. If you just want the long term patterns, then IF you are lucky, your attractor is strong enough that bad numerics is not a guarantee of failure. But, if you want details and smaller quantitative things, like sensitivities then numerics is critical.

    If I want to compute the lift of an airfoil, numerics is not that critical because the lift is a very large number. The drag is 100 times smaller and to compute it, I need much better numerics and much finer grids. Now, this is not a linear scale. Let’s suppose your numerical scheme is second order in space and time. If your grid spacing is Delta X, then the truncation error is O(Delta X)**2. Note, this is just measuring local error, global error can still be large. But, this will give us a necessary condition for accuracy. Now if I double the number of grid points, Delta X is halved and thus truncation error goes down by a factor of 4. However, in climate modeling there are 3 space dimensions, so to get this factor of 4, I need twice as many points in each of 3 directions, resulting in 8 times as many points and at least 8 times as much computer time. It can actually be much worse than 8 because of poor algorithms like banded sparse matrix solves, etc. Now, to get the error down a factor of 100 to get the drag, I need 1,000,000 times as much computer time.

    What people like Curry are demanding is sensitivities to parameters and to really compute those accurately may require much better numerics. The Hansen doctrine may give things that look reasonable, but quantitatively, they are probably not good enough for the next step in modeling, sensitivities.

    By the way, there is a whole literature on this issue of computing sensitivities for time dependent systems. In any case, I still claim that numerical issues are a huge deal and deserve much much more attention. No Gavin, you don’t have to drop everything to look at it. But Hansen should hire some good people to get it right. It’s a huge undertaking and may require a new computer model. Just think of it as employment security Gavin. If there is a serious problem with the models, then they need you to fix the problem.

  150. David Young:

    Sorry, there is an error in that last post. To get a factor of 100 reduction in the error, you need 1000 times as much computer time. Still, its not a very good rate of return.
    If your linear solver is say a banded one, you take another hit of a factor of 10.

  151. Meow:

    @144: I’m pretty sure that most climate modelers understand their models’ numerical shortcomings. A cursory search reveals active work to improve them. See, e.g., Evans et al, “A Fully Implicit Solution Method Capability in CAM-HOMME” (CCSM model), Dugas & Winger, “A CRCM5 Description”, Weijer et al, “A fully-implicit model of the global ocean circulation”, etc.

    I expect that “writ[ing] a check to Heartland Institute” (@120) will not assist these efforts, since I have been unable to find any relevant research sponsored by that organization.

  152. David Young:

    Meow, Your first reference is indeed on the right track and Salinger is good. They know about all the good methods. However, if you read the fine print, they say that their preconditioner is actually slower than the baseline code and they are working on a better one. That’s good and we’ve been through that too in the 1980’s. The problem is that their baseline code is so complex, some terms are treated implicitly and other explicitly so its tough. Their spectral element method is used ONLY to test other methods that are used in the real codes. By the way, spectral element methods REQUIRE variable order, something which is often neglected.

    However, the point is that rewriting at least one model is required so it is designed to use the best methods. I’ve done this myself. Tacking a modern accelerator onto a legacy code helps, but is orders of magnitude worse than really doing it fully implicitly. Certainly even the AR5 simulations are based on the legacy codes and I’ll wager you finite differences. The AR5 simulations are indeed sensitive to the period of calibration. See Judy’s Curry material for an example.

    At any rate, the point is that the evidence from multiple sources says that the uncertainties are much larger than one would get the impression from reading the 1000 papers based on running the models.

    You know, sometimes the scientific process involves strong criticism. There are 2 responses. The first is to really understand it and realize that you will be embarrased if you don’t fix the problem. The second is to try to surpress the criticism or even worse get editors fired, etc. Perhaps even Heartland has a place in the debate if the World Wildlife Fund does. It is a free country.

  153. David Young:

    Meow, look at the outlined box in the upper left hand corner of your first reference. What you see is that climate models have gone BACKWARD recently and have “returned to fully explicit methods.”
    The authors are trying to address the problems caused by this seeming insistence on repeating history. Maybe I do need to write that check after all.

  154. Rob Nicholls:

    Hank Roberts, thanks v much for your help
    re: SO2 emissions from China. (posts 128 and 133).
    It does seem that sulphur dioxide emissions from China have fallen since 2006.

    (see e.g. p.10 of http://www.atmos-chem-phys.net/11/9839/2011/acp-11-9839-2011.pdf )

  155. Patrick 027:

    Re David Young – Floating the idea of supporting the Heartland Institute (which isn’t even all about climate (anti)science; you’ll have ‘collateral damage’) raises a red flag, doesn’t it? I don’t see how increasing the errors in public opinion would reduce errors in computer modelling.

  156. Hank Roberts:

    > rewriting at least one model … so it is designed
    > to use the best methods. I’ve done this myself.

    If you do want to make a contribution,
    do consider contributing to these:

    Clear Climate Code
    Open Climate Code

  157. ldavidcooke:

    Hey Dr. Young,

    First thanks for taking the time to respond. Sorry for not prefacing my video codec spiel, with the idea that Video Compression is not unlike a creating a model of real world event. The digitized data is a representation of the light reflecting from a series of objects. The compression effort is similar to converting the measured values to a grid. The issue with data compression is the attempt to create a high resolution data representation with 1% of the data. This would be similar to taking 7200 data collection stations having 10 variables with ranges that can span 15000 discrete values per min. and trying to represent a trend in just one of the varibles which is dependent on the others either in a group or directly.

    The point being in order to maintain the smooth representation requires predictive analysis. Knowing that a value is changing, the rate that it is changing and the ability to determine if the change is an anomoly or noise or valid data. (Hence part of the purpose of the FT filtering.)

    The tools you had been talking about so far appear rather rudimentry IMHO. The management of an evolving data set, growing out of a series of equations, that feed the next calculation point have to of advanced light years from a mere 20 years ago. I will review the references you have suggsted to see where I have gone wrong.

    (I concur that reducing grid size as well as increasing steps, would be equivalent to increased resolution, hence error values would decrease. In the instance I was referencing regarded the sumation of a series of larger grids created from many discrete data points, while maintaining step size and encoding reversibility.

    As to defensive, I believe you could characterize it that way. The problem is if you leave a crack some try to wedge it open to create a door. The flip side is that confusion or a lack of clarity of what is known or unknown may partially explain the reason that a defensive posture is required. (It is when there is a misunderstanding that doubt occurs, the hard part is trying to be open and at the same time not to have to defend what you have no control over.)

    As this takes things way off topic I will quit here and attempt to review your points, not that it matters; but, so at least I better understand the point you are attempting to make. So far it is proving a struggle…

    Dave Cooke

  158. dhogaza:

    David Young:

    The second is to try to surpress the criticism or even worse get editors fired, etc.

    Yeah, and he wonders why people get “defensive” when all he’s doing is making objective statements meant to help poor, ignorant climate scientists improve the quality of their work …

  159. dhogaza:

    David Young:

    Now if I double the number of grid points, Delta X is halved and thus truncation error goes down by a factor of 4. However, in climate modeling there are 3 space dimensions, so to get this factor of 4, I need twice as many points in each of 3 directions, resulting in 8 times as many points and at least 8 times as much computer time.

    Climate models build their atmosphere of pizza boxes, not cubes, as the atmosphere is thin. Decreasing the grid size doesn’t necessarily imply that the pizza boxes become thinner. I think NASA GISS Model E has a couple of dozen layers in its atmosphere but it’s been awhile since I’ve looked. If you care, you can go look at it yourself.

    However, you somewhat surprisingly forgot to point out that the simulated time interval between steps needs to be reduced if the grid size is reduced, as that upper left box you like puts it:

    “However, finer model grids require a superlinear reduction in the time step size to account for the smaller spatial scale and increased multiscale interactions ”

    Which I think gets you your factor of 8 back.

  160. David Young:

    Hank, These things take a team, trust me on this. Open source is fine, but only if Hansen turned over his team to me would I think about such an endeavor. First order of business is get Gavin to go back to school (only kidding). If NCAR or DOE wants to start a new team, that’s the best approach. They could let John Bell or David Keyes or even Phil Collella lead it. There are younger guys around too. They would know what to do and not be infected by the Hansen doctrine that despite large numerical errors my results look good (and support something I already believed).

    [Response: Please leave the strawman personalisations at home. If you want to taken seriously, imagining supposed ‘doctrines’ that have no reference, cite or antecedent and are actually contradicted by the person you ascribe them to, is neither interesting nor clever. – gavin]

  161. dhogaza:


    However, finer model grids require a superlinear reduction in the time step size to account for the smaller spatial scale and increased multiscale interactions

    And climate modelers have been reducing model grid sizes as time has gone on.

    So we have at least two ways to reduce errors – increase resolution is one way. Using implicit methods another.

    Let’s address David Young’s accusation that climate modelers are ignorant of the methodologies he touts, and therefore incompetent. From the upper left hand box at the link he likes:

    To maintain scalability, a number of climate models have been returning to fully explicit methods developed several decades ago.

    So obviously the modelers involved are not only aware of implicit methods, but have implemented at least partially implicit solutions and have made a conscious decision to go back to explicit methods. Presumably because they believe that the benefits of greater resolution due to smaller grid sizes outweighs the benefits of the implicit methods they’d incorporated into their models previously.

    David Young might want to find out which models are being discussed and talk to the implementors before insisting this is a step backwards, that the modelers are obviously ignorant of improved methods, incompetent, not up to David Young’s level of expertise, not fit to sit at his table, etc.

    You might find out they know more than you claim they do, and aren’t as incompetent as you claim they are …

  162. Ray Ladbury:

    David Young, So, again, let me get this straight. You are going to write a check to the professional liars at the Heartland Institute because their model is so much better… Oh, wait. That’s right, they don’t have a model. They don’t have any researchers, or research or evidence. They just have…well, lies.

    Dude, you sure you’ve thought this through?

  163. David B. Benson:

    I am quite unsure just what various commenters are concerned about regarding the solution of various PDEs. However, (numerical) dispersion and dissipation are treated (rapidly) in
    which has a link to the interesting table of contents.

  164. ldavidcooke:

    RE: 161

    Hey dhogaza,

    Do not get me wrong, I am not saying one is right the other wrong. Nor do I suggest that the expertise of the current systems are inferior. (BTW, the quote you associate with me is not mine.)

    What I am saying is, if we disregard the attitude, is there any validity to the criticism? So far from my limited view point probably not. The hard part is trying to understand before making a judgement.

    When we consider that some of the base code of many of the current feeders are likely Fortran running on a mainframe in the basement, it may be possible that the climate team could use a bit of extra support. As to a reorg. nada, the folks who have been doing the heavy lifting have struggled with shortages in personal and systems since the early ’90s. They are well equipped to maximize output with minimal capacity. (They make up the difference by thinking smarter.)

    The conversion from implicit to explicit modeling is likely more for the purpose of openly showing the engine and feeds. It’s a bit like showing your work, which was a criticism prior to the UEA fiasco. An implicit differential would be more economic coding. However, the request for climate science to be open, drives the need for un-integrated data engines that specialize in solving one function. From there you can build an over-arching system which manages the feeds and data tables.

    This allows the distribution of processing, right sizing of the resources and allows very small steps of discrete functions. At the same time the data filters in the data tables can be adjusted without propogating error to dependent functions.

    As to opinions expressed either way, it is pretty difficult to understand if you have no knowledge of the motivation for change. Though those of us who have had a horse in the race may not always agree on who is the horse to beat; but, it is clear that those who are doing the job are the designated experts, else they would not be on the poll.

    Dave Cooke

  165. David Young:

    I struggled with whether or not to respond. Really, I’m not saying anyone is incompetent. However, I do have my concerns about you Dhogaza. The reason for the work in the reference you took out of context is to GO TO IMPLICIT methods because of their well known advantages. The difficulties are immense, but its important. The comment about scalability is just about massive parallelism. Implicit methods can be made parallel too. In any case, for a stiff system, implicit is so much better, parallelism hardly matters.

    I get the feeling here that the regulars on this web site are not the scientists, but others. Either the scientists already know what I’m telling them (and that’s fine) or they don’t want to respond to someone outside their community (and that’s fine too) or for them this is just a venue to keep the dhogaza’s of the world attacking their perceived opponents. Or maybe its a venue to demonize legitimate players in the debate. I don’t know but perhaps I should have checked before naively assuming that the site was about the science. I’ve received no feedback here indicating that any of the scientists understood what the issues were. I’m sure the better ones know, like Paul Williams. The question is what is the next step.

    The bottom line is that multiple sources of evidence that I’ve uncovered quite easily give strong evidence that the numerical errors in climate models are larger than I think is generally acknowledged by people in the field in their statements to the general public. It’s in the literature, but is perhaps not emphasized here. Further, the idea that there is no legitimate scientific controversy about these things is probably not something even Gavin Schmidt would endorse. I would recommend Climate Etc. as a better place for a discussion of all aspects of this issue. It’s a much friendlier place without the lack of a sense of humor that is quite obvious here. And some of the guest posts there like the one on chaos, ergodicity, and attractors show obvious deep knowledge of the latest theory in the field.

  166. sidd:

    We are blessed, indeed, to have such an intellect as David
    Young to tutor us. How best ought we avail of it ?

    It has been suggested that he join the Clear Climate Code
    initiative or the Open code initiative. We are fortunate that
    he did not take more offense at the idea. The pedestrian
    efforts at Clear/Open Climate code are far below his stature and

    Another measure of his greatheartedness is that he kindly considers
    taking over the “Hansen team”, before regretfully discarding them as
    being “infected by the Hansen doctrine.” No, infected drudges such as
    Dr. Schmidt must go back to school, one taught by such luminaries as
    himself, where they may be cleansed, if that is possible, of heresy.
    If not, he will, no doubt, have further suggestions in mind for their

    What, then, are we to to do ? Let us drink from his font of wisdom again.
    We must assemble a completely new, untainted group of acolytes who are
    capable of fully absorbing the radiant effulgence of his genius. Then,
    and only then, can climate science be delivered from the abyss of
    ignorance and error.

    This will, of course, be very expensive, but such a beneficial
    project will will surely be amply funded by the DOE, NSF, NCAR and
    all the others. Mr. Young has modestly suggested others for the lead
    role, but of course, he is the only one for the job. At the very least
    I am sure that he would accept a senior advisory position, with, at
    minimum, a seven figure salary. Any less would be an insult.

    Certainly, he is even now finishing his proposal for the project and
    when it arrives on the desks of the heads of the funding agencies,
    a great singing of Hosannas will echo in the halls and also a deafening
    scratching sound of pens upon checkbooks.

    Unfortunately, I suspect the gray reality will be a letter asking him to
    reapply after he has actually published something relevant to climate
    science. Such is the fate of genius in its own time.

    But all may not be lost. Mr. Young is a man of means, and has previously
    offered, out of his innate generosity, to fund the research with a check
    to the Heartland Institute. Why does he need the (possibly corrupt)
    agencies of a smothering government ? He can do this himself, and such
    a man of conviction will rise to the challenge.

    I await news of his continuing success, and anticipate accolades and
    laurels showered upon him and rose petals under his feet, and, perhaps
    even, (dare I say it) an appointment in Sweden with the King.
    Be still, my beating heart.


  167. David B. Benson:

    There are lots of papers with different approaches. Here is a quotation from
    A low numerical dissipation immersed interface method for the compressible Navier–Stokes equations by K. Karagiozis, R. Kamakoti & C. Pantano in J. Comput. Physics:
    In this paper, we partially address the stability problem of an immersed interface method for the compressible Navier–Stokes equations by utilizing the theory of summation-by-parts (SBP) operators [69] and [70] to derive a stable immersed interface approximation for the advection derivatives. Numerical experiments suggest that this approach prevents the appearance of spurious numerical instabilities, which otherwise create shock-like regions around complex boundaries. Moreover, different from IIM formulations, the new approach completely eliminates the need to deal with jumps at the object boundary. Finally, the method is combined with semi-implicit time integration to remove any stiffness present in the operators and the implicit equations are solved explicitly for the particular case of constant transport properties.

    [The reCAPTCHA oracle proclaims escape orisatt.]

  168. TFK:

    The defensiveness about the Heartland Institute is interesting. They read the data in a way that many scientists outside the mainstream do. Sort of like Copernicus did. Yet they are branded ‘liars’?

    If the WWF have a seat at the table, so should Heartland. And David Youngs comment (I’ll write a check) was meant to elicit a response. I’m surprised you took his bait.

    The lack of serious response to the David Young comments is revealing.

  169. Meow:


    In any case, for a stiff system, implicit is so much better, parallelism hardly matters.

    Got any cites to support this extraordinary proposition? And have you loaded up on Intel put options?

    CAPTCHA: ingMEn perfectly

  170. Brian Dodge:

    I find that the scariest metric of model inaccuracies come from the ocean – the Arctic ocean -http://www.woodfortrees.org/plot/nsidc-seaice-n/from:1979.6/every:12. The slope triples after 1995.

    Perhaps the damping of the aperiodic oscillations of energy transport by what David Young believes to be shockingly unstudied large numerical errors in climate models underestimates summer Arctic sea ice loss (as well as glacier loss, ice shelf loss, and Greenland & Antarctic ice sheet loss) because of nonlinear mechanisms. E.g. if your model give a correct average temperature of below zero over the Arctic, but damps the peak values of temperatures above zero in the actual data (presently much above, compared to the historic record), it will underestimate melt rates. The decrease in albedo from melting and corresponding increase in energy available for melt during the summer when the sun is shining isn’t balanced by the increase in albedo from freezing during the winter – because the sun is no longer shining. The nonlinearities of heat versus temperature from melting may share some similarities to evaporative nonlinearities, and underestimation due to overdamped response to aperiodic oscillations. There is observational evidence that the models underestimate the increase in rainfall[1] and extreme precipitation[2].

    Model inaccuracies may lead some to be cheerily optimistic. A billion here, a billion there, and pretty soon you’re talking real money.[3]

    Do dissipative economic models lead to probability distributions that falsely preclude Black Swans?

    [1] Science 13 July 2007: Vol. 317 no. 5835 pp. 233-235 DOI: 10.1126/science.1140746 “How Much More Rain Will Global Warming Bring?”, Frank J. Wentz*, Lucrezia Ricciardulli, Kyle Hilburn and Carl Mears
    [2]Science 12 September 2008: Vol. 321 no. 5895 pp. 1481-1484 DOI: 10.1126/science.1160787 “Atmospheric Warming and the Amplification of Precipitation Extremes”, Richard P. Allan and Brian J. Soden
    [3] http://www.ncdc.noaa.gov/img/reports/billion/timeseries2011prelim.pdf

  171. Meow:

    @165: Please write a paper describing your hypothesis and the evidence supporting it. Show how implementing it would improve a current model. Consider collaborating with the CAM-HOMME team. Or download CCSM4 and do it yourself. We eagerly await your well-researched and -reviewed insights.

  172. Martin Vermeer:

    TFK #168

    Yet they are branded ‘liars’?

    They are liars. You haven’t done your homework.

    If the WWF have a seat at the table, so should Heartland.

    To have a seat at the table where a problem is addressed, you must acknowledge the problem.

    The lack of serious response to the David Young comments is revealing.

    Why would baiting require a serious response? Any ‘scientist’ addressing a scientific issue with a political argument is not himself being serious. Such individuals are best ignored, as they are rarely any good at science, lousy at intellectual honesty, and high maintenance. This heuristic has served me well — life’s just too short.

  173. Hank Roberts:

    > the regulars on this web site are not the scientists
    Often true; I’m not. Some of us try to take the obvious questions and be helpful. Some, well, I’ve posted this before as a comment when the fierceness level starts to bother me.

    Others have other reactions.

    The sidebar link identifies the Contributors, all scientists:

    Visitors who are scientists often mention their background; some link to their publications/website.

  174. Patrick 027:

    Re TFK The defensiveness about the Heartland Institute is interesting. They read the data in a way that many scientists outside the mainstream do. Sort of like Copernicus did.

    Did Copernicus cherry-pick his data or rely on misunderstandings? Was it harder to explain the observations using Copernicus’s idea than it was to use the idea he was replacing? Was there an entrenched industry that would be hurt by continued reliance on Ptolemy?

    (Not everyone who disagrees with mainstream ideas has reasons to back up that decision; they don’t all turn out to be a Galileo, Newton, or Einstein. Just because an idea becomes accepted doesn’t make it wrong.)

  175. David B. Benson:

    Everybody attempts to write their numerical codes so as to satisfy the dictates of Emmy Noether’s (first) Theorem. I’ve previously posted a reference to books describing how climate GCMs are built; I’ve also provided two comments with links describing various means of avoiding numerical dissipation. I opine that quite a bit is known by everybody working on such codes in a wide selection of technical fields.

  176. Ray Ladbury:

    David Young, what utter complete horsecrap! Some of us, Sir, are real scientists–many of us, in fact, and we realize that for a complicated system, you must look at ALL the evidence.

    The attribution of climate change to anthropogenic CO2 is not dependent on GCMs. As I said before, a relatively simple, two-box model and basic physics are sufficient for that. I would note that neither Arrhenius nor Tyndall had need of a GCM.

    In fact, GCMs are among the most effective tools for limiting climate sensitivity on the high side. Don’t like the models? Well, you ought to be more worried rather than less.

    In any case, you seem to be utterly uninformed about how climate scientists actually use their models. Like most scientific models, their utility is in providing understanding of physical systems rather than for “answers”. But, I’m sure you don’t care. You are more interested in speaking to your own denialist echo chamber at Heartland. Me, I’ll stick with the real scientists.

  177. Lawrence Coleman:

    I’d just like to bring this governmental scam in Australia to your attention. In light of the recent terrible droughts we have had over the past decade the gov’ has built desalination plants in Queensland, Western Australia and New South Wales plants to supplement the water supply in case of another crippling drought. Now these plants were paid for wholly and solely by the australian tax payers..but now here’s the rub… These plants were only designed to be used in the advent of a drought but they are still being used, actually taking precedence over natural water flowing in our now abundant river systems after the record deluge we suffered last year and up to january this year. If fact water from a large river is being pumped out to sea (wasted) while copious amounts of fossil fuels are being used to keep these desalination plants running 24/7..now wait for it…the residents in many of our major cities are being slugged high water rates and raising at up to 26%/y for the privilage of using this wonderful desalinated water. The amount of power these plants consume is horrendous and comes from coal/oil sources. All our scientific bodies have stated that these plants only be used when there is insufficent water in our river systems which is clearly not the case now. These plants and our populace are simply being used as cash cows despite the environmental damage being done.
    This is an environmental travesty on a huge scale. How can we as a country possibly say that are being environmentally responsible when this wholesale wastage of fossil fuels in taking place.

  178. dhogaza:

    David Young:


    I struggled with whether or not to respond. Really, I’m not saying anyone is incompetent.

    Earlier this

    I guess there is no comment from Gavin on the startling evidence (both empirical and theoretical) of large numerical errors in climate models.

    Along with a bunch of crap suggesting that climate modelers are unaware of research dating back to the 1980s. And that if Gavin learned to be competent, he should view it as “job security”.

    Yes, you did, and have continuous, accused them of being incompetent and climate science, at large, of being guilty of academic misconduct (you’re “get editors fired” comment).

    You struggle with whether or not to respond because you’re going to get ripped open due to your inconsistency …

    Like most other deniers.

    Look, if you don’t like the heat, don’t enter the kitchen.

  179. Meow:

    FWIW, the possibility that earth’s climate system contains chaotic attractors is a good reason to refrain from excessively tampering with it, lest we unknowingly nudge the system onto a trajectory toward a particularly unfriendly one.

    But I’m sure that the “researchers” at Heartland have considered this issue and have an airtight case for all the attractors being centered on temperate modern conditions. Or at least they will if we’re considerate enough to write sufficient checks to fully fund their “research”.

    CAPTCHA: sight mosedli

  180. dhogaza:


    Do not get me wrong, I am not saying one is right the other wrong. Nor do I suggest that the expertise of the current systems are inferior. (BTW, the quote you associate with me is not mine.)

    Actually, I think the quote was, the first one, but … I’m not disagreeing with you. You understand that David Young is, ummm, “pushing it”, to put it mildly.

  181. dhogaza:


    The conversion from implicit to explicit modeling is likely more for the purpose of openly showing the engine and feeds. It’s a bit like showing your work, which was a criticism prior to the UEA fiasco. An implicit differential would be more economic coding. However, the request for climate science to be open, drives the need for un-integrated data engines that specialize in solving one function. From there you can build an over-arching system which manages the feeds and data tables.

    Well, my bet was on the modelers coming to realize that the problems that trip up some explicit models don’t apply the their particular GCM, and therefore have chosen to drop the computational expense of doing implicit methods in favor of greater resolution yielded by more efficient, finer-grained explicit models.

    I mean, it’s clear that Young’s first claim, that climate modelers are unaware of the explicit vs. implicit dynamic is false. So there must’ve been some reason for these particular modelers to move “backwards” as Young puts it (and I’m sure I’m not the only one who thinks that they had good reasons to do so, and that “backwards” is not the proper label).

    Anyway, I’m sure that David Young will contact the appropriate researchers, freely making available his incredible world-class expertise in modeling to them, so the models can be further approved …

  182. dhogaza:

    Either the scientists already know what I’m telling them

    I previously pointed out that some climate modelers, at least, are aware of the power of implicit methods, and that the upper left box you like points out that some modelers have moved back to explicit methods.

    This is not evidence of ignorance. Rather, I suspect it’s evidence of greater expertise than you hold.

    In other words, you’re all about “implicit methods are the only approach that works”, while the modelers referenced in that link obviously are willing to change methodology depending on circumstance (which I suspect may hinge on available computing power, but I don’t know)

    Their understanding is nuanced.

    Your position is absolutist.

    I suspect that you’re the lightweight here … sorry, just judging by the evidence on the table.

  183. dhogaza:

    Young: In any case, for a stiff system, implicit is so much better, parallelism hardly matters.
    Meow:Got any cites to support this extraordinary proposition? And have you loaded up on Intel put options?

    Actually, from my 15 minute skimming of explicit as opposed to implicit methods … it’s not so extraordinary.

    But, David Young has not mathematically proven that explicit modeling methods of climates must yield a stiff system …

    Nor has he mathematically shown that if true, it actually causes climate models to fall apart as he insists they do.

    He just pontificates that this is true, then accuses climate modelers of being ignorant of the existence of implicit methods.

    I think he’s probably about an hour or so ahead of me in google time, we can all probably catch up to his bullshit if we’re willing to put in a little study time. Thus far he’s just smoke and mirrors …

  184. David B. Benson:

    I’m reading the paper by K. Karagiozis, R. Kamakoti & C. Pantano I cited in an earlier comment. I’m beginning to understand why researchers prefer explicit and semi-implicit methods for solving the compressible Navier–Stokes equations.

    Possibly some commenters here should actually become familiar with the available literature before commenting? Otherwise it begins to look to me a stuff fit on for The Bore Hole.

  185. ldavidcooke:


    Hey dhogaza,

    There are many advantages related to explicit functions. As to issues of numerical errors or artifacts, many can be avoided by simply running the equation and using the shortest cycle time/step value to feed a model engines data array/table.

    It is a way to define a broad set of grid sizes and depending on the fastest changing variable for that grid size, step the model in a manner to create an accurate representation of an analog process digitally. In essence, you can run variables at different step sizes, sampling the data at any two points and get a very good replication. This allows for more accurate representation of changing or cyclical variables without injecting hetrodyning.

    Put another way, it allows for functions to change at a “natural” rate and yet interact when they hit the edges of their range, without interacting when closer to their mean.

    A Large Scale example: You might not want to code a hard and fast rule into your model, that when there was a neg. PDO moving to a neutral state that the NH Southern Jet Stream drifted North and a Blocking High took up residence centered at 104 deg. W. and 32 deg. N. If you could modify the rule to make a condition that when a volcano belched 1cu km of light gray aerosols, in the 165um range, with a density of 1000ppm/km^3 at an altitude of 8km at 65 deg. N the year before, well that might be a valuable modeling element.

    The problem with many implicit models is you are unlikey to be able to have high resolution (regional/local) events represented well, when all cycles have the same start/stop points and no means of injecting long cycle functions. You would have to take many small steps for all of your calculations, stop the run, modify a function, start the run, get to the next injection point….
    In my experience, wrt to modeling multiple functions/events, implicit modeling it is not very efficient either in cpu cycles, nor man hours, if the functions have different time domains or grid sizes.

    Dave Cooke

  186. Harmen:

    For something completely different…

    I found this interesting lecture about Tyndall..

    Tyndall: His work and scientific heritage
    EPAIreland op 8 okt 2011

    Professor Richard Somerville, Distinguished Professor Emeritus and Research Professor at Scripps Institution of Oceanography at the University of California, San Diego.

    This public lecture celebrated the life of John Tyndall and served as an introduction to the 3 day Tyndall Conference held from September 28th – 30th 2011. For more information, see http://tyndallconference2011.org/


  187. Marcus:

    #179 Meow

    FWIW, the possibility that earth’s climate system contains chaotic attractors is a good reason to refrain from excessively tampering with it, lest we unknowingly nudge the system onto a trajectory toward a particularly unfriendly one.

    This is a thought that comes always to me, when I see one of the “skeptics” pet arguments that “climate is too complex to be simulated on a computer”.

    You put some force on a system you depend on, which you yourself claim is by any means too complicated to be understood… how can that lead to a position that it is not advisable to a least limit the force?

    To assert the system does not respond at all or little, and that newly awkward behaviour it shows has nothing to do with the forcing, presumes that the system is quite understandable to you or at least you are quite confident you understand it.
    In my opinion this is clearly a contradiction


  188. wili:

    If we can pause briefly from engaging with bore-hole worthy Heartland Institute supporters for a moment–Does anyone have links or references for recent papers on the expected effects of an increasingly ice-free Arctic on Northern Hemisphere weather patterns? Thanks ahead of time.

  189. Hank Roberts:

    Yes. Published thus far in 2011:

  190. Hank Roberts:


    “… while the semi-arid forest can cool itself well enough to survive and take up carbon, it both absorbs more solar radiation energy (through the albedo effect) and retains more of this energy (by suppressing the emission of infrared radiation)….

    … what happens when the opposite process – desertification – takes place? … desertification, instead of hastening global warming, as is commonly thought, has actually mitigated it, at least in the short term. By reflecting sunlight and releasing infrared radiation, desertification of semi-arid lands over the past 35 years has slowed down global warming by as much as 20%, compared with the expected effect of the CO2 rise over the same period. And in a world in which desertification is continuing at a rate of about six million hectares a year, that news might have a significant effect on how we estimate the rates and magnitude of climate change….”

  191. CM:

    On credibility:

    David Young #165,

    If you really want a discussion just about the science, uninterrupted by the peanut gallery, well, duh, don’t try to have it on a blog; and if you do, don’t pander to the peanut galleries of certain other blogs by telling the host his work is meaningless, he should go back to school, and you’d be happy to take over the team he works on. Given your deliberate choices to ignore these simple precautions, forgive me if I find your dismay at the reactions just a tad disingenuous. The exchanges between you and Gavin on the previous thread were interesting, but this posturing is just boring.

    Dhogaza #183,

    > he’s probably about an hour or so ahead of me in google time

    Well, if he is who he says, he’s published work on computational fluid dynamics and probably knows quite a bit about the problems of modeling turbulent flows; those are good credentials to have for the issues he’s raised. From what has transpired here, though, he hasn’t done much homework about climate modeling before presuming to lecture Gavin.

    Here are some instructive results from Google Scholar advanced search for the author “DP Young” plus a few pertinent terms, with “GA Schmidt” as a control:

    “fluid dynamics”: DP Young 24 hits, GA Schmidt 17.
    “numerical instability”: GA Schmidt 2, DP Young 0.
    “climate models”: GA Schmidt 71, DP Young 0.

  192. dhogaza:


    From what has transpired here, though, he hasn’t done much homework about climate modeling before presuming to lecture Gavin.

    That’s the point … it’s clear he has a good theoretical grounding in the subject but he’s done nothing to show that the *potential* problems that can arise from explicit methods actually apply to the climate models he claims are broken. The fact that at least some groups have moved from explicit to implicit to explicit methods makes it clear that Young’s claims that climate modelers are ignorant of such potential problems is bogus. It seems clear enough that they’re making informed decisions …

  193. Hank Roberts:

    Well, his more nuanced contributions are at JC’s place, e.g.: http://judithcurry.com/2011/10/02/wedges-reaffirmed/#comment-117764

  194. wili:

    Hank, thanks, but I did actually already search google scholar with those words, and nothing comes up, at least not in the first few pages, of relevance. So if you have an actual article, or better terms to search under, that would be helpful.

    It would seem to me that helping the world better understand what it is in store–as the planet changes from one with a mostly icy Arctic all year round to one where the Arctic is increasingly more open than icy for more and more of the year–would be a central function of a site such as this (rather than constantly bantering with obvious trolls).

    Again, any real help toward finding such studies would be appreciated.

  195. wili:

    Sorry if that last post came off as snarky. I did find this pdf: http://www.arctic.noaa.gov/future/docs/ArcticAND_Globe.pdf
    and I can probably find some more recent work by searching under the authors’ names that are cited there.

    I would still be interested in any suggestions.

  196. Hank Roberts:

    Wili, did you notice these, in the first page of results to your question?

    “… periods of Arctic amplification are evident from analysis of both warm and cool periods over at least the past three million years. Arctic amplification being observed today is expected to become stronger in coming decades, invoking changes in atmospheric circulation, vegetation and the carbon cycle, with impacts both within and beyond the Arctic….”

    Arctic Warming Ripples through Eurasia
    JA Kelmelis – Eurasian Geography and Economics, 2011 – Bellwether Publishing
    … One of the largest changes that can be expected is a drastic alteration of marine ecosystem composition …

    Aren’t those along the lines you’re asking about?

  197. Pete Dunkelberg:

    Hank @ 190:




  198. Hunt Janin:

    Re sea level rise:

    For my book on this subject, I’d like to know when and where the first scientific measurements of sea level rise were made. Any ideas?

  199. harvey:



  200. Hank Roberts:

    reports on http://www.nature.com/ngeo/journal/vaop/ncurrent/full/ngeo1282.html

    Nature Geoscience | Letter
    Solar forcing of winter climate variability in the Northern Hemisphere
    Received 18 April 2011
    Accepted 07 September 2011
    Published online 09 October 2011

    More links in the article at New Scientist:
    —excerpt follows—

    “The authors emphasize that cooler temperatures in Northern Europe are accompanied by warmer ones further south, resulting in no net overall cooling. “It’s a jigsaw puzzle, and when you average it up over the globe, there is no effect on global temperatures,” Adam Scaife, head of the UK Met Office’s Seasonal to Decadal Prediction team, told BBC News.

    The UV measurements could lead to better forecasting. “While UV levels won’t tell us what the day-to-day weather will do, they provide the exciting prospect of improved forecasts for winter conditions for months and even years ahead. These forecasts play an important role in long-term contingency planning,” Ineson told Reuters.

    The scientists emphasised that several other factors, such as declining levels of sea ice and El Nino, may have played a role in the unusually chilly winters, reports The Independent, which quotes Ineson as saying: “There are a lot of different factors that affect our winter climate. However, the solar cycle would probably have been acting in a way that gave us those cold winters.”

  201. AIC:

    Re 200:

    good coverage at:

    “…The researchers emphasise there is no impact on global warming.”

  202. Hank Roberts:

    BBC’s quotes include a bit more:
    –excerpt follows—-

    “The one big caveat is whether SIM’s data is accurate.

    Scientists in the field appear to believe it is – but as the UV changes it sees are so large compared with previous methods, they would prefer confirmation.

    Commenting in Nature Geoscience, Katja Matthes from the Helmholtz Centre in Potsdam, Germany, describes the results as “intriguing, albeit somewhat provisional”.

    “The trends seen in the SIM observations are still under discussion and remain to be confirmed,” she writes.

    She also points out that SIM measures only a proportion of the ultraviolet region of the spectrum….”|

  203. wili:

    But hasn’t sunspot activity been rapidly increasing this year? So shouldn’t that mean a warmer winter in Europe and parts of the US?

  204. Paul Tremblay:

    Has anyone followed the debate between skeptical science and Pielke Sr here:


    Perhaps as a non scientist, I am missing something, but to me it strikes that Pielke has to be one of the most dishonest debaters ever. He makes a claim that CO2 forcings are only 20%. Other posters point out specifically why he is wrong. He evades, throws out irrelevant information, and finally concludes with the statement that no one really knows for sure, the question is not important, so lets move on. When cornered, he won’t answer directly, but raises questions mostly irrelevant.

    Am I missing something, [edit]?

  205. Hank Roberts:

    Wili, the story above is reporting a brand new result, a surprising one, that has not been confirmed yet from other data and instruments.

    So there’s no good way to make a prediction about the weather based on it.

    What if adding more ultraviolet changes aerosols to make the sky hazier and more reflective? Could that happen? Googling speculatively, turns up:

    “It is hypothesized that many substances in the atmosphere, including volatile organic compounds (VOCs) and their oxidation products, very fine particles and others absorb and/or utilize UV energy. The long-term UVI trends and its main controlling factors in four seasons during the previous 2 decades are discussed, UV energy consumption by atmospheric chemical and photochemical processes, is especially important during summer….”
    Analysis of ultraviolet radiation in clear skies in Beijing and its affecting factors

    Jianhui Bai, LAGEO, Institute of Atmospheric Physics, Chinese Academy of Sciences, Beijing 100029, China
    24 September 2011

    Joan Baez once said something like: “if you are faced with a choice between a hypothetical situation and a real one, choose the real one.”

    We don’t know yet.

  206. Daniel Bailey:

    Any who choose to participate in the discussion threads with Dr. Pielke at SkS, please show restraint and exercise decorum. He is a guest and respect is to be given.

    Thank you.

  207. David B. Benson:

    wili @203 — There is a lag between the sunspot cycle and warmth/coolth just as there is a lag between solar insolation and the warmest/coolest parts of the annual cycle.

  208. ldavidcooke:

    RE: 205

    Hey Hank,

    I believe that the greatest influence of the added 11 yr cycle UV may be isolated to the N. Jet Stream pattern. Not that I have a source other then 8yrs of NOAA, NCDC, SRRS observations, the interesting thing I have seen emerge over the last 11 years are the changes in the Walker Circulation and the N. Jet Stream’s seasonal elipitical pattern change. I believe this may have been also suggested by some of the present researchers associated with the Hadley Center UV/SIM story.

    The problem is trying to associate which is which. Is the extra volume contained in the current N. Jet driven by UV heating or by cross zonal equatorial heat flow. Which I believe returns the discussion to what also drives the equatorial heat increases, is it a reduction in cloud cover caused by CO2 warming, aerosol population changes or added Tropopause UV energy reducing high altitude wv condensation?

    VOCs not withstanding, have you any insights you can share?

    Dave Cooke

  209. wili:

    Thanks, David. That makes sense.

    Sun spots do seem to be increasing rather rapidly, though, so maybe we can expect a somewhat milder MN winter in 2012-2013? (I know, I know, the study is only preliminary and other factors play large roles.)




  210. Ray Ladbury:

    I would note that there is a piece on climate denialism over at the American Institute of Physics website:


    It has already been visited by the sorts of idiots that think you can even lie to physicists about how science works.

  211. Former Skeptic:

    Daniel @206:

    I agree with your call for civility, and would like to thank you and the other folks over at SkS for being so persistent and patient in the RPSr. threads.

    Having said that, RPSr.’s answers – or the lack of them, rather – is very revealing. While I would not go as far to call him “dishonest”, as Paul (@204) opines, his evident evasiveness to the very direct, cogent and polite answers by dana, Albatross, Tom Curtis et al. is both disappointing and expected (given past evidence both here and elsewhere on the intertubes).

    Good luck in continuing the discussion, though I have a sneaky suspicion that RPSr. will soon leave in a huff again upon another self-perceived slight on his person.

  212. Hank Roberts:


    Pumping CO2 into hot rock, deep enough it acts as a supercritical fluid — then drawing some back out to run turbines, then reinjecting the cooled gas. Still takes energy overall to sequester the CO2, and transfers some geothermal heat to the surface — but uses the heat in the reservoir. New idea.

  213. caerbannog:

    Folks attending the GSA meeting in Minneapolis might be treated to a freak show tomorrow outside the main entrance of the conference center.

    Check out http://www.minnesotansforglobalwarming.com/m4gw/2011/10/michael-mann-is-coming-to-town.html for details (if you can stomach it).

  214. Ray Ladbury:

    Of course sunspots are increasing–we’re moving into solar max. To my knowledge, though, the Sun is still below normal activity for this portion of the cycle. However, given the much prolonged solar min of the last cycle, it’s hard to say what normal should be for this one.

  215. Hank Roberts:

    > wili
    > maybe we can expect … Minnesota

    Why would you expect a correlation between solar cycles and winters in Minnesota? Nobody seems to have published the idea anywhere. Do look, maybe I missed something.

    I did find a contrary suggestion (from Lockwood, Harrison, Woollings and Solanki, speculating about cooling):

    Environ. Res. Lett. 5 (April-June 2010) 024001

    Are cold winters in Europe associated with low solar activity?


    “… this is a regional and seasonal effect relating to European winters and not a global effect.”

  216. Sphaerica (Bob):

    Gavin et al

    I’d like to make a request for a post, if you don’t mind. I frequently run into people with gross misconceptions about how the models work. They are insistent about all of their misconceptions. I do point them (with glee) to the excellent two part post here at RC on climate model FAQs.

    But it would be nice (both fun to read, and useful for those people who don’t understand) if you could provide an English language synopsis of the actual structure of the specific models (and model components) that you work with. In business computing this would be termed a “Functional Design,” a document that describes (for the layman) how the programs work without getting into the nitty-gritty and getting bogged down in the actual computing details.

    It would be a great read, and it would also be a great resource for refuting those people who think that climate models work by tweaking parameters and forcing temperatures to rise with CO2 levels through presumed correlations, rather than through the cumulative effects of individually modeled physical relationships.

    Please consider doing so, if you have the time.

  217. Hunt Janin:

    I second the point made in 216 above.

  218. Richard Bird:

    Sphaerica: that would be an excellent idea and really helpful.

    I have a specific problem in accepting this version of modelling as stated by the Skeptical Science site:

    Amplification: ” How does this work? The amount of water vapor in the atmosphere exists in direct relation to the temperature. If you increase the temperature, more water evaporates and becomes vapor, and vice versa. So when something else causes a temperature increase (such as extra CO2 from fossil fuels), more water evaporates. Then, since water vapor is a greenhouse gas, this additional water vapor causes the temperature to go up even further—a positive feedback.

    How much does water vapor amplify CO2 warming? Studies show that water vapor feedback roughly doubles the amount of warming caused by CO2. So if there is a 1°C change caused by CO2, the water vapor will cause the temperature to go up another 1°C. When other feedback loops are included, the total warming from a potential 1°C change caused by CO2 is, in reality, as much as 3°C ”

    My problem is with the statement: “If you increase the temperature, more water evaporates and becomes vapor”.

    The author is not implying a direct causal link from CO2 to WV content. He is saying that temperature increase causes an increase in WV content. But that would imply that for ANY global temperature increase, even a natural increase. That would imply:

    A natural rise in temperature produces more water vapour – which increases temperature – which produces more water vapour – which increases temperature – and so on…. a runaway feedback effect which must lead eventually to either atmospheric saturation or heat catastrophe. I assume this doesn’t happen in reality. (!)

    My understanding of the physics involved is that warmer air has the CAPACITY to hold more water vapour. That is all. Can someone clarify how the principle of Amplification is supposed to work, and why there is not a runaway chain?

  219. tamino:

    Pursuant to Sphaerica (Bob)’s request for a post on the structure of models…

    Kate (of ClimateSight) did such a study as a summer project (she’s studying to be a climate scientist). I believe she will even have a poster presentation on the topic at the upcoming AGU meeting. Perhaps she would be happy (perhaps even delighted) to contribute a guest post to RC,or perhaps do one in collaboration with RC regulars (Gavin?).

    Just a thought…

  220. dhogaza:

    Richard Bird:

    A natural rise in temperature produces more water vapour – which increases temperature – which produces more water vapour – which increases temperature – and so on…. a runaway feedback effect which must lead eventually to either atmospheric saturation or heat catastrophe. I assume this doesn’t happen in reality. (!)

    You’re making the common improper assumption that a positive feedback must inevitably lead to a runaway feedback. Do some googling for stuff like “convergent series positive terms” …

  221. Hank Roberts:

    For Richard Bird
    You’re asking a frequently answered question (might even be found if you look at the “Start Here” link at the top of the page for the FAQs; try there.

    Briefly: water vapor condenses making clouds.
    Clouds (some kinds at some altitudes) increase albedo, reflecting incoming sunlight away.

  222. Maya:

    Richard, the amplification is limited because the atmosphere does not have an infinite capacity to hold water, and you aren’t applying an unlimited forcing. Think of it in terms of a pot of water with a candle under it – it gets warm to a certain point, but no more, even if you just leave the candle there. Putting the lid on the pot will keep some of the heat trapped, but you still reach a point where the water isn’t going to get any hotter.

    Of course, the planet is a lot more complex than a pot of water. :)

  223. Sphaerica (Bob):

    218, Richard Bird,

    Just to clarify what others have already said, you said:

    [The author] is saying that temperature increase causes an increase in WV content. But that would imply that for ANY global temperature increase, even a natural increase.

    This is absolutely true. Feedbacks are a result of temperature change, regardless of the forcings behind that change.

    The water vapor feedback is, in fact, an important element that greatly increases climate sensitivity.

    You go on to say:

    A natural rise in temperature produces more water vapour – which increases temperature – which produces more water vapour – which increases temperature – and so on…. a runaway feedback effect which must lead eventually to either atmospheric saturation or heat catastrophe.

    This is incorrect, as dhogaza and Hank both point out.

    There are multiple factors, but in a nutshell consider a diminishing series. Using made up numbers, what if the increase in water vapor results in half as much warming as a previous increase in water vapor? You have:

    1 + 1/2 + 1/4 + 1/8 + … + 1/∞ → 2

    The point is that if the increase in temperature ∆T1 from an increase in water vapor ∆V0 from an initial increase in temperature ∆T0 is not as great as the initial increase (i.e. ∆T1 < ∆T0) then you do not have a runaway and so do not have a problem.

    And, as Hank points out, there are other factors (such as increased condensation, creating increased clouds, creating increased albedo).

  224. Meow:

    @220: Or, to state it simply right here, doubling CO2 directly causes a temperature jump dT1. That jump makes the atmosphere take up water vapor sufficient to raise the temperature an additional dT2. That jump, in turn, causes the uptake of more water vapor sufficient to raise the temperature an additional dT3 and so on infinitely. But dT3 &lt dT2, and dT4 &lt dT3, etc. Since each term is smaller than the last, the series of temperature increases dTx converges to a finite value, much as the series

    1 + 0.5 + 0.25 + 0.125 + ...
    converges to 2.

  225. CM:

    Bob #216 — excellent suggestion.

    Meanwhile, I think this lovely paragraph from an otherwise dry review article deserves wider readership:

    For a balanced view, it is useful to watch an animation of the output of such a model, starting from an isothermal state of rest with no water vapor in the atmosphere and then “turning on the sun,” seeing the jet stream develop and spin off cyclones and anticyclones with statistics that closely resemble those observed, watching the Southeast Asian monsoon form in the summer, and in more recent models, seeing El Niño events develop spontaneously in the Pacific Ocean. (Held and Soden, “Water Vapor Feedback and Global Warming”, Annu. Rev. Energy Environ. 2000. 25:441–75)

    (Question: Where can one watch an animation like that? Preferably with blow-by-blow commentary pointing out the interesting features to an untrained eye?)

  226. Meow:


    The point is that if the increase in temperature ∆T1 from an increase in water vapor ∆V0 from an initial increase in temperature ∆T0 is not as great as the initial increase (i.e. ∆T1 &lt ∆T0) then you do not have a runaway and so do not have a problem.

    Not quite. What matters is not the relationship between ∆T0 (caused by CO2 in this example) and ∆T1 (caused by the “first round” of WV increase), but that between ∆T1 and ∆T2, ∆T2 and ∆T3, etc.: as long as ∆T2 < ∆T1, ∆T3 < ∆T2, etc., the feedback converges to a finite value. This is true even if ∆T1 > ∆T0. So, purely hypothetically, if doubling CO2 causes ∆T0 = 1K, and that causes ∆T1 = 1.5K, and that causes ∆T2 = 0.75K, etc., the total resulting ∆T is 4K.

    CAPTCHA: evils oatfol

  227. David B. Benson:

    For those who find that the sclimate model FAQs are not enough, try “A Climate Modelling Primer” by Henderson-Sellers.

  228. pmp5:

    I know it isn’t your guys job to address what happens on other blogs, but I was wondering if somebody here would comment on this.

    On her blog, Judith Curry stated:

    “The deep uncertainty is associated with our reliance on projections from climate models, which are loaded with uncertainties and do not adequately treat natural climate variability.”


    I was wondering if there was any work really addressing the last point. Do models adequately treat natural climate variability? Is the natural climate have significantly more or less variation (variance?) than the models?

    I asked on her site, and I got a response related to mean values, but I know you can models means without doing a good job of modelling variation in other fields.


    [Response: It’s not really clear what she is referring to. Climate models obviously do have internal variability and also respond to natural forcings. Whether that is ‘adequate’ depends entirely on what the purpose is. If you are looking to find the climate change signal in the noise, the models have a range of noise levels (which encompass the variability seen in the real world) and you can test how sensitive your attribution is to the different amounts of internal variability (see santer et al, 2009: 2010 for instance). If you are interested in the sensitivity of internal modes to different forcings, I’d say the models were adequate for somethings – the NAO, the Southern Annular mode, – and not yet adequate for something like ENSO. Without a little context or specifics it’s hard to say more. – gavin]

  229. Bob Loblaw:

    Another way of visualizing feedback – think of a microphone and speaker in a large hall. A person talks into the microphone, the voice comes out over the speaker, and gets picked up again by the microphone. Total amount of noise is more than just the single event of the first broadcast of the person’s voice.

    How much noise? It depends. Each time the microphone picks up speaker’s output, the sound gets fed through the amplifiers and sent back out the speakers. If each pass through the amplifier is smaller than the previous one, we get a nice echo effect and a finite amount of sound. If each pass is bigger than the previous one, we get that horrible ear-shattering squealing sound we’ve heard at so many weddings. That’s the run-away feedback.

  230. Joe Cushley:

    Interesting, almost throwaway comment about a change in climate in the 14th century exacerbating the course of the Black Death Plague in Europe.


    But I thought that was during the Mediaeval Warm Period?! ;-)

  231. prokaryotes:

    Last month i posted my first Tshirts i designed here, with related climate themes. I tweaked the designs now and just released “Climate”.

    It features the following text: “The oceans are warmer Today than they were, 30 years ago,means there’s about, on average, 4 percent more water vapor lurking around” – Kevin Trenberth, Scientist

    Feedback is welcome, especially what you think would make other great “text additions” in this regards to climate science, with educational aspect.

    Currently only available in Europe but soon in the USA as well.

    Have a look

  232. Ray Ladbury:

    OK, pet peeve here. It drives me absolutely nuts when “dissenting” scientists like Roger, Judy and Roy make vague, unverifiable, unfalsifiable statements about the skill of climate models or other supposed shortcomings in the case for anthropogenic causation. It is utterly unscientific. It falls into the category of what Pauli decried as “so bad it’s not even wrong!” It sounds more like a theological debate than a scientific discussion.

    I realize that they have no model and can therefore make no predictions, but is this really the best they can do?

    Vague is worse than wrong. Vague is bu****it

  233. SecularAnimist:

    Gavin wrote re: Judith Curry: “Without a little context or specifics it’s hard to say more.”

    Denigrating “climate models” without context or specifics is a common denier tactic. Judith Curry’s sweeping, context-free, unspecific generality will no doubt be copied-and-pasted all over the web within hours by people who don’t have the slightest idea what she’s talking about.

  234. Richard Bird:

    Hank and others: Thanks for the responses on amplification. All clear, the essential factor is the initial ∆T1. Whether the series will converge or diverge depends that value. (As pointed out elsewhere, if other factors such as albedo, ocean circulation etc are involved, sufficient to a produce ∆T1=1 then the ‘runaway’ scenario becomes real. Well, fingers crossed on that for now I suppose…)

    My next question was to have been: Is an amplification factor also applied in models to natural forcings such as solar irradiance, and if so is that well estimated? The obvious point being, if net solar forcing post 1900 is under-estimated, then the CO2 forcing may be over-estimated. However I see there has been a lot of discussion around that in 2005: http://www.realclimate.org/index.php/archives/2005/12/natural-variability-and-climate-sensitivity/#comments

    So I will change my question to: has anything new developed around that subject since 2005? eg on cloud feedbacks?

    A purely empirical observation here: London. Oct 2 2011 (near the fall equinox so giving a median picture) Blue sky, little wind; 29 deg C. Oct 4 2011, overcast with low cloud, little wind: 19 deg C. 10 deg change presumably due to cloud cover. I observed similar variations while driving from London to South of France on Oct 5 & 6, passing from blue sky to thin cloud, heavy cloud and blue sky again all day. In the space of 10-20 miles each time: Thin cloud = 2 deg drop, heavy cloud = 4-5 deg drop. Those were local variations over very short timescale. So over 24 hours, 10 deg drop seems right. Not very scientific but interesting.

    Empirically, if thick cloud cause 10 deg drop. 2.5% reduction in low cloud cover -> 0.25 deg C ∆T.

    IPCC Energy balance diagram: Reflection of solar energy + aerosols etc = 77 W/sm. Rough guess,70 W/sm reflection by clouds alone? Assumed AGW effect is 1.7 W/sm. Empirically, Reduction of cloud cover by 2.5% -> 1.7 W/sm increase in energy at earth surface. My all means shoot me down here, but a simplistic ‘back of envelope’ estimate of 2.5% variation in area of cloud cover would account for most of the ‘gaps’ of 1.7 W/sm and 0.25 deg C between estimated natural warming and recorded warming as set out in Hansen’s 1988 paper. (I think the figure was lower than 1.7 W/sm in 1988 but I don’t have the exact figure)

    I would be interested to know if there is any reliable data on global area of cloud cover over last 40-50 years, and whether any correlation with either solar activity or temperature is apparent.

  235. Hank Roberts:

    > the ‘runaway’ scenario becomes real. Well, fingers crossed

    Nope. Try one of these for more help:

    Seriously, it seems you’re retracing the steps many others have already followed, starting with ideas that either you’re coming up with on your own, or that come from somewhere people are publishing stuff that’s not reliable.

    It’s no surprise logic and deduction fail; nobody knows all the basics needed to figure out what’s going on just by thinking without looking stuff up.

    But, really, reading the Start Here and FAQs will avoid a lot of the most commonly encountered misapprehensions.

    The traditional Usenet method — “post what you know and await correction” — invites a lot of recreational typing, but isn’t all that efficient.

  236. Ray Ladbury:

    Richard Bird,
    But clouds also warm–it depends on which clouds when and where.

  237. Septic Matthew:

    228, gavin in line: Without a little context or specifics it’s hard to say more. – gavin]

    It was the quote that was removed from its context. For the context, click on the link. I reproduce it here:


  238. Hank Roberts:

    For Richard Bird:
    Tried this search (you can improve on it; look for key words, narrow the range, try Scholar): http://www.google.com/search?q=data+on+global+area+of+cloud+cover

    From the first few hits, this:
    which says:

    “The short ISCCP time record, covering only about 22 years, shows some signs of a slower variation. Cloud amount increased by about 2% during the first three years of ISCCP and then decreased by about 4% over the next decade. ISCCP began right after one of the largest El Ninos this century (in 1982-83) and the eruption of the El Chichon volcano, both of which may have caused some changes in clouds. There were other, weaker El Ninos in 1986-87, 1990-91 and 1992-94 and another volcanic eruption (Mt. Pinatubo) in 1991. Note however, that there appear to be no related changes in cloud top pressure/temperature or optical thickness (the small change in 1991 is caused by including the extra sunlight reflected by volcanic aerosol with clouds which affects the cloud top pressure/temperature determined for very thin cirrus clouds). The surface temperature shows a decrease of about 1 K from 1983 to about 1991. Since there a number of events occurring during this time period, the cause of these cloud variations is not yet understood.

    Such variations are referred to as “natural” variability, that is the climate varies naturally for reasons that are not fully understood. The problem for understanding climate changes that might be produced by human activities is that the predicted changes are similar in magnitude to those shown here. The difference between natural and human-induced climate change will only appear clearly in much longer ( >= 50 years) data records….”

    As to why you need more than 50 years, with that particular data set, see generally the method explained at:


  239. Kevin McKinney:

    #230–To respond more or less seriously to a jocular comment:

    Interesting, almost throwaway comment about a change in climate in the 14th century exacerbating the course of the Black Death Plague in Europe.


    But I thought that was during the Mediaeval Warm Period?! ;-)

    Yes, but the end of it.

    Don’t know about the much-touted English wine industry, but those Greenland Norse felt an increasing climatic pinch during the 14th century. (For example, the more northerly “Western settlement”–around present-day Nuuk (formerly Godthab)–was abandoned during this time.) Last known contact with Scandinavia (before the modern era, of course) was 1410.

  240. Chris Colose:

    Richard Bird,

    All of the answers so far on why positive feedbacks can converge are presented in a linear framework of feedback analysis, and are actually incapable of yielding useful information on whether a “runaway” effect occurs as the traditional feedback factor goes to one, where the linear analysis becomes invalid.

    Typically, we define some reference system, such as the Planck restoring feedback (i.e., the increased radiant emission to space in a warmer world, or less emission on a cooling planet) and then define feedbacks relative to that. In a linear world view, if we call the temperature anomaly ∆T0 for a planet where only the Planck radiative restoring response were active (usually ~1.2 C per doubling of CO2), then the real temperature anomaly is ∆T = ∆T0/(1-f), where f is a feedback factor that depends on how much a particular variable (such as water vapor) changes with temperature, and on how much that change actually impacts Earth’s radiation balance. If f ~ 0.5 then climate sensitivity is doubled, but OLR still increases with temperature and you converge to a stable solution.

    Physically, you can think of 1-f as being related to the slope of a line in G vs. T space, where G is the net TOA energy balance (outgoing thermal emission – incoming absorbed sunlight) and T is the surface temperature; note that G increases with T. See Figure 3.1 here to see this visually. The less steep the slope, the higher the climate sensitivity, since the surface temperature must rise more in order for the planet to re-establish radiative equilibrium. Water vapor feedback makes the outgoing radiation more linear than T^4, so the Planck restoring feedback still wins out in the end (allowing for stability), but it isn’t as effective.

    More generally, when f reaches one, you reach a bifurcation point but you have to know something about the global structure of the energy surface in G-T space, rather than the local departure from a stable equilibrium point. When f goes to one, it doesn’t need to be a “runaway” effect (but it could be). This would happen if the absorbed solar radiation were high enough, and you reached a point where the water vapor feedback were strong enough so that OLR did not increase with temperature.

  241. Septic Matthew:

    234, Richard Bird: I would be interested to know if there is any reliable data on global area of cloud cover over last 40-50 years, and whether any correlation with either solar activity or temperature is apparent.

    Here is a start, though it isn’t cloud cover per se , but the change in forcing effected by the cloud cover. At least I think it’s a start.


  242. dhogaza:

    Richard Bird:

    A purely empirical observation here: London. Oct 2 2011 (near the fall equinox so giving a median picture) Blue sky, little wind; 29 deg C. Oct 4 2011, overcast with low cloud, little wind: 19 deg C. 10 deg change presumably due to cloud cover.

    Why presumably due to cloud cover? Why not an influx of cool marine air leading to the switch from clear skies to clouds?

    While thinking about this, please re-read Hank’s post 235 above …

  243. David B. Benson:

    Bacterial Communication Could Affect Earth’s Climate, Researchers Discover

    Biogeochemistry is certainly complex.

  244. Edward Greisch:

    Do you know anything about a botched meeting on the Amazon a la
    I don’t know anything about the alleged meeting, so I need some help.

  245. Paul Briscoe:

    I hope you don’t consider this too far off topic, but I am having an exchange with a very persistent blogger who claims that CO2 and temperature changes during transitions between glacial and interglacial periods are inconsistent with established theory. Please can you confirm whether my argument is correct:

    CO2 can only exert a forcing if its atmospheric concentration changes. The larger the change, the larger the forcing.

    If the rate of change in CO2 concentration is very slow, global temperatures can almost keep up with the effects of the changing CO2, so the net forcing at any point in time is CLOSE TO ZERO.

    The difference in CO2 levels between glacial and interglacial periods was at most 100ppm (and at least 100ppm lower than today) and the change took place over thousands of years in response to the gradual changes in ocean temperature. Consequently, the very gradual increase in CO2 levels could never have exerted sufficient warming effect (it’s strictly a feedback in this case) to overcome the effects of other changes due to albedo and Milankovitch cycles. Instead, its concentration simply rose and fell in response to the ocean temperature changes caused by other things.

    So in that particular scenario, CO2 could only ever have amplified the effects of other things (indeed, the large changes in temperature cannot be explained without it).

    I think the point you’re missing is that CO2 cannot “protect” the Earth against cooling due to other processes unless its own concentration is rising quite fast…… whereas going into a glacial period it would actually be falling due to falling ocean temperature, hence amplifying the cooling.

  246. Kevin McKinney:

    #244–Ed, you remember the 2009 Copenhagen Conference! What’s confusing you is that Revkin doesn’t make clear that he’s only talking about a sub-event dealing with the Amazon; the larger conference dealt comprehensively with climate change issues, from sea level rise to–well, just about any related issue you choose:


    #245–I’m an amateur, Paul, but I’m sorry to say that I think you’re wrong. “Forcing” usually denotes a term in the energy budget during a particular time, and need not be changing–for instance, insolation is a forcing, and varies only within relatively tight limits. (Indeed, they used to call it the “solar constant.”)

    You are right, of course, that CO2 changes during glacial cycles are feedbacks, but that’s not because they are slow, it’s because they are driven by temperature changes initially.

    And according to RC’s own Dr. David Archer, it’s quite possible that human-emitted CO2 could protect the planet from the next glacial:


    Of course, that would be a (very) long-term benefit, which we would first have to suffer the much nearer-term disasters in order to enjoy.

  247. Ray Ladbury:

    Paul Briscoe,
    Your correspondent is full of fetid dingo kidneys. The forcing is proportional to the CO2 concentration, not to its change. The predominant negative feedback is radiation of IR photons from the top of the atmosphere into space–which depends on the temperature at TOA. If CO2 increases, it will take a while for the atmosphere to heat up sufficiently that the radiation out again equals radiation in.

    In any case, for glacial/interglacial cycles CO2 is a feedback, not a forcing. The main driver is small changes in insolation due to changes in Earth’s orientation, orbit, etc.–Milankovitch cycles. I wish these idjits would just learn some science.

  248. Pete Dunkelberg:

    Risk is the right term – the human term. How much damage are we risking and how soon? Uncertainty in model projections given a fixed emissions scenario is is in timing. A projection of a certain result plus or minus Delta X by year 2100 =
    it might happen already in 2090 or not until 2110. Curry issue answered QED

  249. Sphaerica (Bob):

    FYI, for anyone else interested in the models, this is a tremendous read:

    Climate Models: An Assessment of Strengths and Weaknesses

    (also available here).

    In some ways, it does exactly what I was asking of Gavin, except that it is a general answer. I was (and still would like to see) a discussion of one, specific, actual climate model, how it is designed and how it operates.

    This link also looks to be of interest in a more scatter-shot way, although I haven’t even perused it much yet.

    [Feedback on the quality/accuracy/value of either of these links is appreciated.]

  250. Chris Colose:

    Paul (245)

    The radiative forcing from CO2 is defined as the net change in the top of atmosphere energy budget between the initial and final state, and has nothing to do with how fast it got there. If you had two identical planets in every way (same sunlight, orbital configuration, same albedo, etc) except that one had 100 ppm of CO2 and another had 1000 ppm of CO2, the latter would be warmer, and one could predict this without knowledge of the history of CO2 concentration. So even if CO2 were rising in small increments, so would the temperature.

    Also keep in mind that since the radiative forcing is proportional to the logarithm of concentration, even 100 ppm rise can mean a lot when you start from a low background state.

    I talked more about CO2 in its glacial-interglacial role in this RC post. The CO2 role is probably small in the 41 kyr obliquity band but is large, and comparable to the modern CO2 forcing, in the 100 kyr band and helps provide for the full magnitude (and global character) of the Antarctic ice core record (see e.g., Jouzel et al., 2007, Science)

  251. Paul Briscoe:

    In hindsight, I should have posted about this first.

    The latest blog post from BBC Weatherman and Climate Correspondent Paul Hudson is now going viral all over the internet:

    “Met Office wakes up to solar influence on climate”


    I don’t believe that Paul has any agenda, but his wording is unhelpful to say the least. He begins his post:

    “For as long as I have been a meteorologist, the mere suggestion that solar activity could influence climate patterns has been greeted with near derision.”

    As you can probably imagine, some elements are having a field day given that this comes from a source such as the BBC!

  252. Hank Roberts:

    Paul, would you point to that “very persistent blogger” so others can look? I think you get into confusion because various definitions are used. See the fairly old post at: http://www.realclimate.org/index.php/archives/2005/09/what-is-a-first-order-climate-forcing/ for example.

  253. David B. Benson:

    Paul Briscoe @245 — This amateur states you have done quite well. There are no (large) inconsistencies.

  254. Hank Roberts:

    Another for Paul, this also from 2005:

    “… The current global mean top-of-the-atmosphere (TOA) radiative forcing concept with adjusted stratospheric temperatures has both strengths and limitations. The concept has been used extensively in the climate research literature over the past decades and has also become a standard tool for policy analysis endorsed by the Intergovernmental Panel on Climate Change (IPCC). The concept should be retained as a standard metric in future climate research and policy. However, it also has significant limitations that have been revealed by recent research on forcing agents that are not conventionally considered and by regional studies. Also, it diagnoses only one measure of climate change (equilibrium response of global mean surface temperature). The committee believes that these limitations can be addressed through the introduction of additional forcing metrics. Table 4-1 gives a list of these metrics and summarizes their strengths and limitations. Detailed discussion of each is presented below.


    Global-annual mean adjusted radiative forcing at the top of the atmosphere is, in general, a reliable metric relating the effects of various climate perturbations to global mean surface temperature change ….”

  255. Paul Briscoe:

    Thanks everyone for your comments.

    Chris Colose, I understand the point you make regarding the the definition of forcing ( I’m a former microbiologist, so physics was not a speciality for me!) and I can see that if you increase CO2 concentration from A to B it will lead to a specific rise in temperature regardless of the rate of increase.
    What I’m trying to get my head around is this: in the present situation, where CO2 is rising fast, global temperature always has a lot of catching up to do to get towards equilibrium, whereas during glacial/interglacial transitions, where CO2 increased very slowly, the amount of “residual warming” from CO2 (for want of a better term) still in the pipeline at any particular point in time would surely be small. Does this make sense and if so is there a better way of making the point?


  256. David B. Benson:

    Paul Briscoe @255 — Yes, since the forcing was small the climate stayed closer to a relaxed, i.e., equilibrium, state at most times. There were exceptions. The most notable was during Younger Dryas but to a lesser extent during the so-called 8.2 kya event. In addition, during meltwater pulse 1A the sea level rose so fast then the climate must have been somewhat out of equilibrium. Despite these pertubations, the general idea of a smooth S-shaped transition from LGM to Holocene over about 5000+ years generally fits well with rathr simple linear model concepts; nothing fancy required to understand it.

  257. jyyh:

    #243 I knew it! erimaassa.blogspot.com/2011/02/waste.html.

  258. dhogaza:

    Sphaerica (Bob)

    In some ways, it does exactly what I was asking of Gavin, except that it is a general answer. I was (and still would like to see) a discussion of one, specific, actual climate model, how it is designed and how it operates.

    Perhaps if you were to outline what is missing from the NASA GISS Model E documentation and links to underlying papers (for the physics), maybe they could add documentation that would help you out?

    Or are you asking about the model framework (i.e. the way it manages timesteps etc). For me, that’s the easy part, maybe you’d like to see a primer on how discrete time-step modeling works and is managed in software?

  259. Hank Roberts:


    —excerpt follows—-

    “… Kiehl ran his model of the ancient climate with clean skies, and found that the cold-pole problem largely disappeared. With clouds forming in unpolluted air, the poles warmed up much more than the tropics, giving a climate within a few degrees of the one that actually existed.

    The resulting model is a very good match for the Palaeocene-Eocene thermal maximum, says Paul Pearson of Cardiff University in the UK, who uses fossil animals to study past climates. Pearson says Kiehl’s model is the first to reproduce the temperature distribution revealed by the fossils.

    “It’s reassuring,” Kiehl says. “If this is the explanation, there isn’t anything drastically wrong with our climate models.”

    Kiehl presented his work on Tuesday at a Royal Society meeting on warm climates of the past in London.

  260. J Bowers:

    Rick Perry officials spark revolt after doctoring environment report

    “The scientists said they were disowning the report on the state of Galveston Bay because of political interference and censorship from Perry appointees at the state’s environmental agency.

    By academic standards, the protest amounts to the beginnings of a rebellion: every single scientist associated with the 200-page report has demanded their names be struck from the document.
    But scientists said they still hoped to avoid a clash by simply avoiding direct reference to human causes of climate change and by sticking to materials from peer-reviewed journals. However, that plan began to unravel when officials from the agency made numerous unauthorised changes to Anderson’s chapter, deleting references to climate change, sea-level rise and wetlands destruction.
    Mother Jones has tracked the changes.”

    Mother Jones: Perry Officials Censored Climate Change Report

  261. Hunt Janin:

    Can anyone give me the date and details of the first scientific study of sea level rise?

  262. J Bowers:

    More background on the censored Texas report, which was a continuation of ongoing research including peer reviewed:

    The Holocene evolution of the Galveston estuary complex, Texas: Evidence for rapid change in estuarine environments. Anderson et al (2008). Geological Society of America

    Apparently, the agency staff had approved the report, but the managers directly appointed by Perry were the ones who censored it.

  263. Hunt Janin:

    I’m not a scientist but do have a background in international relations and in history. This leads me to believe that there is virtually no chance that global warming can be slowed appreciably, let alone stopped, in the foreseeable future. If so, what will the consequences be?

  264. Hank Roberts:

    > Hunt Janin
    > … what will the consequences be? …

    Did your background lead you to expect success when international controls on chlorofluorocarbons was proposed? Have you been you surprised by the result?


  265. Jack:

    How about this? Christopher Columbus was partly responsible for the Little Ice Age? Amazing.


  266. Hank Roberts:

    > Columbus
    Hm, do Nevie and Kaplan have published papers? Do they cite Ruddiman?

  267. john byatt:

    Comedy in Australia
    I love this Pickering cartoon, I’d tell em the same thing, my unprinted comment was, “the only thing missing is the thrown away sceptic papers in the bin…. Here are a few that Pickering can use,

    “leprechauns cause global warming”
    “Ice age coming Dec 2011″
    “Plants on Venus growing with extra CO2 and loving it”


  268. David B. Benson:

    Tonight there was quite a post-sunset show to the south of west. The horizon was red, above that orange and then yellow. Looking carefully I could just pick out the green band before the upper sky became blue, tending to midnight blue (as the solor is called). It wasn’t an extended rainbow but in its own way even more spectacular.

  269. Hank Roberts:

    “One of the reasons CO2 fertilization may have accelerated plant growth in parts of Europe and North America over the past few decades may be the fact that we’ve inadvertently been fertilizing the plants with nitrogen, as well as CO2. We’ll actually be talking about this in GEOB400 in a couple weeks. For the 6.7 billion or so of you who were unable to register this semester – yes, yes, the class is too small, I hear that all the time – I wrote about this on Maribo a few years ago, in a cross-post with Eli and Tamino ….”

  270. Paul S:

    Hunt Janin – Can anyone give me the date and details of the first scientific study of sea level rise?

    Your best port of call for that kind of information is probably the IPCC First Assessment Report: http://www.ipcc.ch/ipccreports/far/wg_I/ipcc_far_wg_I_chapter_09.pdf

    Looks like there were a couple of studies which identified long-term sea level rise in the 1940s, possibly prompted by Callendar’s work(?).

  271. Pete Dunkelberg:

    Colombus and the LIA?
    Faust, Franz X., Cristóbal Gnecco, Hermann Mannstein, Jörg Stamm, 2006: Evidence for the Postconquest Demographic Collapse of the Americas in Historical CO2 Levels. Earth Interact., 10, 1–14.
    doi: 10.1175/EI157.1 – online at

    Don’t forget the Amazon region.

  272. Hunt Janin:

    Can anyone give me a definitive statement on the relationship between sea level rise and storm surges?

    I think I understand this relationship in general terms but want to be belt-and-braces sure before I go into print.

  273. JCH:

    Hunt Janin – at Climate Etc. Fred Moolten had series of comments about SLR and storm surges. Can’t say they were enthusiastically received, but they made sense to me.

  274. David B. Benson:

    From several papers I won’t attempt to provide links to here one learns that over the satellite era:
    (1) global precipitation has remain constant;
    (2) mid and high latitude precipitation has increased, globally;
    (3) semi-tropical and tropical precipitation has (therefore) decreased globally.

    I find point (3), especially the tropical precipitation decrease, averaged over the entire globe, difficult to understand. I’m not doubting the results of the observations, just not comprehending it. Takers?

  275. Hank Roberts:

    > precipitation

  276. Hank Roberts:

    “… At this point, we don’t know if the Great Filter is in front or behind us. http://en.wikipedia.org/wiki/Great_Filter [wikipedia.org]. The basic idea of the Great Filter is that the easiest explanation of the Fermi Paradox is that there’s some set of events that make life unlikely to reach the interstellar level. That could be behind us, if for example life arising is unlikely or multicellular life arising is unlikely. But at least some filtration has to be in front of us….”

    Hat tip for the observation to Joshua Z. at Slashdot, in

  277. ldavidcooke:

    RE: 276

    Hey Hank,

    The Hanson “Filter” description would appear to have multiple holes. We have been given to find several non-organic sources for amino acids that could under specific ionic influences, form RNA bundles. By the same token there is no evidence so far to suggest that the amino acids found in stellar material may not have been organic initally. So to say that one observation is exclusive of alternate explanations is simply silly.

    As to the premise that for a given set of varibles that there are unknown gating factors should be obvious. The real work requires observation, such that for a given collection of baryonic particles the local conditions may result in a predictable fashion. It is when mankind first specifies the process and then observes the real world, knowledge becomes someone’s pipedream.

    Life as has been found on this planet is not a pipedream. However, it is predicated on several hard and fast rules. Who is to say that what we currently see could not have been built from a former experiment, destroyed in a previous Nova.

    It is likely in the 4.8Gy since the Pre-Sol Nova, some portion of the RNA bearing matetial would have journeyed sufficiently to have populated a similar life form elsewhere.

    This would suggest if conditions are ripe for life to arise here then it should also arise in a nearby locality. If not, then it would suggest that the local conditions are truely unique.

    If local conditions are unique and we have not identified them, it suggests we have much more work to do to discover what makes the local conditions different. We are now searching for water and are contemplating changing over to liquids. It is more likely that a combination of temperature and radiating energy intensity will be the gating element. (Elsewise, why not look to magma for a life form.)

    Given this, life as we contemplate it is tenious at best. Yet, life abounds on this planet without regard for the gating factors we currently apply when searching for life in the cosmos.

    In the end life here will be consumed in the photosphere of a dying star. The local conditions will have been consumed as fuel and the remains expelled from the vacinity at a very rapid rate. As we currently exist in a hydrogen sparse part of the Galaxy the chance of renewal of local life is small. The “Great Filter” is likely the same for us as it is for our Galaxy and eventually the Universe. If not entropy, then accelleration to beyond baryonic matter physics rules.

    My question to you is, “What does the introduction of the “Great Filter” have to do with unforced variations”?

    Dave Cooke

  278. Kevin McKinney:

    #276–Dave, surely it’s obvious? Disastrous ecological mismanagement has been proposed as a filter mechanism more than once. In that case the filtering would certainly be in front of us. (And note that extinction is not logically necessary to achieve “filtering,” since humans could potentially survive for a very long time even if a technologically advanced civilization failed to do the same. If we aren’t ‘visible’ at interstellar distances, we don’t count vis a vis the Fermi paradox.)

    On a cheerier subject (to me, especially), a milestone: I just noticed that my summary review of Dr. Archer’s The Long Thaw has now passed its thousandth page view. One slightly muted cheer, and many thanks to RC readers who helped make that happen. . .

  279. ldavidcooke:


    Hey Kevin,

    In context, life not civilization or social colonys…

    As to ecological filtering, even bacterium are limited by their waste products. It is the diversification/population vs predidation and local conditions (resources and waste disposal) which acts as the filter in the case of ecological concerns. (If you advance to discussing technical human societies, the rules remain the same, just the name of the category changes.)

    Dave Cooke

  280. Meow:

    @276: On the Fermi Paradox, I’m most convinced by two arguments. The first posits that most civilizations that develop technology that’s radio-visible over many light-years use it for only brief periods, until they devise something better — like fiber-optic cables. The second posits that only beings capable of a rare, fine balance between cooperation and competition can collaborate well enough to develop radio-visible technology. Given how we’re approaching AGW, I’d argue that humans are close to the too-competitive edge of the balance.

    CAPTCHA: detelevi Warner

  281. mike:

    [Editorial Comment by Mike Mann: Physicist Mark Boslough compares recent Einstein vindication with repeated vindication of climate science: “Some scientific principles are so powerful that contradictory data become suspect. On closer inspection, the data turn out to be wrong. Einstein’s relativity and human-caused global warming–which requires energy to be conserved–have that aspect in common. Observations of faster-than-light neutrinos turn out to be just as wrong as satellite measurements of a cooling Earth. In both cases the scientists who collected the data made a mistake.”]

  282. wili:

    I would be interested in anyone’s insights into a (maybe) hypothetical situation:

    According to Shakhova (March 2011), the methane coming from the East Siberian Arctic Shelf is about 8 million tons. For the purposes of simplicity I will round this up to 10 megatons. If we multiply this by the 105 times global warming potential of methane (Schindell 2006), we get about one gigaton of gwp from this source per year (compared to about 30 gigatons of CO2 from all direct human activity).

    Reports earlier this year was that there was a ‘dramatic increase’ in methane release from the Arctic which prompted an sudden mission by US and Russian researchers to investigate put together ‘on short notice.’

    My question is, if ‘dramatic’ here ends up being an increase of an order of magnitude, how long would it take for this forcing to significantly affect temperatures, particularly in the Northern Hemisphere?

    The answer involves at least two considerations that I can identify:

    How fast would the methane mix into the broader atmosphere?

    How long would it take for the heat to build up that would be held in the troposphere by this new quantity?

    (I suppose that whether this is a one-time emission or a new and increasing level of regular annual emissions from the Arctic would also have an effect on the answer.)

    Thanks ahead of time for any insights or speculation.

  283. Hugh Laue:

    Congratulations Gavin on your AGU prize. Your scientific integrity is inspirational.

  284. Pete Dunkelberg:

    Mike, I agree in principle but did you give the wrong link? I can’t find the quote.

    By the way general relativity (now with dark matter) also won another round:

    “Proof is in the cosmos: General relativity confirmed (again)
    New findings come from study of light from distant galaxies”

  285. wayne davidson:

    #268 David, reads like clean air… Did you see the dark streaks just above the horizon?

    Does anyone know when we will get ice charts up and running. AMSR-E chose a bad time to break down, there is some hugely interesting flushing going about.

  286. Martin Vermeer:

    Mike #281, the relativistic effect described is called the “Sagnac Effect”. As it is used in the laser gyros that are part of inertial navigation equipment, as a geodesist I should have thought of this myself. It is also a well-known correction in GPS positioning. Damn, there goes my chance of fame ;-)

    One article

  287. Pete Dunkelberg:

    Gavin @ Kate http://climatesight.org/2011/10/18/a-conversation-with-gavin-schmidt/#comments
    is so nice. But Gavin, where do you get those temperatures? (compared to other projections of transient sensitivity mentioned recently)?

    Gavin: Well, we don’t have a perfect crystal ball for exactly what “business-as-usual” means, but the kind of projections that people have been looking at – which involve quite high increases in population and minimal changes in technology – you are talking about global temperature changes, by about 2070, of somewhere between two, three, five degrees Celsius, depending a little bit on the scenario, and a little bit on how sensitive the climate actually is.

    – Pete Dunkelberg

  288. MapleLeaf:

    I suppose that someone has to waste time debunking this latest nonsense:


    High (uncritical) praise received from, surprise surprise Ross McKitrick and Bob Carter.

  289. David B. Benson:

    wayne davidson @285 — The horizon locally is a hilly area a few kilometers to the west and south. The effect wasn’t due to clean air; I know what a properly clean post-sunset should look like. The reddish lower down is due to the aerosols generated in the Willamette Valley. The more yellow-tan might actually be aerosols in the stratosphere; I’ve never seen the green before.

    I don’t recall dark streaks that time but when present I attribute those to clouds between me and the post-sunset.

  290. Dave:

    WUWT have their lead article featuring Anthony Watts attempting to reproduce an experiment In Al Gore’s Climate 101 video. Based upon Watt’s article and video, it appears that it is unlikely that the original experiment in the Climate 101 video was genuine.

    Obviously, this neither proves nor disproves anything – other than the very simple experiment “that can be reproduced in any High School Physics Department” will probably not show what the original video purports to show.

  291. Ray Ladbury:

    Not sure what the WTF post is meant to show. I’m not sure I’d trust Tony Watts to trim his own nose hair without giving himself a lobotomy.

  292. Dave:

    #291 Ray

    … For what it is worth, I predict that the “story” will find its way into the MSM (probably via Fox) with the angle being that this is more junk science promoted by Al Gore and that it is riddled with “untruths and misrepresentations”. This will then lead to others to question why demonstrably “fake” experiments are used as “propoganda” in this way.

    This will then have fall out elsewhere in the MSM and learned people will end up trying to explain to lay people why the theory is correct – but the experiment shown in the video is flawed, wrong, or whaever else you want to say about it. At some point in the not too distant future you will find yourselves discussing it.

    …. All of which could have been avoided if someone did their job properly, actually did the experiement properly, checked the results properly and proved their work before they committed it to being published and distributed.

  293. J Bowers:

    Re, 290 Dave.

    Linda’s first CO2 experiment

    Watts is Gore-bashing and using Gore as a proxy for his adulators to bash climate science. That’s all.

  294. Richard Bird:

    Hi. I asked this question a while ago but I didnt get a clear answer. Sorry I can’t find the original post.

    Q1: Are there any known climate models which can fully simulate using natural forcings alone the reconstructed temperature variations of approx +/- 0.5 deg C either side of the historic average, which are generally agreed to have occurred over the last 1100 years?

    And a new Q2:

    Q2: What would be a reasonable estimate for the reduction in incident solar irradiation on 1 sq m. of earth surface due to dense low level cloud cover, relative to a clear sky condition, either in W/sm or as a percentage of the clear sky irradiation. Assuming: a median date at an equinox, a median time of day say noon +/- 2 hours, a median latitude such as 45 deg North.

    Thanks in advance for any responses.

  295. Pete Dunkelberg:

    Dave, when I mentioned that mess weeks back I was pointed to this:

    As for Watts, if you read all the way to the end Watts admits that CO2 absorbs infrared and gets hot and that this is well known and even that slight traces of CO2 are routinely sensed to regulate room ventilation. In short, the demonstration that CO2 absorbs infrared can not be a fraud. it is simply true.

    There are complete online Chemistry courses (part of “virtual school”) with arrangements to use a high school lab for some experiments.
    Now suppose that you video tapped a basic demonstration A + B Yields C.
    Later you notice that the first part, dissolving A in beaker 1, does not show up very well. Maybe there was reflection off the side of the beaker that you didn’t notice at first.
    So the next day you do that part over. All your beakers were properly washed and put away.
    But you use the beaker that had been first used for B, and it has a scratch. In addition there is now a small spot on your lab coat that was not there before.

    Now along comes some genius like Watts who decides he is a Video Detective and goes all picky on you and screams FRAUD!

    He is automatically wrong. A + B does indeed yield C, and you are simply demonstrating to the class how to set it up. Your video technique is beside the point.

    And in the original case here, Watts knows that CO2 + infrared yields heat. You know it too. Hence, whether Gore is personally handy with the equipment is beside the point.

    This is yet another demonstration of why no one in the reality oriented community pays attention to Watts.

  296. Pete Dunkelberg:

    What is the Walker circulation doing? Is it getting weaker or stronger? Or just moving laterally? This from Nature
    h/t Romm

    “While there appear to be many factors that govern interannual variability in east African long-rains precipitation, convective activity during [the March to June season] has steadily declined in eastern Africa for the past 30 years as the convective branch of the Walker circulation has become more active over the Indian Ocean,” the paper states.

    The authors say the IPCC report got it wrong because of the many unknowns that govern rainfall. Long rains have many inputs that are not well understood.

    Conventional modeling suggests that the tropical Walker circulation will become weaker due to climate change, resulting in more rainfall in eastern Africa. In the IPCC report, 18 out of 21 models predicted greater precipitation in the region. But recent studies, including this one, argue for a strengthening of the Walker circulation. This study uses observational data to show that the Walker circulation has extended westward, which makes precipitation more likely over the Indian Ocean and droughts the norm in eastern Africa.

  297. Septic Matthew:

    281, mike

    Any sensible person would demure and say something like “I appreciate the support, but really you shouldn’t be comparing me to Einstein. Current climate science is anyway not nearly so precise as relatively theory.”

  298. ldavidcooke:

    Re: 297

    Hey Matthew,

    The question is in the limitations of Einsteinian physics, as it improved on the work of Newton… (IE: Post event-horizon, or “dark energy” in the absence of sufficient “dark matter”.)

    As to Climate Science it remains to be seen, IMHO. Have you seen much wrt photo-chemistry or work, (IE: Conversion of Energy); yet…?

    As wrt Dr. Mann’s comment, no doubt! The science is still young, the foundations are well laid, next is the attempt to raise the roof, (and I hope an attempt to leave out the walls).

    Dave Cooke

  299. Maya:


    Not exactly world-shattering news, at least to folks here. I am amused that WTF is already on the attack.

  300. Septic Matthew:

    2011 most transformative biofuel technology:


    It facilitates production of fuel from non-food sources and produces water as a by product.

  301. Bob Loblaw:

    Re: Richard Bird @ 294, question 2

    There is no fixed answer, because “dense” is not a fixed optical property. I’ve seen things so dark that the number is probably at most a few percent. I’ve seen heavy overcasts that let a lot of light through. It is possible to define, measure, and model optical properties of clouds so that a more specific question can be answered. “Optical depth” would be a good place to start a Google search.

  302. Witgren:

    This is making the rounds on the internet, hopefully RC can make some preliminary comments on it while it wends it way through peer review:

    “Global warming ‘confirmed’ by independent study”


    [Response: Thanks, I had not seen that yet. Amusing that Muller is surprised that their results match that of several independent groups. He could have figured this out in about five minutes, as I did (see here). Of course, then he wouldn’t be getting all the attention he is getting. Sheesh–eric]

  303. Dan:

    In case this has not already been posted, a new “independent” study “confirms” global warming:

  304. wayne davidson:

    #289 David,

    A way to judge clean air is the sun disk colour at sunset, very bright slightly yellow was the pre industrial age norm. Not red neither orange. As bright as impossible to look at. My blog deals again with very strange high altitude cloud streaks perpendicular to the sun position. I believe these are cloud precursors. Massive CCN aggregates in the lower stratosphere. Until proved otherwise I call them cloud seeds. I am interested in their observations outside of Canada. http://eh2r.blogspot.com/

  305. CM:

    Eric @302,

    But their slightly faster-warming “very rural stations” (discussion at Tamino’s) are another easy-to-grasp nail in the coffin of UHI-based denial (and looks set to be more widely publicized than Menne et al. 2010). Isn’t that a redeeming quality?

    [Posted twice. How could my reCaptcha response “rchob wayahuga” possibly be incorrect? :-) ]

  306. CM:

    On the other hand, deniers will probably just switch to blaming global surface temperature records for RHO (Rural Heat Ocean) bias. They will excoriate GISS and Hadley/CRU for hiding the Sheep Albedo decline. Watch WUWT for a new citizen science project with photo galleries of how shockingly, shoddily shorn of sheep rural stations have grown across the country.

  307. Hank Roberts:


    “From the second paper:

    The small size, and its negative sign, [of the urban heat island effect] supports the key conclusion of prior groups that urban warming does not unduly bias estimates of recent global temperature change.

    This pretty much confirms the recent work of Menne et al. (2010) and the Watts paper and will hopefully put this issue to bed. Indeed, the third paper, which specifically mentions Watts in the abstract in relation to SurfaceStations, then goes on to show that US station quality makes little impact on the recorded trend….”

  308. Kevin McKinney:

    “How could my reCaptcha response “rchob wayahuga” possibly be incorrect? :-)”

    Well, it certainly SOUNDS authentic. . .

    [“because squilthc”]

  309. Hunt Janin:

    I’ve read that “sea level rise is progressibly changing coastlines. The legal implications for the seaward boundaries between neighboring coastal states are neither straightforward nor forseeable.” (Source: Houghton et al, “Maritime boundaries in a rising sea”, Nature Geoscience, Vol.3, December 2010.)

    If anybody knows something about this issue, I’d like to hear from them for my coauthored book on sea level rise.

  310. CM:

    Hunt #309,

    I know squat about it, but someone’s aggregated a nice set of articles here:

  311. Michael Reisner:


    Need some help. I recently gave a lecture on climate change to a class with over 250 students. About 8 hours after the lecture, the entire class list received an email from a gmail account that is not listed as an email address for any student registered at the university for at least the last three years. The email sender went to great lengths to try to disguise the source of the email.

    Before I use this as a teachable moment for the class next week, I would like to gather as much information as possible.

    Here the text of the email:

    Dear Students,

    Attached to this page is a list of hundreds of peer-reviewed papers that support skeptical arguments against anthropogenic global warming. With all due respect to Dr. Reisner and his information; the list is simply being made available for you to assess the information that has been collected against global warming. An exhaustive search of the provided peer-reviewed papers will show that substantial scientific evidence has been set forth against AGW by many qualified scientists. This is a fact that is not being made available to students in universities by their instructors or their textbooks. Please critically review the information.

    It links to a blog site (Popular Technology.net) with a list of “900 peer reviewed papers” arguing against climate change.

    Is anyone familar with this blog website; including who is behind it etc. I would appreciate any information you can provide.

  312. Septic Matthew:

    I have a question. After reading the Padilla et al paper that I linked earlier, I got to thinking that the transient climate response (over 70 years as Padilla et al used, or over any specific range from 40 – 100 years) was the most important single quantity that needs to be known for public policy purposes, because the equilibrium response will most likely occur over a span of 2,000 – 4,000 years. Paraphrasing a widely cited quote from James Hansen, this is the quantity most relevant to the lives of James Hansen’s grandchildren. If there is, as I suggest, a single most important quantity for public policy purposes, what do you think it is? Perhaps, from thinking about Barton Paul Levenson’s simulations, it might be the expected time to civilizational collapse.

  313. Hank Roberts:

    An interesting collection of papers (titles in English)
    (regret I don’t read Japanese to see what’s written about the papers, but many of them look worth some comments by those who’d follow the subjects covered):

    東京大学大気海洋研究所 横山祐典研究室
    (Atmosphere and Ocean Research Institute, the University of Tokyo)

  314. Ray Ladbury:

    Michael Reisner
    Oh, Poptech. Yeah I know it. Let us just say that I have interacted with this individual on several occasions, and on none of them has he ever passed the Turing test. He will post a list of “400” or “800” or however many papers he claims refute anthropogenic climate change. But if you ask him about any single paper, he merely repeats that he has 800 papers that … He doesn’t realize the Energy & Environment isn’t peer reviewed, doesn’t understand what the papers say well enough to see they don’t support his case. Just doesn’t understand.

    The guy couldn’t find his tuckus with both hands and a GPS. Hates Sourcewatch, hates Snopes, hates any tool that might expose him for the lying libertarian loon he is (IMHO). He can be fun if you have a cruel streak and like to play with your prey.

    My guess, one of your students went crying to him and supplied the list of students.

  315. Paul Tremblay:

    >>It links to a blog site (Popular Technology.net) with a list of “900 peer reviewed papers” arguing against climate change.

    I’ve had to answer to denialists who use this site as well. Sorry I can’t give you specific reasons why poptech is wrong. I usually go through a few of the papers and point out that they don’t support poptech’s position. Then I point out that Oreskes did a study published in *Science,* a reputable journal, and found that 600 articles supported AGW, and none refuted it. The study is a bit old, but the numbers haven’t changed that much.

    Underlining the discrepancy, I state note if poptech were right, then not only is Oreskes wrong, but really, really wrong, and that *Science* would be negligent in publishing Oreskes, or at least in not correcting her.

    I end by asking which source they think correct, an established journal, or a website?

  316. Paul Tremblay:

    Sorry. I should have written that Oreskes found 600 peer-reviewed articles, not mere articles (which everyone here probably knows, anyway).

  317. David B. Benson:

    wayne davidson @304 — Red sky at night, sailor’s delight. In addition, post-sunset in New Mexico (1950s-1960s, before the Four Corners coal burner) red sky was common; plenty of dut in the air.

    In the pix on your site, all I see are the clouds just above the horizon. In any case, that particular evening there were no clouds but obviously there were plenty of aeorsols to scatter the light over the horizon so I could see it.

  318. john byatt:

    #311, Poptech has been banging on about his list of papers for years now,
    Greenfyre may be still watching, most are over it. They do know the name of poptech though,

    post comment here at


  319. john byatt:

    Michael #311 , here is another take down of Poptech (Andrew) at skeptical science,



  320. Hank Roberts:

    > poptech

    “… Carbon Brief … brings us:

    Analysing the ‘900 papers supporting climate scepticism’: 9 out of top 10 authors linked to ExxonMobil

    “Using our paper to support skepticism of anthropogenic global warming is misleading.” Part II of our analysis of the 900+ climate skeptic papers

    Hat tip to http://greenfyre.wordpress.com/denier-vs-skeptic/ for the links

  321. dhogaza:

    Michael Reisner:

    It links to a blog site (Popular Technology.net) with a list of “900 peer reviewed papers” arguing against climate change.

    As Ray said, “poptech” is a well-known crank.

    If you have some time to start skimming the papers you cite, you’ll find that quite a few of them don’t, as Ray hints, argue against (anthropogenic) climate change.

    A few of those examples should make it clear to your students, at least those who are paying attention, that the dude isn’t capable of understanding the papers he cites.

  322. wayne davidson:

    #317 David, the sailors were drunk when signing this fable, Red skies usually indicate moisture, dust or haze, if short lived on a really clear night , horizon red betrays a sun approximately 5 degrees below the horizon from Rayleigh scattering. In Montreal the evening before Irene’s passing had a shinny red twilight after sunset. I have not noticed this good weather forecasting power from red. However, on a clear night you will always see those dark streaks at sunset, very interestingly perpendicular to sun rays, suggesting polarized light effect, aerosols revealed by horizontal light rays is a clue. If you see them all the way to your position from the sun below ground, it means rain, lots of it.

  323. Kevin McKinney:


    IIRC, several authors of papers on his list have stated that their work is not in disagreement with AGW, and some have even asked (to no avail) to have their work removed from the list.

  324. Daniel Bailey:

    @ Michael Reisner

    You may also wish to review this Skeptical Science page for your students as it covers the history of peer-reviewed, published climate science papers:

    The direct link to the visualization itself is:

  325. Hank Roberts:

    Whoever does “poptech” must have a 24- hour search engine running, because like some of the older arcana of Usenet, anyone mentions the name and someone will pop up to comment. Have a look for example at the thread around this post; as someone says a bit before that, a few people chase each other around the Internet posting the same thing over and over. And, lo:
    http://renegadeconservatoryguy.co.uk/global-warming-the-debate/comment-page-1/#comment-10536 (quite typical of many similar exchanges out there).

  326. Former Skeptic:

    Michael Reisner @ 311:

    Poptech (Andrew) has been debunked in numerous places. Greenfrye’s posts listed above would be the best place to start for a detailed explanation to the list’s flaws.

    If you want to see the sort of unpleasant crank Poptech is, feel free to observe when he visited PZ Myers’ blog (starting from post #197). Notice how deranged he becomes when challenged with facts, albeit with R-rated language typical at Pharyngula. To quote PZ himself when he addressed poptech:

    Your credibility is in tatters. All I see here is evidence that you’re a histrionic idiot.

    Go away. One of the surest kooksigns I know is when a thread is taken over by a monomanical nitwit who keeps it going and going and going with his inane obstinance…and there you are.

  327. Susan Anderson:

    Wayne Davidson, interesting, you appear to provide some science to back up this old saying:

    “red at night, shepherd’s delight
    “red in the morning, sailors take warning”

    Though I take this kind of thing with a grain of salt, it is one useful input.

  328. Hank Roberts:

    > red sky

    That observation goes way back; the general circulation of the atmosphere from west to east means most weather comes from the west.

    Scholar turns up among many others:

    Mnemonic strategies: Creating schemata for learning enhancement
    PS Goll – Journal article; Education, 2004 – questia.com
    … (19) A mariner’s rhyme, “Red sky at night; sailor’s delight;/Red sky at morning, sailor take warning,” will find a Biblical association in Matthew 16:2-3 “When it is evening you say, ‘The weather will be fair, for the sky is red.’ And in the …”

    Meteorology and Weather Lore
    D Brunt – Folklore, 1946 – JSTOR
    … If at sunset the western sky is a dull grey, we conclude that bad weather is coming up from the west. The red sky at morning which threatens bad weather is generally of a totally different character, and is of the nature of rather thick and fairly high cloud illuminated from below. …

  329. Septic Matthew:

    314 Ray Ladbury: and on none of them has he ever passed the Turing test.

    That is an amazingly good dig! applause, applause.

  330. RussH:

    Can anyone explain to me why global temperatures decreased from ~1940 to ~1975? I’ve been told that this was due to aerosol particles or sulphates in the atmosphere but from what source? Industry, WW2, natural?

    OK, so if this was the reason for the dip then why does it mirror the trend in the global temperatures of that effected by the Atlantic Multidecadal Oscillation? How can one explanation ignore the other? And also the rise after 1975 mirrors the pattern of the AMO going into a warm phase again? How can that be? Surely if the forcing of the AMO and GHGs is a combination then the GHG forcing may be much lower than expected for one reason or another?

  331. Hank Roberts:

    For RussH:



    Environmental and Energy Engineering
    Insights on global warming John H. Seinfeld
    Article first published online: 11 OCT 2011
    DOI: 10.1002/aic.12780

  332. wayne davidson:

    Hank, the power of words transcend generations, even when they are false. In this case they might explain the Bermuda triangle, graveyard for many ancient ships. A sailor in the Caribbean seeing red sky from the West at night, might relax, smoke his pipe in delight, but in the morning fighting the wrath of a Hurricane raging from the East. Fascinating that even a science paper put some credence to it, although red sky can be seen between cloud systems moving from the West but with baroclinic driven winds from the South bringing thunderstorms so loud. In essence the clearest nights usually indicate a nice next day, but I give it 50% probability. Its the clue in the sky that one must look for, fast moving cirrus usually have a story to tell.

    >The red sky at morning which threatens bad weather is generally of a totally different character, and is of
    >the nature of rather thick and fairly high cloud illuminated from below. …

    This one is better, but meaning eastbound weather, which indicates a likely nice day (if staying in place) following the Cirrus at the end boundary of a weather system. The lower clouds are clearing eastwards making the lower sky horizon open but the top blocked. This way only red gets through and the bottom end of high clouds are magnificent with red hues like a Van Gogh painting.

  333. John E. Pearson:

    Haven’t been on here for a while.

    I don’t know if the results of BEST have been discussed here today or not.


    Here’re the final two paragraphs of Muller’s column in the Rupert Murdoch owned Wall Street Journal today. This is interesting because Anthony (is that the right name) Watt’s agreed that he’d agree with whatever Muller found.

    When we began our study, we felt that skeptics had raised legitimate issues, and we didn’t know what we’d find. Our results turned out to be close to those published by prior groups. We think that means that those groups had truly been very careful in their work, despite their inability to convince some skeptics of that. They managed to avoid bias in their data selection, homogenization and other corrections.

    Global warming is real. Perhaps our results will help cool this portion of the climate debate. How much of the warming is due to humans and what will be the likely effects? We made no independent assessment of that.

  334. Brian Dodge:

    I’ve been reading Curry’s and Watts blogs on the release of Berkeley Earth Surface Temperature papers, and it appears that one new meme is “we skeptics never actually said there’s no warming in the last n years, or since the Little Ice Age, which this recent data is just part of; we question the accuracy of the trends (more or less positive, but which may include zero), and the attribution to anthropogenic CO2″. There are still some “I can safely claim that the conclusions of these four papers are false – without even reading them” diehards, and a little goalpost moving – “In concluding, I will remind everyone that the REAL problem with the surface temperature data set lies with the ocean data.”

    recaptcha – riskty Springer (well, I think its funny)

  335. David B. Benson:

    wayne davidson — Here is the Pacitic Northwest, the sailor’s have it about right. While there aren’t any sailors in New Mexico, one can replace sailors by shepards and have it about right.

    The only vertically oriented feature associated with sunsets that I’ve observed are sundogs, rather rarely.

  336. Hank Roberts:

    PS for RussH, another fellow put forth the same notion you did about the 1940s-1970s and ocean temps; see the response inline: http://www.realclimate.org/index.php/archives/2011/10/global-warming-and-ocean-heat-content/comment-page-6/#comment-217342

  337. Susan Anderson:

    don’t know where my childhood meme got distorted, but I think the shepherd typo is probably UK or elsewhere in Europe.

    BEST is interesting – Muller chasing fame? Gets credit no matter what. Is the data treatment really either new or useful? Just asking.

  338. Hank Roberts:

    Ever feel a little funny when you reply yet again to rebunking? I do. I think this guy’s figured it out:


    “… whoppers and true tales of science illiteracy …..”

  339. Hank Roberts:

    At Moyhu’s blog, Saturday, October 22, 2011
    A combined KMZ file for BEST, GHCN, GSOD and CRUTEM3

  340. David B. Benson:

    Susan Anderson @337 — BEST pushed the instrumental record back to 1800 CE. That might be useful.

  341. wayne davidson:

    335 David , Horizontal dark streaks, although vertical ones break the monotony. Look carefully at the
    sun position, you will see them perpendicular to sun rays.

    #327 there was ancient science in it, which surprisingly tried to simplify complexities of weather.
    sounds familiar….. I wish hearing more perhaps other sayings fare better.

  342. kevin McKinney:

    #328–“D. Brunt–JSTOR–”

    That would be Sir David Brunt (though in his pre-knighthood days.) He served as President of the Royal Met Society, and–interestingly for climate change buffs–was part of the committee who administered the curious “viva voce” examination Guy Callendar survived as part of the publication process for “The Artificial Production of Carbon Dioxide. . .” in 1938.

    That’s a story I retail, along with much other “curious and forgotten lore” in my article on the history of the science of back-radiation and surface radiation:


  343. Hank Roberts:

    http://www.achangeinthewind.com/2011/10/translating-climate-science.html recommends and points to

    Good one.

  344. Edward Greisch:

    304 wayne davidson: Check http://spaceweather.com. 2 July 2011 “NOCTILUCENT CLOUDS: Last night, a bank of rippling electric-blue noctilucent clouds spilled across the Canadian border into the lower United States. In doing so, the glowing clouds made the farthest excursion of the year away from their usual polar realm.  Reports of bright NLCs are coming in from Oregon, Washington, Montana, Minnesota, South Dakota and elsewhere.  Visit http://spaceweather.com for images and observing tips.  ”
    “I took my camera to a spot along Washington’s Hood Canal for a panoramic view,” says Rosenow. “It was a visually stunning display that stretched as far as the eye could see.” NLC reports are also coming in from Oregon, Montana, North and South Dakota, and Minnesota, and in Europe as far south as France. (Stay tuned for updates.)
    Back in the 19th century, these mysterious clouds were confined to polar regions. In recent years, however, NLCs have spread toward the equator, appearing in places such as Utah, Colorado, and perhaps even Virginia. Is this a sign of climate change? Some researchers think so. Sky watchers at all latitudes are encouraged to be alert for electric blue just after sunset or before sunrise; observing tips may be found in the 2011 NLC gallery.
    UPDATED: 2011 Noctilucent Cloud Gallery
[previous years: 2003, 2004, 2005, 2006, 2007, 2008, 2009]
    311 Michael Reisner: I have heard of software that backtracks stuff like that, but it was many years ago. Named Angel or Devil or Diabolo or something like that. Check with Your local cyber-warfare branch …..

  345. Pete Dunkelberg:

    “red sky at night” and all that – color words may not mean the hue they do to you, especially from old sources.

  346. prokaryotes:

    Compare the information about Anthony Watts from Wikipedia and Sourcewatch



  347. john byatt:

    Anyone like to put this from Best Paper #4 into laymen language, Afraid that i will get it wrong. thanks in advance

    Given that the 2-15 year variations in world temperature are so closely linked to the
    AMO raises (or re-raises) an important ancillary issue: to what extent does the 65-70
    year cycle in AMO contribute to the global average temperature change? (Enfield,
    2006; Zhang et al., 2007; Kerr, 1984.) Since 1975, the AMO has shown a gradual but
    steady rise from -0.35 C to +0.2 C (see Figure 2), a change of 0.55 C. During this same
    time, the land-average temperature has increased about 0.8 C. Such changes may be
    independent responses to a common forcing (e.g. greenhouse gases); however, it is
    also possible that some of the land warming is a direct response to changes in the
    AMO region. If the long-term AMO changes have been driven by greenhouse gases
    then the AMO region may serve as a positive feedback that amplifies the effect of
    greenhouse gas forcing over land. On the other hand, some of the long-term change in
    the AMO could be driven by natural variability, e.g. fluctuations in thermohaline
    flow. In that case the human component of global warming may be somewhat

  348. isotopious:


    Hank, the temperature was dead flat during that period, because there was a series of La Nina events / cycles. Now if you want to argue that the temperature should have fallen during that time, you might want to remember that you are comparing that cooling forcing on the back of 20000 years of warming, so you’d hardly expect it to immediately start going into an ice age now, would you? I’m pretty sure these things have time signatures!

  349. J Bowers:

    For Michael Reisner, pass these on to the students:

    * Meet The Denominator
    * Poptart’s 450 climate change Denier lies
    * Poptech’s list of Confusion

  350. Hunt Janin:

    I’m now fnishing up my coauthored book on sea level rise and have been looking, in vain, for a happy-clappy, positive conclusion. Alas, my best judgment is that, with the exception of the Netherlands and to a lesser extent the UK, no other country is doing very much at the national level. If so, it follows that sea level rise of a least 1.5 meters (probably more) will occur by 2100. If you disagree with this conclusion, please tell me why.

  351. kevin McKinney:

    Hunt (#350), can’t quarrel much with the conclusion per se–though, as phrased in your comment, there’s an implied confusion between mitigation and adaptation efforts. That is, the UK and Netherlands efforts mentioned are presumably adaptive–but it’s the *mitigation* efforts (of which there are surely other nations with noteworthy efforts) which will actually affect SLR.

    But I doubt that observation applies to the book itself.

  352. wayne davidson:

    #344 Edward, Nice examples of Noctilucent clouds, now look at my blog http://eh2r.blogspot.com/, and or at your sunset horizon sky, these streaks are unique in two ways, one they are black, blackish and 2, they streak horizontally perpendicular to sun rays. I believe these are aerosols highly likely CCN’s from thundercloud overshoots. They cant be seen at any other time than during twilights especially at sunset. Digital cameras have tremendous time in capturing them, even under powerful magnification. Sort of like the opposite of infra-red which is enhanced by CCD’s, blackish particulates do not emit light.

  353. Paul S:

    john byatt – Don’t rely on everything I say being completely accurate but I think I can provide some context to the quote:

    ‘AMO’ refers to Atlantic Multidecadal Oscillation – a hypothesised quasi-periodical source of multidecadal variability in surface temperatures around the North Atlantic. I believe I’m right in calling it “hypothesised” because it has only been observed in a statistical sense, without any strong physical theory to support a mechanism that drives temperature changes.

    The AMO has been observed via analyses of detrended historical North Atlantic SST records, which are often called AMO indexes. The BEST paper is a correlation study between an AMO index and their land surface temperature data, although they actually focus on short-term fluctuations (2-15 years) rather than the multidecadal trends. To enable this focus they suppress longer-term trends in the AMO and land surface data.

    They find strong correlation between both datasets and use it to infer that natural ocean SST fluctuations in the North Atlantic may be a key factor in short-term land surface temperature fluctuations. This holds some interest because changes associated with ENSO are typically thought to be more relevant at this scale, but the BEST team seem to conclude that the AMO link-up is stronger. The quote you posted is where they go on to speculate that this may mean some of the warming trend of the past few decades has likewise been driven by AMO changes. The paper itself doesn’t really look at that issue. Note this passage from the paper:
    ‘Correlation does not imply causation. The association between Atlantic sea surface temperature fluctuations and land temperature may simply indicate that both sets of temperatures are responding to the same source of natural variability. However, it is also interesting to consider whether oceanic changes in the AMO may be driving short-term fluctuations in land surface temperature. Such fluctuations might originate as instabilities in the AMO region itself, or they might occur as a non-linear response to changes elsewhere (such as within the ENSO region).’

    One point of interest is that the AMO index fluctuations even track the land surface temperature downturn associated with the 1991 Pinatubo eruption, whereas ENSO does not. Given that this event is undoubtedly caused by an ‘external’ forcing this is strong evidence that at least some of the short-term AMO index fluctuations are simply reactions to the same ‘other’ forces driving land surface temperatures – the fact that it correlates in this case counts against it as a causal factor. As far as I can see, this issue isn’t mentioned in the paper.

    In the bigger picture I think I’m right in saying that climate scientists in general believe that some fraction of Northern Hemisphere warming over the past few decades has been caused by Atlantic ocean circulation changes.

  354. john byatt:

    Thanks Paul, to me they seemed to be jumping from “some land warming”
    to “global warming” much appreciated and I am sure that it will be the main point that the skeptics will pick up on if any of them bother to actually read the report.

    again appreciated

  355. wayne davidson:

    Has there ever been an experiment which most people can do at home mimicking the greenhouse effect?

    Well for one I tried about .5 ml of sugar mixed with 500 ml of water. 1 minute in the microwave (not to boil). temperature rose from 10.5 to 25.5 C.

    The other way is to test water alone. Which went from 10 to 27 C.

    Exciting that 1 part per thousand has a consistent effect, microwave boiling is a bit complex. Not that it mimics exactly the warming of the atmosphere. The point is Microwaves next to Infrared on the electromagnetic spectrum chart, causes a molecular interaction which differs when water chemistry is very slightly altered, clearly has potential to discard “CO2 is a trace gas” argument which rages on the contrarian side.

  356. Septic Matthew:

    I thought I would take this opportunity to say good-bye, congratulations again on the recent award, and thanks for letting me post here.

    yours truly,


  357. john byatt:

    # 353 Paul. I did not realize that AMO was Muller’s pet theory.

    coming together.
    believe that Tamino may have something on it soon


  358. CM:

    Septic Matthew #356,

    Are you leaving us? You know, I’ve long thought of Septic Matthew as two different people. Septic’s just another guy who quibbles with the consensus over trivial points (‘acidification’) or rebunks dime-a-dozen ‘skeptic’ talking points. Matthew, on the other hand, takes a real interest in the science, looks up sources, boosts solar energy, and has above all been unfailingly courteous in his exchanges here. I’ll miss that guy, even if I think he worries way too much that saving the planet might burden the economy.

  359. Paul S:

    john byatt – Perhaps you’re referring to Eric’s post over at tamino’s place? I’ve posted there (awaiting moderation) asking Eric to check if what I’ve said above is accurate.

    The AMO as a multidecadal driver of internal climate variability has been widely discussed in the literature for a decade or two. See for example Knight et al. (2005), coauthored by our very own Mike Mann. This 2006 RealClimate post is also highly relevant.

    The apparent innovation of the BEST paper is to conclude that the same processes may also be responsible for short-term climate fluctuations.

  360. John E. Pearson:

    355 Wayne said about trace gases:

    Dunno if it has the “potential to discard “CO2 is a trace gas” argument” or not, but here is a related thought: The fact that CO2 is a trace gas is what makes it possible for us to actually alter its concentration.

  361. Hank Roberts:

    >I’ll miss that guy
    +1 (and hoping it’s just a change in pseudonym)

  362. Hunt Janin:

    I’m now working on the final Conclusions chapter in my coauthored book on sea level rise. If anyone has any thoughts on what SIMPLY MUST be in this chapter, please send them to me.

  363. wayne davidson:

    360, spot on John. i have improved my little experiment and it works well, but its not quite ready for the kitchen.

  364. Hank Roberts:


    From a very badly written article (by a writer who clearly doesn’t understand chemistry or software) obscuring what might be a good product:

    “… The Nest is the iPod of thermostats…. Artificial intelligence figures out when to turn down the heat and when to jack up the air conditioning, so that you don’t waste money and perturb the ozone when no one is home, or when you’re asleep upstairs. You can communicate with the Nest from your smartphone, tablet or web browser.

    Fadell promises the Nest will pay for itself within a year or two of use, and ultimately save you up to 30 percent of your utility bill….”

  365. wili:

    “If anyone has any thoughts on what SIMPLY MUST be in this chapter, please send them to me.”

    feedback and uncertainties.

  366. John E. Pearson:


    “Unusually heavy flooding and typhoons have taken the lives of nearly 800 people and affected more than eight million in an arc that stretches across Thailand, Cambodia, Laos, Vietnam and the Philippines, according to a tally by the United Nations Office for the Coordination of Humanitarian Affairs.

    Another 100 people were said to have died in Myanmar, where reports are difficult to verify.”

  367. SteveF:

    Just a minor bit of blog amusement, but I notice that Judy Curry has posted some reviewers comments on her reply in the back and forth over her “uncertainty monster” paper. The reviewer writes, quite reasonably:

    Curry and Webster state: “The heart of our argument is that the broader scientific and other technical communities (beyond the field of climate science) have higher expectations for understanding and characterizing uncertainty than has been portrayed thus far by the IPCC.” Curry and Webster are entitled to their opinion, but it is inappropriate to “speak for” scientists outside the IPCC process and from other technical fields as a whole without citing support for such claims. Their views should be represented accurately as the views of two individuals, rather than as the unsubstantiated collective view of diverse scientists and scientific communities.

    At the bottom of her post, Judy writes:

    My statement about the broader scientific community refers to YOU (the climate etc denizens).

    This is a fairly, er, idiosyncratic interpretation of what “the broader scientific community” means. I’m not sure if you asked most people what was meant by this, they will immediately reply, “ah yes, it’s Oliver Manuel and the assortment of cranks who populate Judy’s blog comments”. Perhaps a more reliable measure might be found here:


  368. dhogaza:


    This is a fairly, er, idiosyncratic interpretation of what “the broader scientific community” means.

    No shit, Sherlock. Man, I wonder what the reviewers would’ve written if they’d understood that the “broader scientific and technical communities” refer to anyone who can muster up a keyboard and an internet connection?

    Curry is just … weird.

  369. Deep Climate:

    Here is the latest of my continuing coverage of Ethical Oil (the meme, as well as the organization and blog, all inspired by Ezra Levant’s 2009 book of the same name).

    The Ethical Institute on oil sands emissions


    Today I’ll take a detailed look at the Ethical Oil position on the oil sands carbon footprint, as seen in former spokesperson Alykhan Velshi’s error-filled and confused post entitled Mythbusting: Are the Oilsands Major greenhouse Gas Emitters?, part of his “Myths and Lies” series.

    I’ll focus on the two most significant problems in Velshi’s piece:

    * Velshi’s original premise was that not only are oil sands greenhouse gas (GHG) emissions relatively insignificant, but that they are actually declining. This has been partially corrected, presumably in response to my initial commentary on this issue, but in such a way as to render his argument completely illogical. And Velshi’s conclusion still repeats the utterly mistaken assertion that oil emissions “are falling”, whereas in fact they are rising at a rapid rate.

    * Ethical Oil’s credibility is further damaged by misleading statements concerning the supposedly tiny contribution of oil sands emissions when compared to total global human and natural emissions. This echoes barely veiled climate “skeptic” arguments in Ezra Levant’s 2009 book that started the whole “ethical oil” rebranding effort. And an examination of Levant’s previous statements on climate science would appear to confirm that a strong anti-science stance is not far from the surface, despite the efforts of Ethical Oil spokespersons to hide it.

    Also see:



  370. Hank Roberts:

    Dale asked in the “Moscow Warming” thread how to find this:

  371. Hank Roberts:

    How do climatologists decide when a linear trend isn’t giving the best fit to the information? I know it’s possible to fool yourself by picking what appear to be patterns out of graphs, we’re built to do that (thanks to great-great…grandparents who fled any spooky shadow and thereby didn’t get eaten by the occasional tiger lurking in the leaves).

    But ya know, it’s _so_tempting_ to think there’s a pattern in something like this, about the time the US Clean Air Act really limited sulfate pollution (addressing acid rain):


    (Aside– and how do -statisticians- decide? biologists? etc. — I know statistics is a very new field and still much discussion is happening among statisticians about what we can and can’t say …)

  372. Pissed Off Penguin:

    Beside all this very interesting science, some Belgian enthousiasts raised a funny, non-profit, free website in the hope to give an extra scream to the global leaders on the next climate conference in Durban (South-Africa) at the end of November.
    Please defart your penguin to virtual Durban, pimp your penguin and enjoy the party over there.

  373. Hank Roberts:

    For Poul-Henning Kamp, who raised the JASON report in the Moscow paper thread, here’s some previous discussions of it:

    (copy and paste into a search tool if the double quotes break in this blog)

  374. Radge Havers:

    NPR. Same old same old, but at least somebody noticed:

    Class: M

    “A worthy story, broken by the Houston Chronicle, and picked up here and there, but not by NPR, until today.

    Here’s the headline from some other media outlets that paid attention when it was first news:

    Texas officials censored climate change report
    — New Scientist

    Perry Officials Censored Climate Change Report
    — Mother Jones

    And here’s NPR’s headline:
    Scientists Say Texas Agency Edits Out Climate Change

    See the difference? Subtle, but significant. “

  375. ccpo:

    I have a meeting in 40 minutes and am hoping some of us losers (Just kiddin’) are on-line and can help: Statements of emissions are usually given in gross amounts. Does anyone know off-hand the difference between the emissions the Earth can safely cycle and current emissions? That is, what is the number for emissions that would not add to the total atmospheric carbon vs. current emissions? And, can someone quantify that in a way Joe or Josephine Sixpack can understand?


    [Response: Roughly, as a multi-year average, but with a lot of year to year variation, and assuming no drastic, short term changes in forested land cover, it would be: 0.57 * 9GtC/yr = 5.1 GtC/yr. The 0.57 represents the approximate fraction sequestered from the atmosphere on these time scales.–Jim]

  376. Dale:

    Antarctic Circle has 9 times the ice of the Arctic Circle – WHY? I know that I once saw a write up on this by a denier at one of the GW blogs. It was to prove somehow that some natural phenomena was going on which debunked sea ice melt. Maybe volcano’s under the Arctic ice or something. I can’t remember exactly. I believe I read a good explanation here sometime ago. That it had something to do with the fact that the Southern Hemisphere has more sea area and less land area than the Northern Hemisphere.

    If anybody can give me a link I’d appreciate it. If my post is confusing it’s because I’m confused and haven’t been able to locate what I’m looking for.

  377. Nube:

    I think Jim answered another question. The safe human emissions must be zero!

  378. Daniel Bailey:

    @ Dale

    Umm, that would be largely because Antarctic land ice sits on a small continent a few thousand feet above sea level (in places) whereas Arctic sea ice floats over an abyssal plain thousands of feet below sea level.

    It is more difficult for the warmer ocean water to melt Antarctic land ice because most of that ice is out of it’s reach. So less melts and more accumulates at the “Bottom of the World” than at the “Top”.

    No magic volcanoes or other mythical stuff needed.

  379. Dale:

    Thanks Daniel. I did find more on the subject. The amount of heat generated by the volcanoes was said to be about .1 watt per square meter while the sun could take or give 100 watts per square meter. The heat effect of volcanoes was negligible. It would have little effect of the ice retreating.

  380. SecularAnimist:

    Radge Havers wrote: “NPR. Same old same old, but at least somebody noticed”

    I heard NPR’s report on the Texas censorship case on Morning Edition this morning, and it was even worse than that headline suggests. The reality of anthropogenic global warming was presented as a matter of “opinion”, not of science.

    Likewise, when the case was mentioned in one of NPR’s hourly news updates last night, the report concluded with “Governor Perry and Texas officials say that global warming is junk science” — yes, believe it or not, “junk science” is a direct quote from NPR’s “news” report, and the claim by a politician who has received millions of dollars from the fossil fuel corporations that global warming is “junk science” was presented by NPR “news” as having equal validity to the views of the world’s scientific community.

  381. Radge Havers:

    Journalistic conventional wisdom. It amounts to explaining problems away with a lot of noncommittal dithering presented with just enough authoritative posturing to make the whole ersatz news concoction thoroughly debilitating to anyone unwary enough to mistake it for useful information.

  382. Hank Roberts:

    re the various assertions about aerosols (in the ocean heat topic), perhaps this will help:

    Somethig old:
    As I recall — and I’ll have to dig for the cite, it was a few years back —

    The chemistry of particulates from the first coal era (England and US, 1800s) was different in the atmosphere, because of higher ejection velocity, higher altitude and smaller size, more intense sunlight — all factors driving the chemistry — compared to those emitted by India and China’s industrialization.

    For Bob Irvine, who posted the 2009 link on uncertainty about aerosols — there’s a very interesting update on that uncertainty story, about the temperature difference between tropics and Arctic that’s been so hard to model depends on assumptions that air in prehistoric times wasn’t much cleaner than air is nowadays. When clean air is assumed, that problem goes away:



    “… Kiehl ran his model of the ancient climate with clean skies, and found that the cold-pole problem largely disappeared. With clouds forming in unpolluted air, the poles warmed up much more than the tropics, giving a climate within a few degrees of the one that actually existed.

    The resulting model is a very good match for the Palaeocene-Eocene thermal maximum, says Paul Pearson of Cardiff University in the UK, who uses fossil animals to study past climates. Pearson says Kiehl’s model is the first to reproduce the temperature distribution revealed by the fossils.

    “It’s reassuring,” Kiehl says. “If this is the explanation, there isn’t anything drastically wrong with our climate models.”

    Kiehl presented his work on Tuesday at a Royal Society meeting on warm climates of the past in London….”


  383. Hank Roberts:

    Here’s a good summary on aerosols:

    The net climate impact of coal-fired power plant emissions [PDF]
    D Shindell… – Atmospheric Chemistry and Physics, 2010

    See the full text, the cites, and the footnotes for this paper’s sources for the information in it.

    “… Emissions from coal-fired power plants until ∼1970, including roughly 1/3 of total anthropogenic CO2 emissions, likely contributed little net global mean climate forcing during that period though they may have induce weak Northern Hemisphere mid-latitude (NHml) cooling.

    After that time many areas imposed pollution controls or switched to low-sulfur coal. Hence forcing due to emissions from 1970 to 2000 and CO2 emitted previously was strongly positive and contributed to rapid global and especially NHml warming.

    Most recently, new construction in China and India has increased rapidly with minimal application of pollution controls. Continuation of this trend would add negative near-term global mean climate forcing but severely degrade air quality.

    Conversely, following the Western and Japanese pattern of imposing air quality pollution controls at a later time could accelerate future warming rates, especially at NHmls.

    More broadly, our results indicate that due to spatial and temporal inhomogeneities in forcing, climate impacts of multi-pollutant emissions can vary strongly from region to region and can include substantial effects on maximum rate-of-change, neither of which are captured by commonly used global metrics. ….”

    [paragraph breaks added for online readability –hr]

  384. ldavidcooke:


    Hey Hank,

    Anthracite versus Bituminous, the quality of the fuels changed. The initial boiler/firebox design was for the hotter burning fuels, when costs increased and availability fell, rather then use coked bituminous, they would force feed the softer, faster/cooler burning, into the firebox faster and close the dampers to maintain fire box temperatures, at the same time they would cut back the tank feed rates to maintain pressure.

    End result, high ash, high CO, more SO2/Hg/Se… Burn effinciency was dropped nearly in half. The ’70s-’80s installation of the ESD stack scrubbers/wash apps. extended the initial plant investment.

    About ’78-’85 time frame, the EPA starts pushing for fluidized bed boilers with gasification/coking feed chambers. The industry said no, so the gov. prevents upgrades other then repairs and we are stuck with the pollution and efficiency difference.

    (To convert to fluidized bed systems would up the efficiency. Using pre-treated soft coal, (with heat/steam injection) would reduce some pollution. )

    So far, none of the EPA/Industry designs floated by the Industry/EPA/DOE coal burning replacement programs seriously discussed captue/sequestration of CO2.

    Sorry, for butting in…, please continue…

    Dave Cooke

  385. rustneversleeps:

    @ Jim’s inline comment in #375

    Have to disagree, there. The airborne fraction could be various values other than 0.57 and the atmospheric concentration would continue to rise (likewise for the ocean concentrations). Likewise, if the airborne fraction is “X” it still matters for all values of carbon emissions greater than some quite low threshold level.

    You seem to just take the current emission rate of 9 GtC/yr as a given, so ~ 5.1 GtC could be “safely” cycled. But by that logic then when emissions were 7 GtC/yr, then ~ 4 GtC would have been “safe”. And yet we know that the atmospheric concentrations were inexorably rising when that was the case.

    I think the answer to ccpo’s question is more like 10% (and perhaps closer to 0%) of current net emissions.

    I suspect some miscommunication or misunderstanding going on here. (Wouldn’t be a first for me!)

    [Response: No you are absolutely correct, my mistake. I answered the question of the estimated total amount of emissions that are sequestered, on short term time scales, but that’s not what he was asking. He instead wanted to know how much we could emit and keep the atmosphere at a constant level. It is a much lower number than I gave, the approximate value depending on the time scale in mind, and the changes in the terrestrial and oceanic sinks. On short time scales it’s very low. Thanks for pointing this out. Minor point: the 0.57 is the fraction of emissions sequestered, not those retained.–Jim]

  386. wayne davidson:

    I perfected the microwave experiment replicating trace elements affecting the temperature of an entire water body by adding trace amounts of carbon, sugar, to water. 559 ppm equivalent is as low as i can go. The results were very consistent, by adding sugar to water heated in a microwave, a +0.3C average temperature increase was achieve.

    Water is homogenous compared to an atmosphere of air, but there is little long wave radiation reaching the surface from sun rays directly. microwaves must substitute for Infrared in this case simulating the greenhouse effect in the kitchen. care must be given not to boil water in a microwave, this experiment requires a very small increase in temperature. DO NOT BOIL WATER IN A MICROWAVE.

    After this warning, here is a tentative recipe, it can be perfected

    you need:
    2 plastic cups weighing about 5 grams (each must be weighed exactly0
    0.1 gram of white sugar, borrow .1 g scale from school or lab, (unless someone can figure out how to measure this small amount in kitchen)
    1 very good thermometer capable of reading down to .1 C temperature differences
    obtain exactly 174 grams of water preferably distilled or as pure as possible, twice
    a microwave oven


    fill two plastic cups with exactly the same mass of water, care must be given in weighing the cups,
    weight of cups + 174 grams must be obtained

    after filling them with exactly 174 grams of water, mark each cup,

    mix the water cup 20 rotations with a spoon
    measure its water temperature, 25 c or room temperature water is preferred
    place in microwave, mark the spot where you place it
    heat the water for 20 seconds
    once done without hesitation take the cup out
    mix the water with spoon 20 rotations
    measure the temperature of sample

    repeat the same with other cup, but add .1 grams of sugar
    prior to starting
    place the water with sugar cup at exactly the same spot of the water cup

    the results i obtained were very consistent, sugar sample always was warmed up more than the water one alone from +2 to +.5 C .

    You can keep the cups as is, let them cool down to room temperature and repeat the process,
    making sure water mass is the same. Similar results can be replicated as often as you want.

    Sugar is not CO2 and water is not air, but the fact that you can change the temperature of a volume of matter by adding trace amounts of a chemical may convince those who don’t believe it possible that anything can be be changed with such small alterations, one may call it anthropogenic on a planetary scale.

  387. Pete Dunkelberg:

    Hank thanks for that paper-link (383) and many others.

  388. Marcus:

    “It is a much lower number than I gave, the approximate value depending on the time scale in mind, and the changes in the terrestrial and oceanic sinks. On short time scales it’s very low. ”

    Jim: I actually think that on short time scales, your first answer was right. This is why I don’t like the concept of “airborne fraction” – the airborne fraction, in my opinion, is a deceptive number, which has taken hold because it has for remained roughly constant for several decades – but only due to coincidence. The reason it has remained roughly constant is that emissions have, coincidentally, risen at exactly the right rate.

    My guess is that for any given 5 year period, the average carbon uptake is mainly driven by the difference between the atmospheric concentration and the effective concentration in the oceans and ecosystems. To first order, this difference doesn’t depend on emissions during that 5 year period. If the oceans and ecosystems are taking up about 5.1 GtC per year, then if we drop emissions to zero, the oceans and ecosystems will _still_ be taking up 5.1 GtC per year*.

    So, if we want concentrations to stay constant for the next 5 years, we would only need to drop emissions to 5.1 GtC/year.

    In the long term, this doesn’t work, of course, because the effective concentration in the ocean and ecosystems will rise over time, slowly approaching the actual concentration in the atmosphere, and as it does so the size of the sink will drop. So in the long term, emissions must drop to zero. See, eg, Figure 1 in http://globalchange.mit.edu/files/document/MITJPSPGC_Rpt110.pdf, which I think is a good visualization of sources, sinks, and concentration en-route to stabilization in a reasonably sophisticated model.

    *There is a possible 2nd order effect which might depend on emissions, because of fast equilibration with the upper mixed layer… I haven’t actually tested a carbon cycle model with a dramatic increase or decrease of emissions in a single year, so don’t know to what extent this would be important. My intuition says not much, but I could be wrong.

  389. ldavidcooke:


    Hey Wayne,

    Have you tried a similar experiment with salt yet?

    Dave Cooke

  390. Hank Roberts:

    For David Cooke — can you open this link?


    Or, go to the URL http://www.google.com/search

    (the same way you open the RealClimate website, however that is)

    and type in “microwave salt water” including the quotation marks.

    You’ve said elsewhere you’re using either a very old computer without copy and paste, or an old Android handset

    However it is you’re able to get to RealClimate,
    you can also get to Google

    Then search on what you think you remember.

    “It’s a poor memory that only works one way.”
    — the Red Queen

    Google solves that problem.

    You say elsewhere you used to be an IT manager; I’m sure someone can figure out a way for you to use a search engine to check facts.

  391. John E. Pearson:

    Too funny to miss.


  392. ldavidcooke:



    I ran a similar experiment the difference in density affected differences in microwave heating… My question to Wayne was intended to find out if he had come to a similar conclusion or if the effect of the carbohol in solution provided a different conclusion. But, thanks for your suggestion…

  393. wayne davidson:

    Great Thanks Hank. David Cooke should stay on topic, he would be a better off testing than trying to change the subject.

    I would like to challenge all contrarians , at WUWT, CA and especially the chaps who claim that trace elements don’t do squat with long wave radiation, to do the test as written in #386. it is repeatable. exceedingly precise, especially with a good weighing scale.

    But i strongly believe grande chef Martha Stuart will do this experiment first before any contrarian will dare to test their physical chemistry concepts.

  394. Hank Roberts:

    For Motorcyclist (from the NPP satellite thread):
    answering why methane — here’s one of many stories:


    “Big Oil made a miscalculation. They effectively used the media and right wing talk show hosts to destroy public belief in atmospheric Global Warming, then to discredit climatologists and atmospheric scientists in general. But they forgot the ocean scientists…. oceanographers, together with some geologists … have made a pretty strong case that GW is present and serious ….

    First, they noted the enormous amounts of methane sequestered on the ocean floor, particularly under pack ice. They suggested the potential for methane release (e.g. due to disappearance of the pack ice), a potent greenhouse gas on its own, to produce CO2 enhancement and strong GW. This would be a large positive feedback to GW.

    Then increased amounts of methane was detected by aircraft above the ice-free Arctic (see previous blog).

    Most recently, an October National Geographic article, “World Without Ice” (Kunzig), showed evidence of a strong GW event to explain the Paleocene-Eocene Thermal Maximum (PETM) extinctions ….”

  395. Hank Roberts:

    For Motorcyclist:


    “Big Oil made a miscalculation. They effectively used the media and right wing talk show hosts to destroy public belief in atmospheric Global Warming, then to discredit climatologists and atmospheric scientists in general. But they forgot the ocean scientists…. oceanographers, together with some geologists … have made a pretty strong case that GW is present and serious ….

    First, they noted the enormous amounts of methane sequestered on the ocean floor, particularly under pack ice. They suggested the potential for methane release (e.g. due to disappearance of the pack ice), a potent greenhouse gas on its own, to produce CO2 enhancement and strong GW. This would be a large positive feedback to GW.

    Then increased amounts of methane was detected by aircraft above the ice-free Arctic (see previous blog).

    Most recently, an October National Geographic article, “World Without Ice” (Kunzig), showed evidence of a strong GW event to explain the Paleocene-Eocene Thermal Maximum (PETM) extinctions ….”

  396. John E. Pearson:

    wayne: I like it. Using our kitchen scale I weighed a level quarter cup (12 teaspoons) (sugar scraped off with knife) of granulated pure white cane sugar and it came in right at 50 grams. Our scale has 5 gram gradations so I think that is reasonably accurate. A level teaspoon should contain 4 1/6 grams of sugar which is 41.667 times the amount of sugar you said to put into 174 grams of water. If you put a level teaspoon of sugar into 1.9 gallons of water you should be pretty close to your recipe. I suggest one teaspoon of sugar into 2 gallons of water as it is “simpler”. We don’t have a candy thermometer or I’d be trying it myself. If you get a good recipe maybe it’ll go viral and simultaneously disprove both Noella Nikpoor (a republican strategist who says that scientists are the only people qualified to tell if scientists are telling the truth or not and thus all science is suspect [see the above video I linked to]) and Rush Limbag (a popular talk show host who frequently and loudly makes the trace element argument).

  397. Hank Roberts:


  398. Hank Roberts:

    So, what’s going on here?
    Is this the best (pun intended) way to chart the trend over time?


  399. JCH:

    Hank, you’re copying my graph!

  400. wayne davidson:

    #395, John E. Pearson, It is indeed a challenge to come up with a recipe with kitchen tools. The problem with large volumes is time respective, of course a 2 galloon microwave is large and I rather think that temperature increase would take a much longer clock setting, the combination would invariably be cause for error. .. But if you can do the larger 2 galloon mix you suggested, take a graduated sample of about 200 ml from it , it may do the job. In this case you need to get 2 flasks as identical as you can find them, disposable plastic cups weigh equal to about .5 grams, some have graduated lines on them and they may represent a close enough equal volume for this experiment to succeed. A standard medical thermometer (the larger the better) is capable to differentiate .5 C. Will try myself the poorer equipment test and report back to see if it works.

  401. Meow:

    @397: That can’t be right: it looks like a hockey stick.

  402. ldavidcooke:


    Hey Wayne,

    Have you considered running the experiment with diluted carbonated water. Certainly would seem to be a more valid model…

    Dave Cooke

  403. wayne davidson:

    John, Medical thermometers are very good Maximum therms within a very narrow range, water and sugar /water must be at room temperature. 200 ml solution Temperature after 20 seconds microwave should increase by about 16 C, smack within the range of a medical thermometer. Highly recommend a Geratherm Mercury free Fever thermometer, with resolution as best as I ever seen on the free market.
    Both solutions must be at room temperature, you can use a cheaper alcohol thermometer to check it out, ideal starting temperature 23 degrees C. Purify water with Brita or other activated charcoal filter.

    For exact volume, I suggest a large turkey baster. Let me know if it works. Will rewrite the recipe for any modestly equipped kitchen with your input.

  404. john byatt:

    #402 David,

    you have missed the point completely.

    What do you believe that Wayne is showing by this experiment?

  405. Snapple:

    The Russian Academy of Sciences is having a conference in November titled “Problems of Adaptation to Climate Change.”


    They are going to broadcast the conference on-line.

    Maybe you could give your impressions of this event.

  406. SteveF:

    This is embarrassing for Judy Curry and all involved in this farce of an article:


  407. Snapple:

    Here is where the Russians describe the focus of the conference.

    They want to adapt/mitigate, and they mention reducing emissions of greenhouse gasses; but they don’t discuss moving away from fossil fuels. They mention that there may be “benefits” to climate change. They also are considering geoengineering. The Russians already do some geoengineering so it won’t snow so much in Moscow.

  408. ldavidcooke:

    Re: 404

    That the addition of a carbon based compound in a H2O solution demonstrates a warming differential when compared to distilled water. At issue, IMHO, a change in density of the solution adds an un-necessary degree of freedom.

    By initally contrasting the experiment with a non-carbon solution of similar density there is a opportunity to ID the differences due to density, (providing you have a hygrometer). In the absence of a hygrometer and in an effort to perform the experiment removing a portion of the difference in density, (changing the error to an issue of boyancy), attempt the same experiment with carbonated water.

    Dave Cooke

  409. Snapple:

    If the Russians think that nuclear winter is a KGB hoax, as a 2011 FBI white paper’s cited source “Comrade J” alleges, then why do the leaders of this Russian conference on climate change seem to be considering geoengineering (“climate stabilization using new technologies”) to cool the planet? The Russians are considering putting aerosols in the atmosphere in order to reflect the sun. Evidently, the Russians are considering creating a “controlled” nuclear winter.

    They don’t say one word about developing renewable energy. http://www.pacc2011.ru/en/about-conference.htm

    Of course, a couple of years ago, President Medvedev said that global warming was a trick. http://www.time.com/time/world/article/0,8599,2008081,00.html?hpt=T2